Hi Snowplowers,
I ran a manual test on a React site to test each of the Snowplow events and validate the schema.
A weird situation happened when I click, for example, the ad click event button quickly about 10 times, usually only 6-8 events would be saved in BQ, and some are missing. (all 10 events fired succesfully)
code:
doAdClick = () => {
analytics.track('AdClick', {
targetUrl: "http//www.test.com",
clickId: "11233",
costModel: "cpm",
cost: 3,
zoneId: "9",
impressionId: '9/10/3:40',
advertiserId: "201",
campaignId: "123124"
})
}
And in our debugging process, we saw all 10 events in good-sub topic, but not all in enriched-good topic.
In BQ we saw quite a large amount of adaptor_failure:
{
"app_id": "",
"timestamp": "2021-10-13T22:49:50.175Z",
"event_name": "",
"error_type": "adapter_failures",
"schema": "iglu:com.snowplowanalytics.snowplow.badrows/adapter_failures/jsonschema/1-0-0",
"data": {
"failure": "{\"timestamp\": \"2021-10-13T22:49:50.175Z\", \"vendor\": \"snowplow\", \"version\": \"health\", \"messages\": [{\"field\": \"vendor/version\", \"value\": \"snowplow/health\", \"expectation\": \"vendor/version combination is not supported\"}]}",
"payload": "{\"vendor\": \"snowplow\", \"version\": \"health\", \"querystring\": [], \"contentType\": null, \"body\": null, \"collector\": \"ssc-2.3.2-rc1-googlepubsub\", \"encoding\": \"UTF-8\", \"hostname\": \"x.x.x.x\", \"timestamp\": \"2021-10-13T22:49:49.215Z\", \"ipAddress\": \"35.191.x.x\", \"useragent\": \"GoogleHC/1.0\", \"refererUri\": null, \"headers\": [\"Timeout-Access: <function1>\", \"Host\", \"User-Agent: GoogleHC/1.0\", \"Connection: Keep-alive\"], \"networkUserId\": \"xxx\"}",
"processor": {
"artifact": "beam-enrich",
"version": "1.2.3"
}
}
},
Would it because Snowplow treated some of the repeated events as robot generated events?
If yes, how can we change the enrichment to avoid this case? Thank you