GTM pre-pipeline event forwarding (Azure): defining self-describing events

Hey all,

I’ve been setting up the pipeline on Azure and ran into a few limitations with Azure specifically. One of them seems to be that there is no ready streaming service for event forwarding post-pipeline. Has anyone found a solution to relay events with the Azure setup post-pipeline by any chance? If you could share I’d be immensely grateful!

Due to that, I set up GTM to relay events pre-pipeline, by installing the Snoplow Client and Tag.

In order to enable self-describing events forwarding to my Collector for uri_redirect, link_click, I had to define those:

Am I doing something wrong, or is it really the case that for every unstructured or custom event to be forwarded, these definitions need to be implemented?

Many thanks in advance and have a great day! :smile:

If the Snowplow Client in GTM has the “Include Original tp2 Event” option enabled (defaults to enabled), requests that come in to GTM via the standard collector /com.snowplowanalytics.snowplow/tp2 POST or /i GET endpoints will automatically be sent verbatim by any Snowplow Tags to their configured collector. Any configuration defining events per your screenshot is ignored as it only applies to requests that aren’t handled by the Snowplow Client.

So it should just work without those definitions, unless you have other Clients that you want to generate events from, which require mapping.

The exception is the built-in URI redirect, which isn’t supported by the GTM Client/Tag, so you would probably need to implement that in GTM yourself and do the mapping for it. This should be a the only real exception, though.

Snowbridge is the standard solution for post-pipeline event forwarding, and assuming you’re OK/compliant with the Snowplow Community License, should work on Azure with a Kafka source – there’s just no pre-canned Terraform for deploying it as such.