How to send this raw data (visitorId, Screen size, revisit or not etc) to kafka?

Hi team,
Is there any native method available in snowplow to push this UI analytics after enrichment (visitorId, Screen size, revisit, device name, country from etc) to kafka (real time) whether in on-premise or on-cloud ?


1 Like

The enrichment process can push to Kafka natively if that’s what you are asking?


See stream-enrich-kafka:3.7.0.

hi @mike

Thanks for giving me your time to reply.
As of now, I am using Snowplow Micro because it is easy to use in development.
Snowplow Micro Good API ( .
So whatever we are getting in the micro good API as a response, that response I want to provide to Apache Kafka in real time.
So I think this is possible on a self-hosting platform and in the cloud as well. and this won’t be possible through Snowplow Micro .
So if we push their real-time analytics to Kafka, do they also provide the capability of mapping custom schema to Kafka topics?
Suppose we have Add-to-Cart (1-0-0 schema) then there event we want to push into the Add-to-Cart Kafka topic only.


Hi @pkr2.

An alternative to using Enrich Kafka is to use Snowbridge. Enrich Kinesis outputs enriched events (in TSV) to Kinesis, and Snowbridge can convert those to JSON and forward to a Kafka topic in real time.

You can also use Snowbridge with Micro! Check out this video where I demo this setup: Expanding our range of real-time destinations with Snowbridge - YouTube.

Snowbridge currently does not support this directly, i.e. you have to provide a single topic name. (Open Source contributions are welcome!)

If you have a limited number of event types, you could run multiple instances of Snowbridge, each configured to filter by a specific event type and send it to a specific topic.

If you want to do this (what is effectively demultiplexing) then Benthos (similar to Snowbridge) has support for this including a Kafka example that shows topic based routing based on the input message: Outputs | Benthos