Snowplow Analytics w/ Kafka - single Topic for events?

Hello,

I’m new to Snowplow Analytics and I’m exploring usage of it with Kafka as the back-end. I’m interested in using Snowplow Analytics both for multi-system audit log and also a back-bone or central event broker for various systems to send and receive pertinent events for further processing.

When Kafka is used, instead of Kinesis, does Snowplow send all events through a single Kafka topic?

If so, how do you horizontally scale, based on some key and partitions, if all messages (and schema/content types) go through a single topic?

Thanks very much!

Hi @dbh,

Yes, the same behavior as with Kinesis (one stream / one topic).

A topic partition is the unit of parallelism in Kafka - so you make sure your enriched events topic has enough partitions in it.

Does that answer your questions?