Events are not sent to kafka as soon as they are created

This is javascript tracker I am using

;(function(p,l,o,w,i,n,g){if(!p[i]){p.GlobalSnowplowNamespace=p.GlobalSnowplowNamespace||[];
p.GlobalSnowplowNamespace.push(i);p[i]=function(){(p[i].q=p[i].q||[]).push(arguments)
};p[i].q=p[i].q||[];n=l.createElement(o);g=l.getElementsByTagName(o)[0];n.async=1;
n.src=w;g.parentNode.insertBefore(n,g)}}(window,document,“script”,"//d1fc8wv8zag5ca.cloudfront.net/2.9.0/sp.js","snowplow"));

window.snowplow(‘newTracker’, ‘cf’, ‘192.168.55.229:1234’, {
appId: ‘69b8e43473749800’,//app_id
discoverRootDomain: true,
platform: ‘web’,
cookieDomain: ‘localhost’,
post: false,
encodeBase64: false,
pageUnloadTimer: 100,
sessionCookieTimeout: 120,
cookieName: ‘bpr’,
cookieLifetime: 86400 * 31,
stateStorageStrategy: ‘cookie’,
contexts: {
webPage: true,
performanceTiming: true,
gaCookies: true,
geolocation: false
}
});
window.snowplow(‘trackPageView’);

Events are not sent to kafka logs as soon as they are created.After multiple hits events are pushed to kafka.
I want to push the event in kafka as soon as they are created.

It looks like you won’t experience any buffering in the Javascript tracker using that combination of settings.

The latency between your events being fired and your events ended up in the Kafka topic is more likely to be due to the collector configuration that is responsible for receiving the events and forwarding them to Kafka - this contains separate buffering logic from the Snowplow trackers.

1 Like

thanks @mike for your heads up.I used bufferSize :1 in my javascript tracker still I am facing same issue,can you tell me what setting I have to change in my collector configuration or javascript tracker?

Hello @vinayakfutak,
if you use the post method then you can set up
the maxPostBytes parameter which actually defines how many
events are being sent out in one request.
I faced the same issue when I crossed this: https://github.com/snowplow/snowplow-javascript-tracker/issues/631