Why connections limit at 2048 ?
Hi @Antony_Aleksandrov ,
Welcome to Snowplow community!
That’s just an example, feel free to change that depending on your needs and the machine where the collector runs.
For example for customers we’re often using
t3.small instance on AWS but also for
@BenB thank you for answer.
Do you have any resource table or you personal understanding?
cpu - memory - connections
For example 1 vcpu 1GB RAM 2048 connections
maybe i can do 1 vcpu 1GB RAM 8092 connections ?
Do you have any production requirements?
Hi @Antony_Aleksandrov the type of events that get sent to a collector can vary wildly so to get the best performance here you will need to do some tuning that suits your pipeline exactly and the type of data you are sending in.
Ultimately you want to increase the connections allowed to ensure that the server you have provisioned can be fully saturated at load (without allowing it to crash entirely). This ensures that auto-scaling based on CPU will work and that you can have the minimal number of Collectors to collect as many events as possible to keep costs down.
First I would recommend asserting that your server is correctly setup and you:
- Have increased the max-open-files limits to a much larger number than default on most unix systems (otherwise you will see issues when you open too many connections)
- Have assigned enough memory to the Collector pod and have also increased heap allocation from the default 25% (e.g.
- Particularly for the pod you can assign memory cap using
--memory- the logic we generally follow can be found here
Once you have tuned these base settings you need to slowly increase the max connections until you have hit an optimal number for your ingress. Its important to note that any downstream back-pressure (like a Kinesis stream being under-provisioned) is going to impact these tests so you need to take into account having issues here when tuning this.
All that being said
8192 has worked well for us as a general rule of thumb to saturate servers.
Hope this helps!