On the last couple of days, Me and @AmiBenson are working on a new task which involves consuming large amount of data from multiple kafka topics into a NodeJS service.
2. Proccessing/consuming Delay/Lag is unacceptable.
3. Single consumer* — The Data must be consumed in the same order which the producer is producing. Let’s leave that aside for now.
Tech & Env: Prometheus for monitoring, NodeJS, KafkaJS, this article is based on experiments on my macbook pro 2020(16g RAM and 8 CPU), Postgres (using…
As part of the creation of one of our NodeJS services, me and part of my team (@NoyEliyahu and @OrFins) investigated a critical performance problem. The service wasn’t ready yet anyways but it was cool enough to share this story.
The flow was: at the initialization of the service, we connect to a web-socket (https://github.com/websockets/ws ) to other service (let’s call it ws-producer), receiving 100k of events (snapshot of what the other side has from the begining of time until now) at the start of the connection and then recieve real-time events at a much lower rate:
Each incomming ws-message…