ggm 3 hours ago

I don't disagree with the tone of this but I think people need to understand the limits of what "make async/concurrency easier" means. It means being a LOT smarter about logical flows, and assumptions about pipeline ordering over your data streams for one thing.

If you have single entry bottlenecks, your concurrency will maximise the length of the queue on that backlog very quickly. I've hit this several times.

Sometimes the easier path is to shard input data and use heavyweight processes and then re-order/sort/insert at the end, when all is said and done. Easier means more comprehensible. If you can make it do random order insert into the final state then sure, async/concurrent processing of the easy stuff from input can be a dream.

I think I live in a very stone-age world. my primary data source is logfiles and pcap sources. They tend to be linear inputs.