Does it make sense to run a simple akka stream application containerized and spawn 30 instances of this container: this means 30 instances of the akka actorsystem are running in different containers.
Probably makes sense. Obviously depends on what you are consuming from and producing to, but for something like Kafka (just as an example) that can handle multiple consumers/producers effectively, yes, it’s a good strategy.
Is this an antipatern for akka?
Not in the general case, no. As I mention in the next answer, that is specifically what Akka Data Pipelines is designed to do, so it’s arguably a documented best practice.
What is the right way to deploy ~30/many instances of the same (small) stream application on different nodes? Could you point me to a good reference/documentation for this?
Again, it might depend on the details. But what you are describing sounds a lot like a typical use case for Akka Data Pipelines.