I have a question for the S3 connector for alpakka, basically just want to make sure I understand the problem we are currently facing. Essentially we have clients that will upload hundreds of files to s3 and then will send a link to download the files out to whoever needs to the view them. This download link will gather all the files from s3 (or they can download each individually) and zip them and stream the download to the user. We use Play to serve the requests for the download. The problem is that
max-open-requests seems to overflow and users can’t download the files and get a Network Error. We increased the
max-open-requests to 128 and it solves it except if 2 users at the same time try to download the files (or if different files doesn’t seem to matter). I’ve tracked the download request here : https://github.com/akka/alpakka/blob/master/s3/src/main/scala/akka/stream/alpakka/s3/impl/S3Stream.scala#L447. Essentially I think what is happening is that this connector is using the
cachedHostConnectionPool and that means no matter how many different incoming connections are made it will limit the outgoing connections to 1? So even if we increase the host-connection-pool max-connections it will not make a difference? Are the assumptions here correct, is there anything we can do to work around this issue?
Thanks so much!