Handle post with big raw data

Hello ,
May I have a question about [WARN] [06/28/2018 16:09:07.337] [QuickStart-akka.actor.default-dispatcher-3] [akka.actor.ActorSystemImpl(QuickStart)] Sending an 2xx ‘early’ response before end of request was received… Note that the connection will be closed after this response. Also, many clients will not read early responses! Consider only issuing this response after the request data has been completely read!

For a request with a raw string of 120kb the following code
return extractEntity(
entity ->
extractMaterializer(mat ->
Directives.<bytestring>onSuccess(
entity.withoutSizeLimit().getDataBytes().runWith(Sink.<bytestring>head(), mat),
bytes -> {

System.out.println("request size : "+bytes.utf8String().length());
return complete(bytes.utf8String());
}
)
)
);
produce this result

request size : 120456
request size : 120456
request size : 114688
[WARN] [06/28/2018 16:09:07.337] [QuickStart-akka.actor.default-dispatcher-3] [akka.actor.ActorSystemImpl(QuickStart)] Sending an 2xx ‘early’ response before end of request was received… Note that the connection will be closed after this response. Also, many clients will not read early responses! Consider only issuing this response after the request data has been completely read!
request size : 120456
request size : 120456
request size : 65536
[WARN] [06/28/2018 16:09:08.619] [QuickStart-akka.actor.default-dispatcher-4] [akka.actor.ActorSystemImpl(QuickStart)] Sending an 2xx ‘early’ response before end of request was received… Note that the connection will be closed after this response. Also, many clients will not read early responses! Consider only issuing this response after the request data has been completely read!

For several request the result is not get properly .

Do you have any advice how to handle this ?
Many thanks in advance,
Stefan

Because of:

you’re only getting the first (‘head’) segment of the data and then respond. You probably want to process the whole upload before responding?

Hello ,
Thank you very much .
Yes . How can do this ? Is there any documentation related ?

Many thanks in advance,
Stefan

It depends on what you want to do with the uploaded file: ideally you’d stream it to its destination, whatever that is.

If you don’t mind keeping the whole upload in memory you could use Sink.reduce, but that’s almost certainly not the nice solution.

Hello ,
the payload is a big xml file that I wanted to parse , but first I need to get all file loaded .
I also tried with
CompletionStage entityStage = Unmarshaller.entityToString().unmarshal(entity, ExecutionContexts.global(), mat);
and after 2 consecutive requests the server hangs without any message .
How do you suggest to proceed ?

Many thanks in advance,
Stefan

What do you want to do with the parsed XML file though? If it is big then loading it into memory as a whole might fill up your memory. If possible it might be nice to use a streaming XML parser/processor - for example as described at Extensible Markup Language - XML • Alpakka Documentation

Hard to say why it hangs, could it be it already hit your maximum heap size? Have you checked with something like visualvm or a profiler?