How to persist collection of entities?

How to implement service like that:

ServiceCall<PSequence,Done> receive(String mission, String asset);

where intention is to persist collection of records. All examples i have seen so far
deal with one entity at a time and use something like

return persistentEntities.refFor(evt, entityId).ask(evt) etc
to return ServiceCall. But in my case i can’t use this.

This is not possible.

Lagom’s PersistentEntity follows the design principals of aggregate development which states that each single aggregate (PersistentEntity in Lagom) maps to one single transaction. Therefore you can’t mutate more then one PersistentEntity at the same time.

If you don’t need all the records to be persisted in the same transaction you can chain multiple ask calls using thenApply. However, you’ll need to be mindful of how to handle failures in the chain of calls. If call 7 of 10 fails, do you care, and if so how do you respond? The saga pattern might be useful if you need them to all succeed or fail.

Alternatively, you can change the bounded context of your aggregate. Change your PersistentEntity to model a collection of records (that are related in some way). This way all the records can be sent to a single PersistentEntity and be persisted in a single transaction. Determining the right bounded context can be tricky, so you’ll need to be thoughtful about whether this change makes sense for your domain. Larger aggregates make complex transactions simpler, but can hurt scalability.

Thanks for your replies. Chaining might be the possibility here, but what about the possibility to
convert those records to events without persisting and producing them to Kafka topic and next service will subscribe to this topic and persist them(i need to retain ordering also). From docs i see that only with persistentEntityRegistry.eventStream can be used for topic implementation. Is it really only way?

I dont think it’s necessary to use PersistentEntity if its not needful. I like the SeviceCall api for its composition and elegance.

You can consider using akka kafka streams to manually push events to the kakfa topic in each ServiceCall which i think is a bit simple hacky
Or
Setup an Actor which receives events fired from within the ServiceCall. This actor would be connected to Source and streams the event using Akka Stream to the TopicProducer. With this approach you can use the TopicProducer api.

I found in lagom source code example which uses Source.Queue to produce events
for TopicProducer. It’s called EventJournal class there and when i append event to
it then it will be inserted into stream. Perhaps it is similar to your second possibility.

Can you provide a link to the example?

But now i think this EventJournal is good for testing, but not for production code as it keeps all events in memory
because of offset parameter. Perhaps yours actorref version doesn’t help either as i can’t see how we can
use offset in this case.

I guess Offsets are tracked and provided by TopicProducer.

My use case is for when the service performs say, analysis on requests/subscribed events and pushes its own events without necessarily keeping track of events explicitly. If interested in keeping events for later use, then there is no better option than PersistentEntity.