In the documentation for the message broker API, it states that the primary source of messages that Lagom is designed to produce is persistent entity events. However I need to produce persistent entity commands which are sent from one service to another via Kafka. My use case is that I have one service that represents model entities which accept action commands and produce changed events. I have a separate service that represents agent entities which subscribe to model changed events, and based on their view of the model and internal logic, can then act on the model via action commands that they generate. An agent consuming model changed events fits well with Lagom’s message broker approach, but I am having difficulty figuring out how to properly produce commands which can then be fed into the corresponding model entity. Keep in mind that these commands are really requests to change the model, and it is not guaranteed that the model will be updated accordingly - so they truly are not events.
I could wrap the agent-generated commands as events which can then be consumed by a model entity and handled in its event handler. However this would skip the regular command handling mechanism of the model entity (including validation and responding with failure), and would obviously be the wrong approach to take. Or I could use entityRef and ask within my topic implementation in the model service to forward the wrapped commands, but then I’m not sure about how to properly handle the event stream with possible performance/consistency implications since I would not be using Lagom’s topic producer methods. Or perhaps trying to do this via message broker API is the wrong approach, but I have been thinking along this line because both services are Lagom services.
So what is the recommended approach for publishing/subscribing persistent entity commands between Lagom services/persistent entities via Kafka? Or is there another architectural approach for solving this using Lagom? Any guidance/examples would be greatly appreciated!