How to maintain external application in sync with Lagom microservice

My legacy system (java application) needs to be synchronised with a Lagom microservice.
It means that if my legacy system could get events from the microservice, it could do the necessary to update its database.

In such use case, It think the best way is to get the event from kafka topic.

I tried to implement a websocket entry point returning event by subscribing to a topic.
Then I got this error:

Topic#subscribe is not permitted in the service's topic implementation

So I think it is not the right way to do it.

For me the websocket is the easiest way to connect to Lagom. How should I do it ? Any example ?

Hi,

If you only need to update a database of your legacy application one solution could be to create a separate service that would subscribe to a kafka topic and update the database accordingly.

Hope this helps.

Br,
Alan

@domschoen

If you don’t want to create a separated service for this, you must implement a subscribe in you legacy application. In java you can use https://doc.akka.io/docs/alpakka-kafka/current/consumer.html to do that. Another thing is you need to configure that Kafka to accept the external environments access. Another option is disable the embedded Kafka, to use another external Kafka that ill be accessible for other applications like you legacy system =D

Finally,

if what you want is to expose the persitent entity events on a websocket:

then you could use persistentEntityRegistry.eventStream to produce a Source you can then connect to the websocket. The issue, though, is that you’ll have to manage the offset and the websocket reconnection yourself.

I think the best move forward is @domschoen’s proposal:

implement a subscribe in you legacy application.

@aklikic suggestion is a good step forward but requires sharing the database between the new service and the legacy one.

Cheers