Is it possible to have 2 or more different readside db’s with lagom?
You can certainly have implement multiple event consumers, either from the published Kafka event stream or from the event journal directly.
If you can create the read side from the published API event data all you need is some form of a Kafka topic consumer that writes to the database.
If you can only create the read side from full journal events you’ll need to implement an event processor extending
ReadSideProcessor[T] (assuming Scala implementation).
The services I’ve implemented are multi-tenant and data separation must be maintained. With the API events being published to a Kafka topic there could be a data security risk. Once data is in a message and it’s out of your hands you can never count on it staying where you expect it to be. For that reason, we publish primarily only event notifications to our Kafka event topics. They contain only enough information to allow an authorized consumer to know that something happened and request the information they need through the services.
The read side projections are produced directly from the journal events so they have access to every detail. Since they’re only accessible through the AAA protected services those details aren’t exposed.