JDBC (Oracle) persistence of JSON payload

Hi Everyone,

My main question is: is there a way to use the JDBC plugin for Oracle just replacing the default serializer for a JSON serializer?

I have found some content in the official documentation of Akka Persistence and in the GitHub repository of the JDBC plugin but it seems that there is something missing yet to clue all the parts.

The documentation says:

Serialization is NOT handled automatically by Akka Persistence itself. Instead, it only provides the above described serializers, and in case a AsyncWriteJournal plugin implementation chooses to use them directly, the above serialization scheme will be used. Please refer to your write journal’s documentation to learn more about how it handles serialization!

So, I went to documentation of Akka Persistence JDBC. The repository was migrated from https://github.com/dnvriend to Akka Github account.

Even so, I was able to find a pretty close example that I need here showing how create a custom serializer for JSON and use it in the JDBC pluging.

However, it uses PostgreSQL and it uses a journal-converter = “journal.JsonJournalSerializer” setting in JDBC plug settings. But, there is no reference to journal-converter parameter in the newest version of the JDBC plugin code.

That said, I am not sure if the current version of JDBC plugin still supports a custom serializer. Or, if should I try the path of write a completely custom Dao for this purpose.

Many thanks in advance,

Hi Richard,

It depends on what you mean by serializing to json on a Oracle DB.

The plugin will use whatever serializer you configure for your types. Check Akka Serialization documentation for more details.

Basically, if you have an event like UserCreated you can register a serializer for it using you json library of choice. However, at the end of the day, when it comes to add the event to the journal, the json body will have to be converted to a byte array. So, json will be a intermediate format between your event and the byte array.

This is how we do in Lagom, by default all events are first serialized to json and then to byte array before being saved to the journal.

The advantage of that is you can then write custom serializers and manipulate the payload. This is specially useful when supporting event migration, which is the case in Lagom.



Hi Renato,

Just to clarify, I’d like to store the event as JSON format in DB. I already have a custom serializer for them but, as you said, they are stored as a byte array.

I’d like to store them in a human-readable format. Akka documentation mention this saying to check the support of pluging. When I tried to check JDBC plugin I was caught in the doubts which generated the original post here.

Many thanks for your help,

Best Regards

Ok, I see know.

It’s possible to do it with the akka-persistence-jdbc plugin by implementation your own DAO. You can swap DAOs.

But that means that you will need to define a new schema with a json column (but that’s exactly what you want) and then have a DAO that knows how to save and read from that new schema.

If you take that path, be aware to the current plugin impl store the PersistentImpl with the payload in it. Like a wrapper.

PersistentImpl is a Akka internal class and it’s not a good idead to store it in the DB. We have a branch with a fix for it, check this PR: https://github.com/akka/akka-persistence-jdbc/pull/180
This is not on the main branch, but on akka-serialization branch.

Version 4.0.0 will contain that refactoring (or a variation of it).

That to say, if you plan to have your own schema and DAO, you want to take that in consideration.

Best regards,


1 Like

@richardbezerra, you may also be interested in that issue: https://github.com/akka/akka-persistence-jdbc/issues/144

Thanks a lot, @octonato. I have found this issue while I was searching for but I was not sure if it would be valid for the current JDBC plugin version because it was from 2017.

Anyway, I will explore this options.

Best regards,