Akka Artery ByteBuffer Serialization

I am using the ByteBuffer based serialization mechanism suggested for use with the Akka Artery remoting. Snapshots are failing as the data to be persisted grows. The problem is that the data is very dynamic, can’t even measure or pre-empt what the largest buffer size a persistent actor state will be.

How can I acquire a ByteBuffer size from a BufferPool as cited in the https://doc.akka.io/docs/akka/current/remoting-artery.html#serialization code section?

import akka.serialization.ByteBufferSerializer;
import akka.serialization.SerializerWithStringManifest;

class ExampleByteBufSerializer extends SerializerWithStringManifest
implements ByteBufferSerializer {

@Override
public int identifier() {
return 1337;
}

@Override
public String manifest(Object o) {
return “serialized-” + o.getClass().getSimpleName();
}

@Override
public byte[] toBinary(Object o) {
// in production code, acquire this from a BufferPool
final ByteBuffer buf = ByteBuffer.allocate(256);

toBinary(o, buf);
buf.flip();
final byte[] bytes = new byte[buf.remaining()];
buf.get(bytes);
return bytes;

}

@Override
public Object fromBinary(byte[] bytes, String manifest) {
return fromBinary(ByteBuffer.wrap(bytes), manifest);
}

@Override
public void toBinary(Object o, ByteBuffer buf) {
// Implement actual serialization here
}

@Override
public Object fromBinary(ByteBuffer buf, String manifest) {
// Implement actual deserialization here
return null;
}
}

NB::
public byte[] toBinary(Object o) {
// in production code, acquire this from a BufferPool
final ByteBuffer buf = ByteBuffer.allocate(256);

Can anyone show me the direction here?

We have two implementations in akka.io however those are internal APIs, so you’d have to either find a third party library providing one (no recommendation from the top of my head) or one in your own application. You can get inspiration from said two implementations: https://github.com/akka/akka/blob/master/akka-actor/src/main/scala/akka/io/DirectByteBufferPool.scala