Connect akka flow output to an Apache Flink stream processor?

Hi,

I am looking for some information about incorporating Flink for one part of our stream processing architecture. We are already using Akka Streams and have a basic flow that looks like this:

 // Kafka Source -> Read Flow involving Azure cosmosDB -> Evaluate Flow -> Write Flow -> Logger Flow -> Sink
  lazy private val converterGraph: Graph[ClosedShape.type, Future[Done]] =
    GraphDSL.create(sink) { implicit builder => out =>
      source.watchTermination()(Keep.none) ~> prepareFlow ~> readFlow ~> evaluateFlow ~> writeFlow ~> loggerFlow ~> out
      ClosedShape
  }

Flows above are defined like this:

def prepareFlow: Flow[FromSource, ToRead, NotUsed]

def readFlow: Flow[ToRead, ToEvaluate, NotUsed]

Now instead of the readFlow being an Akka flow, I would like to replace it with a Flink stream processor. So the output of prepareFlow would be an input for Flink-based readFlow, and output of that would be input to evaluateFlow.

Basically, is it possible to do something like this:

  prepareFlow ~> [Flink source ->read -> result] ~> evaluateFlow ~> writeFlow ~> loggerFlow ~> out

I see there is a Flink Akka connector (Sink) (Apache Bahir), but not sure if that could be used with just Akka actors or also streams.

Thanks in advance

lakshmi