Kafka streams creating multiple producer threads

I am running alpakka kafka producer in akka stream and multiple producer processes are getting run:

“kafka-producer-network-thread | producer-20” #49 daemon prio=5 os_prio=31 tid=0x00007f80562c1800 nid=0x8903 runnable [0x000070000e3f9000]

I am using singleton and not creating multiple kafka producers instances but still seeing 10k kafka-producer-network-thread created. This is causing servers to go to 99% cpu usage.

Code is
CompletionStage<Done> done = Source.range(1, 100) .map(number -> number.toString()) .map(value -> new ProducerRecord<String, String>(topic, value)) .runWith(Producer.plainSink(producerSettings, kafkaProducer), materializer);

Do I need to do Producer.close()? If so, where shall I do in this stream?

A few more things I tried:

  1. Is there a way to not provide the producer from outside? Is it using flexi flow or akka.kafka.javadsl.Producer.plainSink(producerSettings)? If so, I tried that too and it still created the network threads.
  2. For producer from outside, do we close it after sending every msg or few msgs and have a new instance created again? Is there any example documentation I can refer to for closing the producer?
  3. Is there a way to limit the threads?

1 & 3: The Kafka Producer API creates threads internally, the configuration options for the threading are limited: http://kafka.apache.org/documentation/#producerconfigs

  1. No, as long as the stream is running you may not close the producer. Closing it means to call producer.close(..) .

Is this also the case for Producer.flexiFlow ?