Play framework + Apache Spark


(Hugo Maitre) #1

Hello,

For a project we need to create a play web server using the apache spark libraries.

When launching the server with only creating a spark context we got this error:

We saw that we need to do a spark-submit to use the Spark Libraries but we need a main file that Play don’t have.

Hope someone will figure it out

Best Regards,
Hugo Maitre


(Aditya Athalye) #2

The SparkContext class is present in the spark-core lib.

Check if you have following dependency added to your build.sbt

"org.apache.spark" %  "spark-core" %  "2.4.0"

Version number may be different.

HTH


(Hugo Maitre) #3

I was using the org.spark-packages plugin.

I am trying importing the packages one by one.

Thank you


(Hugo Maitre) #4

Okay,
I used this plugin

It was too old and the build is failing so maybe that’s why.
My build.sbt now looks like this
The dependencyOverrides are there because I got an error about the jackson library being incompatible.

dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-core" % "2.9.7"
dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-databind" % "2.9.7"
dependencyOverrides += "com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.9.7"


libraryDependencies ++= Seq(
  jdbc,
  ehcache,
  ws,
  specs2 % Test,
  guice,
  "org.apache.spark" %% "spark-core" % "2.4.0",
  "org.apache.spark" %% "spark-sql" % "2.4.0",
  "org.apache.spark" %% "spark-mllib" % "2.4.0",
  "org.apache.spark" %% "spark-streaming" % "2.4.0"
)

And now it works like a charm.

Thank you !