Spark Error: Could not initialize class org.apache.spark.rdd.RDDOperationScope
I've created a spark standalone cluster on my laptop, then I go into an sbt console on a spark project and try to embed a spark instance as so:
val conf = new SparkConf().setAppName("foo").setMaster(/* Spark Master URL*/)
val sc = new SparkContext(conf)
Up to there everything works fine, then I try
sc.parallelize(Array(1,2,3))
// and I get: java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.rdd.RDDOperationScope$
How do I fix this?
也许你错过了下面的lib
。
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.4.4</version>
</dependency>
链接地址: http://www.djcxy.com/p/28002.html
上一篇: 无法使用Java编写XML 1.1文档