Spark错误:无法初始化类org.apache.spark.rdd.RDDOperationScope
我在笔记本电脑上创建了一个spark独立群集,然后进入spark项目的sbt控制台并尝试嵌入spark实例,如下所示:
val conf = new SparkConf().setAppName("foo").setMaster(/* Spark Master URL*/)
val sc = new SparkContext(conf)
到那里一切正常,然后我尝试
sc.parallelize(Array(1,2,3))
// and I get: java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.rdd.RDDOperationScope$
我该如何解决?
也许你错过了下面的lib
。
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.4.4</version>
</dependency>
链接地址: http://www.djcxy.com/p/28001.html
上一篇: Spark Error: Could not initialize class org.apache.spark.rdd.RDDOperationScope
下一篇: paginated api request, how to know if there is another page?