Spark 1.5.1,Cassandra Connector 1.5.0-M2,Cassandra 2.1,Scala 2.10,NoSuchMethodError番石榴依赖
Spark环境的新手(对Maven来说还是新手)所以我正在努力解决如何正确发送我需要的依赖项。
看起来Spark 1.5.1有一个guava-14.0.1依赖,它试图使用,而且isPrimitive是在15+中添加的。 确保我的优步jar获胜的正确方法是什么? 我在我的spark-defaults.conf中尝试过spark.executor.extraClassPath
无济于事。
复制到[问题]: Spark 1.5.1 + Scala 2.10 + Kafka + Cassandra = Java.lang.NoSuchMethodError:但对于Maven来说(还没有回复评论)
剥下我的依赖关系:
com.google.guava guava 18.0 org.apache.commons commons-compress 1.10 com.esotericsoftware.kryo kryo 2.21 org.objenesis objenesis 2.1 org.apache.spark spark-core_2.10 1.5.0 org.slf4j slf4j-log4j12 log4j log4j org.apache.spark spark-sql_2.10 1.5.0 com.datastax.spark spark-cassandra-connector_2.10 1.5.0-M2
使用以下方法为我的JAR添加所有依赖项:
org.apache.maven.plugins maven-shade-plugin 2.3 package shade org.apache.hadoop:* org.apache.hbase:* *:* META-INF/*.SF META-INF/*.DSA META-INF/*.RSA org.apache.spark:spark-network-common_2.10 com.google.common.base.* reference.conf
当我跑步时,这是我真棒的爆炸
./spark-submit --master local --class
Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.reflect.TypeToken.isPrimitive()Z at com.datastax.driver.core.TypeCodec.(TypeCodec.java:142) at com.datastax.driver.core.TypeCodec.(TypeCodec.java:136) at com.datastax.driver.core.TypeCodec$BlobCodec.(TypeCodec.java:609) at com.datastax.driver.core.TypeCodec$BlobCodec.(TypeCodec.java:606) at com.datastax.driver.core.CodecRegistry.(CodecRegistry.java:147) at com.datastax.driver.core.Configuration$Builder.build(Configuration.java:259) at com.datastax.driver.core.Cluster$Builder.getConfiguration(Cluster.java:1135) at com.datastax.driver.core.Cluster.(Cluster.java:111) at com.datastax.driver.core.Cluster.buildFrom(Cluster.java:178) at com.datastax.driver.core.Cluster$Builder.build(Cluster.java:1152) at com.datastax.spark.connector.cql.DefaultConnectionFactory$.createCluster(CassandraConnectionFactory.scala:85) at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:155)
修复了我的依赖问题,包括我在/conf/spark-defaults.conf中需要的guava jar。
spark.driver.extraClassPath /home/osboxes/Packages/guava-18.0.jar spark.executor.extraClassPath /home/osboxes/Packages/guava-18.0.jar