spark通过Phoenix访问hbase报错org/apache/twill/zookeeper/ZKClient

提交spark程序的时候指定了twill的jar包,程序里面也添加了twill的依赖还是报错
提交任务命令:
/opt/cloudera/parcels/CDH-5.13.1-1.cdh5.13.1.p0.2/bin/spark-submit \
--master yarn-client \
--class CrossRoadPrice /mapbar/CDH-procedure/cross_road_price_spark-1.0-SNAPSHOT.jar \
-jars \
/opt/cloudera/parcels/CDH-5.13.1-1.cdh5.13.1.p0.2/jars/hbase-server-1.2.0-cdh5.13.1.jar,\
/opt/cloudera/parcels/CDH-5.13.1-1.cdh5.13.1.p0.2/jars/hbase-spark-1.2.0-cdh5.13.1.jar,\
/opt/cloudera/parcels/CDH-5.13.1-1.cdh5.13.1.p0.2/jars/hbase-thrift-1.2.0-cdh5.13.1.jar,\
/opt/cloudera/parcels/CDH-5.13.1-1.cdh5.13.1.p0.2/jars/hbase-client-1.2.0-cdh5.13.1.jar,\
/opt/cloudera/parcels/CDH-5.13.1-1.cdh5.13.1.p0.2/jars/hbase-common-1.2.0-cdh5.13.1.jar,\
/root/kryo-2.21.jar,\
/opt/cloudera/parcels/APACHE_PHOENIX-4.14.0-cdh5.13.2.p0.3/lib/phoenix/lib/tephra-api-0.14.0-incubating.jar,\
/opt/cloudera/parcels/APACHE_PHOENIX-4.14.0-cdh5.13.2.p0.3/lib/phoenix/lib/tephra-core-0.14.0-incubating.jar,\
/opt/cloudera/parcels/APACHE_PHOENIX-4.14.0-cdh5.13.2.p0.3/lib/phoenix/lib/tephra-hbase-compat-1.2-cdh-0.14.0-incubating.jar,\
/opt/cloudera/parcels/APACHE_PHOENIX-4.14.0-cdh5.13.2.p0.3/lib/phoenix/phoenix-4.14.0-cdh5.13.2-client.jar,\
/opt/cloudera/parcels/APACHE_PHOENIX-4.14.0-cdh5.13.2.p0.3/lib/phoenix/phoenix-4.14.0-cdh5.13.2-queryserver.jar,\
/opt/cloudera/parcels/APACHE_PHOENIX-4.14.0-cdh5.13.2.p0.3/lib/phoenix/phoenix-4.14.0-cdh5.13.2-server.jar,\
[b]/opt/cloudera/parcels/APACHE_PHOENIX-4.14.0-cdh5.13.2.p0.3/lib/phoenix/lib/twill-api-0.8.0.jar,\
/opt/cloudera/parcels/APACHE_PHOENIX-4.14.0-cdh5.13.2.p0.3/lib/phoenix/lib/twill-common-0.8.0.jar,\
/opt/cloudera/parcels/APACHE_PHOENIX-4.14.0-cdh5.13.2.p0.3/lib/phoenix/lib/twill-core-0.8.0.jar,\
/opt/cloudera/parcels/APACHE_PHOENIX-4.14.0-cdh5.13.2.p0.3/lib/phoenix/lib/twill-zookeeper-0.8.0.jar,\
/opt/cloudera/parcels/APACHE_PHOENIX-4.14.0-cdh5.13.2.p0.3/lib/phoenix/lib/twill-discovery-api-0.8.0.jar,\
/opt/cloudera/parcels/APACHE_PHOENIX-4.14.0-cdh5.13.2.p0.3/lib/phoenix/lib/twill-discovery-core-0.8.0.jar \[/b]
--executor-memory 4G --total-executor-core 6 --driver-memory 10G
报错:
18/12/19 09:52:48 INFO spark.SparkContext: Created broadcast 4 from newAPIHadoopRDD at PhoenixRDD.scala:49
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/twill/zookeeper/ZKClient
at org.apache.phoenix.transaction.TransactionFactory$Provider.<clinit>(TransactionFactory.java:27)
at org.apache.phoenix.query.QueryServicesOptions.<clinit>(QueryServicesOptions.java:270)
at org.apache.phoenix.query.QueryServicesImpl.<init>(QueryServicesImpl.java:36)
at org.apache.phoenix.jdbc.PhoenixDriver.getQueryServices(PhoenixDriver.java:197)
at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:235)
at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:150)
at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:221)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
at org.apache.phoenix.mapreduce.util.ConnectionUtil.getConnection(ConnectionUtil.java:113)
at org.apache.phoenix.mapreduce.util.ConnectionUtil.getInputConnection(ConnectionUtil.java:58)
at org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil.getSelectColumnMetadataList(PhoenixConfigurationUtil.java:354)
at org.apache.phoenix.spark.PhoenixRDD.toDataFrame(PhoenixRDD.scala:118)
at org.apache.phoenix.spark.SparkSqlContextFunctions.phoenixTableAsDataFrame(SparkSqlContextFunctions.scala:39)
at CrossRoadPrice$$anonfun$main$1.apply(CrossRoadPrice.scala:341)
at CrossRoadPrice$$anonfun$main$1.apply(CrossRoadPrice.scala:78)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
at CrossRoadPrice$.main(CrossRoadPrice.scala:78)
at CrossRoadPrice.main(CrossRoadPrice.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:730)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.twill.zookeeper.ZKClient
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 29 more
已邀请:

IvanLeung - 一只帅气的码狗

赞同来自:

大概率是包冲突,尝试把maven 的scope设置为provided

要回复问题请先登录注册


中国HBase技术社区微信公众号:
hbasegroup

欢迎加入HBase生态+Spark社区钉钉大群