Spark写入phoenix问题

提交spark写入phoenix作业,查看配置中disruptor-3.3.0.jar也存在,替换disruptor-3.3.8.jar以后任然出现这个错误
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 410.0 failed 4 times, most recent failure: Lost task 0.3 in stage 410.0 (TID 821, hadoop-slave02, executor 1): com.google.common.util.concurrent.ExecutionError: java.lang.NoSuchMethodError: com.lmax.disruptor.dsl.Disruptor.<init>(Lcom/lmax/disruptor/EventFactory;ILjava/util/concurrent/ThreadFactory;Lcom/lmax/disruptor/dsl/ProducerType;Lcom/lmax/disruptor/WaitStrategy;)V
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2232)
at com.google.common.cache.LocalCache.get(LocalCache.java:3965)
at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4764)
at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:240)
at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:150)
已邀请:

IvanLeung - 一只帅气的码狗

赞同来自:

不需要替换,把disruptor-3.3.8.jar放到phoenix的lib里面,spark-submit时添加
--conf spark.driver.extraClassPath=/opt/cloudera/parcels/APACHE_PHOENIX/lib/phoenix/lib/* \
--conf spark.executor.extraClassPath=/opt/cloudera/parcels/APACHE_PHOENIX/lib/phoenix/lib/* \
 
 

要回复问题请先登录注册


中国HBase技术社区微信公众号:
hbasegroup

欢迎加入HBase生态+Spark社区钉钉大群