Phoenix +HBase +Kerberos 环境 执行 CsvBulkLoadTool 失败

HBase 开启了Kerberos 认证
下面的sqlline 可以使用

/usr/lib/phoenix/bin/sqlline.py test1,test2,test3:/hbase-secure:hbase-test@TEST:/etc/security/keytabs/hbase.headless.keytab
 
Phoenix 但是使用CSVBULKLOAD,则会报错

/usr/lib/hadoop/bin/hadoop jar /usr/lib/phoenix/phoenix-xxxxx-client.jar org.apache.phoenix.mapreduce.CsvBulkLoadTool -Dhbase.client.retries.number=1 -table STOCK_SYMBOL --input /tmp/STOCK_SYMBOL.csv -z "test1,test2,test3:2181:/hbase-secure:hbase-test@TEST:/etc/security/keytabs/hbase.headless.keytab
报错如下:

19/05/14 17:11:55 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=test1:2181,test2:2181,test3:2181 sessionTimeout=90000 watcher=org.apache.hadoop.hbase.zookeeper.ReadOnlyZKClient$$Lambda$14/1787127182@2785174a
19/05/14 17:11:55 INFO zookeeper.ClientCnxn: Opening socket connection to server test1/192.168.1.2:2181. Will not attempt to authenticate using SASL (unknown error)
19/05/14 17:11:55 INFO zookeeper.ClientCnxn: Socket connection established, initiating session, client: /192.168.1.2:52996, server: test1/192.168.1.2:2181
19/05/14 17:11:55 INFO zookeeper.ClientCnxn: Session establishment complete on server test1/192.168.1.2:2181, sessionid = 0x100d35746c1004d, negotiated timeout = 90000
19/05/14 17:11:56 INFO query.ConnectionQueryServicesImpl: HConnection established. Stacktrace for informational purposes: hconnection-0x4de3d79d java.lang.Thread.getStackTrace(Thread.java:1556)
org.apache.phoenix.util.LogUtil.getCallerStackTrace(LogUtil.java:55)
org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:432)
org.apache.phoenix.query.ConnectionQueryServicesImpl.access$400(ConnectionQueryServicesImpl.java:272)
org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2556)
org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2532)
org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)
org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2532)
org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:255)
org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:150)
org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:221)
java.sql.DriverManager.getConnection(DriverManager.java:664)
java.sql.DriverManager.getConnection(DriverManager.java:208)
org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:400)
org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:392)
org.apache.phoenix.mapreduce.AbstractBulkLoadTool.loadData(AbstractBulkLoadTool.java:206)
org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:180)
org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:109)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.hadoop.util.RunJar.run(RunJar.java:308)
org.apache.hadoop.util.RunJar.main(RunJar.java:222)

19/05/14 17:11:57 INFO zookeeper.ZooKeeper: Session: 0x100d35746c1004d closed
19/05/14 17:11:57 INFO zookeeper.ClientCnxn: EventThread shut down for session: 0x100d35746c1004d
19/05/14 17:11:57 INFO log.QueryLoggerDisruptor: Shutting down QueryLoggerDisruptor..
Exception in thread "main" org.apache.phoenix.exception.PhoenixIOException: Failed after attempts=2, exceptions:
Tue May 14 17:11:57 CST 2019, RpcRetryingCaller{globalStartTime=1557825116950, pause=100, maxAttempts=2}, org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=2, exceptions:
Tue May 14 17:11:57 CST 2019, RpcRetryingCaller{globalStartTime=1557825116975, pause=100, maxAttempts=2}, org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed
Tue May 14 17:11:57 CST 2019, RpcRetryingCaller{globalStartTime=1557825116975, pause=100, maxAttempts=2}, org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed

Tue May 14 17:11:57 CST 2019, RpcRetryingCaller{globalStartTime=1557825116950, pause=100, maxAttempts=2}, org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=2, exceptions:
Tue May 14 17:11:57 CST 2019, RpcRetryingCaller{globalStartTime=1557825117760, pause=100, maxAttempts=2}, org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed
Tue May 14 17:11:57 CST 2019, RpcRetryingCaller{globalStartTime=1557825117760, pause=100, maxAttempts=2}, org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed


at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:138)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1204)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1501)
at org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2721)
at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:1114)
at org.apache.phoenix.compile.CreateTableCompiler$1.execute(CreateTableCompiler.java:192)
at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:408)
at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:391)
at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:390)
at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:378)
at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1806)
at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2569)
at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2532)
at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2532)
at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:255)
at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:150)
at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:221)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:400)
at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:392)
at org.apache.phoenix.mapreduce.AbstractBulkLoadTool.loadData(AbstractBulkLoadTool.java:206)
at org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:180)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
at org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:109)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:308)
at org.apache.hadoop.util.RunJar.main(RunJar.java:222)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=2, exceptions:
Tue May 14 17:11:57 CST 2019, RpcRetryingCaller{globalStartTime=1557825116950, pause=100, maxAttempts=2}, org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=2, exceptions:
Tue May 14 17:11:57 CST 2019, RpcRetryingCaller{globalStartTime=1557825116975, pause=100, maxAttempts=2}, org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed
Tue May 14 17:11:57 CST 2019, RpcRetryingCaller{globalStartTime=1557825116975, pause=100, maxAttempts=2}, org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed

Tue May 14 17:11:57 CST 2019, RpcRetryingCaller{globalStartTime=1557825116950, pause=100, maxAttempts=2}, org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=2, exceptions:
Tue May 14 17:11:57 CST 2019, RpcRetryingCaller{globalStartTime=1557825117760, pause=100, maxAttempts=2}, org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed
Tue May 14 17:11:57 CST 2019, RpcRetryingCaller{globalStartTime=1557825117760, pause=100, maxAttempts=2}, org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed


at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:144)
at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:3084)
at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:3076)
at org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:442)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1105)
... 32 more
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=2, exceptions:
Tue May 14 17:11:57 CST 2019, RpcRetryingCaller{globalStartTime=1557825117760, pause=100, maxAttempts=2}, org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed
Tue May 14 17:11:57 CST 2019, RpcRetryingCaller{globalStartTime=1557825117760, pause=100, maxAttempts=2}, org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed

at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:144)
at org.apache.hadoop.hbase.client.HTable.get(HTable.java:386)
at org.apache.hadoop.hbase.client.HTable.get(HTable.java:360)
at org.apache.hadoop.hbase.MetaTableAccessor.getTableState(MetaTableAccessor.java:1078)
at org.apache.hadoop.hbase.MetaTableAccessor.tableExists(MetaTableAccessor.java:403)
at org.apache.hadoop.hbase.client.HBaseAdmin$6.rpcCall(HBaseAdmin.java:445)
at org.apache.hadoop.hbase.client.HBaseAdmin$6.rpcCall(HBaseAdmin.java:442)
at org.apache.hadoop.hbase.client.RpcRetryingCallable.call(RpcRetryingCallable.java:58)
at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:107)
... 36 more
Caused by: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed
at org.apache.hadoop.hbase.ipc.NettyRpcDuplexHandler.channelInactive(NettyRpcDuplexHandler.java:211)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
at org.apache.hbase.thirdparty.io.netty.handler.codec.ByteToMessageDecoder.channelInputClosed(ByteToMessageDecoder.java:377)
at org.apache.hbase.thirdparty.io.netty.handler.codec.ByteToMessageDecoder.channelInactive(ByteToMessageDecoder.java:342)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
at org.apache.hbase.thirdparty.io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
at org.apache.hbase.thirdparty.io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:277)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1354)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:917)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannel$AbstractUnsafe$8.run(AbstractChannel.java:822)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
at org.apache.hbase.thirdparty.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:745)
已邀请:

hmaster - HBase 从业者

赞同来自:

自问自答一下:/usr/lib/hadoop/bin/hadoop jar /usr/lib/phoenix/phoenix-xxxxx-client.jar org.apache.phoenix.mapreduce.CsvBulkLoadTool -Dhbase.client.retries.number=1 -table STOCK_SYMBOL --input /tmp/STOCK_SYMBOL.csv -z "test1,test2,test3:2181:/hbase-secure:hbase-test@TEST:/etc/security/keytabs/hbase.headless.keytab
 -Dhbase.security.authentication=kerberos -Dphoenix.schema.isNamespaceMappingEnabled=true  -Dhbase.master.kerberos.principal=hbase/_HOST@xxx -Dhbase.regionserver.kerberos.principal=hbase/_HOST@xxx
这个根据phoenix 配置来看是否设置:
 -Dphoenix.schema.isNamespaceMappingEnabled=true 

要回复问题请先登录注册


中国HBase技术社区微信公众号:
hbasegroup

欢迎加入HBase生态+Spark社区钉钉大群