spark读取hbase时显示NoServerForRegionException

Spark2.1.0读取cdh 5.7.0的hbase时,20个task有19个task成功,最后一个task出现如下错误:
(hdp02101,executor 3):org.apache.hadoop.hbase.client.NoServerForRegionException:Unable to find region for 123 in graph_dev:tt_dm after 35 retries
at org.apache.hadoop.hbase.client:ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1329)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1199)
at org.apache.hadoop.hbase.clientRpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:305)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:156 )
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:60 )
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:320 )
at org.apache.hadoop.hbase.client.ClientScanner.nextScanner( ClientScanner.java:295 )
......

我的hbase版本是 1.2.0-cdh5.7.0。请问是需要重启hbase?还是需要修改配置?
已邀请:

要回复问题请先登录注册


中国HBase技术社区微信公众号:
hbasegroup

欢迎加入HBase生态+Spark社区钉钉大群