【背景】配置hive catalog时,无法查询hive数据,日志里报链接被断开的错误信息
【StarRocks版本】3.0.2
【集群规模】3fe + 10be
【机器信息】10台物理机,48核、256GB内存
【HMS版本】2.3.8
【详述】
参考https://docs.starrocks.io/zh-cn/latest/data_source/catalog/hive_catalog#hive-catalog配置hive catalog,主要是进行了如下操作
- 物理机定期kinit
- be和fe配置文件添加JAVA_OPTS="-Djava.security.krb5.conf=/etc/krb5.conf"
- hadoop_env.sh 文件最开头增加 export HADOOP_USER_NAME="<user_name>"
- core-site.xml和hdfs-site.xml拷贝到be和fe的conf目录
- 重启所有fe和be
- 创建catalog:
CREATE EXTERNAL CATALOG hive_catalog
PROPERTIES
(
“type” = “hive”,
“hive.metastore.uris” = “thrift://#######:9083,thrift://#######:9083”
); - 查询hive catalog
mysql> show databases from hive_catalog;
ERROR 1064 (HY000): Failed to getAllDatabases, msg: null
查看日志发现有如下报错:
2023-08-02 10:15:42,103 WARN (starrocks-mysql-nio-pool-80|728) [HiveMetaStoreThriftClient.open():556] set_ugi() not successful, Likely cause: new client talking to old server. Continuing without it.
org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset
at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:127) ~[libthrift-0.13.0.jar:0.13.0]
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) ~[libthrift-0.13.0.jar:0.13.0]
at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:411) ~[libthrift-0.13.0.jar:0.13.0]
at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:254) ~[libthrift-0.13.0.jar:0.13.0]
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77) ~[libthrift-0.13.0.jar:0.13.0]
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:4931) ~[hive-apache-3.1.2-13.jar:?]
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:4917) ~[hive-apache-3.1.2-13.jar:?]
at com.starrocks.connector.hive.HiveMetaStoreThriftClient.open(HiveMetaStoreThriftClient.java:548) ~[starrocks-fe.jar:?]
at com.starrocks.connector.hive.HiveMetaStoreThriftClient.(HiveMetaStoreThriftClient.java:302) ~[starrocks-fe.jar:?]
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_131]
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_131]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_131]
at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_131]
at org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:84) ~[hive-apache-3.1.2-13.jar:?]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:95) ~[hive-apache-3.1.2-13.jar:?]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:148) ~[hive-apache-3.1.2-13.jar:?]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:119) ~[hive-apache-3.1.2-13.jar:?]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:112) ~[hive-apache-3.1.2-13.jar:?]
at com.starrocks.connector.hive.HiveMetaClient$RecyclableClient.(HiveMetaClient.java:93) ~[starrocks-fe.jar:?]
at com.starrocks.connector.hive.HiveMetaClient$RecyclableClient.(HiveMetaClient.java:82) ~[starrocks-fe.jar:?]
at com.starrocks.connector.hive.HiveMetaClient.getClient(HiveMetaClient.java:137) ~[starrocks-fe.jar:?]
at com.starrocks.connector.hive.HiveMetaClient.callRPC(HiveMetaClient.java:153) ~[starrocks-fe.jar:?]
at com.starrocks.connector.hive.HiveMetaClient.callRPC(HiveMetaClient.java:145) ~[starrocks-fe.jar:?]
at com.starrocks.connector.hive.HiveMetaClient.getAllDatabaseNames(HiveMetaClient.java:176) ~[starrocks-fe.jar:?]
at com.starrocks.connector.hive.HiveMetastore.getAllDatabaseNames(HiveMetastore.java:56) ~[starrocks-fe.jar:?]
at com.starrocks.connector.hive.CachingHiveMetastore.loadAllDatabaseNames(CachingHiveMetastore.java:178) ~[starrocks-fe.jar:?]
at com.google.common.cache.CacheLoader$SupplierToCacheLoader.load(CacheLoader.java:227) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.CacheLoader$1.load(CacheLoader.java:192) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3529) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2278) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2155) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2045) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache.get(LocalCache.java:3962) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3985) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4946) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4952) ~[spark-dpp-1.0.0.jar:?]
at com.starrocks.connector.hive.CachingHiveMetastore.get(CachingHiveMetastore.java:447) ~[starrocks-fe.jar:?]
at com.starrocks.connector.hive.CachingHiveMetastore.getAllDatabaseNames(CachingHiveMetastore.java:174) ~[starrocks-fe.jar:?]
at com.starrocks.connector.hive.CachingHiveMetastore.loadAllDatabaseNames(CachingHiveMetastore.java:178) ~[starrocks-fe.jar:?]
at com.google.common.cache.CacheLoader$SupplierToCacheLoader.load(CacheLoader.java:227) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.CacheLoader$1.load(CacheLoader.java:192) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3529) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2278) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2155) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2045) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache.get(LocalCache.java:3962) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3985) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4946) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4952) ~[spark-dpp-1.0.0.jar:?]
at com.starrocks.connector.hive.CachingHiveMetastore.get(CachingHiveMetastore.java:447) ~[starrocks-fe.jar:?]
at com.starrocks.connector.hive.CachingHiveMetastore.getAllDatabaseNames(CachingHiveMetastore.java:174) ~[starrocks-fe.jar:?]
at com.starrocks.connector.hive.HiveMetastoreOperations.getAllDatabaseNames(HiveMetastoreOperations.java:42) ~[starrocks-fe.jar:?]
at com.starrocks.connector.hive.HiveMetadata.listDbNames(HiveMetadata.java:73) ~[starrocks-fe.jar:?]
at com.starrocks.server.MetadataMgr.listDbNames(MetadataMgr.java:105) ~[starrocks-fe.jar:?]
at com.starrocks.qe.ShowExecutor.handleShowDb(ShowExecutor.java:770) ~[starrocks-fe.jar:?]
at com.starrocks.qe.ShowExecutor.execute(ShowExecutor.java:264) ~[starrocks-fe.jar:?]
at com.starrocks.qe.StmtExecutor.handleShow(StmtExecutor.java:1140) ~[starrocks-fe.jar:?]
at com.starrocks.qe.StmtExecutor.execute(StmtExecutor.java:516) ~[starrocks-fe.jar:?]
at com.starrocks.qe.ConnectProcessor.handleQuery(ConnectProcessor.java:349) ~[starrocks-fe.jar:?]
at com.starrocks.qe.ConnectProcessor.dispatch(ConnectProcessor.java:463) ~[starrocks-fe.jar:?]
at com.starrocks.qe.ConnectProcessor.processOnce(ConnectProcessor.java:729) ~[starrocks-fe.jar:?]
at com.starrocks.mysql.nio.ReadListener.lambda$handleEvent$0(ReadListener.java:69) ~[starrocks-fe.jar:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_131]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_131]
at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_131]
Caused by: java.net.SocketException: Connection reset
at java.net.SocketInputStream.read(SocketInputStream.java:210) ~[?:1.8.0_131]
at java.net.SocketInputStream.read(SocketInputStream.java:141) ~[?:1.8.0_131]
at java.io.BufferedInputStream.read1(BufferedInputStream.java:284) ~[?:1.8.0_131]
at java.io.BufferedInputStream.read(BufferedInputStream.java:345) ~[?:1.8.0_131]
at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:125) ~[libthrift-0.13.0.jar:0.13.0]
… 64 more