为了更快的定位您的问题,请提供以下信息,谢谢
【详述】
我在执行从 hive 导入到 Starrocks 操作时候出错。执行 sql:
INSERT OVERWRITE xxx.xxxx
SELECT * FROM hive_catalog_hms.xxx.xxx where pt_d = ‘2025-09-18’;
报错内容:
[1064] [42000]: Failed to get remote files, msg: com.starrocks.connector.exception.StarRocksConnectorException: Failed to get hive remote file’s metadata on path: RemotePathKey{path=‘hdfs://mycluster/user/hive/warehouse/xxx.db/xxx/pt_d=2025-09-18’, isRecursive=true}. msg: java.net.UnknownHostException: mycluster
但是我单独的执行 SELECT * FROM hive_catalog_hms.xxx.xxx where pt_d = ‘2025-09-18’; 是可以查出来数据的。
【背景】上次集群崩了一次,然后我重启之后就不行了。之前是可以导入的。
【业务影响】较大
【是否存算分离】否
【StarRocks版本】例如:3.1.13
【集群规模】例如:3fe(1 follower+2observer)+3be(fe与be混部)
【机器信息】CPU虚拟核/内存/网卡,例如:48C/64G/万兆
【表模型】例如:主键模型
【导入或者导出方式】 内部方式 insert overwrite
【联系方式】446477483@qq.com
【附件】2025-09-19 18:38:29.927+08:00 INFO (pull-hive-remote-files-10|11259) [FileSystem.createFileSystemInternal():3730] [hadoop-ext] FileSystem.createFileSystem
2025-09-19 18:38:29.928+08:00 WARN (pull-hive-remote-files-10|11259) [FileSystem.createFileSystemInternal():3744] Failed to initialize filesystem hdfs://mycluster/user/hive/warehouse/xxx_dm.db/kkkk/pt_d=2025-09-18: java.lang.IllegalArgumentException: java.net.UnknownHostException: mycluster
2025-09-19 18:38:29.928+08:00 ERROR (pull-hive-remote-files-10|11259) [HiveRemoteFileIO.getRemoteFiles():120] Failed to get hive remote file’s metadata on path: hdfs://mycluster/user/hive/warehouse/xxx_dm.db/kkkk/pt_d=2025-09-18
java.lang.IllegalArgumentException: java.net.UnknownHostException: mycluster
at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:475) ~[hadoop-common-3.3.6.jar:?]
at org.apache.hadoop.hdfs.NameNodeProxiesClient.createProxyWithClientProtocol(NameNodeProxiesClient.java:134) ~[hadoop-hdfs-client-3.3.6.jar:?]
at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:374) ~[hadoop-hdfs-client-3.3.6.jar:?]
at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:308) ~[hadoop-hdfs-client-3.3.6.jar:?]
at org.apache.hadoop.hdfs.DistributedFileSystem.initDFSClient(DistributedFileSystem.java:204) ~[hadoop-hdfs-client-3.3.6.jar:?]
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:189) ~[hadoop-hdfs-client-3.3.6.jar:?]
at org.apache.hadoop.fs.FileSystem.createFileSystemInternal(FileSystem.java:3740) ~[starrocks-hadoop-ext.jar:?]
at org.apache.hadoop.fs.FileSystem.lambda$createFileSystem$0(FileSystem.java:3760) ~[starrocks-hadoop-ext.jar:?]
at com.starrocks.connector.hadoop.HadoopExt.doAs(HadoopExt.java:61) ~[starrocks-hadoop-ext.jar:?]
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3760) ~[starrocks-hadoop-ext.jar:?]
at org.apache.hadoop.fs.FileSystem.access$300(FileSystem.java:199) ~[starrocks-hadoop-ext.jar:?]
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3859) ~[starrocks-hadoop-ext.jar:?]
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3807) ~[starrocks-hadoop-ext.jar:?]
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:601) ~[starrocks-hadoop-ext.jar:?]
at com.starrocks.connector.hive.HiveRemoteFileIO.getRemoteFiles(HiveRemoteFileIO.java:78) ~[starrocks-fe.jar:?]
at com.starrocks.connector.hive.HiveRemoteFileIO.getRemoteFiles(HiveRemoteFileIO.java:67) ~[starrocks-fe.jar:?]
at com.starrocks.connector.CachingRemoteFileIO.loadRemoteFiles(CachingRemoteFileIO.java:87) ~[starrocks-fe.jar:?]
at com.google.common.cache.CacheLoader$FunctionToCacheLoader.load(CacheLoader.java:169) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.CacheLoader$1.load(CacheLoader.java:192) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3570) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2312) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2189) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2079) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache.get(LocalCache.java:4011) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4034) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:5010) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:5017) ~[spark-dpp-1.0.0.jar:?]
at com.starrocks.connector.CachingRemoteFileIO.getRemoteFiles(CachingRemoteFileIO.java:78) ~[starrocks-fe.jar:?]
at com.starrocks.connector.CachingRemoteFileIO.getRemoteFiles(CachingRemoteFileIO.java:70) ~[starrocks-fe.jar:?]
at com.starrocks.connector.CachingRemoteFileIO.loadRemoteFiles(CachingRemoteFileIO.java:87) ~[starrocks-fe.jar:?]
at com.google.common.cache.CacheLoader$FunctionToCacheLoader.load(CacheLoader.java:169) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.CacheLoader$1.load(CacheLoader.java:192) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3570) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2312) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2189) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2079) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache.get(LocalCache.java:4011) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4034) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:5010) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:5017) ~[spark-dpp-1.0.0.jar:?]
at com.starrocks.connector.CachingRemoteFileIO.getRemoteFiles(CachingRemoteFileIO.java:78) ~[starrocks-fe.jar:?]
at com.starrocks.connector.RemoteFileOperations.lambda$getRemoteFiles$0(RemoteFileOperations.java:86) ~[starrocks-fe.jar:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_212]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_212]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_212]
at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_212]
Caused by: java.net.UnknownHostException: mycluster
… 46 more
2025-09-19 18:38:29.928+08:00 ERROR (pull-hive-remote-files-10|11259) [CachingRemoteFileIO.getRemoteFiles():80] Error occurred when getting remote files from cache
com.google.common.util.concurrent.UncheckedExecutionException: com.starrocks.connector.exception.StarRocksConnectorException: Failed to get hive remote file’s metadata on path: RemotePathKey{path=‘hdfs://mycluster/user/hive/warehouse/xxx_dm.db/kkkk/pt_d=2025-09-18’, isRecursive=true}. msg: java.net.UnknownHostException: mycluster
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2085) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache.get(LocalCache.java:4011) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4034) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:5010) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:5017) ~[spark-dpp-1.0.0.jar:?]
at com.starrocks.connector.CachingRemoteFileIO.getRemoteFiles(CachingRemoteFileIO.java:78) ~[starrocks-fe.jar:?]
at com.starrocks.connector.CachingRemoteFileIO.getRemoteFiles(CachingRemoteFileIO.java:70) ~[starrocks-fe.jar:?]
at com.starrocks.connector.CachingRemoteFileIO.loadRemoteFiles(CachingRemoteFileIO.java:87) ~[starrocks-fe.jar:?]
at com.google.common.cache.CacheLoader$FunctionToCacheLoader.load(CacheLoader.java:169) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.CacheLoader$1.load(CacheLoader.java:192) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3570) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2312) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2189) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2079) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache.get(LocalCache.java:4011) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4034) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:5010) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:5017) ~[spark-dpp-1.0.0.jar:?]
at com.starrocks.connector.CachingRemoteFileIO.getRemoteFiles(CachingRemoteFileIO.java:78) ~[starrocks-fe.jar:?]
at com.starrocks.connector.RemoteFileOperations.lambda$getRemoteFiles$0(RemoteFileOperations.java:86) ~[starrocks-fe.jar:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_212]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_212]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_212]
at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_212]
Caused by: com.starrocks.connector.exception.StarRocksConnectorException: Failed to get hive remote file’s metadata on path: RemotePathKey{path=‘hdfs://mycluster/user/hive/warehouse/xxx_dm.db/kkkk/pt_d=2025-09-18’, isRecursive=true}. msg: java.net.UnknownHostException: mycluster
at com.starrocks.connector.hive.HiveRemoteFileIO.getRemoteFiles(HiveRemoteFileIO.java:122) ~[starrocks-fe.jar:?]
at com.starrocks.connector.hive.HiveRemoteFileIO.getRemoteFiles(HiveRemoteFileIO.java:67) ~[starrocks-fe.jar:?]
at com.starrocks.connector.CachingRemoteFileIO.loadRemoteFiles(CachingRemoteFileIO.java:87) ~[starrocks-fe.jar:?]
at com.google.common.cache.CacheLoader$FunctionToCacheLoader.load(CacheLoader.java:169) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.CacheLoader$1.load(CacheLoader.java:192) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3570) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2312) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2189) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2079) ~[spark-dpp-1.0.0.jar:?]
… 23 more
2025-09-19 18:38:29.928+08:00 ERROR (pull-hive-remote-files-10|11259) [CachingRemoteFileIO.getRemoteFiles():80] Error occurred when getting remote files from cache
com.google.common.util.concurrent.UncheckedExecutionException: com.starrocks.connector.exception.StarRocksConnectorException: Failed to get hive remote file’s metadata on path: RemotePathKey{path=‘hdfs://mycluster/user/hive/warehouse/xxx_dm.db/kkkk/pt_d=2025-09-18’, isRecursive=true}. msg: java.net.UnknownHostException: mycluster
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2085) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache.get(LocalCache.java:4011) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4034) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:5010) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:5017) ~[spark-dpp-1.0.0.jar:?]
at com.starrocks.connector.CachingRemoteFileIO.getRemoteFiles(CachingRemoteFileIO.java:78) ~[starrocks-fe.jar:?]
at com.starrocks.connector.RemoteFileOperations.lambda$getRemoteFiles$0(RemoteFileOperations.java:86) ~[starrocks-fe.jar:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_212]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_212]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_212]
at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_212]
Caused by: com.starrocks.connector.exception.StarRocksConnectorException: Failed to get hive remote file’s metadata on path: RemotePathKey{path=‘hdfs://mycluster/user/hive/warehouse/xxx_dm.db/kkkk/pt_d=2025-09-18’, isRecursive=true}. msg: java.net.UnknownHostException: mycluster
at com.starrocks.connector.hive.HiveRemoteFileIO.getRemoteFiles(HiveRemoteFileIO.java:122) ~[starrocks-fe.jar:?]
at com.starrocks.connector.hive.HiveRemoteFileIO.getRemoteFiles(HiveRemoteFileIO.java:67) ~[starrocks-fe.jar:?]
at com.starrocks.connector.CachingRemoteFileIO.loadRemoteFiles(CachingRemoteFileIO.java:87) ~[starrocks-fe.jar:?]
at com.google.common.cache.CacheLoader$FunctionToCacheLoader.load(CacheLoader.java:169) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.CacheLoader$1.load(CacheLoader.java:192) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3570) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2312) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2189) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2079) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache.get(LocalCache.java:4011) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4034) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:5010) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:5017) ~[spark-dpp-1.0.0.jar:?]
at com.starrocks.connector.CachingRemoteFileIO.getRemoteFiles(CachingRemoteFileIO.java:78) ~[starrocks-fe.jar:?]
at com.starrocks.connector.CachingRemoteFileIO.getRemoteFiles(CachingRemoteFileIO.java:70) ~[starrocks-fe.jar:?]
at com.starrocks.connector.CachingRemoteFileIO.loadRemoteFiles(CachingRemoteFileIO.java:87) ~[starrocks-fe.jar:?]
at com.google.common.cache.CacheLoader$FunctionToCacheLoader.load(CacheLoader.java:169) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.CacheLoader$1.load(CacheLoader.java:192) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3570) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2312) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2189) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2079) ~[spark-dpp-1.0.0.jar:?]
… 10 more
2025-09-19 18:38:29.929+08:00 ERROR (thrift-server-pool-9614|11258) [MetadataMgr.getRemoteFileInfos():364] Failed to list remote file’s metadata on catalog [hive_catalog_hms], table [HiveTable{catalogName=‘hive_catalog_hms’, hiveDbName=‘xxx_dm’, hiveTableName=‘kkkk’, resourceName=‘hive_catalog_hms’, id=100001833, name=‘kkkk’, type=HIVE, createTime=1731292184}]
com.starrocks.connector.exception.StarRocksConnectorException: Failed to get remote files, msg: com.starrocks.connector.exception.StarRocksConnectorException: Failed to get hive remote file’s metadata on path: RemotePathKey{path=‘hdfs://mycluster/user/hive/warehouse/xxx_dm.db/kkkk/pt_d=2025-09-18’, isRecursive=true}. msg: java.net.UnknownHostException: mycluster
at com.starrocks.connector.RemoteFileOperations.getRemoteFiles(RemoteFileOperations.java:94) ~[starrocks-fe.jar:?]
at com.starrocks.connector.RemoteFileOperations.getRemoteFiles(RemoteFileOperations.java:56) ~[starrocks-fe.jar:?]
at com.starrocks.connector.hive.HiveMetadata.getRemoteFileInfos(HiveMetadata.java:155) ~[starrocks-fe.jar:?]
at com.starrocks.server.MetadataMgr.getRemoteFileInfos(MetadataMgr.java:361) ~[starrocks-fe.jar:?]
at com.starrocks.server.MetadataMgr.getRemoteFileInfos(MetadataMgr.java:352) ~[starrocks-fe.jar:?]
at com.starrocks.catalog.HiveTable.toThrift(HiveTable.java:313) ~[starrocks-fe.jar:?]
at com.starrocks.analysis.DescriptorTable.toThrift(DescriptorTable.java:178) ~[starrocks-fe.jar:?]
at com.starrocks.qe.StmtExecutor.handleDMLStmt(StmtExecutor.java:1754) ~[starrocks-fe.jar:?]
at com.starrocks.load.InsertOverwriteJobRunner.executeInsert(InsertOverwriteJobRunner.java:322) ~[starrocks-fe.jar:?]
at com.starrocks.load.InsertOverwriteJobRunner.doLoad(InsertOverwriteJobRunner.java:161) ~[starrocks-fe.jar:?]
at com.starrocks.load.InsertOverwriteJobRunner.handle(InsertOverwriteJobRunner.java:141) ~[starrocks-fe.jar:?]
at com.starrocks.load.InsertOverwriteJobRunner.transferTo(InsertOverwriteJobRunner.java:201) ~[starrocks-fe.jar:?]
at com.starrocks.load.InsertOverwriteJobRunner.prepare(InsertOverwriteJobRunner.java:223) ~[starrocks-fe.jar:?]
at com.starrocks.load.InsertOverwriteJobRunner.handle(InsertOverwriteJobRunner.java:138) ~[starrocks-fe.jar:?]
at com.starrocks.load.InsertOverwriteJobRunner.run(InsertOverwriteJobRunner.java:126) ~[starrocks-fe.jar:?]
at com.starrocks.load.InsertOverwriteJobMgr.executeJob(InsertOverwriteJobMgr.java:87) ~[starrocks-fe.jar:?]
at com.starrocks.qe.StmtExecutor.handleInsertOverwrite(StmtExecutor.java:1578) ~[starrocks-fe.jar:?]
at com.starrocks.qe.StmtExecutor.handleDMLStmt(StmtExecutor.java:1657) ~[starrocks-fe.jar:?]
at com.starrocks.qe.StmtExecutor.handleDMLStmtWithProfile(StmtExecutor.java:1587) ~[starrocks-fe.jar:?]
at com.starrocks.qe.StmtExecutor.execute(StmtExecutor.java:621) ~[starrocks-fe.jar:?]
at com.starrocks.qe.ConnectProcessor.proxyExecute(ConnectProcessor.java:728) ~[starrocks-fe.jar:?]
at com.starrocks.service.FrontendServiceImpl.forward(FrontendServiceImpl.java:1124) ~[starrocks-fe.jar:?]
at com.starrocks.thrift.FrontendService$Processor$forward.getResult(FrontendService.java:3721) ~[starrocks-fe.jar:?]
at com.starrocks.thrift.FrontendService$Processor$forward.getResult(FrontendService.java:3701) ~[starrocks-fe.jar:?]
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:38) ~[libthrift-0.13.0.jar:0.13.0]
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:38) ~[libthrift-0.13.0.jar:0.13.0]
at com.starrocks.common.SRTThreadPoolServer$WorkerProcess.run(SRTThreadPoolServer.java:311) ~[starrocks-fe.jar:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_212]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_212]
at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_212]
2025-09-19 18:38:29.929+08:00 WARN (thrift-server-pool-9614|11258) [HiveTable.toThrift():315] table kkkk gets partition info failed.
com.starrocks.connector.exception.StarRocksConnectorException: Failed to get remote files, msg: com.starrocks.connector.exception.StarRocksConnectorException: Failed to get hive remote file’s metadata on path: RemotePathKey{path=‘hdfs://mycluster/user/hive/warehouse/xxx_dm.db/kkkk/pt_d=2025-09-18’, isRecursive=true}. msg: java.net.UnknownHostException: mycluster
at com.starrocks.connector.RemoteFileOperations.getRemoteFiles(RemoteFileOperations.java:94) ~[starrocks-fe.jar:?]
at com.starrocks.connector.RemoteFileOperations.getRemoteFiles(RemoteFileOperations.java:56) ~[starrocks-fe.jar:?]
at com.starrocks.connector.hive.HiveMetadata.getRemoteFileInfos(HiveMetadata.java:155) ~[starrocks-fe.jar:?]
at com.starrocks.server.MetadataMgr.getRemoteFileInfos(MetadataMgr.java:361) ~[starrocks-fe.jar:?]
at com.starrocks.server.MetadataMgr.getRemoteFileInfos(MetadataMgr.java:352) ~[starrocks-fe.jar:?]
at com.starrocks.catalog.HiveTable.toThrift(HiveTable.java:313) ~[starrocks-fe.jar:?]
at com.starrocks.analysis.DescriptorTable.toThrift(DescriptorTable.java:178) ~[starrocks-fe.jar:?]
at com.starrocks.qe.StmtExecutor.handleDMLStmt(StmtExecutor.java:1754) ~[starrocks-fe.jar:?]
at com.starrocks.load.InsertOverwriteJobRunner.executeInsert(InsertOverwriteJobRunner.java:322) ~[starrocks-fe.jar:?]
at com.starrocks.load.InsertOverwriteJobRunner.doLoad(InsertOverwriteJobRunner.java:161) ~[starrocks-fe.jar:?]
at com.starrocks.load.InsertOverwriteJobRunner.handle(InsertOverwriteJobRunner.java:141) ~[starrocks-fe.jar:?]
at com.starrocks.load.InsertOverwriteJobRunner.transferTo(InsertOverwriteJobRunner.java:201) ~[starrocks-fe.jar:?]
at com.starrocks.load.InsertOverwriteJobRunner.prepare(InsertOverwriteJobRunner.java:223) ~[starrocks-fe.jar:?]
at com.starrocks.load.InsertOverwriteJobRunner.handle(InsertOverwriteJobRunner.java:138) ~[starrocks-fe.jar:?]
at com.starrocks.load.InsertOverwriteJobRunner.run(InsertOverwriteJobRunner.java:126) ~[starrocks-fe.jar:?]
at com.starrocks.load.InsertOverwriteJobMgr.executeJob(InsertOverwriteJobMgr.java:87) ~[starrocks-fe.jar:?]
at com.starrocks.qe.StmtExecutor.handleInsertOverwrite(StmtExecutor.java:1578) ~[starrocks-fe.jar:?]
at com.starrocks.qe.StmtExecutor.handleDMLStmt(StmtExecutor.java:1657) ~[starrocks-fe.jar:?]
at com.starrocks.qe.StmtExecutor.handleDMLStmtWithProfile(StmtExecutor.java:1587) ~[starrocks-fe.jar:?]
at com.starrocks.qe.StmtExecutor.execute(StmtExecutor.java:621) ~[starrocks-fe.jar:?]
at com.starrocks.qe.ConnectProcessor.proxyExecute(ConnectProcessor.java:728) ~[starrocks-fe.jar:?]
at com.starrocks.service.FrontendServiceImpl.forward(FrontendServiceImpl.java:1124) ~[starrocks-fe.jar:?]
at com.starrocks.thrift.FrontendService$Processor$forward.getResult(FrontendService.java:3721) ~[starrocks-fe.jar:?]
at com.starrocks.thrift.FrontendService$Processor$forward.getResult(FrontendService.java:3701) ~[starrocks-fe.jar:?]
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:38) ~[libthrift-0.13.0.jar:0.13.0]
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:38) ~[libthrift-0.13.0.jar:0.13.0]
at com.starrocks.common.SRTThreadPoolServer$WorkerProcess.run(SRTThreadPoolServer.java:311) ~[starrocks-fe.jar:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_212]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_212]
at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_212]
2025-09-19 18:38:29.930+08:00 INFO (thrift-server-pool-9614|11258) [QeProcessorImpl.registerQuery():103] register query id = c846996c-9544-11f0-8a6b-00163e121b02
2025-09-19 18:38:29.930+08:00 INFO (pull-hive-remote-files-11|11260) [FileSystem.createFileSystemInternal():3730] [hadoop-ext] FileSystem.createFileSystem
2025-09-19 18:38:29.930+08:00 WARN (pull-hive-remote-files-11|11260) [FileSystem.createFileSystemInternal():3744] Failed to initialize filesystem hdfs://mycluster/user/hive/warehouse/xxx_dm.db/kkkk/pt_d=2025-09-18: java.lang.IllegalArgumentException: java.net.UnknownHostException: mycluster
2025-09-19 18:38:29.930+08:00 ERROR (pull-hive-remote-files-11|11260) [HiveRemoteFileIO.getRemoteFiles():120] Failed to get hive remote file’s metadata on path: hdfs://mycluster/user/hive/warehouse/xxx_dm.db/kkkk/pt_d=2025-09-18
java.lang.IllegalArgumentException: java.net.UnknownHostException: mycluster
at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:475) ~[hadoop-common-3.3.6.jar:?]
at org.apache.hadoop.hdfs.NameNodeProxiesClient.createProxyWithClientProtocol(NameNodeProxiesClient.java:134) ~[hadoop-hdfs-client-3.3.6.jar:?]
at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:374) ~[hadoop-hdfs-client-3.3.6.jar:?]
at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:308) ~[hadoop-hdfs-client-3.3.6.jar:?]
at org.apache.hadoop.hdfs.DistributedFileSystem.initDFSClient(DistributedFileSystem.java:204) ~[hadoop-hdfs-client-3.3.6.jar:?]
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:189) ~[hadoop-hdfs-client-3.3.6.jar:?]
at org.apache.hadoop.fs.FileSystem.createFileSystemInternal(FileSystem.java:3740) ~[starrocks-hadoop-ext.jar:?]
at org.apache.hadoop.fs.FileSystem.lambda$createFileSystem$0(FileSystem.java:3760) ~[starrocks-hadoop-ext.jar:?]
at com.starrocks.connector.hadoop.HadoopExt.doAs(HadoopExt.java:61) ~[starrocks-hadoop-ext.jar:?]
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3760) ~[starrocks-hadoop-ext.jar:?]
at org.apache.hadoop.fs.FileSystem.access$300(FileSystem.java:199) ~[starrocks-hadoop-ext.jar:?]
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3859) ~[starrocks-hadoop-ext.jar:?]
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3807) ~[starrocks-hadoop-ext.jar:?]
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:601) ~[starrocks-hadoop-ext.jar:?]
at com.starrocks.connector.hive.HiveRemoteFileIO.getRemoteFiles(HiveRemoteFileIO.java:78) ~[starrocks-fe.jar:?]
at com.starrocks.connector.hive.HiveRemoteFileIO.getRemoteFiles(HiveRemoteFileIO.java:67) ~[starrocks-fe.jar:?]
at com.starrocks.connector.CachingRemoteFileIO.loadRemoteFiles(CachingRemoteFileIO.java:87) ~[starrocks-fe.jar:?]
at com.google.common.cache.CacheLoader$FunctionToCacheLoader.load(CacheLoader.java:169) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.CacheLoader$1.load(CacheLoader.java:192) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3570) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2312) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2189) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2079) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache.get(LocalCache.java:4011) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4034) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:5010) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:5017) ~[spark-dpp-1.0.0.jar:?]
at com.starrocks.connector.CachingRemoteFileIO.getRemoteFiles(CachingRemoteFileIO.java:78) ~[starrocks-fe.jar:?]
at com.starrocks.connector.CachingRemoteFileIO.getRemoteFiles(CachingRemoteFileIO.java:70) ~[starrocks-fe.jar:?]
at com.starrocks.connector.CachingRemoteFileIO.loadRemoteFiles(CachingRemoteFileIO.java:87) ~[starrocks-fe.jar:?]
at com.google.common.cache.CacheLoader$FunctionToCacheLoader.load(CacheLoader.java:169) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.CacheLoader$1.load(CacheLoader.java:192) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3570) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2312) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2189) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2079) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache.get(LocalCache.java:4011) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4034) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:5010) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:5017) ~[spark-dpp-1.0.0.jar:?]
at com.starrocks.connector.CachingRemoteFileIO.getRemoteFiles(CachingRemoteFileIO.java:78) ~[starrocks-fe.jar:?]
at com.starrocks.connector.RemoteFileOperations.lambda$getRemoteFiles$0(RemoteFileOperations.java:86) ~[starrocks-fe.jar:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_212]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_212]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_212]
at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_212]
Caused by: java.net.UnknownHostException: mycluster
… 46 more
2025-09-19 18:38:29.931+08:00 ERROR (pull-hive-remote-files-11|11260) [CachingRemoteFileIO.getRemoteFiles():80] Error occurred when getting remote files from cache
com.google.common.util.concurrent.UncheckedExecutionException: com.starrocks.connector.exception.StarRocksConnectorException: Failed to get hive remote file’s metadata on path: RemotePathKey{path=‘hdfs://mycluster/user/hive/warehouse/xxx_dm.db/kkkk/pt_d=2025-09-18’, isRecursive=true}. msg: java.net.UnknownHostException: mycluster
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2085) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache.get(LocalCache.java:4011) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4034) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:5010) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:5017) ~[spark-dpp-1.0.0.jar:?]
at com.starrocks.connector.CachingRemoteFileIO.getRemoteFiles(CachingRemoteFileIO.java:78) ~[starrocks-fe.jar:?]
at com.starrocks.connector.CachingRemoteFileIO.getRemoteFiles(CachingRemoteFileIO.java:70) ~[starrocks-fe.jar:?]
at com.starrocks.connector.CachingRemoteFileIO.loadRemoteFiles(CachingRemoteFileIO.java:87) ~[starrocks-fe.jar:?]
at com.google.common.cache.CacheLoader$FunctionToCacheLoader.load(CacheLoader.java:169) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.CacheLoader$1.load(CacheLoader.java:192) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3570) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2312) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2189) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2079) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache.get(LocalCache.java:4011) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4034) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:5010) ~[spark-dpp-1.0.0.jar:?]
at com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:5017) ~[spark-dpp-1.0.0.jar:?]
at com.starrocks.connector.CachingRemoteFileIO.getRemoteFiles(CachingRemoteFileIO.java:78) ~[starrocks-fe.jar:?]
at com.starrocks.connector.RemoteFileOperations.lambda$getRemoteFiles$0(RemoteFileOperations.java:86) ~[starrocks-fe.jar:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_212]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_212]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_212]
at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_212]