创建外表查询失败

【详述】创建hive 外表查询失败,不知是否版本升级导致,原来2.2版本创建的外表升级2.3.4之后有很多也查询不了了
【业务影响】
【StarRocks版本】例如:2.3.4
CREATE EXTERNAL TABLE hdop.ext_test (

user_id string,

year_no string

) ENGINE=HIVE

PROPERTIES (

“resource” = “hive_qdcdh”,

“database” = “di_yh_appop”,

“table” = “dwd_shop_user_register”

);

select * from hdop.ext_test

出现以下错误
SQL 错误 [1064] [42000]: hdfsOpenFile failed, file=hdfs://nameservice1/user/hive/warehouse/di_yh_appop.db/dwd_shop_user_register/year_no=2021/000000_0:file = hdfs://nameservice1/user/hive/warehouse/di_yh_appop.db/dwd_shop_user_register/year_no=2021/000000_0

先refresh一下之后再查试试呢?相同的表2.2版本能查,2.3.4现在不能查了吗?发下报错的be.out日志看下呢

另外贴下be.out的日志

2.3.4一样的问题,2.2的时候,查的好好的,升级后hive外表不可用了。
手动查所有的外表,都是一样的报错,hive外表功能不可用了。

每台节点的klist还在有效期内。

be.out日志

hdfsOpenFile(hdfs://slcluster01/hive_warehouse/sl_dm.db/sl_dm_promt_order_attribution_sct_temp/dt=2022-10-17/part-00003-9152b64e-a0e3-48b2-8c16-a057416859d6-c000): FileSystem#open((Lorg/apache/hadoop/fs/Path;I)Lorg/apache/hadoop/fs/FSDataInputStream;) error:
AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]java.io.IOException: DestHost:destPort metastore-adress:8020 , LocalHost:localPort centos/10.0.0.1:0. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
    at sun.reflect.GeneratedConstructorAccessor14.newInstance(Unknown Source)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:837)
    at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:812)
    at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1566)
    at org.apache.hadoop.ipc.Client.call(Client.java:1508)
    at org.apache.hadoop.ipc.Client.call(Client.java:1405)
    at org.apache.hadoop.ipc.ProtobufRpcEngine2$Invoker.invoke(ProtobufRpcEngine2.java:234)
    at org.apache.hadoop.ipc.ProtobufRpcEngine2$Invoker.invoke(ProtobufRpcEngine2.java:119)
    at com.sun.proxy.$Proxy9.getBlockLocations(Unknown Source)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:333)
    at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
    at com.sun.proxy.$Proxy10.getBlockLocations(Unknown Source)
    at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:892)
    at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:881)
    at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:870)
    at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1038)
    at org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:333)
    at org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:329)
    at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:346)
Caused by: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
    at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:778)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1845)
    at org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:732)
    at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:835)
    at org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:413)
    at org.apache.hadoop.ipc.Client.getConnection(Client.java:1636)
    at org.apache.hadoop.ipc.Client.call(Client.java:1452)
    ... 22 more
Caused by: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
    at org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:179)
    at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:392)
    at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:622)

我是用 k8s 部署的 cn,版本是2.4.0,也遇到了同样的问题
1.k8s 的 cn pod 中已经放了 keytab,并且做了定时更新( 并且cn 进程启动前进行了 kinit)
2.core-site.xml/hdfs-site.xml 放过 $BE_HOME/conf 下
3.去掉 k8s 部署的 cn,集群就可以查外部表,加上之后就不能查,pod 中cn.out的报错和楼主一致