为了更快的定位您的问题,请提供以下信息,谢谢
【详述】
sr正常使用,在sr中创建hive catalog
CREATE EXTERNAL CATALOG hive_catalog
PROPERTIES(
“type” = “hive”,
“hive.metastore.uris” = “thrift://cdp02.:9083”,
“hadoop.security.authentication” = “kerberos”,
“hadoop.kerberos.keytab” = “/home/bsmp/portal/.keytab/hive.keytab”,
“hadoop.kerberos.principal” = “hive/admin@AOTAIN.COM”
);
【背景】在fe和be中添加了core-site.xml、hdfs-site.xml、hive-site.xml、krb5.conf 在fe/conf/fe.conf 文件中添加了export HADOOP_USER_NAME = hive
【业务影响】
【StarRocks版本】3.1.11
【机器信息】
【联系方式】电话号码:z13672289825@163.com
【附件】
报错信息:查询所有的表报这个
mysql> select * from test_with_gzip_compression limit 1;
ERROR 1064 (HY000): hdfsOpenFile failed, file=hdfs://nameservice1/user/hive/warehouse/starrocks_test20260119_db.db/test_with_gzip_compression/000000_0. err_msg: error=Error(255): Unknown error 255, root_cause=GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt): file = hdfs://nameservice1/user/hive/warehouse/starrocks_test20260119_db.db/test_with_gzip_compression/000000_0
mysql>