StarRocks3.3.0 创建hive catalog连接带有kerberos认证失败

为了更快的定位您的问题,请提供以下信息,谢谢
【详述】创建hive catalog 成功,但是show database 报kerberos认证失败
【背景】做过哪些操作?
【业务影响】
【是否存算分离】
【StarRocks版本】3.3.0
【java版本】jdk-11.0.21
【集群规模】1fe(1 follower+2observer)+5be
【机器信息】CPU虚拟核/内存/网卡,48C/64G/万兆
【联系方式】648380139@qq.com
【附件】

  • fe.conf
    JAVA_OPTS="-Dsun.security.krb5.debug=true -Djava.security.krb5.conf=/opt/StarRocks/krb5.conf -Djavax.security.auth.useSubjectCredsOnly=false -Dlog4j2.formatMsgNoLookups=true -Xmx125536m "

  • be.conf
    JAVA_OPTS="-Xmx1024m -Djava.security.krb5.conf=/opt/StarRocks/krb5.conf -Djavax.security.auth.useSubjectCredsOnly=false"

  • catalog语句:
    CREATE EXTERNAL CATALOG bigdata_hive_test
    PROPERTIES (
    “type” = “hive”,
    “hive.metastore.type” = “hive”,
    “hive.metastore.uris” = “thrift://hivenode1:9083”,
    “hadoop.security.authentication” = “kerberos”,
    “hadoop.kerberos.keytab” = “/opt/StarRocks/be/conf/hive.keytab”,
    );

  • fe.log
    2024-08-16 17:25:48.281+08:00 INFO (starrocks-mysql-nio-pool-3|1242) [HiveMetaStoreClient.openInternal():461] Trying to connect to metastore with URI thrift://hivenode1:9083
    2024-08-16 17:25:48.282+08:00 INFO (starrocks-mysql-nio-pool-3|1242) [HiveMetaStoreClient.openInternal():513] HMSC::open(): Could not find delegation token. Creating KERBEROS-based thrift connection.
    2024-08-16 17:25:48.285+08:00 ERROR (starrocks-mysql-nio-pool-3|1242) [TSaslTransport.open():271] SASL negotiation failure
    javax.security.sasl.SaslException: GSS initiate failed
    at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:222) ~[jdk.security.jgss:?]
    at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) ~[libthrift-0.20.0.jar:0.20.0]
    at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:231) ~[libthrift-0.20.0.jar:0.20.0]
    at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) ~[libthrift-0.20.0.jar:0.20.0]
    at org.apache.hadoop.hive.metastore.security.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:51) ~[hive-apache-3.1.2-22.jar:?]
    at org.apache.hadoop.hive.metastore.security.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:48) ~[hive-apache-3.1.2-22.jar:?]
    at java.security.AccessController.doPrivileged(Native Method) ~[?:?]
    at javax.security.auth.Subject.doAs(Subject.java:423) ~[?:?]
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1953) ~[hadoop-common-3.4.0.jar:?]
    at org.apache.hadoop.hive.metastore.security.TUGIAssumingTransport.open(TUGIAssumingTransport.java:48) ~[hive-apache-3.1.2-22.jar:?]
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.openInternal(HiveMetaStoreClient.java:540) ~[starrocks-fe.jar:?]
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.lambda$open$1(HiveMetaStoreClient.java:444) ~[starrocks-fe.jar:?]

  • 客户端log

MySQL [(none)]> show databases;
ERROR 1064 (HY000): Failed to getAllDatabases, msg: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

  • 备注
    端口和网络都正常

1、每个BE、FE节点定时执行kinit
export KRB5_CONFIG=/opt/StarRocks/krb5.conf
kinit -kt /opt/StarRocks/hive.keytab hive

2、创建catalog
CREATE EXTERNAL CATALOG bigdata_hive
PROPERTIES (“hive.metastore.type” = “hive”,
“hadoop.security.authentication” = “kerberos”,
“hadoop.kerberos.keytab” = “/opt/doris/be/conf/hive.keytab”,
“hive.metastore.uris” = “thrift://hivenode1:9083”,
“type” = “hive”
)

3、执行show databases

(‘root’@10.192.12.51) 13:11:04 [dwd]> set catalog bigdata_hive;
(‘root’@10.192.12.51) 13:11:14 [test]> show databases;
±-------------------+
| Database |
±-------------------+
| test |
| tmp |
| tpcds_text_3 |
| ywfx |
±-------------------+
41 rows in set (0.18 sec)

4、查询

(root’@10.192.12.51) 13:11:24 [test]> select * from test1;
ERROR 1064 (HY000): HdfsOrcScanner::do_open failed. reason = Failed to read hdfs://nameservice1/user/hive/warehouse/test.db/test1/dt=2024-05-01/000000_0: Internal error: fail to connect hdfs namenode, namenode=hdfs://nameservice1/, err=error=Error(255): 未知的错误 255, root_cause=KerberosName.NoMatchingRule: No rules applied to hive@CDH.COM
be/src/fs/hdfs/hdfs_fs_cache.cpp:121 create_hdfs_fs_handle(namenode, hdfs_client, options)
be/src/fs/hdfs/fs_hdfs.cpp:45 HdfsFsCache::instance()->get_connection(namenode, _hdfs_cl

5、BE log

hdfsBuilderConnect(forceNewInstance=1, nn=hdfs://nameservice1/, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error:
KerberosName.NoMatchingRule: No rules applied to hive@CDH.COMorg.apache.hadoop.security.KerberosAuthException: failure to login: javax.security.auth.login.LoginException: java.lang.IllegalArgumentException: Illegal principal name hive@CDH.COM: org.apache.hadoop.security.authentication.util.KerberosName$NoMatchingRule: No rules applied to hive@CDH.COM
at org.apache.hadoop.security.UserGroupInformation.doSubjectLogin(UserGroupInformation.java:2064)
at org.apache.hadoop.security.UserGroupInformation.createLoginUser(UserGroupInformation.java:733)
at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:683)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:590)
at org.apache.hadoop.security.UserGroupInformation.getBestUGI(UserGroupInformation.java:611)
at org.apache.hadoop.fs.FileSystem.newInstance(FileSystem.java:621)
Caused by: javax.security.auth.login.LoginException: java.lang.IllegalArgumentException: Illegal principal name hive@CDH.COM: org.apache.hadoop.security.authentication.util.KerberosName$NoMatchingRule: No rules applied to hive@CDH.COM
at org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:227)
at java.base/javax.security.auth.login.LoginContext.invoke(LoginContext.java:750)
at java.base/javax.security.auth.login.LoginContext$4.run(LoginContext.java:672)
at java.base/javax.security.auth.login.LoginContext$4.run(LoginContext.java:670)
at java.base/java.security.AccessController.doPrivileged(Native Method)
at java.base/javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:670)
at java.base/javax.security.auth.login.LoginContext.login(LoginContext.java:582)
at org.apache.hadoop.security.UserGroupInformation$HadoopLoginContext.login(UserGroupInformation.java:2148)
at org.apache.hadoop.security.UserGroupInformation.doSubjectLogin(UserGroupInformation.java:2053)