Strarrocks3.2.3 创建带kerberos认证的hive catalog失败

为了更快的定位您的问题,请提供以下信息,谢谢
【详述】有4台hadoop集群,s1,s2,s3,s4, kerberos装在s1上。现在s5-s7装了sr 323版本。 也通过官网做了相关 kerberos认证,在dbeaver 客户端里,可以成功创建hive catalog ,但是不能use hive里的数据库。

fe的hadoop_env.sh里的用户
export HADOOP_USER_NAME=“hive”

在s5的klist 认证信息
[root@10 ~]# klist
Ticket cache: FILE:/tmp/krb5cc_0
Default principal: hive/bigdata-1@ZETAPROD.COM

Valid starting Expires Service principal
03/13/2024 20:31:50 03/14/2024 20:31:50 krbtgt/ZETAPROD.COM@ZETAPROD.COM

创建hive catalog语句:
CREATE EXTERNAL CATALOG hive_padmin5
PROPERTIES (“hive.metastore.type” = “hive”,
“hive.metastore.uris” = “thrift://10.153.11.71:10000”,
“type” = “hive”
)

CREATE EXTERNAL CATALOG hive_padmin6
PROPERTIES (“hive.metastore.type” = “hive”,
“hadoop.security.authentication” = “kerberos”,
“hadoop.kerberos.keytab” = “/etc/security/keytabs/hive.service.keytab”,
“hadoop.kerberos.principal” = “padmin@ZETAPROD.COM”,
“hive.metastore.uris” = “thrift://bigdata-2:9083”,
“type” = “hive”
)


以上两个语句都可正常执行。
但是在 use cata.db时候报错,先set cata ,后use db也报错。


报错内容:
org.jkiss.dbeaver.model.sql.DBSQLException: SQL 错误 [1064] [42000]: Unknown database ‘ads’
at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCStatementImpl.executeStatement(JDBCStatementImpl.java:133)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.executeStatement(SQLQueryJob.java:614)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.lambda$2(SQLQueryJob.java:505)
at org.jkiss.dbeaver.model.exec.DBExecUtils.tryExecuteRecover(DBExecUtils.java:190)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.executeSingleQuery(SQLQueryJob.java:524)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.extractData(SQLQueryJob.java:976)
at org.jkiss.dbeaver.ui.editors.sql.SQLEditor$QueryResultsContainer.readData(SQLEditor.java:4133)
at org.jkiss.dbeaver.ui.controls.resultset.ResultSetJobDataRead.lambda$0(ResultSetJobDataRead.java:123)
at org.jkiss.dbeaver.model.exec.DBExecUtils.tryExecuteRecover(DBExecUtils.java:190)
at org.jkiss.dbeaver.ui.controls.resultset.ResultSetJobDataRead.run(ResultSetJobDataRead.java:121)
at org.jkiss.dbeaver.ui.controls.resultset.ResultSetViewer$ResultSetDataPumpJob.run(ResultSetViewer.java:5148)
at org.jkiss.dbeaver.model.runtime.AbstractJob.run(AbstractJob.java:114)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:63)
Caused by: java.sql.SQLSyntaxErrorException: Unknown database ‘ads’
at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:121)
at com.mysql.cj.jdbc.exceptions.SQLExceptionsMapping.translateException(SQLExceptionsMapping.java:122)
at com.mysql.cj.jdbc.StatementImpl.executeInternal(StatementImpl.java:770)
at com.mysql.cj.jdbc.StatementImpl.execute(StatementImpl.java:653)
at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCStatementImpl.execute(JDBCStatementImpl.java:330)
at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCStatementImpl.executeStatement(JDBCStatementImpl.java:131)
… 12 more


另外我也试了用 broker load凡是连hdfs, 语句可以正常执行,但是sr 表里没数据
LOAD LABEL ads.label1
(
DATA INFILE(“hdfs://10.150.18.70:50070/apps/hive/warehouse/ads.db/t1/000000_0”)
INTO TABLE t1
)
WITH BROKER my_hdfs_broker
(
“hadoop.security.authentication” = “kerberos”,
“kerberos_principal” = “hive/bigdata-1@ZETAPROD.COM”,
“kerberos_keytab” = “/etc/security/keytabs/hive.service.keytab”
)

show load where label like ‘label’

errMsg:type:ETL_RUN_FAIL; msg:Unknown broker name(my_hdfs_broker)

LOAD LABEL ads.label2
(
DATA INFILE(“hdfs://10.150.18.70:50070/apps/hive/warehouse/ads.db/t1/000000_0”)
INTO TABLE t1
)
WITH BROKER
(
“hadoop.security.authentication” = “kerberos”,
“kerberos_principal” = “hive/bigdata-1@ZETAPROD.COM”,
“kerberos_keytab” = “/etc/security/keytabs/hive.service.keytab”
)

错误:type:ETL_RUN_FAIL; msg:invalid load_properties, kerberos should be set in hdfs/core-site.xml for load without broker. For broker load with broker, you can set namenode HA in the load_properties


请教各位大佬:
我在hadoop_env.sh用户是否正确.

怎么验证是否真正的链接上 kerberos 服务器。

创建hive catalog的语句是否正确。

最直接的 就是 ,怎么能连上hive 或hdfs 拿到上面的数据。

这个那您是否做了呢

这个已经做了,每台机器上执行klist 都有内容
[root@10 ~]# klist
Ticket cache: FILE:/tmp/krb5cc_0
Default principal: padmin@ZETAPROD.COM

Valid starting Expires Service principal
03/14/2024 12:08:04 03/15/2024 12:08:04 krbtgt/ZETAPROD.COM@ZETAPROD.COM

Hi! 这个问题你解决了吗?

可以单独发个帖子,把任务创建命令,做了哪些配置补充下