StarRocks hive外表权限相关

StarRocks版本 2.5.5

现在需要使用hive外表,然后根据官网的的命令创建了hive resource
CREATE EXTERNAL RESOURCE “hive0”
PROPERTIES (
“type” = “hive”,
“hive.metastore.uris” = “thrift://myhadoop:9083”
);

然后查询报错:
starrocks [42000][1064] hdfsOpenFile failed

我的hdfs是需要用户名和密码进行访问的,我想问这个要配在哪个地方?
Spark 的resource 有一个 broker.user 和 broker.password,配置了Spark resource是可以使用的

2.5.5可以使用hive catalog,需要把对应的hive-site.xml和hdfs-site.xml配置放到fe/be conf下,具体参考 https://docs.starrocks.io/zh-cn/latest/data_source/catalog/hive_catalog

使用hive catalog触发了一个内部错误

2023-06-12 06:58:40,009 WARN (thrift-server-pool-316490|338626) [Coordinator.updateFragmentExecStatus():2476] one instance report fail errorCode INTERNAL_ERROR hdfsOpenFile failed, path=hdfs://cdh1:8020/:file = hdfs://cdh1:8020/**, params=TReportExecStatusParams(protocol_version:V1, query_id:TUniqueId(hi:-8145920752530615826, lo:-8390768710694777104), backend_num:1, fragment_instance_id:TUniqueId(hi:-8145920752530615826, lo:-8390768710694777103), status:TStatus(status_code:INTERNAL_ERROR, error_msgs:[hdfsOpenFile failed, path=hdfs://cdh1:8020/:file = hdfs://cdh1:8020/), done:true, error_log:[, , , , , , , , , , , , , , , ], backend_id:172518) query_id=8ef3e03c-08ee-11ee-8b8e-005056934ef0 instance_id=8ef3e03c-08ee-11ee-8b8e-005056934ef1
2023-06-12 06:58:40,009 WARN (thrift-server-pool-316490|338626) [Coordinator.updateStatus():1526] one instance report fail throw updateStatus(), need cancel. job id: -1, query id: 8ef3e03c-08ee-11ee-8b8e-005056934ef0, instance id: 8ef3e03c-08ee-11ee-8b8e-005056934ef1
2023-06-12 06:58:40,010 INFO (thrift-server-pool-316490|338626) [Coordinator.cancelInternal():1627] unfinished instance: 8ef3e03c-08ee-11ee-8b8e-005056934efa

2023-06-12 06:58:40,013 WARN (starrocks-mysql-nio-pool-10302|338592) [Coordinator.getNext():1546] get next fail, need cancel. status errorCode CANCELLED InternalError, query id: 8ef3e03c-08ee-11ee-8b8e-005056934ef0
2023-06-12 06:58:40,013 WARN (starrocks-mysql-nio-pool-10302|338592) [Coordinator.getNext():1567] query failed: hdfsOpenFile failed, path=hdfs://cdh1:8020/:file = hdfs://cdh1:8020/
2023-06-12 06:58:53,222 WARN (starrocks-mysql-nio-pool-10302|338592) [BackendServiceClient.execBatchPlanFragmentsAsync():115] Execute batch plan fragments catch a exception, address=starrock1:8060
java.lang.RuntimeException: Unable to validate object
at com.baidu.jprotobuf.pbrpc.transport.ChannelPool.getChannel(ChannelPool.java:86) ~[jprotobuf-rpc-core-4.2.1.jar:?]
at com.baidu.jprotobuf.pbrpc.transport.RpcChannel.getConnection(RpcChannel.java:73) ~[jprotobuf-rpc-core-4.2.1.jar:?]
at com.baidu.jprotobuf.pbrpc.client.ProtobufRpcProxy.invoke(ProtobufRpcProxy.java:499) ~[jprotobuf-rpc-core-4.2.1.jar:?]
at com.sun.proxy.$Proxy34.execBatchPlanFragmentsAsync(Unknown Source) ~[?:?]
at com.starrocks.rpc.BackendServiceClient.execBatchPlanFragmentsAsync(BackendServiceClient.java:101) ~[starrocks-fe.jar:?]
at com.starrocks.qe.Coordinator$BackendExecState.execRemoteBatchFragmentsAsync(Coordinator.java:3056) ~[starrocks-fe.jar:?]
at com.starrocks.qe.Coordinator.deliverExecBatchFragmentsRequests(Coordinator.java:1227) ~[starrocks-fe.jar:?]
at com.starrocks.qe.Coordinator.deliverExecFragments(Coordinator.java:771) ~[starrocks-fe.jar:?]
at com.starrocks.qe.Coordinator.exec(Coordinator.java:674) ~[starrocks-fe.jar:?]
at com.starrocks.qe.StmtExecutor.handleQueryStmt(StmtExecutor.java:750) ~[starrocks-fe.jar:?]
at com.starrocks.qe.StmtExecutor.execute(StmtExecutor.java:442) ~[starrocks-fe.jar:?]
at com.starrocks.qe.ConnectProcessor.handleQuery(ConnectProcessor.java:323) ~[starrocks-fe.jar:?]
at com.starrocks.qe.ConnectProcessor.dispatch(ConnectProcessor.java:440) ~[starrocks-fe.jar:?]
at com.starrocks.qe.ConnectProcessor.processOnce(ConnectProcessor.java:698) ~[starrocks-fe.jar:?]
at com.starrocks.mysql.nio.ReadListener.lambda$handleEvent$0(ReadListener.java:55) ~[starrocks-fe.jar:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_333]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_333]
at java.lang.Thread.run(Thread.java:750) [?:1.8.0_333]
Caused by: java.util.NoSuchElementException: Unable to validate object
at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:506) ~[commons-pool2-2.3.jar:2.3]
at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:363) ~[commons-pool2-2.3.jar:2.3]
at com.baidu.jprotobuf.pbrpc.transport.ChannelPool.getChannel(ChannelPool.java:80) ~[jprotobuf-rpc-core-4.2.1.jar:?]
… 17 more
2023-06-12 06:58:53,274 WARN (starrocks-mysql-nio-pool-10302|338592) [Coordinator.deliverExecBatchFragmentsRequests():1259] exec plan fragment failed, errmsg=Unable to validate object, host: 172.16.10.35, code: THRIFT_RPC_ERROR, fragmentId=F01, backend=172.16.10.35:9060
2023-06-12 06:58:53,278 WARN (starrocks-mysql-nio-pool-10302|338592) [BackendServiceClient.cancelPlanFragmentAsync():158] Cancel plan fragment catch a exception, address=172.16.10.35:8060
java.lang.RuntimeException: Unable to validate object
at com.baidu.jprotobuf.pbrpc.transport.ChannelPool.getChannel(ChannelPool.java:86) ~[jprotobuf-rpc-core-4.2.1.jar:?]
at com.baidu.jprotobuf.pbrpc.transport.RpcChannel.getConnection(RpcChannel.java:73) ~[jprotobuf-rpc-core-4.2.1.jar:?]
at com.baidu.jprotobuf.pbrpc.client.ProtobufRpcProxy.invoke(ProtobufRpcProxy.java:499) ~[jprotobuf-rpc-core-4.2.1.jar:?]
at com.sun.proxy.$Proxy34.cancelPlanFragmentAsync(Unknown Source) ~[?:?]
at com.starrocks.rpc.BackendServiceClient.cancelPlanFragmentAsync(BackendServiceClient.java:141) ~[starrocks-fe.jar:?]
at com.starrocks.qe.Coordinator$BackendExecState.cancelFragmentInstance(Coordinator.java:2964) ~[starrocks-fe.jar:?]
at com.starrocks.qe.Coordinator.cancelRemoteFragmentsAsync(Coordinator.java:1634) ~[starrocks-fe.jar:?]
at com.starrocks.qe.Coordinator.cancelInternal(Coordinator.java:1623) ~[starrocks-fe.jar:?]
at com.starrocks.qe.Coordinator.handleErrorBackendExecState(Coordinator.java:780) ~[starrocks-fe.jar:?]
at com.starrocks.qe.Coordinator.deliverExecBatchFragmentsRequests(Coordinator.java:1278) ~[starrocks-fe.jar:?]
at com.starrocks.qe.Coordinator.deliverExecFragments(Coordinator.java:771) ~[starrocks-fe.jar:?]
at com.starrocks.qe.Coordinator.exec(Coordinator.java:674) ~[starrocks-fe.jar:?]
at com.starrocks.qe.StmtExecutor.handleQueryStmt(StmtExecutor.java:750) ~[starrocks-fe.jar:?]
at com.starrocks.qe.StmtExecutor.execute(StmtExecutor.java:442) ~[starrocks-fe.jar:?]
at com.starrocks.qe.ConnectProcessor.handleQuery(ConnectProcessor.java:323) ~[starrocks-fe.jar:?]
at com.starrocks.qe.ConnectProcessor.dispatch(ConnectProcessor.java:440) ~[starrocks-fe.jar:?]
at com.starrocks.qe.ConnectProcessor.processOnce(ConnectProcessor.java:698) ~[starrocks-fe.jar:?]
at com.starrocks.mysql.nio.ReadListener.lambda$handleEvent$0(ReadListener.java:55) ~[starrocks-fe.jar:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_333]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_333]
at java.lang.Thread.run(Thread.java:750) [?:1.8.0_333]
Caused by: java.util.NoSuchElementException: Unable to validate object
at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:506) ~[commons-pool2-2.3.jar:2.3]
at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:363) ~[commons-pool2-2.3.jar:2.3]
at com.baidu.jprotobuf.pbrpc.transport.ChannelPool.getChannel(ChannelPool.java:80) ~[jprotobuf-rpc-core-4.2.1.jar:?]
… 20 more

用S3的协议试试

CREATE EXTERNAL CATALOG hive_catalog_hms
PROPERTIES
(
“type” = “hive”,
“hive.metastore.uris” = “thrift://34.132.15.127:9083”,
“aws.s3.enable_path_style_access” = “true”,
“aws.s3.endpoint” = “<s3_endpoint>”,
“aws.s3.access_key” = “<iam_user_access_key>”,
“aws.s3.secret_key” = “<iam_user_secret_key>”
);

这个应该不行吧,你测试过没?

还有我底层用的就是hdfs, s3_endpoint 还有后面的iam用户名和密码怎么指定?

我们用的就是这个, 不过我们的场景和你的有点差异, 就是我们的文件是放在cos的

那就不行了
奇怪的是通过spark resouce居然可以访问hive外表,而直接访问触发内部错误

catalog外表支持写用户密码了吗?我看官方文档也没写,现在还不支持吧?

目前不支持,我只是说hive 外表访问不到,然后通过spark resource居然可以通过访问,奇怪

参考我另外一篇帖子,有大佬回复我到了
spark-2x.zip这个名字是固定的,不能修改,FE配置文件里面修改了,然后重启,hive 外表和hive catalog的问题就自然而然的解决了

spark jar 打包名字需要是 spark-2x.zip

https://docs.starrocks.io/zh-cn/latest/loading/SparkLoad#配置-spark-客户端