catalog建表报错

【详述】
前台报错:Failed to create table ods.test0502, msg: null
后台报错:Failed to create table ods.test0410
Caused by: org.apache.hadoop.hive.metastore.api.AlreadyExistsException: Table test0410 already exists
show tables可以看到表,但是没有对应的hdfs目录
【StarRocks版本】StarRocks-3.2.4
【集群规模】3fe(1 follower+2observer)+3be(fe与be混部)
【联系方式】578086335@qq.com
【附件】

StarRocks集群使用root启动,hadoop_env.sh已经使用了export hive,所有节点已部署

语句:set catalog tmp;create table tmp.ods.test0502(id int)

具体报错如下:

FE日志:
2024-04-01 16:23:04,164 ERROR (thrift-server-pool-21|239) [HiveMetaClient.callRPC():163] Failed to create table ods.test0410
Caused by: org.apache.hadoop.hive.metastore.api.AlreadyExistsException: Table test0410 already exists
2024-04-01 16:23:04,164 ERROR (thrift-server-pool-21|239) [HiveMetaClient.callRPC():170] An exception occurred when using the current long link to access metastore. msg: Failed to create table ods.test0410
2024-04-01 16:23:04,165 ERROR (thrift-server-pool-21|239) [HiveMetastoreOperations.createTable():189] Failed to create table ods.test0410
2024-04-01 16:23:04,189 WARN (thrift-server-pool-21|239) [StmtExecutor.handleDdlStmt():1599] DDL statement (create table test0410(id int)) process failed.
com.starrocks.common.DdlException: Failed to create table ods.test0410. msg: Failed to create table ods.test0410, msg: null

Hive-Meta的日志如下:
2024-04-01 17:00:39,020 INFO org.apache.hadoop.hive.metastore.HiveMetaStore.audit: [pool-8-thread-107246]: ugi=root ip=172.21.98.247 cmd=source:172.21.98.247 create_table: Table(tableName:test0502, dbName:ods, owner:root, createTime:1711962038, lastAccessTime:0, retention:0, sd:StorageDescriptor(cols:[FieldSchema(name:id, type:int, comment:null)], location:null, inputFormat:org.apache.hadoop.mapred.TextInputFormat, outputFormat:org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat, compressed:false, numBuckets:-1, serdeInfo:SerDeInfo(name:null, serializationLib:org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, parameters:{serialization.format=1}), bucketCols:[], sortCols:[], parameters:{}, skewedInfo:SkewedInfo(skewedColNames:[], skewedColValues:[], skewedColValueLocationMaps:{}), storedAsSubDirectories:false), partitionKeys:[], parameters:{totalSize=0, numRows=0, rawDataSize=0, COLUMN_STATS_ACCURATE={“BASIC_STATS”:“true”}, numFiles=0, numFilesErasureCoded=0}, viewOriginalText:null, viewExpandedText:null, tableType:MANAGED_TABLE, privileges:PrincipalPrivilegeSet(userPrivileges:{root=[PrivilegeGrantInfo(privilege:INSERT, createTime:-1, grantor:root, grantorType:USER, grantOption:true), PrivilegeGrantInfo(privilege:SELECT, createTime:-1, grantor:root, grantorType:USER, grantOption:true), PrivilegeGrantInfo(privilege:UPDATE, createTime:-1, grantor:root, grantorType:USER, grantOption:true), PrivilegeGrantInfo(privilege:DELETE, createTime:-1, grantor:root, grantorType:USER, grantOption:true)]}, groupPrivileges:null, rolePrivileges:null), temporary:false, ownerType:USER)

是hive catalog吗? 你能查询这个hive catalog下的其他表吗?能贴一下你创建hive catalog的语句吗?

查询可以,没有问题

CREATE EXTERNAL CATALOG tmp
PROPERTIES
(
“type” = “hive”,
“hive.metastore.type” = “hive”,
“hive.metastore.uris” = “thrift://$ip:9083”
);

看报错是因为你创建已经存在的table test0410, 你能重试一个新的表名吗?一个目前没有的表名,看看是否可以创建 如果报错麻烦看下对应的堆栈

表在创建前是不存在的
MySQL [test]> create table test.test04011947(id int);
ERROR 1064 (HY000): Unexpected exception: Failed to create table test.test04011947. msg: Failed to create table test.test04011947, msg: null

FE日志:
2024-04-01 19:47:51,489 WARN (thrift-server-pool-362|629) [HiveMetaStoreClient.getTable():641] Failed to get table test.test04011947
org.apache.hadoop.hive.metastore.api.NoSuchObjectException: test.test04011947 table not found
2024-04-01 19:48:02,726 ERROR (thrift-server-pool-362|629) [HiveMetaClient.callRPC():163] Failed to create table test.test04011947
Caused by: org.apache.hadoop.hive.metastore.api.AlreadyExistsException: Table test04011947 already exists
2024-04-01 19:48:02,727 ERROR (thrift-server-pool-362|629) [HiveMetaClient.callRPC():170] An exception occurred when using the current long link to access metastore. msg: Failed to create table test.test04011947
2024-04-01 19:48:02,727 ERROR (thrift-server-pool-362|629) [HiveMetastoreOperations.createTable():189] Failed to create table test.test04011947
2024-04-01 19:48:02,759 WARN (thrift-server-pool-362|629) [StmtExecutor.handleDdlStmt():1599] DDL statement (create table test.test04011947(id int)) process failed.
com.starrocks.common.DdlException: Failed to create table test.test04011947. msg: Failed to create table test.test04011947, msg: null

Hive日志:
2024-04-01 19:47:51,479 INFO org.apache.hadoop.hive.metastore.HiveMetaStore.audit: [pool-8-thread-107854]: ugi=hive ip=172.28.163.148 cmd=source:172.28.163.148 get_table : db=test tbl=test04011947
2024-04-01 19:47:51,487 INFO org.apache.hadoop.hive.metastore.HiveMetaStore.audit: [pool-8-thread-107759]: ugi=hive ip=172.21.99.72 cmd=source:172.21.99.72 add_partition
2024-04-01 19:47:51,498 INFO org.apache.hadoop.hive.metastore.HiveMetaStore.audit: [pool-8-thread-107759]: ugi=hive ip=172.21.99.72 cmd=Cleaning up thread local RawStore…
2024-04-01 19:47:51,498 INFO org.apache.hadoop.hive.metastore.HiveMetaStore.audit: [pool-8-thread-107759]: ugi=hive ip=172.21.99.72 cmd=Done cleaning up thread local RawStore
2024-04-01 19:47:51,500 INFO org.apache.hadoop.hive.metastore.HiveMetaStore.audit: [pool-8-thread-107759]: ugi=hive ip=172.21.99.72 cmd=source:172.21.99.72 get_table : db=ods_domino tbl=ods_domino_binlog__hdfs_split_incr
2024-04-01 19:47:51,513 INFO org.apache.hadoop.hive.metastore.HiveMetaStore.audit: [pool-8-thread-107854]: ugi=hive ip=172.28.163.148 cmd=source:172.28.163.148 create_table: Table(tableName:test04011947, dbName:test, owner:hive, createTime:0, lastAccessTime:0, retention:0, sd:StorageDescriptor(cols:[FieldSchema(name:id, type:int, comment:null)], location:hdfs://nameservice/user/hive/warehouse/test.db/test04011947, inputFormat:org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat, outputFormat:org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat, compressed:false, numBuckets:0, serdeInfo:SerDeInfo(name:test04011947, serializationLib:org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe, parameters:null), bucketCols:null, sortCols:null, parameters:{}), partitionKeys:[], parameters:{totalSize=0, numRows=0, starrocks_version=3.2.4-613f0b5, starrocks_query_id=ab3a816a-f01d-11ee-94aa-0242e67d7f23}, viewOriginalText:null, viewExpandedText:null, tableType:MANAGED_TABLE, ownerType:USER)

经过排查,发现是create table连接到hms的耗时太长,导致进行了超时重试,所以报错显示table already exist, 调大超时时间即可 在fe conf中加入如下参数 hive_meta_store_timeout_s = 30