SR插入hive表,在hive中查询报错

【详述】在sr中查询sr表的数据插入到hive表中,然后在hive,和impala中查询这个表报错
【是否存算分离】否
【StarRocks版本】
Starrocks版本:3.2.14
CDH版本:6.3.2
hive版本:2.1.1
impala版本:3.2.0
parquet版本:1.9.0
【集群规模】3fe+3be(fe与be混部)

操作步骤:
1.创建sr表,并插入数据:
CREATE TABLE user_data (
uid bigint(20) NOT NULL COMMENT “用户id”,
uname varchar(65533) NULL COMMENT “姓名”,
birth varchar(65533) NULL COMMENT “出生日期”,
bz varchar(65533) NULL COMMENT “备注”,
address varchar(65533) NULL COMMENT “家庭住址”
) ENGINE=OLAP
PRIMARY KEY(uid)
DISTRIBUTED BY HASH(uid);

INSERT INTO test.user_data values
(1,‘lizhi’,‘2024-12-01’,’’,‘河南郑州’),(2,‘lizhi’,‘2024-12-01’,’’,‘河南郑州’),(3,‘lizhi’,‘2024-12-01’,’’,‘河南郑州’),(12,‘lizhi’,‘2024-12-01’,’’,‘河南郑州’);

2.创建hive表;
CREATE TABLE test.user_data(
uid decimal(10,0) COMMENT ‘用户id’,
uname string COMMENT ‘姓名’,
birth string COMMENT ‘出生日期’,
bz string COMMENT ‘备注’,
address string COMMENT ‘家庭住址’)
STORED AS parquet ;

3.在sr中执行sql,把数据插入hive:
set catalog hive_catalog;
insert into test.user_data select * from default_catalog.test.user_data ;

4.在sr中查询hive表能正常查询到数据:
select * from test.user_data;

5.在hive中查询该表,报错:报错信息如下:

  • Bad status for request TFetchResultsReq(fetchType=0, operationHandle=TOperationHandle(hasResultSet=True, modifiedRowCount=None, operationType=0, operationId=THandleIdentifier(secret=’\xb4H\x1d\x03t8D=\x8e\xc6\xe7\x88\x83]\x01:’, guid=‘W/\x9fH_\xb3Kh\x92\xf1 \x8c\xd5\x93k\xe8’)), orientation=4, maxRows=100): TFetchResultsResp(status=TStatus(errorCode=0, errorMessage=‘java.io.IOException: org.apache.parquet.io.ParquetDecodingException: Can not read value at 0 in block -1 in file hdfs://nameservice1/user/hive/warehouse/ods_sap.db/user_data_sr2/472551d1-e5c9-11ef-92f9-005056a47d33_0_1.parquet’, sqlState=None, infoMessages=[’*org.apache.hive.service.cli.HiveSQLException:java.io.IOException: org.apache.parquet.io.ParquetDecodingException: Can not read value at 0 in block -1 in file hdfs://nameservice1/user/hive/warehouse/ods_sap.db/user_data_sr2/472551d1-e5c9-11ef-92f9-005056a47d33_0_1.parquet:14:13’, ‘org.apache.hive.service.cli.operation.SQLOperation:getNextRowSet:SQLOperation.java:496’, ‘org.apache.hive.service.cli.operation.OperationManager:getOperationNextRowSet:OperationManager.java:297’, ‘org.apache.hive.service.cli.session.HiveSessionImpl:fetchResults:HiveSessionImpl.java:869’, ‘org.apache.hive.service.cli.CLIService:fetchResults:CLIService.java:507’, ‘org.apache.hive.service.cli.thrift.ThriftCLIService:FetchResults:ThriftCLIService.java:708’, ‘org.apache.hive.service.rpc.thrift.TCLIService$Processor$FetchResults:getResult:TCLIService.java:1717’, ‘org.apache.hive.service.rpc.thrift.TCLIService$Processor$FetchResults:getResult:TCLIService.java:1702’, ‘org.apache.thrift.ProcessFunction:process:ProcessFunction.java:39’, ‘org.apache.thrift.TBaseProcessor:process:TBaseProcessor.java:39’, ‘org.apache.hive.service.auth.TSetIpAddressProcessor:process:TSetIpAddressProcessor.java:56’, ‘org.apache.thrift.server.TThreadPoolServer$WorkerProcess:run:TThreadPoolServer.java:286’, ‘java.util.concurrent.ThreadPoolExecutor:runWorker:ThreadPoolExecutor.java:1149’, ‘java.util.concurrent.ThreadPoolExecutor$Worker:run:ThreadPoolExecutor.java:624’, ‘java.lang.Thread:run:Thread.java:748’, ‘*java.io.IOException:org.apache.parquet.io.ParquetDecodingException: Can not read value at 0 in block -1 in file hdfs://nameservice1/user/hive/warehouse/ods_sap.db/user_data_sr2/472551d1-e5c9-11ef-92f9-005056a47d33_0_1.parquet:18:4’, ‘org.apache.hadoop.hive.ql.exec.FetchOperator:getNextRow:FetchOperator.java:521’, ‘org.apache.hadoop.hive.ql.exec.FetchOperator:pushRow:FetchOperator.java:428’, ‘org.apache.hadoop.hive.ql.exec.FetchTask:fetch:FetchTask.java:146’, ‘org.apache.hadoop.hive.ql.Driver:getResults:Driver.java:2227’, ‘org.apache.hive.service.cli.operation.SQLOperation:getNextRowSet:SQLOperation.java:491’, ‘*org.apache.parquet.io.ParquetDecodingException:Can not read value at 0 in block -1 in file hdfs://nameservice1/user/hive/warehouse/ods_sap.db/user_data_sr2/472551d1-e5c9-11ef-92f9-005056a47d33_0_1.parquet:25:7’, ‘org.apache.parquet.hadoop.InternalParquetRecordReader:nextKeyValue:InternalParquetRecordReader.java:223’, ‘org.apache.parquet.hadoop.ParquetRecordReader:nextKeyValue:ParquetRecordReader.java:213’, ‘org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper::ParquetRecordReaderWrapper.java:101’, ‘org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper::ParquetRecordReaderWrapper.java:63’, ‘org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat:getRecordReader:MapredParquetInputFormat.java:75’, ‘org.apache.hadoop.hive.ql.exec.FetchOperator$FetchInputFormatSplit:getRecordReader:FetchOperator.java:695’, ‘org.apache.hadoop.hive.ql.exec.FetchOperator:getRecordReader:FetchOperator.java:333’, ‘org.apache.hadoop.hive.ql.exec.FetchOperator:getNextRow:FetchOperator.java:459’, ‘*java.lang.UnsupportedOperationException:org.apache.parquet.column.values.dictionary.PlainValuesDictionary$PlainLongDictionary:36:11’, ‘org.apache.parquet.column.Dictionary:decodeToBinary:Dictionary.java:44’, ‘org.apache.hadoop.hive.ql.io.parquet.convert.ETypeConverter$BinaryConverter:setDictionary:ETypeConverter.java:269’, ‘org.apache.parquet.column.impl.ColumnReaderImpl::ColumnReaderImpl.java:346’, ‘org.apache.parquet.column.impl.ColumnReadStoreImpl:newMemColumnReader:ColumnReadStoreImpl.java:82’, ‘org.apache.parquet.column.impl.ColumnReadStoreImpl:getColumnReader:ColumnReadStoreImpl.java:77’, ‘org.apache.parquet.io.RecordReaderImplementation::RecordReaderImplementation.java:272’, ‘org.apache.parquet.io.MessageColumnIO$1:visit:MessageColumnIO.java:145’, ‘org.apache.parquet.io.MessageColumnIO$1:visit:MessageColumnIO.java:107’, ‘org.apache.parquet.filter2.compat.FilterCompat$NoOpFilter:accept:FilterCompat.java:155’, ‘org.apache.parquet.io.MessageColumnIO:getRecordReader:MessageColumnIO.java:107’, ‘org.apache.parquet.hadoop.InternalParquetRecordReader:checkRead:InternalParquetRecordReader.java:136’, ‘org.apache.parquet.hadoop.InternalParquetRecordReader:nextKeyValue:InternalParquetRecordReader.java:194’], statusCode=3), results=None, hasMoreRows=None)

6.在impala中执行同样的查询报错如下:

  • File: hdfs://nameservice1/user/hive/warehouse/test.db/user_data_1/478d86b3-e5c5-11ef-99f9-005056a47d35_0_1.parquet is of an unsupported version. file version: 2