starrocks 3.1
cdp 7.1.7
单FE和单BE,搭好存算分离模型后尝试建表直接超时失败(默认和60秒都超时)
hdfs里能看到生成的临时文件
求解?
设置超时时间:
be.out (9.2 KB) be.WARNING.log.20230810-132100 (4.2 KB)
be.INFO.log.20230810-131523 (53.5 KB)
starrocks 3.1
cdp 7.1.7
单FE和单BE,搭好存算分离模型后尝试建表直接超时失败(默认和60秒都超时)
hdfs里能看到生成的临时文件
设置超时时间:
看 be.out 里有如下错误,应该是 HDFS 集群有问题,先排查下 HDFS 集群的问题吧:
hdfsHFlush: FSDataOutputStream#hflush error:
IOException: Could not get block locations. Source file “/user/root/starrocks/5572960c-bd3e-4c90-8225-b9220e398a9d/10013/SCHEMA_000000000000271E.633a519a-4cb4-4904-b8f0-8a219b026af0.TEMP.” - Aborting…block==nulljava.io.IOException: Could not get block locations. Source file “/user/root/starrocks/5572960c-bd3e-4c90-8225-b9220e398a9d/10013/SCHEMA_000000000000271E.633a519a-4cb4-4904-b8f0-8a219b026af0.TEMP.” - Aborting…block==null
at org.apache.hadoop.hdfs.DataStreamer.setupPipelineForAppendOrRecovery(DataStreamer.java:1525)
at org.apache.hadoop.hdfs.DataStreamer.processDatanodeOrExternalError(DataStreamer.java:1305)
at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:668)
FSDataOutputStream#close error:
IOException: Could not get block locations. Source file “/user/root/starrocks/5572960c-bd3e-4c90-8225-b9220e398a9d/10013/SCHEMA_000000000000271E.633a519a-4cb4-4904-b8f0-8a219b026af0.TEMP.” - Aborting…block==nulljava.io.IOException: Could not get block locations. Source file “/user/root/starrocks/5572960c-bd3e-4c90-8225-b9220e398a9d/10013/SCHEMA_000000000000271E.633a519a-4cb4-4904-b8f0-8a219b026af0.TEMP.” - Aborting…block==null
at org.apache.hadoop.hdfs.DataStreamer.setupPipelineForAppendOrRecovery(DataStreamer.java:1525)
at org.apache.hadoop.hdfs.DataStreamer.processDatanodeOrExternalError(DataStreamer.java:1305)
at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:668)
防火墙问题,除了8020端口,还会去访问hdfs的9866端口,临时申请了一个同网段虚拟机测通了