【Spark Load+CDH】spark load报错

【详述】
(1)参考starrocks3.3.3版本文档,测试spark load,其中starrocks集群采用存算一体,使用三台物理机,搭建三个FE和BE,FE使用HA模式。
(2)spark版本使用spark-2.4.5-bin-hadoop2.7,hadoop版本使用hadoop-2.7.7。因为公司有CDH集群,就直接使用CDH集群的YARN资源和HDFS,其中CDH版本为6.3.2,Hadoop版本为3.0.0。
(3)在所有的fe.conf里配置spark客户端信息和yarn客户端信息,配置如下:
spark_home_default_dir=/opt/module/spark
spark_resource_path=/opt/module/spark/jars/spark-2x.zip
yarn_client_path=/opt/module/hadoop/bin/yarn
(4)创建spark资源:
create external resource “spark_resource”
properties
(
“type”=“spark”,
“spark.master”=“yarn”,
“spark.submit.deployMode”=“cluster”,
“spark.executor.memory”=“1g”,
“spark.yarn.queue” = “cxys”,
“spark.hadoop.yarn.resourcemanager.address”=“xxx:8032”, --这是CDH集群上YARN的RM主节点
“spark.hadoop.fs.defaultFS”=“hdfs://xxx:8020”, --这是CDH集群HDFS NameNode主节点
“working_dir”=“hdfs://xxx/test/spark”
);
(5)创建load任务:
load label spark_load_test
(
data infile(“hdfs://xxx:8020/test/test001.csv”)
into table test001
columns terminated by “|”
(id,name,score,mark)
)
with resource “spark_resource”
(
“spark.executor.memory”=“2g”,
“spark.shuffle.compress”=“true”
)
properties(
“timeout”=“300”
);
【业务影响】用于测试
【是否存算分离】存算一体
【StarRocks版本】3.3.3
【集群规模3fe(1 leader+2 follower)+3be(fe与be混部)
【机器信息】48C/64G/万兆
【表模型】明细表
【导入或者导出方式】Spark Load
【联系方式】
【附件】
(1)mysql客户端报错信息:

State: CANCELLED
ErrorMsg: type:ETL_SUBMIT_FAIL; msg:start spark app failed. error: spark app state: KILLED,loadJobId:26018, logPath:/opt/module/starrocks-3.3.3/fe/log/spark_launcher_log/spark_launcher_26018_spark_load_test.log

(2)fe.log报错信息与mysql客户端报错信息一致。
(3)spark_launcher_log报错信息:
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/opt/module/spark/jars/spark-unsafe_2.11-2.4.5.jar) to method java.nio.Bits.unaligned()
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Using Spark’s default log4j profile: org/apache/spark/log4j-defaults.properties
24/10/18 16:35:38 WARN DependencyUtils: Skip remote jar hdfs://10.0.24.26/test/spark/925821022/__spark_repository__spark_resource/__archive_1.0.0/__lib_7517ce5604e91e88a250750d54b38432_spark-dpp-1.0.0-jar-with-dependencies.jar.
24/10/18 16:35:39 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
24/10/18 16:35:39 WARN DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
24/10/18 16:35:39 INFO Client: Requesting a new application from cluster with 4 NodeManagers
24/10/18 16:35:39 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (358400 MB per container)
24/10/18 16:35:39 INFO Client: Will allocate AM container, with 1408 MB memory including 384 MB overhead
24/10/18 16:35:39 INFO Client: Setting up container launch context for our AM
24/10/18 16:35:39 INFO Client: Setting up the launch environment for our AM container
24/10/18 16:35:39 INFO Client: Preparing resources for our AM container
24/10/18 16:35:39 INFO Client: Uploading resource hdfs://10.0.24.26/test/spark/925821022/__spark_repository__spark_resource/__archive_1.0.0/__lib_e9da07934b1e652fcd8bf1ac1a91861f_spark-2x.zip -> hdfs://10.0.24.26:8020/user/root/.sparkStaging/application_1712473418517_28095/__lib_e9da07934b1e652fcd8bf1ac1a91861f_spark-2x.zip
24/10/18 16:35:40 INFO Client: Uploading resource hdfs://10.0.24.26/test/spark/925821022/__spark_repository__spark_resource/__archive_1.0.0/__lib_7517ce5604e91e88a250750d54b38432_spark-dpp-1.0.0-jar-with-dependencies.jar -> hdfs://10.0.24.26:8020/user/root/.sparkStaging/application_1712473418517_28095/__lib_7517ce5604e91e88a250750d54b38432_spark-dpp-1.0.0-jar-with-dependencies.jar
24/10/18 16:35:40 INFO Client: Uploading resource file:/tmp/spark-fe3ef97e-5dc7-447c-89e5-377c48cd2858/__spark_conf__4193515778977338812.zip -> hdfs://10.0.24.26:8020/user/root/.sparkStaging/application_1712473418517_28095/spark_conf.zip
24/10/18 16:35:40 INFO SecurityManager: Changing view acls to: root
24/10/18 16:35:40 INFO SecurityManager: Changing modify acls to: root
24/10/18 16:35:40 INFO SecurityManager: Changing view acls groups to:
24/10/18 16:35:40 INFO SecurityManager: Changing modify acls groups to:
24/10/18 16:35:40 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()
24/10/18 16:35:41 INFO Client: Submitting application application_1712473418517_28095 to ResourceManager
24/10/18 16:35:41 INFO YarnClientImpl: Submitted application application_1712473418517_28095
24/10/18 16:35:42 INFO Client: Application report for application_1712473418517_28095 (state: ACCEPTED)
24/10/18 16:35:42 INFO Client:
client token: N/A
diagnostics: AM container is launched, waiting for AM container to Register with RM
ApplicationMaster host: N/A
ApplicationMaster RPC port: -1
queue: root.cxys
start time: 1729240541740
final status: UNDEFINED
tracking URL: http://haedahp001.innotron.com:8088/proxy/application_1712473418517_28095/
user: root
24/10/18 16:35:43 INFO Client: Application report for application_1712473418517_28095 (state: ACCEPTED)
24/10/18 16:35:44 INFO Client: Application report for application_1712473418517_28095 (state: ACCEPTED)
24/10/18 16:35:45 INFO Client: Application report for application_1712473418517_28095 (state: ACCEPTED)
24/10/18 16:35:46 INFO Client: Application report for application_1712473418517_28095 (state: ACCEPTED)
24/10/18 16:35:47 INFO Client: Application report for application_1712473418517_28095 (state: ACCEPTED)
24/10/18 16:35:48 INFO Client: Application report for application_1712473418517_28095 (state: ACCEPTED)
24/10/18 16:35:49 INFO Client: Application report for application_1712473418517_28095 (state: ACCEPTED)
24/10/18 16:35:50 INFO Client: Application report for application_1712473418517_28095 (state: ACCEPTED)
24/10/18 16:35:51 INFO Client: Application report for application_1712473418517_28095 (state: ACCEPTED)
24/10/18 16:35:52 INFO Client: Application report for application_1712473418517_28095 (state: ACCEPTED)
24/10/18 16:35:53 INFO Client: Application report for application_1712473418517_28095 (state: ACCEPTED)
24/10/18 16:35:54 INFO Client: Application report for application_1712473418517_28095 (state: ACCEPTED)
24/10/18 16:35:55 INFO Client: Application report for application_1712473418517_28095 (state: ACCEPTED)
24/10/18 16:35:56 INFO Client: Application report for application_1712473418517_28095 (state: ACCEPTED)
24/10/18 16:35:57 INFO Client: Application report for application_1712473418517_28095 (state: ACCEPTED)
24/10/18 16:35:58 INFO Client: Application report for application_1712473418517_28095 (state: ACCEPTED)
24/10/18 16:35:59 INFO Client: Application report for application_1712473418517_28095 (state: ACCEPTED)
24/10/18 16:36:00 INFO Client: Application report for application_1712473418517_28095 (state: ACCEPTED)
24/10/18 16:36:01 INFO Client: Application report for application_1712473418517_28095 (state: ACCEPTED)
24/10/18 16:36:02 INFO Client: Application report for application_1712473418517_28095 (state: FAILED)
(4)YARN上stderr报错信息:
Error: Could not find or load main class org.apache.spark.deploy.yarn.ApplicationMaster