当项目中存在如下jar包列表:
connect-api-2.7.0.jar
fastjson-1.2.68.jar
flink-connector-kafka_2.12-1.13.6.jar
flink-connector-starrocks-1.2.3_flink-1.13_2.11.jar
flink-csv-1.13.6.jar
flink-dist_2.12-1.13.6.jar
flink-json-1.13.6.jar
flink-s3-fs-hadoop-1.14.2.jar
flink-shaded-hadoop-uber-3.1.2.jar
flink-shaded-zookeeper-3.4.14.jar
flink-sql-connector-mysql-cdc-2.2.0.jar
flink-table_2.12-1.13.6.jar
flink-table-blink_2.12-1.13.6.jar
kafka-clients-2.4.1.jar
log4j-1.2-api-2.17.1.jar
log4j-api-2.17.1.jar
log4j-core-2.17.1.jar
log4j-slf4j-impl-2.17.1.jar
mysql-connector-java-8.0.16.jar
flink-connector-starrocks-1.2.3_flink-1.13_2.11.jar
配置中存在如下配置:
s3a.endpoint: http://XXX.XXX.XXX
s3a.access.key: XXX
s3a.secret.key: XXX
s3a.path.style.access: true
s3a.connection.ssl.enabled: false
high-availability.storageDir: s3a://XXX/XXX/
state.checkpoints.dir: s3a://XXX/XXX
state.savepoints.dir: s3a://XXX/XXX
启动flink报错,报错如下:
org.apache.flink.runtime.entrypoint.ClusterEntrypointException: Failed to initialize the cluster entrypoint StandaloneSessionClusterEntrypoint.
…
Caused by: java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.ObjectMapper.configure(Lcom/fasterxml/jackson/core/JsonParser$Feature;Z)Lcom/fasterxml/jackson/databind/ObjectMapper;
…
当flink-connector-starrocks-1.2.3_flink-1.13_2.11.jar的pom文件修改如下配置:
重新打包后,数据写入StarRocks会一直报错,始终无法写入数据库,没有可以参考的日志。
当去掉以上s3的配置,删除相关jar包,采用原始flink-connector-starrocks-1.2.3_flink-1.13_2.11.jar,相同的flinksql脚本和相同的表是正常能够写入数据的。
版本: StarRocks 2.1.10,flink-connector-starrocks 1.2.3 2.11