向starrocks写入数据,如果一行数据过多null,报错

【详述】向starrocks写入数据,如果一行数据过多null,报错
详细报错信息:
18845 [Thread-8] ERROR com.starrocks.connector.flink.manager.StarRocksStreamLoadVisitor - Stream Load response:
{“Status”:“Fail”,“BeginTxnTimeMs”:0,“Message”:“too many filtered rows”,“NumberUnselectedRows”:0,“CommitAndPublishTimeMs”:0,“Label”:“f597a963-ebbf-49f3-abe0-c983d9905cc4”,“LoadBytes”:451854,“StreamLoadPlanTimeMs”:1,“NumberTotalRows”:1603,“WriteDataTimeMs”:197,“TxnId”:13591,“LoadTimeMs”:199,“ErrorURL”:“http://172.30.16.26:18040/api/_load_error_log?file=error_log_f9430f845a1f5c29_5a40ba4658efeab4",“ReadDataTimeMs”:0,“NumberLoadedRows”:0,"NumberFilteredRows”:1603}

18845 [Thread-8] WARN com.starrocks.connector.flink.manager.StarRocksSinkManager - Failed to flush batch data to StarRocks, retry times = 2
com.starrocks.connector.flink.manager.StarRocksStreamLoadFailedException: Failed to flush data to StarRocks, Error response:
{“Status”:“Fail”,“BeginTxnTimeMs”:0,“Message”:“too many filtered rows”,“NumberUnselectedRows”:0,“CommitAndPublishTimeMs”:0,“Label”:“f597a963-ebbf-49f3-abe0-c983d9905cc4”,“LoadBytes”:451854,“StreamLoadPlanTimeMs”:1,“NumberTotalRows”:1603,“WriteDataTimeMs”:197,“TxnId”:13591,“LoadTimeMs”:199,“ErrorURL”:“http://172.30.16.26:18040/api/_load_error_log?file=error_log_f9430f845a1f5c29_5a40ba4658efeab4",“ReadDataTimeMs”:0,“NumberLoadedRows”:0,"NumberFilteredRows”:1603}

at com.starrocks.connector.flink.manager.StarRocksStreamLoadVisitor.doStreamLoad(StarRocksStreamLoadVisitor.java:89)
at com.starrocks.connector.flink.manager.StarRocksSinkManager.asyncFlush(StarRocksSinkManager.java:258)
at com.starrocks.connector.flink.manager.StarRocksSinkManager.access$000(StarRocksSinkManager.java:52)
at com.starrocks.connector.flink.manager.StarRocksSinkManager$1.run(StarRocksSinkManager.java:120)
at java.lang.Thread.run(Thread.java:748)

【背景】将其他null值的列进行赋值就不报错了
【业务影响】
【StarRocks版本】例如:2.5.2
【集群规模】3fe(1 follower+2observer)+3be(fe与be混部)
【联系方式】392388393@qq.com
【附件】
建表语句:
CREATE TABLE if not exists ods_ac_score_detail_duplicate (
op varchar,
id bigint(20) NOT NULL COMMENT ‘id’,
ac_template_detailed_id bigint(20) DEFAULT NULL COMMENT ‘细项id’,
score decimal(5,2) DEFAULT NULL COMMENT ‘评分’,
ac_score_id bigint(20) DEFAULT NULL COMMENT ‘绩效评分Id’,
create_date_time datetime DEFAULT NULL COMMENT ‘创建时间’,
create_user_id bigint(20) DEFAULT NULL COMMENT ‘创建人’,
update_date_time datetime DEFAULT NULL COMMENT ‘更新时间’,
update_user_id bigint(20) DEFAULT NULL COMMENT ‘更新人’,
before_id bigint(20) NOT NULL COMMENT ‘id’,
before_ac_template_detailed_id bigint(20) DEFAULT NULL COMMENT ‘细项id’,
before_score decimal(5,2) DEFAULT NULL COMMENT ‘评分’,
before_ac_score_id bigint(20) DEFAULT NULL COMMENT ‘绩效评分Id’,
before_create_date_time datetime DEFAULT NULL COMMENT ‘创建时间’,
before_create_user_id bigint(20) DEFAULT NULL COMMENT ‘创建人’,
before_update_date_time datetime DEFAULT NULL COMMENT ‘更新时间’,
before_update_user_id bigint(20) DEFAULT NULL COMMENT ‘更新人’,
ts_ms bigint,
ts_date date
)
DUPLICATE KEY(op)
DISTRIBUTED BY HASH(ts_date) BUCKETS 366 properties(
“replication_num” = “2”
);

这个报错很奇怪,但是如果直接用insert into
的方式将插入数据就不会报错,上述报错是用flink将数据插入starrocks时报错的