StarRocks 2.1版本spark connector导数到hive时报错

【版本】2.1.11
执行下面的load时报错:
val startRocksDF = spark.read.format(“starrocks”)
.option(“starrocks.table.identifier”,name)
.option(“starrocks.fenodes”,fenodes)
.option(“user”,username)
.option(“password”,srpd)
.load()
startRocksDF
报错信息:
22/08/30 00:48:40 task-result-getter-1 WARN TaskSetManager: Lost task 0.0 in stage 12.0 (TID 950, CNSH434565.app.paic.com.cn, executor 2): com.starrocks.connector.spark.exception.StarrocksException: Load StarRocks data failed, schema size of fetch data is wrong.
at com.starrocks.connector.spark.serialization.RowBatch.(RowBatch.java:116)
at com.starrocks.connector.spark.rdd.ScalaValueReader.hasNext(ScalaValueReader.scala:200)
at com.starrocks.connector.spark.rdd.AbstractStarrocksRDDIterator.hasNext(AbstractStarrocksRDDIterator.scala:58)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)

背景:我们集群最开始是2.1.12版本,这个任务没有问题。但因为2.1.12版本的base Compaction问题,回退到了2.1.11,回退之后开始出现该问题