3.3.2版本, pivot函数报错

sr版本: 3.3.2-857dd73
在使用pivot函数时报错: java.sql.SQLSyntaxErrorException: Getting analyzing error. Detail message: Column '`db`.`t_auto_schedule_rate_aps_eq_detail`.`version_code`' cannot be resolved
sql在spark中运行是正常的;
sql为:

SELECT *
FROM (SELECT 1                                                                                                                    flg,
             version_code,
             factory_code,
             material_code,
             resources_code,
             IFNULL(detail_actual_demand_quantity, 0)                                                                             dv,
             ROW_NUMBER() over (PARTITION BY version_code, factory_code, material_code, resources_code ORDER BY demand_date desc) row_num
      FROM db.t_auto_schedule_rate_aps_eq_detail
      WHERE date(demand_date) = '2024-08-22') t1
    PIVOT
( SUM(dv) AS sum_dv
    FOR row_num IN (1, 2, 3, 4))
    ORDER BY flg, version_code, factory_code, material_code, resources_code
    LIMIT 20;

表结构:

CREATE TABLE `t_auto_schedule_rate_aps_eq_detail`
(
    `id`                                 bigint(20)     NOT NULL COMMENT '',
    `version_code`                       varchar(64)    NOT NULL COMMENT '',
    `factory_code`                       varchar(64)    NULL COMMENT '',
    `material_code`                      varchar(64)    NULL COMMENT '',
    `resources_code`                     varchar(64)    NULL COMMENT '',
    `daily_plan_type`                    varchar(64)    NOT NULL COMMENT '',
    `demand_date`                        date           NULL COMMENT '',
    `detail_actual_demand_quantity`      decimal(15, 5) NULL COMMENT '',
    `detail_back_actual_demand_quantity` decimal(15, 5) NULL COMMENT '',
    `create_time`                        datetime       NULL COMMENT ''
) ENGINE = OLAP UNIQUE KEY(`id`)
COMMENT 'A'
DISTRIBUTED BY HASH(`id`) BUCKETS 6 
PROPERTIES (
'compression' = 'LZ4',
'fast_schema_evolution' = 'true',
'replicated_storage' = 'true',
'replication_num' = '1'
);

一模一样的问题,版本也一样,sql也差不多,老哥解决了吗
现在我的临时方案遇到PIVOT关键字走spark…

没有解决, 我只是搭建了测试集群, 目前这种需求还是用spark跑, 3.3.3也是一样有这个问题

这个有些问题,我们修复下

好的谢谢大佬

大佬你好, 有最新的进展吗, 会在哪个版本fix呢