FE模块编译报错:package com.starrocks.connector.hadoop does not exist

使用如下命令,编译FE模块报错

cd fe
mvn install -DskipTests

报错信息如下:

[WARNING] COMPILATION WARNING :
[INFO] -------------------------------------------------------------
[WARNING] /D:/github.com/qingzhongli/starrocks/fe/fe-core/src/main/java/com/starrocks/credential/aws/AWSCloudCredential.java: Some input files use or override a deprecated API.
[WARNING] /D:/github.com/qingzhongli/starrocks/fe/fe-core/src/main/java/com/starrocks/credential/aws/AWSCloudCredential.java: Recompile with -Xlint:deprecation for details.
[INFO] 2 warnings
[INFO] -------------------------------------------------------------
[INFO] -------------------------------------------------------------
[ERROR] COMPILATION ERROR :
[INFO] -------------------------------------------------------------
[ERROR] /D:/github.com/qingzhongli/starrocks/fe/fe-core/src/main/java/com/starrocks/credential/CloudConfiguration.java:[18,38] package com.starrocks.connector.hadoop does not exist
[ERROR] /D:/github.com/qingzhongli/starrocks/fe/fe-core/src/main/java/org/apache/hadoop/hive/metastore/HiveMetaStoreClient.java:[38,38] package com.starrocks.connector.hadoop does not exist
[INFO] 2 errors
[INFO] -------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] starrocks-fe 3.4.0 ................................. SUCCESS [  0.199 s]
[INFO] plugin-common 1.0.0 ................................ SUCCESS [  0.723 s]
[INFO] fe-common 1.0.0 .................................... SUCCESS [  1.315 s]
[INFO] spark-dpp 1.0.0 .................................... SUCCESS [ 10.614 s]
[INFO] fe-core 3.4.0 ...................................... FAILURE [01:14 min]
[INFO] hive-udf 1.0.0 ..................................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  01:27 min
[INFO] Finished at: 2024-01-09T09:21:17+08:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project fe-core: Compilation failure: Compilation failure:
[ERROR] /D:/github.com/qingzhongli/starrocks/fe/fe-core/src/main/java/com/starrocks/credential/CloudConfiguration.java:[18,38] package com.starrocks.connector.hadoop does not exist
[ERROR] /D:/github.com/qingzhongli/starrocks/fe/fe-core/src/main/java/org/apache/hadoop/hive/metastore/HiveMetaStoreClient.java:[38,38] package com.starrocks.connector.hadoop does not exist
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <args> -rf :fe-core

下面是个软连接:

…/…/…/…/…/…/…/…/java-extensions/hadoop-ext/src/main/java/com/starrocks/connector/hadoop

因为FE/BE要用同一份Java定义,所以用了软连接。
可能是windows下面软连接不work了。 可以copy一份过去试下。

1赞

@dongquan 你好,我的也是windows系统,请问复制到哪个目录下面呢