生产环境2.3.4
我们经常有人写一些消耗资源很高的sql,于是我们参照文档开启了黑名单功能
但是实际使用发现,只要开启了该功能,哪怕里面一条规则也不配,datax就会报错
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown error
然后我们实时作业flink插件写入正常
麻烦官方确认一下,这里是不是有什么bug,sql黑名单功能对我们来说还是挺实用的
生产环境2.3.4
我们经常有人写一些消耗资源很高的sql,于是我们参照文档开启了黑名单功能
但是实际使用发现,只要开启了该功能,哪怕里面一条规则也不配,datax就会报错
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown error
然后我们实时作业flink插件写入正常
麻烦官方确认一下,这里是不是有什么bug,sql黑名单功能对我们来说还是挺实用的
Unknown error后面的code具体是什么?您这边是开启了黑名单功能datax就会报上述错误?不开启黑名单功能的时候datax写入sr不会报错?
是的,我们把开关关了立马就好了,用的datax starrocks插件,版本应该是1.1.5
具体报错信息是
[2022-11-13 07:04:57,313] {{bash.py:182}} INFO - 2022-11-13 07:04:57.313 [job-0] INFO HdfsReader$Job - 您即将读取的文件数为: [1], 列表为: [com.alibaba.datax.plugin.reader.hdfsreader.FileInfo@e56d292f]
[2022-11-13 07:04:57,314] {{bash.py:182}} INFO - 2022-11-13 07:04:57.313 [job-0] INFO JobContainer - DataX Writer.Job [doriswriter] do prepare work .
[2022-11-13 07:04:57,612] {{bash.py:182}} INFO - 2022-11-13 07:04:57.611 [job-0] ERROR RetryUtil - Exception when calling callable, 异常Msg:Code:[DBUtilErrorCode-10], Description:[连接数据库失败. 请检查您的 账号、密码、数据库名称、IP、Port或者向 DBA 寻求帮助(注意网络环境).]. - 具体错误信息为:com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown error
[2022-11-13 07:04:57,612] {{bash.py:182}} INFO - com.alibaba.datax.common.exception.DataXException: Code:[DBUtilErrorCode-10], Description:[连接数据库失败. 请检查您的 账号、密码、数据库名称、IP、Port或者向 DBA 寻求帮助(注意网络环境).]. - 具体错误信息为:com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown error
[2022-11-13 07:04:57,612] {{bash.py:182}} INFO - at com.alibaba.datax.common.exception.DataXException.asDataXException(DataXException.java:26) ~[datax-common-0.0.1-SNAPSHOT.jar:na]
[2022-11-13 07:04:57,612] {{bash.py:182}} INFO - at com.alibaba.datax.plugin.rdbms.util.RdbmsException.asConnException(RdbmsException.java:23) ~[plugin-rdbms-util-0.0.1-SNAPSHOT.jar:na]
[2022-11-13 07:04:57,612] {{bash.py:182}} INFO - at com.alibaba.datax.plugin.rdbms.util.DBUtil.connect(DBUtil.java:394) [plugin-rdbms-util-0.0.1-SNAPSHOT.jar:na]
[2022-11-13 07:04:57,612] {{bash.py:182}} INFO - at com.alibaba.datax.plugin.rdbms.util.DBUtil.connect(DBUtil.java:384) [plugin-rdbms-util-0.0.1-SNAPSHOT.jar:na]
[2022-11-13 07:04:57,613] {{bash.py:182}} INFO - at com.alibaba.datax.plugin.rdbms.util.DBUtil.access$000(DBUtil.java:22) [plugin-rdbms-util-0.0.1-SNAPSHOT.jar:na]
[2022-11-13 07:04:57,613] {{bash.py:182}} INFO - at com.alibaba.datax.plugin.rdbms.util.DBUtil$3.call(DBUtil.java:322) ~[plugin-rdbms-util-0.0.1-SNAPSHOT.jar:na]
[2022-11-13 07:04:57,613] {{bash.py:182}} INFO - at com.alibaba.datax.plugin.rdbms.util.DBUtil$3.call(DBUtil.java:319) ~[plugin-rdbms-util-0.0.1-SNAPSHOT.jar:na]
[2022-11-13 07:04:57,613] {{bash.py:182}} INFO - at com.alibaba.datax.common.util.RetryUtil$Retry.call(RetryUtil.java:164) ~[datax-common-0.0.1-SNAPSHOT.jar:na]
[2022-11-13 07:04:57,613] {{bash.py:182}} INFO - at com.alibaba.datax.common.util.RetryUtil$Retry.doRetry(RetryUtil.java:111) ~[datax-common-0.0.1-SNAPSHOT.jar:na]
[2022-11-13 07:04:57,613] {{bash.py:182}} INFO - at com.alibaba.datax.common.util.RetryUtil.executeWithRetry(RetryUtil.java:30)
黑名单功能开关只要打开
ADMIN SET FRONTEND CONFIG (“enable_sql_blacklist” = “true”);
不配置任何规则datax也会报错
datax插件 mysql驱动版本是5.1.34
flink插件 mysql驱动版本是5.1.49
不确定是否是这个原因
方便的话发个datax job看下,还有你的sqlbacklist
sqlbacklist 是curd都支持吗
是的,只针对sql
{
“content”:[
{
“reader”:{
“name”:“hdfsreader”,
“parameter”:{
“column”:[
{
“index”:“0”,
“type”:“String”
},
{
“index”:“1”,
“type”:“String”
},
{
“index”:“2”,
“type”:“String”
},
{
“index”:“3”,
“type”:“String”
},
{
“index”:“4”,
“type”:“String”
},
{
“index”:“5”,
“type”:“String”
},
{
“index”:“6”,
“type”:“String”
},
{
“index”:“7”,
“type”:“String”
},
{
“index”:“8”,
“type”:“String”
},
{
“index”:“9”,
“type”:“String”
},
{
“index”:“10”,
“type”:“String”
},
{
“index”:“11”,
“type”:“String”
},
{
“index”:“12”,
“type”:“String”
},
{
“index”:“13”,
“type”:“String”
},
{
“index”:“14”,
“type”:“String”
},
{
“index”:“15”,
“type”:“String”
},
{
“index”:“16”,
“type”:“String”
},
{
“index”:“17”,
“type”:“String”
},
{
“index”:“18”,
“type”:“String”
},
{
“index”:“19”,
“type”:“String”
},
{
“index”:“20”,
“type”:“String”
},
{
“index”:“21”,
“type”:“String”
},
{
“index”:“22”,
“type”:“String”
},
{
“index”:“23”,
“type”:“String”
},
{
“index”:“24”,
“type”:“String”
},
{
“index”:“25”,
“type”:“String”
},
{
“index”:“26”,
“type”:“String”
},
{
“index”:“27”,
“type”:“String”
},
{
“index”:“28”,
“type”:“String”
},
{
“index”:“29”,
“type”:“String”
},
{
“index”:“30”,
“type”:“String”
},
{
“index”:“31”,
“type”:“String”
},
{
“index”:“32”,
“type”:“String”
},
{
“index”:“33”,
“type”:“String”
},
{
“index”:“34”,
“type”:“String”
},
{
“index”:“35”,
“type”:“String”
},
{
“index”:“36”,
“type”:“String”
},
{
“index”:“37”,
“type”:“String”
},
{
“index”:“38”,
“type”:“String”
},
{
“index”:“39”,
“type”:“String”
},
{
“index”:“40”,
“type”:“String”
},
{
“index”:“41”,
“type”:“String”
},
{
“index”:“42”,
“type”:“String”
},
{
“index”:“43”,
“type”:“String”
},
{
“index”:“44”,
“type”:“String”
},
{
“index”:“45”,
“type”:“String”
},
{
“index”:“46”,
“type”:“String”
},
{
“index”:“47”,
“type”:“String”
},
{
“index”:“48”,
“type”:“String”
},
{
“index”:“49”,
“type”:“String”
},
{
“index”:“50”,
“type”:“String”
},
{
“index”:“51”,
“type”:“String”
},
{
“index”:“52”,
“type”:“String”
},
{
“index”:“53”,
“type”:“String”
},
{
“index”:“54”,
“type”:“String”
},
{
“index”:“55”,
“type”:“String”
},
{
“index”:“56”,
“type”:“String”
},
{
“index”:“57”,
“type”:“String”
},
{
“index”:“58”,
“type”:“String”
},
{
“index”:“59”,
“type”:“String”
},
{
“index”:“60”,
“type”:“String”
},
{
“index”:“61”,
“type”:“String”
},
{
“index”:“62”,
“type”:“String”
},
{
“index”:“63”,
“type”:“String”
},
{
“index”:“64”,
“type”:“String”
},
{
“index”:“65”,
“type”:“String”
},
{
“index”:“66”,
“type”:“String”
},
{
“index”:“67”,
“type”:“String”
},
{
“index”:“68”,
“type”:“String”
},
{
“index”:“69”,
“type”:“String”
},
{
“index”:“70”,
“type”:“String”
},
{
“index”:“71”,
“type”:“String”
},
{
“index”:“72”,
“type”:“String”
},
{
“index”:“73”,
“type”:“String”
},
{
“index”:“74”,
“type”:“String”
},
{
“index”:“75”,
“type”:“String”
},
{
“index”:“76”,
“type”:“String”
},
{
“index”:“77”,
“type”:“String”
},
{
“index”:“78”,
“type”:“String”
},
{
“index”:“79”,
“type”:“String”
},
{
“index”:“80”,
“type”:“String”
},
{
“index”:“81”,
“type”:“String”
},
{
“index”:“82”,
“type”:“String”
},
{
“index”:“83”,
“type”:“String”
},
{
“index”:“84”,
“type”:“String”
},
{
“index”:“85”,
“type”:“String”
},
{
“index”:“86”,
“type”:“String”
},
{
“index”:“87”,
“type”:“String”
},
{
“index”:“88”,
“type”:“String”
},
{
“index”:“89”,
“type”:“String”
},
{
“index”:“90”,
“type”:“String”
},
{
“index”:“91”,
“type”:“String”
},
{
“index”:“92”,
“type”:“String”
},
{
“index”:“93”,
“type”:“String”
},
{
“index”:“94”,
“type”:“String”
},
{
“index”:“95”,
“type”:“String”
},
{
“index”:“96”,
“type”:“String”
},
{
“index”:“97”,
“type”:“String”
},
{
“index”:“98”,
“type”:“String”
},
{
“index”:“99”,
“type”:“String”
},
{
“index”:“100”,
“type”:“String”
},
{
“index”:“101”,
“type”:“String”
},
{
“index”:“102”,
“type”:“String”
},
{
“index”:“103”,
“type”:“String”
},
{
“index”:“104”,
“type”:“String”
},
{
“index”:“105”,
“type”:“String”
},
{
“index”:“106”,
“type”:“String”
},
{
“index”:“107”,
“type”:“String”
},
{
“index”:“108”,
“type”:“String”
},
{
“index”:“109”,
“type”:“String”
},
{
“index”:“110”,
“type”:“String”
},
{
“index”:“111”,
“type”:“String”
},
{
“index”:“112”,
“type”:“String”
},
{
“index”:“113”,
“type”:“String”
},
{
“index”:“114”,
“type”:“String”
},
{
“index”:“115”,
“type”:“String”
},
{
“index”:“116”,
“type”:“String”
},
{
“index”:“117”,
“type”:“String”
},
{
“index”:“118”,
“type”:“String”
},
{
“index”:“119”,
“type”:“String”
},
{
“index”:“120”,
“type”:“String”
},
{
“index”:“121”,
“type”:“String”
},
{
“index”:“122”,
“type”:“String”
},
{
“index”:“123”,
“type”:“String”
},
{
“index”:“124”,
“type”:“String”
},
{
“index”:“125”,
“type”:“String”
},
{
“index”:“126”,
“type”:“String”
},
{
“index”:“127”,
“type”:“String”
},
{
“index”:“128”,
“type”:“String”
},
{
“index”:“129”,
“type”:“String”
}
],
“compress”:"",
“defaultFS”:“hdfs://nameservice1”,
“encoding”:“utf-8”,
“excludeRegex”:“hdfs://nameservice1/user/hive/warehouse/xxxxxx.db/xxxxxx/_impala_insert_staging”,
“fieldDelimiter”:"\u0001",
“fileType”:“par”,
“hadoopConfig”:{
“dfs.client.failover.proxy.provider.nameservice1”:“org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider”,
“dfs.ha.namenodes.nameservice1”:“namenode45,namenode130”,
“dfs.namenode.rpc-address.nameservice1.namenode130”:“xxxxxx:8020”,
“dfs.namenode.rpc-address.nameservice1.namenode45”:“xxxxxx:8020”,
“dfs.nameservices”:“nameservice1”
},
“haveKerberos”:true,
“kerberosKeytabFilePath”:"/bigdata/xxxxxx/bdmp/keytab/xxxxxx/xxxxxx.keytab",
“kerberosPrincipal”:“xxxxxx”,
“nullFormat”:"\N",
“path”:"/user/hive/warehouse/xxxxxx.db/xxxxxx"
}
},
“writer”:{
“name”:“doriswriter”,
“parameter”:{
“column”:[
“pt”,
“yq_code”,
“yq_name”,
“factory_code”,
“factory_name”,
“id”,
“base_code”,
“assign_code”,
“assign_line”,
“assign_group_code”,
“purchase_order_code”,
“order_code”,
“order_line”,
“customer_code”,
“customer_name”,
“channel”,
“delivery_sign”,
“center_code”,
“target_center_code”,
“unloading_point”,
“sale_code”,
“payer”,
“product”,
“factory”,
“material_code”,
“material_name”,
“material_group”,
“batch”,
“amount”,
“amount_unit”,
“volume”,
“volume_unit”,
“weight”,
“weight_unit”,
“length_width_height”,
“consignment_bill”,
“consignment_bill_line”,
“delivery_bill”,
“delivery_bill_line”,
“waybill_code”,
“delivery_address”,
“delivery_phone”,
“delivery_contacts”,
“ptd_pickpoint”,
“ptd_actual_datetime”,
“ptd_plan_datetime”,
“plan_arrive_time”,
“center_register_time”,
“center_instock_time”,
“vehicle_status”,
“recall_status”,
“wholecar_flag”,
“place_flag”,
“transfer_flag”,
“delete_flag”,
“remark”,
“remarks”,
“gmt_create”,
“gmt_modified”,
“create_by”,
“last_modified_by”,
“shortconsign_bill”,
“shortconsign_bill_line”,
“library_position”,
“custom_flag”,
“custom_cause”,
“custom_create_date”,
“custom_creator”,
“custom_actual_arrive_date”,
“custom_actual_loaded_date”,
“custom_done_creator”,
“custom_done_date”,
“strative_area_code”,
“modify_reason”,
“transform_center_code”,
“route”,
“handle_pay_date”,
“waybill_create_date”,
“first_center_instock_time”,
“business_type”,
“custom_num”,
“reser_code”,
“reser_date”,
“last_update_time”,
“order_sign”,
“customer_order_code”,
“twice_route”,
“src_etl_date”,
“trg_etl_date”,
“dl_etl_date”,
“plan_shipment_starttime”,
“plan_shipment_endtime”,
“thd_register_time”,
“thd_scanning_starttime”,
“thd_scanning_endtime”,
“plan_transport_time”,
“ptd_pickpoint_name”,
“wms_code”,
“xx_qty”,
“dr_qty”,
“tm_qty”,
“2h_qty”,
“4h_qty”,
“6h_qty”,
“ol_pro_qty”,
“zt_qty”,
“scan_amount”,
“avail_stock_amount”,
“occupy_stock_amount”,
“total_stock_amount”,
“sd_stock_amount”,
“dj_stock_amount”,
“now_plan_sum_amount”,
“now_actual_sum_amount”,
“ptwotwoh_plan_sum_amount”,
“ptwotwoh_actual_sum_amount”,
“ptwoh_plan_sum_amount”,
“ptwoh_actual_sum_amount”,
“ptwo_plan_sum_amount”,
“pfour_plan_sum_amount”,
“psix_plan_sum_amount”,
“peight_plan_sum_amount”,
“pten_plan_sum_amount”,
“ptwelve_plan_sum_amount”,
“pfourteen_plan_sum_amount”,
“psixteen_plan_sum_amount”,
“peighteen_plan_sum_amount”,
“ptwenty_plan_sum_amount”,
“ptwentytwo_plan_sum_amount”,
“ptwentyfour_plan_sum_amount”
],
“database”:“xxxxxx”,
“jdbcUrl”:“jdbc:mysql://xxxxxx:8888/xxxxxx”,
“loadProps”:{
“column_separator”:"\x01",
“row_delimiter”:"\x02"
},
“loadUrl”:[
“:"
],
“password”:"*******”,
“preSql”:[
“truncate table xxxxxx”
],
“table”:“xxxxxx”,
“username”:“xxxxxx”
}
}
}
],
“jobname”:“F_RRS_VEHICLE@DORIS_DWD_NYC_TPT_ASSIGNMENT_DETAIL_TEST”,
“setting”:{
“errorLimit”:{
“percentage”:0.02,
“record”:0
},
“speed”:{
“channel”:3
}
}
datax任务是从hdfs中读取数据,然后同步到starrocks表中,starrocks只开启了黑名单功能,没有进行任何配置就发生报错,另外测试了datax从starrocks读取数据,同步到starrocks中是没有发生错误的
请问是否复现?(凑足8个字)
hi,您好请使用我们官方的starrockswriter。
是用的datax starrockswriter,里面之所以叫doriswriter,是为了兼容以前的作业,实际用的程序是starrockswriter
方便复现下提供下fe.log be.INFO的日志么?我们排查下。