routine load将Kafka2亿多条数据写入Doris,最终有200多万数据缺失,从日志中未发现异常,有人遇到过这种情况吗?
- Doris版本是多少
- 贴下routineload创建语句
- show routine load 结果贴下
1、Doris版本:1.2.6
2、routine load语句:CREATE ROUTINE LOAD iot_detail_routine_load2 ON iot_detail2
COLUMNS(
vt,valueTime = FROM_UNIXTIME(vt),devId,dataCode,orgId,devType,gatewayType,timeInterval,Villageid,CustomerId,Devstationid,val
) PROPERTIES (
“desired_concurrent_number” = “3”,
“max_error_number” = “0”,
“max_batch_interval” = “60”,
“max_batch_rows” = “200000”,
“max_batch_size” = “209715200”,
“format” = “json”,
“jsonpaths” = “["$.valueTime","$.devId","$.dataCode","$.orgId","$.devType","$.gatewayType","$.timeInterval","$.Villageid","$.CustomerId","$.Devstationid","$.val"]”,
“strip_outer_array” = “false”,
“num_as_string” = “false”,
“fuzzy_parse” = “false”,
“strict_mode” = “true”,
“timezone” = “Asia/Shanghai”
)
FROM
KAFKA (
“kafka_broker_list” = “192.168.0.14:9092,192.168.0.15:9092,192.168.0.16:9092”,
“kafka_topic” = “iot-detail-test”,
“property.group.id” = “iot_detail_routine_load2_6bfc05c2-048b-4b11-ba00-fc2553cdc57f”,
“kafka_partitions” = “0, 1, 2”,
“kafka_offsets” = “270000000, 270000000, 270000000”
);
3、show routine load
看下doris的建表语句,是不是主键相同覆盖了?