【发布时间】:2021-02-14 09:27:15
【问题描述】:
我试图了解如何在 Kafka Source 的 WatermarkStrategy 中使用 withTimestampAssigner()。我需要使用的“时间”在消息负载中。
为此,我有以下代码:
FlinkKafkaConsumer<Event> kafkaData =
new FlinkKafkaConsumer("CorID_0", new EventDeserializationSchema(), p);
kafkaData.assignTimestampsAndWatermarks(
WatermarkStrategy
.forMonotonousTimestamps()
.withTimestampAssigner(Event, Event.time))
DataStream<Event> stream = env.addSource(kafkaData);
EventDeserializationSchema() 是这样的:
public class EventDeserializationSchema implements DeserializationSchema<Event> {
private static final long serialVersionUID = 1L;
private static final CsvSchema schema = CsvSchema.builder()
.addColumn("firstName")
.addColumn("lastName")
.addColumn("age", CsvSchema.ColumnType.NUMBER)
.addColumn("time")
.build();
private static final ObjectMapper mapper = new CsvMapper();
@Override
public Event deserialize(byte[] message) throws IOException {
return mapper.readerFor(Event.class).with(schema).readValue(message);
}
@Override
public boolean isEndOfStream(Event nextElement) {
return false;
}
@Override
public TypeInformation<Event> getProducedType() {
return TypeInformation.of(Event.class);
}
}
还有事件:
import java.io.Serializable;
public class Event implements Serializable {
public String firstName;
public String lastName;
private int age;
public String time;
public Event() {
}
public String getFirstName() {
return firstName;
}
public void setFirstName(String firstName) {
this.firstName = firstName;
}
public String getLastName() {
return lastName;
}
public void setLastName(String lastName) {
this.lastName = lastName;
}
public int getAge() {
return age;
}
public void setAge(int age) {
this.age = age;
}
public String getTime() {
return time;
}
public void setTime(String time) {
this.time = time;
}
}
我想了解的是如何为 withTimeStampAssigner() 提供 时间:
.withTimestampAssigner(???))
变量应该是 Event.time 但从 flink 页面我不太明白。
我一直在寻找
这让我有点困惑,因为我不明白在我的情况下解决方案是否非常简单,或者我需要额外的上下文。我发现的所有示例都使用 .forBoundedOutOfOrderness() 或以前版本的 flink,其中实现与此不同:
kafka flink timestamp Event time and watermark
谢谢!
【问题讨论】:
标签: apache-kafka flink-streaming