【发布时间】:2018-11-26 03:40:07
【问题描述】:
对于一个简单的概念验证,我试图在两分钟的窗口中显示点击数据。我想做的就是打印每个窗口的计数,以及 BigQuery 的窗口边界。在运行我的管道时,我不断收到以下错误:
org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.lang.RuntimeException: java.io.IOException: Insert failed: [{"errors":[{"debugInfo":"","location":"windowend","message":"This field is not a record.","reason":"invalid"}],"index":0}]
管道如下所示:
// Creating the pipeline
Pipeline p = Pipeline.create(options);
// Window items
PCollection<TableRow> counts = p.apply("ReadFromPubSub", PubsubIO.readStrings().fromTopic(options.getTopic()))
.apply("AddEventTimestamps", WithTimestamps.of(TotalCountPipeline::ExtractTimeStamp).withAllowedTimestampSkew(Duration.standardDays(10000)))
.apply("Window", Window.<String>into(
FixedWindows.of(Duration.standardHours(options.getWindowSize())))
.triggering(
AfterWatermark.pastEndOfWindow()
.withLateFirings(AfterPane.elementCountAtLeast(1)))
.withAllowedLateness(Duration.standardDays(10000))
.accumulatingFiredPanes())
.apply("CalculateSum", Combine.globally(Count.<String>combineFn()).withoutDefaults())
.apply("BigQueryFormat", ParDo.of(new FormatCountsFn()));
// Writing to BigQuery
counts.apply("WriteToBigQuery",BigQueryIO.writeTableRows()
.to(options.getOutputTable())
.withSchema(getSchema())
.withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_APPEND));
// Execute pipeline
p.run().waitUntilFinish();
我猜是和BigQuery格式化函数有关,实现如下:
static class FormatCountsFn extends DoFn<Long, TableRow> {
@ProcessElement
public void processElement(ProcessContext c, BoundedWindow window) {
TableRow row =
new TableRow()
.set("windowStart", window.maxTimestamp().toDateTime())
.set("count", c.element().intValue());
c.output(row);
}
}
受this post 启发。任何人都可以对此有所了解吗?我似乎无法理解它。
【问题讨论】:
标签: java google-bigquery google-cloud-dataflow apache-beam