【问题标题】:Beam: writing per window element count with window boundariesBeam:使用窗口边界写入每个窗口元素计数
【发布时间】:2018-11-26 03:40:07
【问题描述】:

对于一个简单的概念验证,我试图在两分钟的窗口中显示点击数据。我想做的就是打印每个窗口的计数,以及 BigQuery 的窗口边界。在运行我的管道时,我不断收到以下错误:

org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.lang.RuntimeException: java.io.IOException: Insert failed: [{"errors":[{"debugInfo":"","location":"windowend","message":"This field is not a record.","reason":"invalid"}],"index":0}]

管道如下所示:

// Creating the pipeline
Pipeline p = Pipeline.create(options);

// Window items
PCollection<TableRow> counts = p.apply("ReadFromPubSub", PubsubIO.readStrings().fromTopic(options.getTopic()))
.apply("AddEventTimestamps", WithTimestamps.of(TotalCountPipeline::ExtractTimeStamp).withAllowedTimestampSkew(Duration.standardDays(10000)))
        .apply("Window", Window.<String>into(
                FixedWindows.of(Duration.standardHours(options.getWindowSize())))
                .triggering(
                        AfterWatermark.pastEndOfWindow()
                                .withLateFirings(AfterPane.elementCountAtLeast(1)))
                .withAllowedLateness(Duration.standardDays(10000))
                .accumulatingFiredPanes())
        .apply("CalculateSum", Combine.globally(Count.<String>combineFn()).withoutDefaults())
        .apply("BigQueryFormat", ParDo.of(new FormatCountsFn()));

// Writing to BigQuery
counts.apply("WriteToBigQuery",BigQueryIO.writeTableRows()
                .to(options.getOutputTable())
                .withSchema(getSchema())
.withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_APPEND));

// Execute pipeline
p.run().waitUntilFinish();

我猜是和BigQuery格式化函数有关,实现如下:

static class FormatCountsFn extends DoFn<Long, TableRow> {
    @ProcessElement
    public void processElement(ProcessContext c, BoundedWindow window) {
        TableRow row =
                new TableRow()
                        .set("windowStart", window.maxTimestamp().toDateTime())
                        .set("count", c.element().intValue());
        c.output(row);
    }
}

this post 启发。任何人都可以对此有所了解吗?我似乎无法理解它。

【问题讨论】:

    标签: java google-bigquery google-cloud-dataflow apache-beam


    【解决方案1】:

    显然,这个问题的答案与光束窗口无关,仅与 BigQuery 有关。将 DateTime 对象写入 BigQuery 行需要一个正确的 yyyy-MM-dd HH:mm:ss 格式的字符串,这与我提供的 DateTime 对象相反。

    【讨论】:

      猜你喜欢
      • 1970-01-01
      • 1970-01-01
      • 2018-05-31
      • 2019-01-27
      • 2011-06-05
      • 2016-05-15
      • 1970-01-01
      • 1970-01-01
      • 2013-05-03
      相关资源
      最近更新 更多