【发布时间】:2017-06-27 22:42:37
【问题描述】:
我有一个看似简单的场景,我正在使用 python 数据流使用大查询来查询数据。
当 bq 查询返回零行时,我遇到了一个 AssertionError,脚本和断言错误如下所示。我想知道这是否是一个错误,或者是否有推荐的方法来处理 py 数据流中 bq 读取器的零行?
数据流脚本:
from apache_beam.io import WriteToText
from apache_beam.typehints import Any, Dict
pipeline_options = PipelineOptions(pipeline_args)
pipeline_options.view_as(SetupOptions).save_main_session = True
p = beam.Pipeline(options=pipeline_options)
BIGQUERY_ROW_TYPE = Dict[str, Any]
# construct a bigquery SQL
query_sql = Query().build_sql()
lines = p \
| 'read from bigquery' >> beam.io.Read(beam.io.BigQuerySource(query=query_sql, validate=True)).with_output_types(BIGQUERY_ROW_TYPE) \
| 'write to test' >> WriteToText(known_args.output)
result = p.run()
查询返回零行时看到的错误:
(98b5a6e4c0cd002e): Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 581, in do_work
work_executor.execute()
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 166, in execute
op.start()
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/native_operations.py", line 48, in start
for value in reader:
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/nativefileio.py", line 186, in __iter__
for eof, record, delta_offset in self.read_records():
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/nativeavroio.py", line 102, in read_records
assert block.num_records() > 0
AssertionError
`2017-06-27 (13:55:58) Workflow failed. Causes: (7390b72dc5ceedb6): S04:read from bigquery+write to test/Write/WriteImpl/Wr...
(bb74ab934e658b06): Workflow failed. Causes: (7390b72dc5ceedb6): S04:read
from bigquery+write to test/Write/WriteImpl/WriteBundles/Do+write to
test/Write/WriteImpl/Pair+write to
test/Write/WriteImpl/WindowInto(WindowIntoFn)+write to
test/Write/WriteImpl/GroupByKey/Reify+write to
test/Write/WriteImpl/GroupByKey/Write failed.`
【问题讨论】:
-
您有实际查询吗?
-
我相信这是 Python 数据流工作者中的一个错误。 Avro 规范不要求块具有非零元素计数,但工作人员错误地强制执行此要求。我们正在努力修复。
-
@jkff 这个有什么消息吗?
-
这是一个错误。修复将在几周后提供。
-
@Pablo 这个问题解决了吗?我正在从使用 Dataflow 构建管道的数据实验室 VM 中运行代码。无论我的 LIMIT 大小如何,我都会收到这个确切的错误。我还在 BigQuery 中仔细检查了我的查询,它确实返回了数据。
标签: python google-cloud-dataflow