【发布时间】:2016-12-29 08:11:26
【问题描述】:
到目前为止,我发现的示例是将 json 流式传输到 BQ,例如https://cloud.google.com/bigquery/streaming-data-into-bigquery
如何将 Csv 或任何文件类型流式传输到 BQ?下面是流式传输的代码块,似乎“问题”在 insert_all_data 中,其中“行”定义为 json .. 谢谢
# [START stream_row_to_bigquery]
def stream_row_to_bigquery(bigquery, project_id, dataset_id, table_name, row,
num_retries=5):
insert_all_data = {
'rows': [{
'json': row,
# Generate a unique id for each row so retries don't accidentally
# duplicate insert
'insertId': str(uuid.uuid4()),
}]
}
return bigquery.tabledata().insertAll(
projectId=project_id,
datasetId=dataset_id,
tableId=table_name,
body=insert_all_data).execute(num_retries=num_retries)
# [END stream_row_to_bigquery]
【问题讨论】:
标签: python streaming google-bigquery