【问题标题】:BigQuery Streaming data with insertAll使用 insertAll 的 BigQuery 流式传输数据
【发布时间】:2014-09-19 14:57:48
【问题描述】:

我们正在实施 Google Cloud 解决方案。我们对如何 insertAll 有疑问?

  1. 在等待文件导入时是否有超时?
  2. 我们在测试流代码时遇到了这个错误。

Traceback(最近一次通话最后一次):

  File "/base/data/home/runtimes/python27/python27_lib/versions/1/google/appengine/runtime/wsgi.py", line 266, in Handle
    result = handler(dict(self._environ), self._StartResponse)
  File "/base/data/home/runtimes/python27/python27_lib/versions/third_party/webapp2-2.5.2/webapp2.py", line 1529, in __call__
    rv = self.router.dispatch(request, response)
  File "/base/data/home/runtimes/python27/python27_lib/versions/third_party/webapp2-2.5.2/webapp2.py", line 1278, in default_dispatcher
    return route.handler_adapter(request, response)
  File "/base/data/home/runtimes/python27/python27_lib/versions/third_party/webapp2-2.5.2/webapp2.py", line 1102, in __call__
    return handler.dispatch()
  File "/base/data/home/runtimes/python27/python27_lib/versions/third_party/webapp2-2.5.2/webapp2.py", line 570, in dispatch
    return method(*args, **kwargs)
  File "/base/data/home/apps/s~silicon-alpha-636/mytest.378795683110553780/oauth2client/appengine.py", line 714, in check_oauth
    resp = method(request_handler, *args, **kwargs)
  File "/base/data/home/apps/s~silicon-alpha-636/mytest.378795683110553780/main.py", line 378, in get
    get_cloud_storage(self, http)
  File "/base/data/home/apps/s~silicon-alpha-636/mytest.378795683110553780/main.py", line 359, in get_cloud_storage
    jsonData = json.dumps(json_row, ensure_ascii = False, sort_keys = True, indent = 4).encode('utf-8')
  File "/base/data/home/runtimes/python27/python27_dist/lib/python2.7/json/__init__.py", line 250, in dumps
    sort_keys=sort_keys, **kw).encode(obj)
  File "/base/data/home/runtimes/python27/python27_dist/lib/python2.7/json/encoder.py", line 209, in encode
    chunks = list(chunks)
  File "/base/data/home/runtimes/python27/python27_dist/lib/python2.7/json/encoder.py", line 434, in _iterencode
    for chunk in _iterencode_dict(o, _current_indent_level):
  File "/base/data/home/runtimes/python27/python27_dist/lib/python2.7/json/encoder.py", line 408, in _iterencode_dict
    for chunk in chunks:
  File "/base/data/home/runtimes/python27/python27_dist/lib/python2.7/json/encoder.py", line 332, in _iterencode_list
    for chunk in chunks:
  File "/base/data/home/runtimes/python27/python27_dist/lib/python2.7/json/encoder.py", line 409, in _iterencode_dict
    yield chunk
DeadlineExceededError

【问题讨论】:

    标签: google-app-engine google-bigquery


    【解决方案1】:

    目前,Python 运行时有几个名为 DeadlineExceededError 的错误:

    google.appengine.runtime.DeadlineExceededError:如果整个请求超时,通常在 60 秒后引发,对于任务队列请求则为 10 分钟。

    google.appengine.runtime.apiproxy_errors.DeadlineExceededError:如果 RPC 超过其截止日期,则引发。这通常是 5 秒,但对于某些使用“截止日期”选项的 API 是可以设置的。

    google.appengine.api.urlfetch_errors.DeadlineExceededError:如果 URLFetch 超时则引发。

    阅读更多Dealing with DeadlineExceededErrors

    【讨论】:

    • 感谢 Pentium 10 的回答。对于那个链接,我已经阅读了它,我想我的错误可能是这个 'google.appengine.runtime.DeadlineExceededError' 但我仍然不知道如何修复它我的代码。
    • insertAll 接受 json 作为有效负载的主体,而不是文件。确保这不是混淆。另一方面,我们看到这些呼叫的响应时间约为 2 秒,这并不算太长。如果您仍然被阻止,请检查您的防火墙和 API 网址,可能超时来自那里。
    猜你喜欢
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 2018-10-19
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    相关资源
    最近更新 更多