Closed
Description
We have a http Cloud Function that does some data processing and then streams to BQ. The function errors out sometimes because of either the bq client losing connection or it is the insert_rows that can't connect.
See below an example of a stack trace captured in the GCP logs.
File "/layers/google.python.pip/pip/lib/python3.8/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/layers/google.python.pip/pip/lib/python3.8/site-packages/flask/app.py", line 1952, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/layers/google.python.pip/pip/lib/python3.8/site-packages/flask/app.py", line 1821, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/layers/google.python.pip/pip/lib/python3.8/site-packages/flask/_compat.py", line 39, in reraise
raise value
File "/layers/google.python.pip/pip/lib/python3.8/site-packages/flask/app.py", line 1950, in full_dispatch_request
rv = self.dispatch_request()
File "/layers/google.python.pip/pip/lib/python3.8/site-packages/flask/app.py", line 1936, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/layers/google.python.functions-framework/functions-framework/lib/python3.8/site-packages/functions_framework/__init__.py", line 66, in view_func
return function(request._get_current_object())
File "/workspace/main.py", line 162, in stream_tax
errors = bq.insert_rows_json(table=dataset_table,
File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/cloud/bigquery/client.py", line 3013, in insert_rows_json
response = self._call_api(
File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/cloud/bigquery/client.py", line 636, in _call_api
return call()
File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/api_core/retry.py", line 281, in retry_wrapped_func
return retry_target(
File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/api_core/retry.py", line 184, in retry_target
return target()
File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/cloud/_http.py", line 427, in api_request
response = self._make_request(
File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/cloud/_http.py", line 291, in _make_request
return self._do_request(
File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/cloud/_http.py", line 329, in _do_request
return self.http.request(
File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/auth/transport/requests.py", line 464, in request
response = super(AuthorizedSession, self).request(
File "/layers/google.python.pip/pip/lib/python3.8/site-packages/requests/sessions.py", line 542, in request
resp = self.send(prep, **send_kwargs)
File "/layers/google.python.pip/pip/lib/python3.8/site-packages/requests/sessions.py", line 655, in send
r = adapter.send(request, **kwargs)
File "/layers/google.python.pip/pip/lib/python3.8/site-packages/requests/adapters.py", line 498, in send
raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', ConnectionResetError(104, 'Connection reset by peer'))
-
bq
(=bigquery.Client()
) in the trace is instantiated as a global variable as recommended here: https://cloud.google.com/functions/docs/bestpractices/networking#accessing_google_apis -
error is logged 30 secs after function is invoked - so can't be the 60s default timeout in
-http
Thoughts ?