-
Notifications
You must be signed in to change notification settings - Fork 1
Open
Description
Many thanks for this and I look forward to the blog post :)
I have just seen your slides, so perhaps you covered this, but I have a case with 2.5million records in the BigQuery table that I think will cause out of memory issues. That solution is using Dataiku as its within an existing setup, but I'd like it to be able to work with Worflows too. For that I think possible modifications would be:
- Export BigQuery tables that are 500 rows each, which is the batch import limit for Firestore. I think this maybe possible with Workflow iterations? https://cloud.google.com/workflows/docs/reference/syntax/iteration#yaml
- Then each export that hits Cloud Storage starts a Cloud Function for import into Firestore via Cloud Storage triggers
What do you think?
Metadata
Metadata
Assignees
Labels
No labels