I have a routine that returns me 2900 rows daily. This information is sent to BigQuery. Is there a way to send these 2900 rows to a BigQuery table today and add another 2900 rows to that table?
I have a routine that returns me 2900 rows daily. This information is sent to BigQuery. Is there a way to send these 2900 rows to a BigQuery table today and add another 2900 rows to that table?
This question is very broad, it is not reproducible, BUT, I will provide my 2 cents. There is a cloud service called Stitchdata that basically does integration of data sources with a data warehouse like Bigquery.
I for example have an API that registers the API usage logs in a database in Postgres. This data is important for later analysis and so I send every 30 minutes the new rows to a table in Bigquey. To work in this incremental mode, simply have an incremental field in the table, such as an ID, or some field with temporal information, such as a timestamp.