Common way to approach this problem is to use Airflow
and write a DAG to meet your requirements.
But if you want to iterate tables and dump them in GCS on a regular basis only with BigQuery, following could be another option.
1. Export Data
You can export data to GCS with EXPORT DATA
statement in BigQuery script.
EXPORT DATA OPTIONS(
uri='gs://bucket/folder/*.csv',
format='CSV',
overwrite=true,
header=true,
field_delimiter=';') AS
SELECT field1, field2 FROM mydataset.table1 ORDER BY field1 LIMIT 10
2. Loops and Dynamic SQL
If you have a list of table you want to dump, you can loop those tables in BigQuery FOR
loop.
And you need to generate EXPORT DATA
script dynamically for each table. To do so, you can use EXECUTE IMMEDIATE
Dynamic SQL.
3. Scheduling
BigQuery provides a feature to schedule a user query and you can use it for your purpose.
![](https://cdn.statically.io/img/cloud.google.com/bigquery/images/new-scheduled-query-top-console.png)