BigQuery hosts a number of public datasets that are available to the general public to query. In this tutorial, you create a workflow that runs multiple BigQuery query jobs in parallel, demonstrating an improvement in performance when compared to running the jobs serially, one after the other.
Run a BigQuery query job
In BigQuery, you can run an interactive (on-demand) query job. For more information, see Running interactive and batch query jobs.
Console
In the Google Cloud console, go to the BigQuery page.
Enter the following BigQuery SQL query in the Query editor text area:
SELECT TITLE, SUM(views) FROM `bigquery-samples.wikipedia_pageviews.201207h` GROUP BY TITLE ORDER BY SUM(views) DESC LIMIT 100
Click Run.
bq
In your terminal, enter the following bq query
command to run an
interactive query using standard SQL syntax:
bq query \
--use_legacy_sql=false \
'SELECT
TITLE, SUM(views)
FROM
`bigquery-samples.wikipedia_pageviews.201207h`
GROUP BY
TITLE
ORDER BY
SUM(views) DESC
LIMIT 100'
This runs a query that returns the top 100 Wikipedia titles with the most views in a specific month and writes the output to a temporary table.
Note how long the query takes to run.
Deploy a workflow that runs multiple queries serially
A workflow definition is made up of a series of steps described using the Workflows syntax. After creating a workflow, you deploy it to make it available for execution. The deploy step also validates that the source file can be executed.
The following workflow defines a list of five tables to run a query against using the Workflows BigQuery connector. The queries are run serially, one after the other, and the most viewed titles from each table are saved to a results map.
Console
In the Google Cloud console, go to the Workflows page:
Click Create.
Enter a name for the new workflow, such as
workflow-serial-bqjobs
.Choose an appropriate region; for example, us-central1.
Select the service account you previously created.
You should have already granted both the BigQuery > BigQuery Job User and Logging > Logs Writer IAM roles to the service account.
Click Next.
In the workflow editor, enter the following definition for your workflow:
Click Deploy.
gcloud
Open a terminal and create a source code file for your workflow:
touch workflow-serial-bqjobs.yaml
Copy the following workflow to your source code file:
Deploy the workflow by entering the following command:
gcloud workflows deploy workflow-serial-bqjobs \ --source=workflow-serial-bqjobs.yaml \ --service-account=MY_SERVICE_ACCOUNT@MY_PROJECT.iam.gserviceaccount.com
Replace
MY_SERVICE_ACCOUNT@MY_PROJECT.iam.gserviceaccount.com
with the email of the service account you previously created.You should have already granted both the
roles/bigquery.jobUser
androles/logging.logWriter
IAM roles to the service account.
Execute the workflow and run multiple queries serially
Executing a workflow runs the current workflow definition associated with the workflow.
Console
In the Google Cloud console, go to the Workflows page:
On the Workflows page, select the workflow-serial-bqjobs workflow to go to its details page.
On the Workflow Details page, click play_arrow Execute.
Click Execute again.
View the results of the workflow in the Output pane.
gcloud
Open a terminal.
Execute the workflow:
gcloud workflows run workflow-serial-bqjob
The workflow execution should take approximately a minute or five times the previous running time. The result will include each table and look similar to the following:
{
"201201h": {
"title": "Special:Search",
"views": "14591339"
},
"201202h": {
"title": "Special:Search",
"views": "132765420"
},
"201203h": {
"title": "Special:Search",
"views": "123316818"
},
"201204h": {
"title": "Special:Search",
"views": "116830614"
},
"201205h": {
"title": "Special:Search",
"views": "131357063"
}
}
Deploy and execute a workflow that runs multiple queries in parallel
Instead of running five queries sequentially, you can run the queries in parallel by making a few changes:
- runQueries:
parallel:
shared: [results]
for:
value: table
in: ${tables}
- A
parallel
step allows each iteration of thefor
loop to run in parallel. The
results
variable is declared asshared
which allows it to be writable by a branch, and the result of each branch can be appended to it.
Console
In the Google Cloud console, go to the Workflows page:
Click Create.
Enter a name for the new workflow, such as
workflow-parallel-bqjobs
.Choose an appropriate region; for example, us-central1.
Select the service account you previously created.
Click Next.
In the workflow editor, enter the following definition for your workflow:
Click Deploy.
On the Workflow Details page, click play_arrow Execute.
Click Execute again.
View the results of the workflow in the Output pane.
gcloud
Open a terminal and create a source code file for your workflow:
touch workflow-parallel-bqjobs.yaml
Copy the following workflow to your source code file:
Deploy the workflow by entering the following command:
gcloud workflows deploy workflow-parallell-bqjobs \ --source=workflow-parallel-bqjobs.yaml \ --service-account=MY_SERVICE_ACCOUNT@MY_PROJECT.iam.gserviceaccount.com
Replace
MY_SERVICE_ACCOUNT@MY_PROJECT.iam.gserviceaccount.com
with the email of the service account you previously created.Execute the workflow:
gcloud workflows run workflow-parallel-bqjobs
The result will be similar to the previous output but the workflow execution should take approximately twenty seconds or less!