Google released the possibility to export your Search Console data to BigQuery recently. Totally awesome! Now we don’t have to use the API or 3rd party ETL tools anymore.
In this post I’ll give you a script to create and prepare your Cloud Project for this.
But first
One caveat: you can only add 1 (one) search console property data export to a cloud project.
If you only have 1 property: no problem, just plug the data in your main Cloud Project and you’ll be fine.
If you have multiple properties, a best practice is to create a Google Cloud Project for every search console project.
Steps to prepare the cloud project via a script
The export needs
- a project
- the BigQuery API enabled
- permissions given to an existing Google service account
Simple enough if you follow the steps in the support article.
Even simpler if you use the cloud shell
Open the cloud shell via the button
In the shell that appears, paste the script below (line by line)
# REPLACE this by your project name export PROJECT_ID=ga4-nl-gsc export ORG_ID=1057498872770 # create project gcloud projects create $PROJECT_ID --organization=$ORG_ID # set project gcloud config set project $PROJECT_ID # enable bq api gcloud services enable bigquery.googleapis.com # grant serviceaccount access to BQ user for ROLE in bigquery.jobUser bigquery.dataEditor ; do gcloud projects add-iam-policy-binding $PROJECT_ID \ --member=serviceAccount:search-console-data-export@system.gserviceaccount.com \ --role=roles/$ROLE done
Steps for the script
- edit the first lines of the script
- replace PROJECT_ID by the name of your new project (or existing one)
- replace ORG_ID by the ID of your organisation
- if you already have a project, skip the “gcloud projects create” line
- paste in the rest of the script to enable the API and grant access to the service account
Add a Billing account to the project
You will need a billing account associated with the project. You can do that via the interface, or if you’re feeling l33t:
# still in the same cloud shell.. export BILLING_ACCOUNT=$(gcloud alpha billing accounts list | grep ^ACCOUNT_ID | cut -f2 -d' ') if test -z $BILLING_ACCOUNT ; then echo "no billing account found" else gcloud alpha billing projects link $PROJECT_ID --billing-account $BILLING_ACCOUNT fi
Steps after the script
- go to https://search.google.com/search-console/settings/bulk-data-export
- choose your property
- Fill in the project ID and click Continue
Then.. wait up to 48 hours, and data will be rolling in
You can run gcloud alpha bq datasets list
to check if the dataset is created succesfully.
Don’t forget
When the data export setup is done, read the pages in the help center to know what to do next, how to troubeshoot errors, and how to set expiry for data (you know.. all this data storage is going to add up),
Thanks for reading!
Let me know via a comment if you want to know more, and share the article if you like it.
Leave a Reply