Build Your very Own Google Search Console Data Pipeline using Google Search Console's Bulk Export tool to BigQuery.
It's the first tool I've seen built on top of Google's Bulk Export service that allows you to customize how you classify your data by brand vs non brand, purchase funnel stages, question terms, low hanging fruit and by url segments.
Did we mention the Google Looker Studio report is super fast too?
Pipeline Setup Steps | Looker Studio Report Setup | Faqs | Cost Calculator | Using the Looker Studio Report
Before we dig in, please remember:
This is necessary to make sure you're logged into the correct Google account for the steps below.
This tool allows you to set up a number of configuration variables for each pipeline including:
This is necessary to build the BigQuery table that stores all the configuration values for you.
Use the Help tab to guide you in this process
Set up the values in the Config Sheet tab
Go to the Bulk Export Settings page inside Google Search Console
Select the property you used to set up the export from dropdown in top left of window.
This will show you the Cloud Project ID, Dataset ID
Code will appear after you've added both your poject and dataset IDs above.
Go to Google BigQuery to set up the pipeline functions
This will add what are called BigQuery procedures to your Google Cloud Project.
This will also build out the views and tables necessary for the pipeline to run.
These blocks of code are what run your data pipeline every day.
Go to the Scheduled Queries tool.
Copy code from below & paste it into Schedules Query Editor IDE
A new pane will slide in from left.
Change Schedule frequency from every 1 hour to every 8 hours.
Open the searchconole dataset
Open routines
Click on transform_search_data routine
Click on Invoke Stored Procedure button.
Click RUN
when you see the new SQL Window pop up with the routine inside it:
CALL `your-project.your-dataset.transform_search_data`();
.Please note that we'll modify the name to make it work with Looker Studio naming rules
The link to your report will appear after you've named it.
When Looker Studio report opens, Click Edit and Share.
A new popup will appear.
No worries! We built a procedure inside your BigQuery dataset that we'll use to reclassify the data with your new format. Here's how to use it:
Get your Daily growth size by dividing searchdata_url_impression table size / # of partitions
Check out the goldilocks version of this tool from Branch Tools that costs $30 / month.
It has more views of the data and lower cost of ownership.
Currently Google charges around $0.02 / GB to store data in the US.
Data that's been stored for > 90 days costs $0.01 / GB to store in the US.
Currently Google charges $6.25 / TB (or 1000 GBs) to query data in the US.
Each day Google adds more and more data to the pipeline. We've seen medium sized sites that get 5MB of data added per day and larger sites with much more data
There are 9 Pages in the tool:
A perfect page for high level analysis of your website. Detailed Instructions
A page that allows you to see how much of your data is anonymous by day.
A perfect page to help you add FAQ content to your website. Detailed Instructions
A powerhouse page for exploring Topics across pages. Detailed Instructions
A perfect page for optimizing individual pieces of content. Detailed Instructions
A great tool for exploring big changes in performance. See what directories, pages, and queries are contributing to big changes. Detailed Instructions
Get deep insights into cannibalization on your site. Detailed Instructions
Explore performance by directory. Detailed Instructions
Track performance by URL segments that you configure according to what matters to you. Detailed Instructions
Analytics
Content
Datascience
General SEO
Local SEO
Reading List Resources
Technical SEO
Building friendships
Kindness
Giving
Elevating others
Creating Signal
Discussing ideas respectfully
Diminishing others
Gatekeeping
Taking without giving back
Spamming others
Arguing
Selling links and guest posts