Chapter 19: Automate Refreshing SEO Keywords Performance in Google Sheets Using Google Search Console and Easy2Digital APIs

SEO keyword insight is pretty valuable for any webmasters. The existing ranked keywords’ avg position performance month by month let you understand how’s going of your content marketing strategy. What is more, new keywords popping up in the search console inspire you with the new content perspective and long-tail keywords to utilize. It’s kind of a no-brainer, but the thing is how to organize the process and automate the process which grabs the SEO insight. This is the value you can gain from this piece.

Google SEO keyword query insight currently is only available in the Google search console, unless you pay for a SaaS, such as SEMrush, Moz, etc. A long time ago, Google analytics stopped showing SEO insight and changed it to “not provided”. But keyword insight is hugely important for content marketing, and website optimization. You and I absolutely know that, which is just like bullshit as I said. But the thing is how to automatically update these keyword queries to your Google Sheets Dashboard you created and customized it as you like.

In this chapter, I would talk to you about how to leverage Python codes and Google Search Console API with Crontab. It’s for automatically updating the latest keyword queries and refreshing the dashboard. By the end of this article, you can learn the techniques to create your favorite application. Then, you can sit and open the dashboard and enjoy!

Table of Contents: Google Search Console API Integration Using Python

Required Modules in Python Script

We would use Google APIs, which connect with Google search console and Google Drive. So oauth2 is necessary. Then, it’s necessary to have pandas of course, because we need to use them to frame the fetched data. Last but not least, as it would upload to Google Sheets, so for making it more simple, I recommend using gspread and oauth2client as well.

Google Search Console API

First thing first, we need to go to the Google developer console to enable Google search console API and Google Drive API. Many people would forget to activate Google Drive API though. So please remember to check your backend of the Google API library.

Then, we need to create a new credential for this project in the Google cloud platform. And we also need to create a new key within this credential and download the JSON file. You would use this file in a moment.

Last but not least, we set the scope in the codings and configure the JSON file with building the service functions.

We need to add the gspread section codings as well. For more details, please check out the other article I released before.

Chapter 17: Amazon Price Tracker, Get the Up-to-date Product Market Value Using ASIN, Oauth2Client, and Google Sheet

Available JSON Data Feed

In Google search console API, you can grab data more than you download the data from the platform report. But also it can be an automatic process and you have more keyword insight.

In terms of the data available via API, you can select startDate, enddate, dimension of query, page, and device. What’s more, it’s because Google API restricts the data generation volume. You can set the rowLimit. For example, you can generate 10 keywords, or you fetch 2000 keywords as well.

Search Console API Methods and Parameters

Obviously, there are three types of API connection, which are search analytics, sitemap, and site setting. To us, we would utilize search analytics in this Python Script.

Frankly, it doesn’t have many methods to utilize. It’s just the searchanalytics() and query(). In the query, we need to fill in our website URL in the siteUrl parameter. Then the data we aim to fetch is necessary to add to the body parameter.

response = service.searchanalytics().query(siteUrl='https://www.easy2digital.com/', body=request).execute()

After command B, you can see the JSON format data already has come up in front of us. From the JSON data, you can find out what metric data we can fetch. They can be clicks, impressions, ctr, position, and so on. Create the Loop to Fetch and Save Data on the Google Sheets

As well as the youtube video performance and Shopify product data I talked about previously, we need to create a loop to grab all the keyword data we set the amount in rowLimit above.

for row in response['rows']:
seoData = {}

for i in range(len(request['dimensions'])):
seoData[request['dimensions'][i]] = row['keys'][i]

seoData['clicks'] = row['clicks']
seoData['impressions'] = row['impressions']
seoData['ctr'] = round(row['ctr'], 2)
seoData['position'] = round(row['position'], 2)
results.append(seoData)

Like grabbing Amazon product data, we would use pandas to append the fetched data and upload it to Google Sheets using gspread. For more details, please check out Python Tutorial 17.

Automate the update using Crontab

In Python Tutorial 18, I talked about how to use crontab to automate refreshing the Amazon price tracker. Regarding SEO keyword queries and position performance refreshing, it’s the same script by just modifying the schedule and script path.

Then, you can just open your SEO performance tracker and check the up-to-date performance. I would talk about how to create a Google Sheets dashboard if you are interested in. Please comment below and let me know

For more details, please check out Python Tutorial 18

Chapter 18: Utilize Macbook Crontab to Automate Running Amazon Competitor Price Tracker and Updating P&L Calculator and Product Market Value

Full Python Script of Google Search Console API Scraper

If you would like to have the full version of the Python Script of Amazon Product Price Tracker, please subscribe to our newsletter by adding the message “Chapter 19”. We would send you the script immediately to your mailbox.

Contact us

I hope you enjoy reading Chapter 19: Automate Refreshing SEO Google Sheets Dashboard with New and Existing Keywords Performance from Google Search Console. If you did, please support us by doing one of the things listed below, because it always helps out our channel.

  • Support and donate to our channel through PayPal (paypal.me/Easy2digital)
  • Subscribe to my channel and turn on the notification bell Easy2Digital Youtube channel.
  • Follow and like my page Easy2Digital Facebook page
  • Share the article on your social network with the hashtag #easy2digital
  • Buy products with Easy2Digital 10% OFF Discount code (Easy2DigitalNewBuyers2021)
  • You sign up for our weekly newsletter to receive Easy2Digital latest articles, videos, and discount codes
  • Subscribe to our monthly membership through Patreon to enjoy exclusive benefits (www.patreon.com/louisludigital)

FAQ:

Q1: What is the Google Search Console API?

A: The Google Search Console API is a service that allows users to access and interact with data from their websites in the Google Search Console.

Q2: What can I do with the Google Search Console API?

A: With the Google Search Console API, you can perform various tasks such as retrieving search performance data, submitting URLs for crawling, getting information about indexed pages, and managing sitemaps.

Q3: How can I access the Google Search Console API?

A: To access the Google Search Console API, you need to create a project in the Google Developers Console, enable the Search Console API, and obtain an API key or OAuth 2.0 credentials.

Q4: What are the benefits of using the Google Search Console API?

A: Using the Google Search Console API allows you to automate processes, retrieve more detailed data, and integrate Search Console data with other tools and systems.

Q5: Can I use the Google Search Console API for multiple websites?

A: Yes, you can use the Google Search Console API for multiple websites by adding them as properties to your project in the Google Developers Console.

Q6: What is search performance data in the Google Search Console API?

A: Search performance data includes information about how your website performs in Google Search, such as the number of clicks, impressions, average position, and click-through rate for specific queries and pages.

Q7: How can I submit URLs for crawling using the Google Search Console API?

A: You can submit URLs for crawling by using the ‘URL Testing Tools’ endpoint in the Google Search Console API. This allows you to request indexing for specific URLs or check their current index status.

Q8: Can I get information about indexed pages with the Google Search Console API?

A: Yes, you can use the ‘URL Testing Tools’ endpoint to get information about indexed pages. This includes details like the index status, last crawl date, and canonical URL.

Q9: What is a sitemap in the context of the Google Search Console API?

A: A sitemap is a file that lists the URLs of your website and provides additional information about them. Using the Google Search Console API, you can manage sitemaps by submitting, removing, or retrieving sitemap information.

Q10: Is the Google Search Console API free to use?

A: Yes, the Google Search Console API is free to use. However, there might be limitations on the number of requests you can make per day or per minute, depending on your API usage.

Google Product Deals Recommendation

Google Cloud: Building No-Code Apps with AppSheet: Automation

Price: USD49.00

Learn to recognize the need for business process automation in your organization. This course will teach you how to identify areas where automation can improve efficiency and productivity.

More product options from the Google collection. 

SAVE UP TO 40% and EXPLORE MORE!

Google API Endpoint Recommendation

Google Shopping Product Scraper API

Price: US$18

Google Shopping SERP scraper crawls the product information from Google Shopping channel. API allows to filter by platform country domain, user location, language. Users can scrape the product information using a keyword query. The scraped dataset include product name, pricing, shipping fees, brand name, product page URL etc.

More API options from the Google collection. 

SAVE UP TO 50% and EXPLORE MORE!