In the previous Python Tutorial, we talked about how to scrape more than 50 videos from a Youtube search query keyword, and also grab the performance of each video, such as view, comment, like, etc. However, it’s not the end of automation power, like saying you aim to research, filter Youtubers, and automate the collaboration invitation process. At least, the fetched list of Youtubers should be saved and managed in a datasheet on a cloud drive instead of in the CSV file, that can be set up and easily integrated with other platforms.

So in this Python Tutorial, I will continue to use the Python script from the Python Tutorial Chapter 6, and walk you through how to create a Robot user account, leverage Google Sheet API to save all fetched data in a Google Sheet In your web scraping python script. By the end of this Python Tutorial, you can learn what modules you need to set up, and experience just looking at a spreadsheet that is automatically listing all videos in a preset format.

Create a Robot Service Account in Google API Console

As well as Youtube API, Google Sheet API is of course part of the Google API family. So first thing first, we need to create a credential, and it is the service account this time.

Basically, this creation purpose is to name the robot account, grant the access level to the robot (Normally it can be edited level), and download a JSON script that will be used in the coming script writing. For one part regarding service account role, it can be ignored and it’s unnecessary to fill in in this Python Tutorial.

Add the Robot Account to a Google Sheet

After the account is created, we go to a google cloud and create a new google sheet. The cloud drive can be the free version, which doesn’t matter. And in the share button, we add the robot account address and set it as the editor.

Install Google Client Library on your Macbook

Again, whatever API you are going to create and adopt in your Python Script, just remember to check the API documentation, where normally includes installation codes required, script samples, etc. Here is the same, we can go to the Google sheet API and check out Python language

The Mac or Windows operating system is almost the same, so I take Macbook as an example. We go to the terminal and type in this code. The reason I type in pip3 is that my Macpro default python version is 2.7, and I didn’t associate them together. But of course, you can do something different with me.

pip3 install –upgrade google-api-python-client google-auth-httplib2 google-auth-oauthlib

After a while, the installation is done.

And try to check the installed packages by typing in pip3 list, the results show that they’re in place.

Import the Google API module

As well as talking about beautifulsoup, requests, CSV writing, etc, we need to input the modules that include functions we need to create the script feature. If we try to check out the API documentation, there is a sample for our reference to use, but the sample is not fully fitting our script creation purpose. So based on our purpose, I would select one module and remove the rest, which can allow us to use build (), and create the Google sheet API object.

from __future__ import print_function

import pickle

import os.path

from googleapiclient.discovery import build

from google_auth_oauthlib.flow import InstalledAppFlow

from google.auth.transport.requests import Request

Use OAuth 2.0 for Server to Server Application

The Google OAuth 2.0 system supports server-to-server interactions such as those between a web application and a Google service. We just created a service account, which is an account that belongs to the application instead of to an individual end-user. Your application calls Google APIs on behalf of the service account, so users aren’t directly involved. This scenario is sometimes called “two-legged OAuth,” or “2LO.” In fact, it is a robot account, we would deep dive into more details in the coming chapters, such as Wechat chatbot, etc.

Below is the required module, object, and variable above the lines of code.

from google.oauth2 import service_account

SCOPES = [‘’]

SERVICE_ACCOUNT_FILE = ‘/path/to/service.json’

credentials = service_account.Credentials.from_service_account_file(


But we need to modify a few places. First of all, we need to select the scope of the API call, we can refer to the Google Sheet API scopes. And here we need to select read and write.

Then, we need to update the location path of the service account JSON file which was downloaded when we created it in Google API console, which tells Python we would use a Robot account and represent to get access and edit the Google sheet data.

Also, I would suggest changing the variable name “credential” to creds, which is easier for us to separate the credential in any other API objects set up by default. 

Last but not least, we need to add creds = None before the creds, which ensures the creds are not defined with any value.

Call the Sheets API

In the API documentation script sample, there are three lines of coding that are used to call the sheets API, which we could copy and paste to our Python script. The only thing we need to update the Google sheet ID.

SAMPLE_SPREADSHEET_ID = ‘xxxxxxxxxxxx………’

service = build(‘sheets’,’v4′,credentials=creds)

sheet = service.spreadsheets()

python tutorial

As you might remember, we created a Google sheet and authorize the robot account on the editor level, so we go to that spreadsheet, and copy the sheet ID from the page URL and paste this code to the SAMPLE_SPREADSHEET_ID value.

python tutorial

Append the Video Data Values to the Google Sheet

So now things are almost going to be ready. What we need to do is to feed the scraped video data to the google sheet.

First of all, as you might remember in the previous chapter about CSV file writing, we create a line of code after the data fetching section, which is for formatting data to appear in columns. Similar to the CSV write method, we also need to create a variable. But the difference here is that it doesn’t have an object and method to follow, so we need to create it by ourselves, like saying we create a variable named in video50_scrape.

video50_scrape = [[channel_title, video_title, yt_link,vid_view,vid_like,vid_dislike,vid_favorite,vid_comment]]

Then, we go to the Google sheet API reference and check what parameter and value we need to pass in if we aim to append the data to the Google sheet.

python tutorial

As we can see in the append from spreadsheet values, we must pass in 

  • spreadsheet id and range
  • based on our project object, we also need to pass in valueinputoption, because the data is scraped by our written script. 
  • We also need the insertdataoption, because it’s avoiding the new video data overwriting the previous one.
  • Last but not least, we need to pass in body, because we have format the data by column, and this parameter makes sure the data would show in the sheet as we expect

So now we can create two variables called updatesheet_request, and updatesheet_response. One is for appending the data, the other one is for executing the updatesheet. This way is better for other people to understand the logic.

updatesheet_request = sheet.values().append(spreadsheetId=SAMPLE_SPREADSHEET_ID,range=”Youtubers_Ring_Spotlight_Camera_Query!A2“,valueInputOption=”USER_ENTERED“, insertDataOption=”INSERT_ROWS“, body={“values”: video50_scrape})

updatesheet_response = updatesheet_request.execute()

Google Sheet API Usage Limit

Now we can try to command b, and you should be happy to see all the video fetched data are automatically updated in the Google sheet. And you just need to change the q value, and the video data would continue to update after. If it’s necessary, you can add one more column via Python, which shows you what keyword query you use. This task leaves you guys to finish, I’m sure you already can do it.

python tutorial

Lastly, Google Sheet API has usage limits, which has a limit of 500 requests per 100 seconds, so you might need to control the frequency of the execution, or you can refer to my next articles regarding how to handle API limitation. I’ll discuss more this section in the work automation collection.

python tutorial

So easy, right? I hope you enjoy reading Python Tutorial for Digital Marketers 7: Save Web Scraping Data in Google Sheet via API. If you did, please support us by doing one of the things listed below, because it always helps out our channel.

  • Support my channel through PayPal (
  • Subscribe to my channel and turn on the notification bell Easy2Digital Youtube channel.
  • Follow and like my page Easy2Digital Facebook page
  • Share the article to your social network with the hashtag #easy2digital
  • Buy products with Easy2Digital 10% OFF Discount code (Easy2DigitalNewBuyers2021)
  • You sign up for our weekly newsletter to receive Easy2Digital latest articles, videos, and discount code on Buyfromlo products and digital software
  • Subscribe to our monthly membership through Patreon to enjoy exclusive benefits (

Youtuber Research and Recruitment Service

If you are looking for a freelancer to help you with Youtube influencer research and recruitment, I’m able to help you with this, and below has 3 service and quotation ($US) plans you can select. For details, please visit here and contact me.

By Louis Lu

Growth Hacker & Digital Marketer, with a proven record of over 11 years experience in 20+ Asian markets, and 25,000+ connections in Linkedin

6 thoughts on “Python Tutorial for Digital Marketers 7: Save Web Scraping Data in Google Sheet via API”

Leave a Reply

Your email address will not be published. Required fields are marked *