data

Python Tutorial for Digital Marketers 8: One Script to Scrape Competitor Shopify Web Product Data

Previously, I talked about how to store all fetched data in the Google sheet in your Google drive, which you are not limited to your local computer or laptop to open the data, or your partners can open it to work for projects anytime. When talking about digital and eCommerce projects, the Shopify project must be one of the most popular topics which many eCommerce sellers like to discuss. If you don’t believe me, please just check out the Shopify stock pricing and ask your surroundings how many sellers are significantly benefiting from Shopify, thanks to easy-to-use and super agile features. The fragment eCommerce era has come, creating an online store is as easy as creating a blog. So in this Python Tutorial, I’ll walk you through how to create a Python script and scrape any Shopify website product data just by changing a website URL. By the end of this piece, you can learn how to find Shopify storefront API and read Shopify JSON, then write the python codes for specific product information, such as title, price, compared price,…

Python Tutorial for Digital Marketers 7: Save Web Scraping Data in Google Sheet via API

In the previous Python Tutorial, we talked about how to scrape more than 50 videos from a Youtube search query keyword, and also grab the performance of each video, such as view, comment, like, etc. However, it’s not the end of automation power, like saying you aim to research, filter Youtubers, and automate the collaboration invitation process. At least, the fetched list of Youtubers should be saved and managed in a datasheet on a cloud drive instead of in the CSV file, that can be set up and easily integrated with other platforms. So in this Python Tutorial, I will continue to use the Python script from the Python Tutorial Chapter 6, and walk you through how to create a Robot user account, leverage Google Sheet API to save all fetched data in a Google Sheet In your web scraping python script. By the end of this Python Tutorial, you can learn what modules you need to set up, and experience just looking at a spreadsheet that is automatically listing all videos in a preset format.

Python Tutorial for Digital Marketers 6: Scrape View, Comment, Like Data of More than 50 Videos from the Youtube Search List

In the previous Python Tutorial, I walked you through how to create a youtube API key and scrape videos by search queries via python programming. Basically, the beauty of youtube video scraping is you don’t need to pay any fees to research potential YouTubers for your brand and eCommerce store product review, which is a super-efficient way. But having said that for filtering the better YouTubers, it’s not sufficient for us to select just by those fetched videos, on the other hand, we also need to look into the video data, such as view, comments, likes, dislikes, etc, which can help us make a better candidate list. Also, by default Youtube data API v3 allows developers to scrape the top 50 pieces of videos, I would say it might be not enough because you might miss some secondary tier YouTubers whose videos might rank lower positions, but the content is pretty good and engaging. As I haven’t talked about how to fetch those data above, so in this piece, I’ll continue to use the previous Youtube video scraping python script,…

Python Tutorial for Digital Marketers 5: Create Youtube API & Scrape Youtube Videos

Previously we took Ring.com website as an example, and walked you through how to specify Ring product price data to scrape and save the bulk of data in local drive. Basically, the script code can scrape any general websites that are built and developed by CMS, such as WordPress, Shopify, etc. If you want to have a copy of the python script file, please contact us. Being said that, not all the websites can be indexable and crawlable, because the data feed from some websites can be only accessible via API, such as Youtube, Facebook, Amazon, etc. Frankly if you just want to scrape these sites’ data by URL, I suggest you leverage Google Sheet importxml instead of Python, because it’s much easier (Time is money). On the other hand, automation and bulk amount of scraping data work are your regular work, for example, if you are a social media marketer, you need to recruit influencers, or KOLs, the following a series of articles can help relieve your stress and repeating manual work. We would start with Youtube video scraping…

Python Tutorial for Digital Marketers 4: How to Specify Web Data to Scrape

As mentioned in the last chapter about “How to Write, Parse, Read CSV Files with Scraped Data”, we would discuss how to specify web data to scrape, because this is one of the key purposes of why we like to learn Python as a digital marketer. So in this python tutorial for digital marketers 4, I’ll walk you through a basic concept, and methods with using beautifulsoup and requests you need to know to specify web data and scrape. It’s better if you understand how to read HTML, CSS, javascript in this part, but it’s totally okay if you haven’t yet, because the purpose is to find the data located at the moment and learn some methods to scrape specific data for digital marketing purpose. During the lesson, I’ll take Ring.com as an example to write codes and scrape all the latest offers and pricing. By the end of the lesson, you can master identify where your expected data locate on a target page and scrape it all in minutes.

Python Tutorial for Digital Marketers 3: How to Write, Parse, Read CSV Files with Scraped Data

In the previous Python Tutorial for digital marketers 2, we talked about how to install beautifulsoup4, requests, lxml, html5lib and sublime text, and then scraping web data by them. But the data is not saved in a file or a database yet, so it’s not convenient for you to use for your business purpose and work operation. So in this Python Tutorial,we would talk about how to write Python scripts to parse and save the data into CSV files in local, and read the CSV files in a Python environment. By the end of this Python Tutorial, you can master what CSV read, parse and write methods you can use to open and save CSV files in a readable format, although we are not going to deep dive into a specific scraping methods script writing which we would talk about in the next chapter of Python Tutorial.

Python Tutorial for Digital Marketers 2: Web Scraping with BeautifulSoup, Requests, Sublime Text

In the previous Python tutorial for digital marketers 1, we discussed what a digital marketer can benefit from Python super power, why she or he needs it, and how to install and set up the latest Python version for Mac OS. As you might be aware, one of the most essential Python benefits to digital marketers is to scrape web data and update the data automatically.  So in this article, I’ll talk about how to set up an environment to write python scripts for the purpose of scraping objective website data. This article doesn’t go into details regarding Python methods introduction, code writing and feeding the data to a spreadsheet or database. I’ll release other articles and videos to walk through. But the purpose of this article is to let you understand the big picture of what components are necessary and how it works. By the end of this article, you can master the installation of beautifulsoup4, requests, lxml, html5lib and sublime text, and how to scrape web data by them.

Python Tutorial for Digital Marketers 1: Install and Set up Python for Mac

Digital marketing has become so sophisticated and data-oriented in the present-day marketing field. The modern digital marketing strategies and on-going performance optimization are highly influenced by first-hand external data and deeper in-house data analytics. No matter what marketing position you are in, or what business model you’re running. Too many application scenarios to illustrate the benefits of Python for digital markets, but the main goal of a successful digital marketing strategy is to achieve the marketing return on investment ROI greater and faster, which is impossible without using the technologies like Python for automating the process.

A Practical Guide to Set Up Shopify Google Analytics for Collecting Transaction Data Integrated with Your Profit Calculator

Every coin has two sides. Shopify provides store owners ready-to-use templates and eCommerce friendly URL structures that hugely save eCommerce website creation and increase marketing, operation efficiency.  However, due to this fixed framework and structure, Shopify can’t facilitate owners to understand granular marketing performance for each SKU transaction data by diversified dimensions, and learn web content insight which can engage, and better convert traffic into sales. All these are critical to optimize SKUs conversion rate and see if the SKU should be continued based on the profit performance.

Web Scraping with Google Sheets ImportXML to Automatically Collect Product Price Info

I am always on the lookout for a unique angle to use freely available or potentially scrapable data sources. Also, It’s indeed frustrating that you have, admittedly, spent hours upon hours trying to learn Python for writing simple web scraper applications, and implementing web scraping automatically, however, at the end you can only discover the data isn’t accessible, interesting or differentiated enough from what’s already out there. If you just want to automate updating the profit calculator of your eCommerce business, thankfully, there is an easier way to collect data from the web without spending that much hours: Google Sheets ImportXML Function.