Data Intelligence


Chapter 10 – Build a Shopify Bot to Scrape Store Product Data at Scale Using Easy2Digital APIs

In previous chapters, we discussed how to scrape website HTML information and Shopify product information via JSON API. Actually, on most websites and platforms, there is more than one page to show articles, products, and so on. Basically, we call it pagination, for example, page 1, or previous page or next page, and the mentioned codings and dataset only scrape the single URL page.

In this article, I would walk you through how to scrape the pagination using Python, via either the website HTML or JSON API, for the purpose to scrape all target objective information. By the end of this article, you can master Pandas library and some new methods, and you can customize the script based on your business needs.

Chapter 9: Big Picture Matters, Get Data from Google Trends Using Easy2Digital API

In the previous Python Tutorial for digital marketers, I talked about leveraging Shopify APIs to scrape the competitors’ product feed and monitor up-to-trend products and pricing from there, for the purpose to adjust tactics and keep your business cutting-edge from the same selling marketplaces.

In a way, the product feed is not sufficient to understand the market demand, because you might not want to anchor a dot, where they’re selling low-demand products, or the product trend has been going down. Then, unfortunately, you test along with these sellers and lose money at the end, because you just follow without further step analyzing the big picture.

The big picture is critical in the business battleground, and the end consumer search trend is a key implication to tell you how’s the demand going out there, and what topics they are looking for.

In this piece of Python Tutorial, I would walk through how to pull search data from Google Trends API via Pytrends, then people can learn the data that is integrated with your in-house database, and identify the opportunities. By the end of this Python Tutorial, you can master how to install Pytrends and necessary modules, what available API methods and parameters you can leverage to scrape available data, and custom the data sheet based on actual needs.

Chapter 7: Manipulate Data in Google Sheets Using Easy2Digital APIs and Google Sheets Key

In the previous Python Tutorial, we talked about how to scrape more than 50 videos from a Youtube search query keyword, and also grab the performance of each video, such as view, comment, like, etc. However, it’s not the end of automation power, like saying you aim to research, filter Youtubers, and automate the collaboration invitation process. At least, the fetched list of Youtubers should be saved and managed in a datasheet on a cloud drive instead of in the CSV file, that can be set up and easily integrated with other platforms.

So in this Python Tutorial, I will continue to use the Python script from the Python Tutorial Chapter 6, and walk you through how to create a Robot user account, leverage Google Sheet API to save all fetched data in a Google Sheet In your web scraping python script. By the end of this Python Tutorial, you can learn what modules you need to set up, and experience just looking at a spreadsheet that is automatically listing all videos in a preset format.

Chapter 8: Build a Shopify Scraper to Fetch Competitor Webshop Product Data Using Easy2Digital APIs

Previously, I talked about how to store all fetched data in the Google sheet in your Google drive, which you are not limited to your local computer or laptop to open the data, or your partners can open it to work for projects anytime.

When talking about digital and eCommerce projects, the Shopify project must be one of the most popular topics which many eCommerce sellers like to discuss. If you don’t believe me, please just check out the Shopify stock pricing and ask your surroundings how many sellers are significantly benefiting from Shopify, thanks to easy-to-use and super agile features. The fragment eCommerce era has come, creating an online store is as easy as creating a blog.
So in this Python Tutorial, I’ll walk you through how to create a Python script and scrape any Shopify website product data just by changing a website URL. By the end of this piece, you can learn how to find Shopify storefront API and read Shopify JSON, then write the python codes for specific product information, such as title, price, compared price, etc, as long as Shopify product data includes in the JSON format.

Chapter 4: Create a Website Bot to Scrape Specific Website Data Using BeautifulSoup

As mentioned in the last chapter about “How to Write, Parse, Read CSV Files with Scraped Data”, we would discuss how to specify web data to scrape, because this is one of the key purposes of why we like to learn Python as a digital marketer.

So in this python tutorial for digital marketers 4, I’ll walk you through a basic concept, and methods with using beautifulsoup and requests you need to know to specify web data and scrape. It’s better if you understand how to read HTML, CSS, javascript in this part, but it’s totally okay if you haven’t yet, because the purpose is to find the data located at the moment and learn some methods to scrape specific data for digital marketing purpose.

During the lesson, I’ll take Ring.com as an example to write codes and scrape all the latest offers and pricing. By the end of the lesson, you can master identify where your expected data locate on a target page and scrape it all in minutes.

Chapter 3: Utilise CSV Module to Write, Parse, Read CSV Files to Manage Scraped Data

In the previous Python Tutorial for digital marketers 2, we talked about how to install beautifulsoup4, requests, lxml, html5lib and sublime text, and then scraping web data by them. But the data is not saved in a file or a database yet, so it’s not convenient for you to use for your business purpose and work operation.

So in this Python Tutorial,we would talk about how to write Python scripts to parse and save the data into CSV files in local, and read the CSV files in a Python environment.

By the end of this Python Tutorial, you can master what CSV read, parse and write methods you can use to open and save CSV files in a readable format, although we are not going to deep dive into a specific scraping methods script writing which we would talk about in the next chapter of Python Tutorial.

Chapter 1 – Ways to Install and Set up Python Environment in MacBook OS

Digital marketing has become so sophisticated and data-oriented in the present-day marketing field. The modern digital marketing strategies and on-going performance optimization are highly influenced by first-hand external data and deeper in-house data analytics. No matter what marketing position you are in, or what business model you’re running.

Too many application scenarios to illustrate the benefits of Python for digital markets, but the main goal of a successful digital marketing strategy is to achieve the marketing return on investment ROI greater and faster, which is impossible without using the technologies like Python for automating the process.

Chapter 2: Web Scraping with BeautifulSoup, Requests, Python

In the previous Python tutorial for digital marketers 1, we discussed what a digital marketer can benefit from Python super power, why she or he needs it, and how to install and set up the latest Python version for Mac OS. As you might be aware, one of the most essential Python benefits to digital marketers is to scrape web data and update the data automatically. 

So in this article, I’ll talk about how to set up an environment to write python scripts for the purpose of scraping objective website data. This article doesn’t go into details regarding Python methods introduction, code writing and feeding the data to a spreadsheet or database. I’ll release other articles and videos to walk through. But the purpose of this article is to let you understand the big picture of what components are necessary and how it works.

By the end of this article, you can master the installation of beautifulsoup4, requests, lxml, html5lib and sublime text, and how to scrape web data by them.

Chapter 83 – Ultimate Guide to Google Cloud Storage CRUD Using Python

File storages, such as videos, images etc might occupy the most proportion of most brand servers you might be using. Furthermore, cloud storage capability interaction with business applications is one of the most important items to assess if the cloud storage is excellent or not. 

Google Cloud storage pricing per GB is not pricey comparably. And its integration capability with applications are super flexible and friendly. Thus, in this piece, I would walk through Google cloud storage CRUD. By the end of this piece, you can refer and apply these methods to set up applications with Google cloud storage

Python Web Application: Multi Calculators – NPV, P&L, CLV

Web applications are easy to integrate with different applications, like we can combine different financial and marketing calculators today. The users can select the NPV, CLV or P&L calculator. Or they can generate the scraped data by selecting the eCommerce bots or social bot in one interface. Transformer is not just in the animation, you can leverage Python and flask to empower your Python web applications with more and more features and reduce the complex steps to get what you want.

Python Software & Application – How to Convert a Python Script Into Exe.

Python software and applications in the space of finance, automatic data collection, content ideas, and business development are very powerful. I believe anyone would get addicted to that magic of saving time and providing you more back-pushing energy. From day one, I have shared a bunch of raw scripts which you can run by using sublime editor, however, that’s just the beginning. Believe it or not, you do want an App Icon to click on your Mac laptop or a sharable application. And this is why the value of this article is.

Chapter 81 – Convert Python Web App to Images & Push Them Live using Docker

For more resiliently, flexibly, and cost-efficiently deploying and running Web apps, app containerization is one of the most popular approaches worldwide. In this article, I would walk through how to leverage Docker to convert a Python Flask web app to images and push them to the Docker hub. By the end of this piece, you can learn to deploy live images on the Docker hub which would be used to connect with Cloud.

Chapter 80 – LangChain Chat AI Model Applied in AI Web App & DApp for Beginners

Standalone AI models and apps like OpenAI, Google Vertex AI PaLM  are just a starting point. From my perspective, how to apply diversified models to streamline and boost productivity of a workflow in every specialising area is the key step to be able to really leverage their beauty and sex pros.

In this piece, I would walk through briefly regarding LangChain Chat AI models for beginners who are interested in building your owned chain model in marketing, for the purpose to boost productivity and create continuous hits in your game.

Chapter 79 – Leverage Flask Session Cookies Reducing Server side Resource Using Python

Optimizing cost from server side is a forever inevitable discussion Also, increasing the loading speed is indispensable. Both of them are incredibly important in all Web3, Web App or AI projects. I assume you are a fan of Python and love making apps with its seasoning. So, this piece can fit your stomach. I would walk through briefly up-to-date scripts to display content using Python and Flask sessions.

Chapter 68 – Build a Keyword Extractor Using Easy2Digital APIs

Keyword extraction using website URLs facilitates you to fast learn about a new brand from scratch rather than reading through all information. There are many tools out there, nevertheless, it’s expensive to subscribe or the free tool is not user-friendly, notably not providing APIs for you to integrate with your business dashboard.

In this article, I would try using Easy2Digital APIs – brand footprint scraper and Google SERP scraper, to build a keyword extractor.

Chapter 61 – Financial Statement Scraper for Common-Sizing Comparison Using Easy2Digital APIs

Financial statements provide investors and marketers the raw data that can facilitate them to calculate, and compare the underlying value of companies. Moreover, notably revenue growth, capital expenses, and debt growth YoY do directly imply if a company is in a good shape or not, for example, market share is shrinking, or probably the cost of debt might rise, or it’s a potential one to invest because of current aggressive investment in new product development.

In this article, I’m going to work through how to build a financial statement bot using Easy2Digital API and Python. By the end of this piece, you can use the Python script and Easy2Digital API to obtain specific listed companies’ financial statement data, or leverage it to automate your dashboard update.