Automation

Chapter 86 – Tips to Create AMP Pages for Web App using Python, HTML, CSS, JS

AMP page weight has been downsized after Google has sunset AMP icon in SERP and the function of automatically redirecting traffic to target website page in mobile end. However, it’s still one of the most important snippets to increase exposure and grow traffic from search and discover channels as you might be aware from the search console panel..

So in this piece, I would briefly walk through how to create AMPs for your web app mobile side users usng Python as the sample. By the end of this piece, you can learn the core components and are able to start your own projects using the module in this chapter.

Chapter 87 – Interact with Google Big Query ML Pre-trained Model Dataset Using Python

Pre-trained machine learning model is getting popular with more and more LLM launched in the market, which is accepted by more users who are adopting those to boost work efficiency. As this trend moves forward, customized machine learning models might be needed with more demands. Therefore, this article would briefly walk through how an AI app can interact with a big dataset using Python for the purpose to provide pre-trained model functions to users.

Python 88 – Guide to Tokenize AI Prompt to Save Cost & Generate Better Result

Generative AI API is not free as well as search engines to provide information resources, although the pricing varies by service providers. Some are not pricey, however, some Generative AI service providers charge more and it’s not cheap, although it also depends on how users optimize their prompt.

In this article, I’ll walk through an approach of tokenized context materials to save cost and deliver better results using any Generative AI API. There are two sections in this article. First is a video to walk through the whole process, and the other section is to elaborate things related to modules, APIs, experimental comparison results between using tokenized context and non-tokenized context.

Chapter 61 – Financial Statement Scraper for Common-Sizing Comparison Using Easy2Digital APIs

Financial statements provide investors and marketers the raw data that can facilitate them to calculate, and compare the underlying value of companies. Moreover, notably revenue growth, capital expenses, and debt growth YoY do directly imply if a company is in a good shape or not, for example, market share is shrinking, or probably the cost of debt might rise, or it’s a potential one to invest because of current aggressive investment in new product development.

In this article, I’m going to work through how to build a financial statement bot using Easy2Digital API and Python. By the end of this piece, you can use the Python script and Easy2Digital API to obtain specific listed companies’ financial statement data, or leverage it to automate your dashboard update.

Chapter 63 – Company Financial News Scraper for the Top-down Analysis Using Easy2Digital News API

Industry and company up-to-date news helps you keep pace with the stock company you invested. Nevertheless, it is time consuming if you check one by one out of your investment dashboard. I daresay you feel excited if your stock dashboard can automatically refresh the company news every day using a financial news scraper. As a result, you can make decision always along with top-down industry and company news.

In this article, I’m going to introduce Easy2Digital financial news API 1.0 with the API script sample. With this API, you can build a financial news scraper for your specific collection of company symbols and integrate it with your stock monitor.

Chapter 68 – Build a Keyword Extractor Using Easy2Digital APIs

Keyword extraction using website URLs facilitates you to fast learn about a new brand from scratch rather than reading through all information. There are many tools out there, nevertheless, it’s expensive to subscribe or the free tool is not user-friendly, notably not providing APIs for you to integrate with your business dashboard.

In this article, I would try using Easy2Digital APIs – brand footprint scraper and Google SERP scraper, to build a keyword extractor.

Chapter 69 – Build A Flask User Login System Using PyMongo

In this article, I would walk through how to use one of the MongoDB python modules – PyMongo to build a Flask user login system. The integration and development logic is similar to the one using the SQLALCHEMY database, however, there are some configuration and setting details that are unique and different from using MongoDB. 

If you like to use the non-SQL database to build your Flask user login system and manage your customer database, this piece would be your cup of tea.

Chapter 70 – Build a Discord Bot Using Python, Hikari, Lightbulb, MongoDB

Discord is a social media platform where large numbers of people can interact in the form of a community. Currently, it’s the most harmonious platform for NFT projects as they’re community-oriented. Furthermore, the platforms facilitate NFTs trading or brand NFT marketing through their freedom communication features and integration space with Crypto wallets, such as Collabland, Metamask. Last but not least, it also facilitates NFT gating in private communities.

In this article, I would walk through how to build a discord bot to further increase communication and navigation efficiency using Python, MongoDB, Hikari, and Lightbulb.

JSON vs YAML, Which Data Serialization Is Better?

Any developers or programmers, or even marketers would not feel strange to a data type – JSON. It’s one of the most popular and awesome data serialization language. In fact, there is an alternative called YAML. I believe people who is famiilar with Google ads API must know this data type. In this Python knowledge hub, I would elaborate what are their pros and cons respectively, and how you can better leverage them as a developer and marketer.

Chapter 71 – Build Online Shop Product Detail Pages or PDP Using Flask, Javascript, Bootstrap

Product detail pages are the main functional substances for several critical purposes in business operations. Customers learn about your product and IWOM from PDPs, and it has a high weight to make customers determine whether to check out or not eventually. Online marketplace and social commerce platforms like Google Shopping, and TikTok shop open API integration for merchants to list products and generate organic traffic by connecting with PDP data feed. Marketers put effort to optimise the product conversion rate to increase investment ROI as PDP is one of the most important parts of the conversion funnel.

In this piece, I would attempt to go through how to build product detail pages in bulk using flask, Javascript, and Python from a Python developer perspective. If you are interested in building PDPs using flask, this piece would put you in the right shoes.

Chapter 72 – Build a Blog Content Generator Using OpenAI GPT3 and Easy2Digital API

ChatGPT has been in the spotlight recently. OpenAI GPT3 has been launched since 2020. So in a way, I kind of feel it might be relevant to external factors of the cost of capital and the cost of debt rising which might tighten the business’s operational cost now or this situation might last for a while moving forward. It’s a sort of good timing for AI. Capital-driven is always able to smell the project opportunities in hands.

In this article, I would walk through how to build a blog content generator using OpenAI GPT3 and Easy2Digital APIs to automatically generate blog content in Google Sheets.

Chapter 74 – Flask App Dynamic Sitemap XML Using MongoDB

What a freezing day is like if you build a sitemap and update time by time manually. Or paying for a recurring monthly subscription for just a second work one-off is not a smart decision as well. 

If you are looking for a better way like the feeling of eating better marbling beef meat, this piece is here for you. This article is going to talk about developing dynamic sitemap xml for your Flask App. Let’s go!

Google Sheets ImportXML – Automatically Scrape Web and Collect Product Price Info

I am always on the lookout for a unique angle to use freely available or potentially scrapable data sources. Also, It’s indeed frustrating that you have, admittedly, spent hours upon hours trying to learn Python for writing simple web scraper applications, and implementing web scraping automatically, however, at the end you can only discover the data isn’t accessible, interesting or differentiated enough from what’s already out there.

If you just want to automate updating the profit calculator of your eCommerce business, thankfully, there is an easier way to collect data from the web without spending that much hours: Google Sheets ImportXML Function.

Python Robotic Process Automation – Def Functions, Import Custom Modules to Create a Multi-functional Bot

Robotic process automation, or RPA is not only a technology, but also it’s a vital mindset to think of how to leverage bots that can do things like understand what’s on a screen, complete the right keystrokes, navigate systems, identify and extract data, and perform a wide range of defined actions. In Python, the def function is one of the key components to deploy, and create multi-functional bot to complete a whole process of task. So this is why the article is about today.

Chapter 75 – CRUD Notion Page Content Using Notion API & Python

ChatGPT is super blink recently because it transforms our life and work style upside down. Likey I am incline to use OpenAI API GPT3 and GPT 3.5 API as it can totally automate my life and off load repetitive work that wastes time but is critical. Using both of them is just like eating a piece of buttery and marbling rich Wagyu beef.

In fact, we never only have one option. Between API and AI chatbot, Notion AI which is using Anthropic’s Claude generative AI model can perfectly provide you semi-auto AI experience with its AI writing and API capability. Although it’s not as crunch as GPT 3, it impresses users with clear, thin and straight using experience.

In this piece, I would walk you through how to retrieve Notion AI content from Notion private page, and update new content using Notion API.

Chapter 79 – Leverage Flask Session Cookies Reducing Server side Resource Using Python

Optimizing cost from server side is a forever inevitable discussion Also, increasing the loading speed is indispensable. Both of them are incredibly important in all Web3, Web App or AI projects. I assume you are a fan of Python and love making apps with its seasoning. So, this piece can fit your stomach. I would walk through briefly up-to-date scripts to display content using Python and Flask sessions.

Chapter 80 – LangChain Chat AI Model Applied in AI Web App & DApp for Beginners

Standalone AI models and apps like OpenAI, Google Vertex AI PaLM  are just a starting point. From my perspective, how to apply diversified models to streamline and boost productivity of a workflow in every specialising area is the key step to be able to really leverage their beauty and sex pros.

In this piece, I would walk through briefly regarding LangChain Chat AI models for beginners who are interested in building your owned chain model in marketing, for the purpose to boost productivity and create continuous hits in your game.

Chapter 81 – Convert Python Web App to Images & Push Them Live using Docker

For more resiliently, flexibly, and cost-efficiently deploying and running Web apps, app containerization is one of the most popular approaches worldwide. In this article, I would walk through how to leverage Docker to convert a Python Flask web app to images and push them to the Docker hub. By the end of this piece, you can learn to deploy live images on the Docker hub which would be used to connect with Cloud.

Chapter 2: Web Scraping with BeautifulSoup, Requests, Python

In the previous Python tutorial for digital marketers 1, we discussed what a digital marketer can benefit from Python super power, why she or he needs it, and how to install and set up the latest Python version for Mac OS. As you might be aware, one of the most essential Python benefits to digital marketers is to scrape web data and update the data automatically. 

So in this article, I’ll talk about how to set up an environment to write python scripts for the purpose of scraping objective website data. This article doesn’t go into details regarding Python methods introduction, code writing and feeding the data to a spreadsheet or database. I’ll release other articles and videos to walk through. But the purpose of this article is to let you understand the big picture of what components are necessary and how it works.

By the end of this article, you can master the installation of beautifulsoup4, requests, lxml, html5lib and sublime text, and how to scrape web data by them.

1 2 5