Chapter 37 – Brand Info Bot for Scraping Brand Web Domains with Python and Easy2Digital APIs
An objective-oriented scraping project consists of many standalone Python bot scripts which can connect and function together. One of the most useful data used to scrape potential leads’ data must be the brand web domains. Basically we learn and know a brand from there. The question is how we are able to automatically grab in bulk instead of using Google search. This article tells how to make a bot with Python, Clearbit and Sqlite3
Chapter 46: Data Converters to convert CSV to SQL, SQL to CSV, Google Sheets to SQL
Data converters help convert data inventory between different formats into an expected format you like to use, such as SQL, CSV, JSON, XML, etc. If you are looking for ways to monetize data by selling contactable data like B2B prospects through an API or a SaaS, I believe this piece can help manage your data inventory in SQL and CSV.
Chapter 48: JSON, XML Converters to CSV, SQL, Google Sheets Data into JSON, XML
In this python tutorial, I’ll walk you through how to create a script that can convert CSV, SQL or Google Sheets data into JSON or XML. The main modules of this tutorial are the JSON and CSV.
Stock Daily Pricing Visualization Using yFinance, mplfinance, Pandas
In this article, I would walk you through how to visualize ticker stock daily pricing and volume using yFinance, mplfinance and Pandas
Pandas Pivot Table() – Transpose Data in Column Sequence to Horizontal One in Bulk Using Pandas & Python
Pandas pivot_table() is super powerful for developers to manipulate the data, such as data visualization, data inventory, API development, etc. In terms of dashboard development or data visualization, transposing specific data objectives from column order to row sequence is very common. So in this article, I’ll go through how to transpose specific data in bulk in a second using Pandas pivot_table() and Python
Tips for Data Preprocessing Using Python and Scikit Learn
Data is the blood of machine learning, but it’s not the quantity of data. So, proper and optimal data preprocessing is super important before starting developing machine learning models. So in this piece, I would walk through 1 plus 2 critical data preprocessing steps using Python and Scikit Learn. By the end of this piece, you can start working out on your own machine learning, data analysis projects with practical tips and tricks
Pandas Set_Index() and Transpose() – Convert a Column Value into a Row Using Pandas and Python
Using Pandas to manipulate the data is a set of fundamental skills used in so many applications. This article shares how to convert a column into a row using Set_index().T given by Pandas. By the end of this piece, you can learn skills applied to data visualization, API development and some sections of machine learning.
Stock Portfolio Trend Visualization Using Python, matplotlib
Previously I shared the way to visualise daily pricing in a candle type data format, here I would walk through how to visualise a collection of stock portfolio in a time series data format.
Correlation Between Ticker Price and NASDAQ Price Prediction Using Python and Scikit Linearregression Model
Determination is likely being affected by one variant to one variable, or one variant to multiple variables. Machine makes decision based on maths. So in this article, I would walk through how to generate a price prediction score between a stock ticker and NASDAQ price correlation. I would show the methods using Python and Scikit Linear Regression model.
Build a Pricing Prediction Model Using Python, ScikitLearn, Linear Regression
In this piece, I would walk you through brieflyf how to predict a variant pricing based on having considered multiple variables that might be correlated to the pricing change. By the end of this piece, you can apply this module to your business actual cases using Python and Scikit learn for generating a score to predict the pricing.
Pandas Groupby() – Combine all Values Into One Set Shared with the Same Index Key Using Python
In this piece, I will share introduce Pandas GroupBy(), and go through how to combine the value into one set with a shared key, or column value. For example, if your Google advertising campaign name is shared with different data sets such as data from daily, weekly or monthly, and so on and so forth, here is a way to consolidate them into one set for easy fetching, using, and applying them in web application interactions.
JSON vs YAML, Which Data Serialization Is Better?
Any developers or programmers, or even marketers would not feel strange to a data type – JSON. It’s one of the most popular and awesome data serialization language. In fact, there is an alternative called YAML. I believe people who is famiilar with Google ads API must know this data type. In this Python knowledge hub, I would elaborate what are their pros and cons respectively, and how you can better leverage them as a developer and marketer.
Chapter 76 – Generate the Object Feature Importance Using Scikit learn and Random Forest
The random forest algorithm has been applied across a number of industries, allowing them to make better business decisions. Some use cases include high credit risk analysis and product recommendation for cross-sell purposes.
In this piece, I would briefly walk you through several methods of generating feature importance by using classic red wine quality validator dataset. By the end of this chapter, you can have a basic concept to use Random forest applied to your projects and compare the result amongst different methods.
Chapter 7: Manipulate Data in Google Sheets Using Easy2Digital APIs and Google Sheets Key
In the previous Python Tutorial, we talked about how to scrape more than 50 videos from a Youtube search query keyword, and also grab the performance of each video, such as view, comment, like, etc. However, it’s not the end of automation power, like saying you aim to research, filter Youtubers, and automate the collaboration invitation process. At least, the fetched list of Youtubers should be saved and managed in a datasheet on a cloud drive instead of in the CSV file, that can be set up and easily integrated with other platforms.
So in this Python Tutorial, I will continue to use the Python script from the Python Tutorial Chapter 6, and walk you through how to create a Robot user account, leverage Google Sheet API to save all fetched data in a Google Sheet In your web scraping python script. By the end of this Python Tutorial, you can learn what modules you need to set up, and experience just looking at a spreadsheet that is automatically listing all videos in a preset format.