Site icon EASY2DIGITAL

Data Science

Pandas Pivot Table() – Transpose Data in Column Sequence to Horizontal One in Bulk Using Pandas & Python

Pandas pivot_table() is super powerful for developers to manipulate the data, such as data visualization, data inventory, API development, etc. In terms of dashboard development or data visualization, transposing specific data objectives from column order to row sequence is very common. So in this article, I’ll go through how to transpose specific data in bulk in a second using Pandas pivot_table() and Python

Tips for Data Preprocessing Using Python and Scikit Learn

Data is the blood of machine learning, but it’s not the quantity of data. So, proper and optimal data preprocessing is super important before starting developing machine learning models. So in this piece, I would walk through 1 plus 2 critical data preprocessing steps using Python and Scikit Learn. By the end of this piece, you can start working out on your own machine learning, data analysis projects with practical tips and tricks

Correlation Between Ticker Price and NASDAQ Price Prediction Using Python and Scikit Linearregression Model

Determination is likely being affected by one variant to one variable, or one variant to multiple variables. Machine makes decision based on maths. So in this article, I would walk through how to generate a price prediction score between a stock ticker and NASDAQ price correlation. I would show the methods using Python and Scikit Linear Regression model.

Pandas Groupby() – Combine all Values Into One Set Shared with the Same Index Key Using Python

In this piece, I will share introduce Pandas GroupBy(), and go through how to combine the value into one set with a shared key, or column value. For example, if your Google advertising campaign name is shared with different data sets such as data from daily, weekly or monthly, and so on and so forth, here is a way to consolidate them into one set for easy fetching, using, and applying them in web application interactions.

JSON vs YAML, Which Data Serialization Is Better?

Any developers or programmers, or even marketers would not feel strange to a data type – JSON. It’s one of the most popular and awesome data serialization language. In fact, there is an alternative called YAML. I believe people who is famiilar with Google ads API must know this data type. In this Python knowledge hub, I would elaborate what are their pros and cons respectively, and how you can better leverage them as a developer and marketer.

Chapter 76 – Generate the Object Feature Importance Using Scikit learn and Random Forest

The random forest algorithm has been applied across a number of industries, allowing them to make better business decisions. Some use cases include high credit risk analysis and product recommendation for cross-sell purposes.

In this piece, I would briefly walk you through several methods of generating feature importance by using classic red wine quality validator dataset. By the end of this chapter, you can have a basic concept to use Random forest applied to your projects and compare the result amongst different methods.

Chapter 37 – Brand Info Bot for Scraping Brand Web Domains with Python and Easy2Digital APIs

An objective-oriented scraping project consists of many standalone Python bot scripts which can connect and function together. One of the most useful data used to scrape potential leads’ data must be the brand web domains. Basically we learn and know a brand from there. The question is how we are able to automatically grab in bulk instead of using Google search. This article tells how to make a bot with Python, Clearbit and Sqlite3

Chapter 7: Manipulate Data in Google Sheets Using Easy2Digital APIs and Google Sheets Key

In the previous Python Tutorial, we talked about how to scrape more than 50 videos from a Youtube search query keyword, and also grab the performance of each video, such as view, comment, like, etc. However, it’s not the end of automation power, like saying you aim to research, filter Youtubers, and automate the collaboration invitation process. At least, the fetched list of Youtubers should be saved and managed in a datasheet on a cloud drive instead of in the CSV file, that can be set up and easily integrated with other platforms.

So in this Python Tutorial, I will continue to use the Python script from the Python Tutorial Chapter 6, and walk you through how to create a Robot user account, leverage Google Sheet API to save all fetched data in a Google Sheet In your web scraping python script. By the end of this Python Tutorial, you can learn what modules you need to set up, and experience just looking at a spreadsheet that is automatically listing all videos in a preset format.

Exit mobile version