top of page

A Beginner's Guide to Machine Learning Regression Analysis

Examples, diagrams, animations, and cheat sheets are used to illustrate regression analysis.

CONTEXT-

Consider the following basic example to gain a better understanding of the inspiration behind regression. From 2001 to 2012, the scatter plot below displays the number of college graduates in the United States.


What if someone asked you, based on the available data, how many college graduates with master's degrees will there be in 2018? The number of college graduates with master's degrees increases almost linearly with the passing of each year. So, based on a quick visual review, we can estimate the number to be between 2.0 and 2.1 million. Let's take a peek at the numbers. From 2001 to 2018, the same variable is plotted in the graph below. As can be shown, our expected value was within a few percent of the actual value.

Our mind was easily able to solve the problem because it was a simple one (fitting a line to data). Regression analysis is the method of fitting a feature to a collection of data points.


What is Regression Analysis and How Does It Work?

The method of estimating the relationship between a dependent variable and independent variables is known as regression analysis. To put it another way, it means fitting a function from a selected family of functions to the sampled data thus accounting for any error. Regression analysis is one of the most basic methods for estimation in the field of machine learning. You fit a feature to the available data and try to predict the outcome in the future or for hold-out datapoints using regression. This function-fitting is beneficial in two ways.

  1. Within your data set, you can estimate missing data (Interpolation)

  2. Outside of your data set, you can make educated guesses about future data (Extrapolation)

Predicting the price of a house based on house features, predicting the effect of SAT/GRE scores on college admissions, predicting sales based on input parameters, predicting the weather, and so on are some real-world examples of regression analysis.

Let's go back to the college graduates example from earlier.


Interpolation:

Let's pretend we have access to some sparse info, such as the number of college graduates every four years, as depicted in the scatter plot below.


For the years in between, we'd like to estimate the number of college graduates. We can do this by fitting a line to the few data points we have. Interpolation is the term for this procedure.

Extrapolation:

Say we only have minimal data from 2001 to 2012 and want to forecast the number of college graduates from 2013 to 2018.

The number of college graduates with master's degrees increases almost linearly with the passing of each year. As a result, fitting a line to the dataset makes sense. It can be shown that the prediction is very close by using the 12 points to match a line and then testing the line's prediction on the future 6 points.

In terms of mathematics,

Regression analysis types-

Let's take a look at some of the various approaches to regression. We may categorise regression into the following groups based on the family of functions (f beta) and the loss function (l) used.

1. Linear Regression

The aim of linear regression is to minimise the amount of mean-squared error for each data point in order to suit a hyperplane (a line for 2D data points).

In terms of mathematics, linear regression solves the following issue:

As a result, we must identify two beta-denoted variables that parameterize the linear function f. (.). Figure 4 shows an example of linear regression with a P value of 5. The equipped linear function with beta 0 = -90.798 and beta 1 = 0.046 is also shown in the figure.

2. Polynomial Regression

Polynomial Regression is the second form of regression.

The relationship between the dependent (y) and independent (x) variables is assumed to be linear in linear regression. When the relationship between the data points is not linear, it fails to match them. Instead of fitting a linear regression model to the data points, polynomial regression fits a polynomial of degree m to the data points. The more complex the role under consideration, the better its fitting capabilities (in general). Polynomial regression solves the following equation mathematically.

As a result, we must locate (m+1) variables denoted by beta 0,...,beta m. Linear regression is a special case of polynomial regression with degree 2 as can be shown.

Consider the scatter plot of the following series of data points. We get a fit that simply fails to estimate the data points when we use linear regression. However, we get a much better fit when we use polynomial regression with degree 6, as shown below.

1
2

3

Image- [1] Scatter plot of data — [2] Linear regression on data — [3] Polynomial regression of degree 6

Linear regression struggled to estimate a good fitting function since the data points did not have a linear relationship between the dependent and independent variables. Polynomial regression, on the other hand, was able to capture the non-linear relationship.

3. Ridge Regression

In regression analysis, ridge regression solves the problem of overfitting. Consider the same example as before to see what I mean. When a polynomial of degree 25 is fitted to the data with 10 training points, the red data points are perfectly matched (center figure below). However, it does so at the expense of other points in the centre (spike between last two data points). This is depicted in the diagram below. Ridge regression is an attempt to solve this problem. By sacrificing the fit on the training points, it seeks to reduce the generalisation error.


1
2

3

mage- [1] Scatter plot of data — [2] Polynomial regression of degree 25— [3] Polynomial Ridge regression of degree 25

Ridge regression solves the following problem mathematically by changing the loss function.