Logistic regression
Logistic regression, also known as logit regression or logit model, is a mathematical model used in statistics to estimate (guess) the probability of an event occurring having been given some previous data. Logistic regression works with binary data, where either the event happens (1) or the event does not happen (0). So given some feature x it tries to find out whether some event y happens or not. So y can either be 0 or 1. In the case where the event happens, y is given the value 1. If the event does not happen, then y is given the value of 0. For example, if y represents whether a sports team wins a match, then y will be 1 if they win the match or y will be 0 if they do not. This is known as binomial logistic regression. There is also another form of logistic regression which uses multiple values for the variable y. This form of logistic regression is known as multinomial logistic regression.
Logistic regression uses the logistic function to find a model that fits with the data points. The function gives an 'S' shaped curve to model the data. The curve is restricted between 0 and 1, so it is easy to apply when y is binary. Logistic regression can then model events better than linear regression, as it shows the probability for y being 1 for a given x value. Logistic regression is used in statistics and machine learning to predict values of an input from previous test data.
Basics
Logistic regression is an alternative method to use other than the simpler linear regression. Linear regression tries to predict the data by finding a linear – straight line – equation to model or predict future data points. Logistic regression does not look at the relationship between the two variables as a straight line. Instead, Logistic regression uses the natural logarithm function to find the relationship between the variables and uses test data to find the coefficients. The function can then predict the future results using these coefficients in the logistic equation.
Logistic regression uses the concept of odds ratios to calculate the probability. This is defined as the ratio of the odds of an event happening to its not happening. For example, the probability of a sports team to win a certain match might be 0.75. The probability for that team to lose would be 1 – 0.75 = 0.25. The odds for that team winning would be 0.75/0.25 = 3. This can be said as the odds of the team winning are 3 to 1.[1]
The odds can be defined as:
[math]\displaystyle{ Odds = {P(y=1|x) \over 1-P(y=1|x)} }[/math]
The natural logarithm of the odds ratio is then taken in order to create the logistic equation. The new equation is known as the logit:
[math]\displaystyle{ Logit(P(x)) = \ln \left ( {P(y = 1|x) \over 1 - P(y = 1|x) } \right) }[/math]
In Logistic regression the logit of the probability is said to be linear with respect to x, so the logit becomes:
[math]\displaystyle{ Logit(P(x)) = a + bx }[/math]
Using the two equations together then gives the following:
[math]\displaystyle{ {P(y = 1|x) \over 1 - P(y = 1|x)} = e^{a+bx} }[/math]
This then leads to the probability:
[math]\displaystyle{ P(y=1|x) = {e^{a+bx} \over 1 + e^{a+bx}} = {1 \over 1 + e^{-(a + bx)}} }[/math] [2]
This final equation is the logistic curve for Logistic regression. It models the non-linear relationship between x and y with an ‘S’-like curve for the probabilities that y =1 - that event the y occurs. In this example a and b represent the gradients for the logistic function just like in linear regression. The logit equation can then be expanded to handle multiple gradients. This gives more freedom with how the logistic curve matches the data. The multiplication of two vectors can then be used to model more gradient values and give the following equation:
[math]\displaystyle{ Logit(P(x)) = w_0x^0 + w_1x^1 + w_2x^2 + ... + w_nx^n = w^Tx }[/math]
In this equation w = [ w0 , w1 , w2 , ... , wn ] and represents the n gradients for the equation. The powers of x are given by the vector x = [ 1 , x , x2 , .. , xn ] . These two vectors give the new logit equation with multiple gradients. The logistic equation then can then be changed to show this:
[math]\displaystyle{ P(y=1|x) = {1 \over 1 + e^{-(w^Tx)}} }[/math]
This is then a more general logistic equation allowing for more gradient values.
Logistic Regression Media
Example graph of a logistic regression curve fitted to data. The curve shows the probability of passing an exam (binary dependent variable) versus hours studying (scalar independent variable). See § Example for worked details.
The image represents an outline of what an odds ratio looks like in writing, through a template in addition to the test score example in the "Example" section of the contents. In simple terms, if we hypothetically get an odds ratio of 2 to 1, we can say... "For every one-unit increase in hours studied, the odds of passing (group 1) or failing (group 0) are (expectedly) 2 to 1 (Denis, 2019).
Comparison of logistic function with a scaled inverse probit function (i.e. the CDF of the normal distribution), comparing \sigma(x) vs. \Phi(\sqrt{\frac{\pi}{8}}x), which makes the slopes the same at the origin. This shows the heavier tails of the logistic distribution.
References
- ↑ "What is Logistic Regression? - University of Strathclyde". Archived from the original on 2015-05-08. Retrieved 2015-05-01.
- ↑ "Logistic Regression". faculty.cas.usf.edu.