Linear regression![]() Linear regression is a way to look at how something changes when other things change using math. A linear regression uses a dependent variable and one or more explanatory variables to create a straight line. This straight line is known as a "line of regression". Linear regression was the first of many ways of performing regression analysis. This is because models which depend linearly on their unknown parameters are easier to fit than models which are non-linearly related to their parameters. Another advantage of linear regression is that the statistical properties of the resulting estimators are easier to determine. Linear regression has many practical uses. Most applications fall into one of the following two broad categories:
Linear regression models try to make the vertical distance between the line and the data points (that is, the residuals) as small as possible.[2] This is called "fitting the line to the data." Often, linear regression models try to minimize the sum of the squares of the residuals (least squares), but other ways of fitting exist.[3] They include minimizing the "lack of fit" in some other norm (as with least absolute deviations regression), or minimizing a penalized version of the least squares loss function as in ridge regression. The least squares approach can also be used to fit models that are not linear. As outlined above, the terms "least squares" and "linear model" are closely linked, but they are not synonyms. UsageEconomicsLinear regression is the main analytical tool in economics. For example, it is used to guess consumption spending,[4] fixed investment spending, inventory investment, purchases of a country's exports,[5] spending on imports,[5] the demand to hold liquid assets,[6] labor demand[7] and labor supply.[7] Related pagesReferences
|
Portal di Ensiklopedia Dunia