# Regression

From AMS Glossary

(Redirected from Predictor)

## regression

The statistical counterpart or analogue of the functional expression, in ordinary mathematics, of one variable in terms of others.

A random variable is seldom uniquely determined by any other variables, but it may assume a unique mean value for a prescribed set of values of any other variables. The variate

*y*is statistically dependent upon other variates*x*_{1},*x*_{2}, · · ·,*x*_{n}when it has different probability distributions for different sets of values of the*x***s. In that case its mean value, called its conditional mean, corresponding to given values of the****x**s will ordinarily be a function of the*x***s. The regression function****Y**s, of the conditional mean of*of*y*with respect to*x*x*_{1},*x*_{2}, · · ·,*n*_{}*is the functional expression, in terms of the*x*y*. This is the basis of statistical estimation or prediction of*y*for known values of the*x***s. From the definition of the regression function, we may deduce the following fundamental properties: where σ**^{2}(**w**s are called predictors. When it is necessary to resort to an approximation*) denotes the variance of any variate*w*, and*E*(*w*) denotes the expected value of*w*. The variate*y*is called the regressand, and the associated variates*x*x*_{1},*x*_{2}, · · ·,*n*_{}*are called regressors; or, alternatively,*y*is called the predictand, and the*x*Y*′ of the true regression function*Y*, the approximating function is usually expanded as a series of terms*Y*_{1},*Y*_{2}, · · ·,*Y*_{m}, each of which may involve one or more of the basic variates*x*_{1},*x*_{2}, · · ·,*x*_{n}. By extension of the original definitions, the component functions*Y*_{1},*Y*_{2}, · · ·,*Y*_{m}are then called regressors or predictors. Various quantities associated with regression are referred to by the following technical terms: The variance σ^{2}(*y*) of the regressand is called the total variance. The quantity*y*-*Y*is variously termed the residual, the error, the error of estimate. Its variance σ^{2}(*y*-*Y*) is called the unexplained variance, the residual variance, the mean-square error; and its positive square root σ(*y*-*Y*) is called the residual standard deviation, the standard error of estimate, the standard error, the root-mean-square error. The variance σ^{2}(*Y*) of the regression function is called the explained variance or the variance reduction; the ratio σ^{2}(*Y*)/σ^{2}(*y*) of explained to total variance is called the relative reduction, or, expressed in percent, the percent reduction.