1) Regresyon terimi , olağan basit doğrusal regresyon modelinde:
y=α+βx+ϵ
yxy^y¯xx¯
|y^−y¯|/sy<|x−x¯|/sx
For example if we use the BOD data frame built into R then:
fm <- lm(demand ~ Time, BOD)
with(BOD, all( abs(fitted(fm) - mean(demand)) / sd(demand) < abs(scale(Time))))
## [1] TRUE
For a a proof see: https://en.wikipedia.org/wiki/Regression_toward_the_mean
2) The term on comes from the fact that the fitted values are the projection of the outcome variable onto the subspace spanned by the predictor variables (including the intercept) as further explained in many sources such as http://people.eecs.ku.edu/~jhuan/EECS940_S12/slides/linearRegression.pdf .
Note
Regarding the comment below, what the commenter is stating is what the answer already states above in formula form except that the answer states it correctly. In fact, due to the equality:
(y^−y¯)=β^(x−x¯)
the dependent variable is not necessarly on average closer to its mean than the predictor is to its mean unless |β|<1 . What is true is that the dependent variable is on average fewer standard deviations from its mean than the predictor is to its as stated in the formula in the answer.
Using Galton's data to which the comment refers (which is available in the UsingR package in R) I ran the regression and in fact the slope is 0.646 so the average child was closer to its mean than its parent was to its but that is not the general case. The current usage of regression to the mean is based on the correct general relationship which we showed in the answer. In the example shown in the R code in the answer above beta>1 so it is not true that the demand is necessarily closer to the mean demand than the Time is to the mean Time and we can readily check numerically in this example that it is not always closer. It is only true if we measure closeness in standard deviations as the inequality in the answer shows.