The common basis for linear as well as multiple regression is the minimization of the sum of squared errors (SSE) between the experimentally observed values and the values predicted by the model, as shown in equation 4.19:
In this equation, yi is the observed value, and f(xi) is the predicted value based on the presumed function f. The function can be linear in a single variable (generally, what is implied by the term linear regression), linear in multiple variables (multiple regression), or polynomial (polynomial regression). Minimization of SSE yields values of model parameters (slope and intercept for a linear function, for example) in terms of the observed data points (xi, yi). The least squares regression formulas are built into many software programs.
Leave a Reply