Least-Square Multiple Regression
 Least-Square Multiple Regression
 Statistics Set Theory Permutations & Combinations Probability Theory Distributions Reliability Least Squares Line Parabola Polynomials Multiple Regression Resources Bibliography

 Home Membership Magazines Forum Search Member Calculators
 Materials Design Processes Units Formulas Math
 Multiple Regression Multiple regression estimates the outcomes (dependent variables) which may be affected by more than one control parameter (independent variables) or there may be more than one control parameter being changed at the same time. An example is the two independent variables and and one dependent variable in the linear relationship case: For a given data set , , ..., , where , the best fitting curve has the least square error, i.e., Please note that , , and are unknown coefficients while all , , and are given. To obtain the least square error, the unknown coefficients , , and must yield zero first derivatives. Expanding the above equations, we have The unknown coefficients , , and can hence be obtained by solving the above linear equations.