|Curve Fitting, Regression
Field data is often accompanied by noise. Even though all control parameters (independent variables) remain constant, the resultant outcomes (dependent variables) vary. A process of quantitatively estimating the trend of the outcomes, also known as regression or curve fitting, therefore becomes necessary.
The curve fitting process fits equations of approximating curves to the raw field data. Nevertheless, for a given set of data, the fitting curves of a given type are generally NOT unique. Thus, a curve with a minimal deviation from all data points is desired. This best-fitting curve can be obtained by the method of least squares.
|The Method of Least Squares
The method of least squares assumes that the best-fit curve of a given type is the curve that has the minimal sum of the deviations squared (least square error) from a given set of data.
Suppose that the data points are , , ..., where is the independent variable and is the dependent variable. The fitting curve has the deviation (error) from each data point, i.e., , , ..., . According to the method of least squares, the best fitting curve has the property that:
|Polynomials Least-Squares Fitting
Polynomials are one of the most commonly used types of curves in regression. The applications of the method of least squares curve fitting using polynomials are briefly discussed as follows. To obtain further information on a particular curve fitting, please click on the link at the end of each item. Or try the calculator on the right
Multiple Regression Least-Squares: Multiple regression estimates the outcomes which may be affected by more than one control parameter or there may be more than one control parameter being changed at the same time, e.g., . See complete derivation.