Curve Fitting, Regression

Field data is often accompanied by noise. Even though all control parameters (independent variables) remain constant, the resultant outcomes (dependent variables) vary. A process of quantitatively estimating the trend of the outcomes, also known as regression or curve fitting, therefore becomes necessary.

The curve fitting process fits equations of approximating curves to the raw field data. Nevertheless, for a given set of data, the fitting curves of a given type are generally NOT unique. Thus, a curve with a minimal deviation from all data points is desired. This best-fitting curve can be obtained by the method of least squares.

The Method of Least Squares

The method of least squares assumes that the best-fit curve of a given type is the curve that has the minimal sum of the deviations squared (least square error) from a given set of data.

Suppose that the data points are (x1,y1), (x2,y2), ..., (xn,yn) where x is the independent variable and y is the dependent variable. The fitting curve f(x) has the deviation (error) d from each data point, i.e., d1=y1-f(x1), d2=y2-f(x2), ..., dn=yn-f(xn). According to the method of least squares, the best fitting curve has the property that:

Polynomials Least-Squares Fitting

Polynomials are one of the most commonly used types of curves in regression. The applications of the method of least squares curve fitting using polynomials are briefly discussed as follows. To obtain further information on a particular curve fitting, please click on the link at the end of each item. Or try the calculator on the right

Least Square Method Related Calculator
The Least-Squares Line: The least-squares line method uses a straight line y=a+bx to approximate the given set of data, (x1,y1), (x2,y2), ..., (xn,yn), where n>=2. See complete derivation.

The Least-Squares Parabola: The least-squares parabola method uses a second degree curve y=a+bx+cx^2 to approximate the given set of data, (x1,y1), (x2,y2), ..., (xn,yn), where n>=3. See complete derivation.

The Least-Squares mth Degree Polynomials: The least-squares mth degree Polynomials method uses mth degree polynomials y=a0+a1x+a2x^2+...+amx^m to approximate the given set of data, (x1,y1), (x2,y2), ..., (xn,yn), where n>=m+1. See complete derivation.

Multiple Regression Least-Squares: Multiple regression estimates the outcomes which may be affected by more than one control parameter or there may be more than one control parameter being changed at the same time, e.g., z=ax+by. See complete derivation.