Linear Least Squares Regression

Introduction :
To compare data to known laws, it is important to represent the data mathematically. For example, when dealing with kinetics we are often concerned with the concentration of a substance. Measuring the concentration at several different times can yield a set of data which we need to represent with an equation rather than as separate points. To do this we use a  process called linear least squares fitting. This process gives a linear fit in the slope-intercept form (y=mx+b).

For a general linear equation, y = mx + b, it is assumed that the errors in the y-values are substantially greater than the errors in the x-values. The vertical deviation can be calculated using this formula: 

             
If the square of the deviations is minimized, the "best line" can be calculated: 
By the use of matrix algebra (determinants), the values of the slope (m) and the y-intercept (b) can be calculated.

A short review of determinants: 

Now, the values for m, b, and the deviation D can be determined by these matrices: 

                                                                             
 

Linear Least Squares Regression


Enter the number of data points which you have, and then enter the data in the space provided. Enter data as x,y pairs with a single comma (no spaces) between x and y values in a single point. Separate each x,y pair with a tab or return. Do not use parentheses. 

Number of Data Points: 

Data

Graph size, from 0 to 1: 

Number of decimal places: 

Show y-intercept on graph.