The Linear Regression App allows you to input your data points and make predictions based on linear regression. The app utilizes advanced mathematical computations derived from linear algebra to provide accurate and reliable predictions.
- Input Data Points: Users can input their own data points directly into the app.
- Graphing Interface: Visualize your data points and the resulting regression line.
- Background Customization: Customize the background of your graph with either a color or an image.
- AI Model Creation: Create and save your AI models for future predictions.
-
Construct Matrix
$A$ and Vector$b$ : These represent the system of equations for the points. -
Projection and Gram-Schmidt: The projection function helps in computing orthogonal vectors. The Gram-Schmidt process orthogonalizes these vectors to form the matrix
$Q$ . -
Compute
$R$ : By multiplying$Q^T$ and$A$ , we get$R$ . -
Solve for Slope and Intercept: Finally, by solving the system
$R x = Q^T b$ , we find the slope and intercept of the best fit line.
Start with the points you want to find the line of best fit for. Example points:
Create matrix
Note: written next to b in the slope-intercept equation is (1) just to show a coefficient of 1, as b(1) just equals b, but it helps with visualizing where A comes from.
For the point
- Equation:
$(0) = m(0) + b(1)$
For the point
- Equation:
$(0) = m(1) + b(1)$
For the point
- Equation:
$(2) = m(1) + b(1)$
Matrix
Vector
We represent
To find the line of best fit, we need to determine the values of vector
We use QR factorization to convert matrix
Understand that you can look at the set of columns of the matrix as a set of vectors, as a vector is really just a single column matrix
-
$Q$ is an orthonormal matrix:-
What makes an orthonormal matrix?
- The set of columns (vectors) are orthogonal (point in directions perpendicular to each other). Look up pictures of orthogonal vectors if confused.
- The set of columns (vectors) are normalized (unit length or length of 1).
- For example, if
$\mathbf{v} = \begin{bmatrix} a \\ b \end{bmatrix}$ , then the length would be equal to$\sqrt{a^2 + b^2}$ , which must be equal to 1 to be considered normalized. So to normalize any vector just divide its values by its length.
-
Example of an orthonormal matrix: This is the identity matrix and is made up of ones on the diagonal. It is also orthonormal. In other words, the columns are both unit vectors and perpendicular to each other.
-
- Property of an orthonormal matrix:
$Q^T = Q^{-1}$ .- Q Transpose = Q Inverse
- Don't forget the property of orthonormal matrices, we will use it later.
- More info:
- A transpose of a matrix is basically just flipping it in a certain way that makes it so all the columns become the rows and the rows become the columns.
- Here is an example:
- R is an upper triangular matrix:
- What is an upper triangular matrix
- Non-zero entries are above or on the main diagonal.
- Example of an upper triangular matrix:
- What is an upper triangular matrix
We will need to understand how vector projection works in order to continue, so let's project vector
To find the projection, you can visualize it or use the projection equation:
- Dot Product Calculation:
- Projection Calculation:
To understand this visually, imagine the two vectors were real rods, and you shined a light directly above them. The shadow of the
Now, the reason we use projection to get orthogonal vectors is because if you take a vector and subtract its projection onto another vector, you are left with the perpendicular part of the vector.
For example, with the vectors
- Subtracting
$[1,1]$ by its projection$[1,0]$ leaves you with$[0,1]$ . - The vectors
$[0,1]$ and$[2,0]$ are perpendicular.
The Gram-Schmidt process converts the columns of
- Start with matrix
$A$ :
Create vectors from A's columns:
-
Finding
$\mathbf{u}_1$ : The first vector just equals the first original vector
-
Finding
$\mathbf{u}_2$ : Now we subtract the projection of$v_2$ , so we are just left with the perpendicular part.
Now that we have an orthogonal (perpendicular) set of vectors, we need to normalize them to make their length equal to 1. This will make the vectors both orthogonal and normalized, or orthonormal. Fortunately,
-
Normalize
$\mathbf{u}_1$ :
Put
-
Matrix
$Q$ :
This part is much easier once you know Q. Since you know
- Multiply both sides by
$Q^{-1}$ :
-
$Q^{-1}Q$ basically cancels itself out, so it just equals$R$
- Use the property of orthonormal matrices (
$Q^T = Q^{-1}$ ):
Knowing:
Compute
So go back to the equation from the beginning
- Multiply both sides by
$Q^T$ :
- Multiply by
$R^{-1}$ :
Solve for
Next, get Finding inverse of a matrix with RREF (Row Reduced Echelon Form)
Compute
The line of best fit is:
The final answer is
