Understand which is the best method to use to compute an orthogonal projection in a given situation.
Recipes: an orthonormal set from an orthogonal set, Projection Formula, -coordinates when is an orthogonal set, Gram–Schmidt process.
Vocabulary words:orthogonal set, orthonormal set.
In this section, we give a formula for orthogonal projection that is considerably simpler than the one in Section 6.3, in that it does not require row reduction or matrix inversion. However, this formula, called the Projection Formula, only works in the presence of an orthogonal basis. We will also present the Gram–Schmidt process for turning an arbitrary basis into an orthogonal one.
Subsection6.4.1Orthogonal Sets and the Projection Formula
Computations involving projections tend to be much easier in the presence of an orthogonal set of vectors.
Definition
A set of nonzero vectors is called orthogonal if whenever It is orthonormal if it is orthogonal, and in addition for all
In other words, a set of vectors is orthogonal if different vectors in the set are perpendicular to each other. An orthonormal set is an orthogonal set of unit vectors.
Example
The standard coordinate vectors in always form an orthonormal set. For instance, in we check that
We saw in the previous example that it is easy to produce an orthonormal set of vectors from an orthogonal one by replacing each vector with the unit vector in the same direction.
Suppose that is an orthogonal basis for a subspace and let for each Then we see that for any vector we have
In other words, for an orthogonal basis, the projection of onto is the sum of the projections onto the lines spanned by the basis vectors. In this sense, projection onto a line is the most important example of an orthogonal projection.
We saw in the previous subsection that orthogonal projections and -coordinates are much easier to compute in the presence of an orthogonal basis for a subspace. In this subsection, we give a method, called the Gram–Schmidt Process, for computing an orthogonal basis of a subspace.
First we claim that each is in and in fact that is in Clearly is in Then is a linear combination of and which are both in so is in as well. Similarly, is a linear combination of and which are all in so is in Continuing in this way, we see that each is in
Now we claim that is an orthogonal set. Let Then so by definition is orthogonal to every vector in In particular, is orthogonal to
We still have to prove that each is nonzero. Clearly Suppose that Then which means that is in But each is in by the first paragraph, so is in This contradicts the increasing span criterion in Section 2.5; therefore, must be nonzero.
The previous two paragraphs justify the use of the projection formula in the equalities
in the statement of the theorem.
Since is an orthogonal set, it is linearly independent. Thus it is a set of linearly independent vectors in so it is a basis for by the basis theorem in Section 2.7. Similarly, for every we saw that the set is contained in the -dimensional subspace so is an orthogonal basis for
You can use the Gram–Schmidt Process to produce an orthogonal basis from any spanning set: if some just throw away and and continue.
Subsection6.4.3Two Methods to Compute the Projection
We have now presented two methods for computing the orthogonal projection of a vector: this theorem in Section 6.3 involves row reduction, and the projection formula requires an orthogonal basis. Here are some guidelines for which to use in a given situation.
If you already have an orthogonal basis, it is almost always easier to use the projection formula. This often happens in the sciences.
If you are going to have to compute the projections of many vectors onto the same subspace, it is worth your time to run Gram–Schmidt to produce an orthogonal basis, so that you can use the projection formula.
If you only have to project one or a few vectors onto a subspace, it is faster to use the theorem in Section 6.3. This is the method we will follow in Section 6.5.