From Euclidean to Hilbert Spaces. Edoardo Provenzi
. . . , um}, m ≼ n, ui ≠ 0V ∀i = 1, . . . , m.
The vector subspace of V produced by all linear combinations of the vectors of F shall be written Span(F ):
The orthogonal projection operator or orthogonal projector of a vector v ∈ V onto S is defined as the following application, which is obviously linear:
Theorem 1.12 shows that the orthogonal projection defined above retains all of the properties of the orthogonal projection demonstrated for ℝ2.
THEOREM 1.12.– Using the same notation as before, we have:
1) if s ∈ S then PS(s) = s, i.e. the action of PS on the vectors in S is the identity;
2) ∀v ∈ V and s ∈ S, the residual vector of the projection, i.e. v − PS(v), is ⊥ to S:
3) ∀v ∈ V et s ∈ S: ‖v − PS(v)‖ ≼ ‖v − s‖ and the equality holds if and only if s = PS(v). We write:
PROOF.–
1) Let s ∈ S, i.e.
2) Consider the inner product of PS(v) and a fixed vector uj, j ∈ {1, . . . , m}:
hence:
Lemma 1.1 guarantees that
3) It is helpful to rewrite the difference v − s as v − PS(v) + PS(v) − s. From property 2, v−PS(v)⊥S, however PS(v), s ∈ S so PS(v)−s ∈ S. Hence (v−PS(v)) ⊥ (PS(v) − s). The generalized Pythagorean theorem implies that:
hence ‖v − s‖ ≽ ‖v − PS(v)‖ ∀v ∈ V, s ∈ S.
Evidently, ‖PS(v) − s‖2 = 0 if and only if s = PS(v), and in this case ‖v − s‖2 = ‖v − PS(v)‖2.□
The theorem demonstrated above tells us that the vector in the vector subspace S ⊆ V which is the most “similar” to v ∈ V (in the sense of the norm induced by the inner product) is given by the orthogonal projection. The generalization of this result to infinite-dimensional Hilbert spaces will be discussed in Chapter 5.
As already seen for the projection operator in ℝ2 and ℝ3, the non-negative scalar quantity
This observation is crucial to understanding the significance of the Fourier decomposition, which will be examined in both discrete and continuous contexts in the following chapters.
Finally, note that the seemingly trivial equation v = v − s + s is, in fact, far more meaningful than it first appears when we know that s ∈ S: in this case, we know that v − s and s are orthogonal.
The decomposition of a vector as the sum of a component belonging to a subspace S and a component belonging to its orthogonal is known as the orthogonal projection theorem.
This decomposition is unique, and its generalization for infinite dimensions, alongside its consequences for the geometric structure of Hilbert spaces, will be examine in detail in Chapter 5.
1.7. Existence of an orthonormal basis: the Gram-Schmidt process
As we have seen, projection and decomposition laws are much simpler when an orthonormal basis is available.
Theorem 1.13 states that in a finite-dimensional inner product space, an orthonormal basis can always be constructed from a free family of generators.
THEOREM 1.13.– (The iterative Gram-Schmidt process6) If (v1, . . . , vn), n ≼ ∞ is a basis of (V, 〈, 〉), then an orthonormal basis of (V, 〈, 〉) can be obtained from (v1, . . . , vn).
PROOF.– This proof is constructive in that it provides the method used to construct an orthonormal basis from any arbitrary basis.
– Step 1: normalization of v1:
– Step 2, illustrated in Figure 1.5: v2 is projected in the direction of u1, that is, we consider 〈v2, u1〉u1. We know from theorem 1.12 that the vector difference v2 − 〈v2, u1〉u1 is orthogonal to u1. The result is then normalized: