MATH 524 University of California Operators on Complex Vectors Spaces Questions

User Generated

ongzna03

Mathematics

MATH 524

University of California Berkeley

MATH

Description

this question deals with the “Laguerre Polynomials,” which are orthogonal with respect to the inner product hf, gi = Z ∞ 0 f(x)g(x) e −x dx. Use the fact that ∀ integers p, q ≥ 0 hx p , xq i = (p + q)! (“p plus q factorial”), to derive the first 4 (order 0, 1, 2, 3) orthonormal Laguerre Polynomials starting from elements of the standard polynomial basis {1, x, x2 , x3}.

Let T ∈ L(C 7 ) be defined by T(z1, z2, z3, z4, z5, z6, z7) = (πz1+z2+z3+z4, πz2+z3+z4, πz3+z4, πz4, √ 7z5+z6+z7, √ 7z6+z7, √ 7z7) Let Bs(C 7 ) = {e1, e2, e3, e4, e5, e6, e7} be the standard basis of C 7 (a) (25 pts.) Find M(T, Bs(C 7

(b) (25 pts.) Find the eigenvalues {λk}k=1,...,? (c) For each eigenvalue, λk: i. (30 pts.) Find the eigenspace E(λk, T) ii. (30 pts.) Find the generalized eigenspace G(λk, T)

Unformatted Attachment Preview

Math 524: Linear Algebra Notes #8 — Operators on Complex Vector Spaces Peter Blomgren hblomgren.peter@gmail.comi Department of Mathematics and Statistics Dynamical Systems Group Computational Sciences Research Center San Diego State University San Diego, CA 92182-7720 http://terminus.sdsu.edu/ Spring 2021 (Revised: January 18, 2021) Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (1/99) Outline 1 2 Student Learning Targets, and Objectives SLOs: Operators on Complex Vector Spaces Generalized Eigenvectors and Nilpotent Operators Null Spaces of Powers of an Operator Generalized Eigenvectors Nilpotent Operators 3 Decomposition of an Operator Description of Operators on Complex Vector Spaces Multiplicity of an Eigenvalue Block Diagonal Matrices Square Roots 4 Characteristic and Minimal Polynomials The Cayley–Hamilton Theorem The Minimal Polynomial 5 Jordan Form Jordan “Normal” / “Canonical” Form Problems, Homework, and Supplements Suggested Problems Assigned Homework Supplements 6 Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (2/99) Student Learning Targets, and Objectives SLOs: Operators on Complex Vector Spaces Student Learning Targets, and Objectives 1 of 2 Target Generalized Eigenvectors and Nilpotent Operators Objective Be able to identify generalized eigenspaces G (λ, T ) Objective Be able to identify a Nilpotent Operator, N, by the dimension of its Generalized Eigenspace G (0, N) Objective Be able to construct an orthonormal basis so that the matrix of a Nilpotent Operator is upper trianguler with respect to the basis Target Decomposition of an Operator Objective Be able to Decompose all operators on complex vector spaces V onto direct sums of invariant generalized eigenspaces Objective Be able to identify a Block Diagonal Matrix Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (3/99) Student Learning Targets, and Objectives SLOs: Operators on Complex Vector Spaces Student Learning Targets, and Objectives 2 of 2 Target Characteristic Polynomial and the Cayley–Hamilton Theorem Objective Be able to state the properties of the Characteristic Polynomial and its relation to the Eigenvalues of an Operator Objective Be able to state the properties of the Minimal Polynomial and its relation to the Eigenvalues of an Operator Objective Be able to derive the Characteristic and Minimal Polynomials for an Operator. Target Jordan Form Objective Be able to identify the Jordan Chains, and use them to construct a Jordan Basis for an Operator Objective Be able to identify the Jordan Normal Form for an Operator Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (4/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Null Spaces of Powers of an Operator Generalized Eigenvectors Nilpotent Operators Introduction We return to the issue of describing an operator in terms of its eigenspaces. In particular, we address the issue of non-Diagonalizability. Rewind (Sum of Eigenspaces is a Direct Sum [Notes#5]) Suppose V is finite-dimensional and T ∈ L(V ). Suppose also that λ1 , . . . , λm are distinct eigenvalues of T . Then E (λ1 , T ) + · · · + E (λm , T ) is a direct sum. Furthermore, dim (E (λ1 , T )) + · · · + dim (E (λm , T )) ≤ dim(V ) Rewind (Conditions Equivalent to Diagonalizability [Notes#5]) Suppose V is finite-dimensional and T ∈ L(V ). Let λ1 , . . . , λm denote the distinct eigenvalues of T . Then the following are equivalent: (a) T is diagonalizable. (b) V has a basis consisting of eigenvectors of T (c) ∃ 1-D subspaces U1 , . . . , Un of V , each invariant under T , such that V = U1 ⊕ · · · ⊕ Un There may be more than one U∗ per eigenvalue! (d) V = E (λ1 , T ) ⊕ · · · ⊕ E (λm , T ) (e) dim(V ) = dim(E (λ1 , T )) + · · · + dim(E (λm , T )) Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (5/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Null Spaces of Powers of an Operator Generalized Eigenvectors Nilpotent Operators null(T k ), for T ∈ L(V ) “Building the Toolbox” We (temporarily) “discard” our inner products, and return to the simplicity of Vector Spaces. We look at the behavior of powers of operators T k ; first we look at the associated null-spaces (Generalized) Eigenspaces Theorem (Sequence of Increasing Null Spaces) Suppose T ∈ L(V ), then {0} = null(T 0 ) ⊂ null(T 1 ) ⊂ · · · ⊂ null(T k ) ⊂ null(T k+1 ) ⊂ · · · Proof (Sequence of Increasing Null Spaces) Suppose T ∈ L(V ), let k ≥ 0 and v ∈ null(T k ). Then T k (v ) = 0, and T k+1 (v ) = T (T k (v )) = √ T (0) = 0, so that v ∈ null(T k+1 ); thus k k+1 null(T ) ⊂ null(T ). Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (6/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Null Spaces of Powers of an Operator Generalized Eigenvectors Nilpotent Operators Equality in the Sequence of Null Spaces Theorem (Equality in the Sequence of Null Spaces) Suppose T ∈ L(V ), let m ≥ 0 such that null(T m ) = null(T m+1 ), then null(T m ) = null(T m+1 ) = null(T m+2 ) = · · · Proof (Equality in the Sequence of Null Spaces) Let m, k ≥ 0. From the previous result we already have null(T k+m ) ⊂ null(T k+m+1 ), to show equality we need to show null(T k+m+1 ) ⊂ null(T k+m ): Let v ∈ null(T k+m+1 ), then T m+1 (T k (v )) = T k+m+1 (v ) = 0 m+1 thus T k (v ) ∈ null(T √ ) = null(T m ) ⇒ T m+k (v ) = T m (T k (v )) = 0 k+m ⇒ v ∈ null(T ). Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (7/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Null Spaces of Powers of an Operator Generalized Eigenvectors Nilpotent Operators Null Spaces Stop Growing Theorem (Null Spaces Stop Growing) Suppose T ∈ L(V ), let n = dim(V ), then null(T n ) = null(T n+1 ) = · · · Proof (Null Spaces Stop Growing) By Contradiction: If the theorem is false, then {0} = null(T 0 ) ( null(T 1 ) ( · · · ( null(T n ) ( null(T n+1 ) The strict inclusions means 0 = dim(null(T 0 )) < dim(null(T 1 )) < · · · dim(null(T n )) < dim(null(T n+1 )) √ so that dim(null(T n+1 )) ≥ (n + 1). But dim(V ) = n. < Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (8/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Null Spaces of Powers of an Operator Generalized Eigenvectors Nilpotent Operators Direct Sums of null and range It is generally true that V 6= null(T ) ⊕ range(T ); e.g. recall examples where null(T ) = range(T ). Theorem (V = null(T n ) ⊕ range(T n ); n = dim(V )) Suppose T ∈ L(V ), n = dim(V ), then V = null(T n ) ⊕ range(T n ) Proof (V = null(T n ) ⊕ range(T n ); n = dim(V )) (1) We show null(T n ) ∩ range(T n ) = {0}: Let v ∈ null(T n ) ∩ range(T n ), then T n (v ) = 0, and ∃u ∈ V : v = T n (u). Then 0 = T n (v ) = T 2n (u), using the previous result null(T n ) = null(T 2n ), we must have T n (u) = 0, hence v = 0. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (9/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Null Spaces of Powers of an Operator Generalized Eigenvectors Nilpotent Operators Direct Sums of null and range Proof (V = null(T n ) ⊕ range(T n ); n = dim(V )) (2) Since null(T n ) ∩ range(T n ) = {0} by [Direct Sum of Two spaces (Notes#1)] null(T n ) + range(T n ) is a direct sum; and dim(null(T n ) ⊕ range(T n )) 1 = 2 = 1 = 2 = Sub- dim(null(T n )) + dim(range(T n )) dim(V ) [A Sum is a Direct Sum if and only if Dimensions Add Up (Notes#3.2)] [Fundamental Theorem of Linear Maps (Notes#3.1)] Therefore V = null(T n ) ⊕ range(T n ). Peter Blomgren hblomgren.peter@gmail.comi √ 8. Operators on Complex Vector Spaces — (10/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Null Spaces of Powers of an Operator Generalized Eigenvectors Nilpotent Operators Direct Sums of null and range Example Consider T ∈ L(F3 ) defined by T (z1 , z2 , z3 ) = (4z2 , 0, 5z3 ) range(T ) = null(T ) = {(w1 , 0, w2 ) : w1 , w2 ∈ F} {(w , 0, 0) : w ∈ F} T 2 (z1 , z2 , z3 ) = (0, 0, 25z3 ) = (0, 0, 125z3 ) {(0, 0, w ) : w ∈ F} 3 T (z1 , z2 , z3 ) range(T {2,3} ) = null(T {2,3} ) = {(w1 , w2 , 0) : w1 , w2 ∈ F} Clearly F3 = range(T {2,3} ) ⊕ null(T {2,3} ) — the theorem guarantees the result for n = 3, but here it happens sooner (n = 2). Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (11/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Null Spaces of Powers of an Operator Generalized Eigenvectors Nilpotent Operators Generalized Eigenvectors As we have seen, some operators do not have enough eigenvectors to lead to a good description (diagonalization). We now introduce a remedy — generalized eigenvectors, which will aid in the description of the structure of operators. For C:normal T ∗ T = TT ∗ , and R:self-adjoint T = T ∗ operators we are guaranteed eigenspace decompositions V = E (λ1 , T ) ⊕ · · · ⊕ E (λm , T ) thanks to the [C/R Spectral Theorems (Notes#7.1)]. allows for an upper triangular matrix M(T ) for every operator; but does not give a direct-sum decomposition of the space. [Schur’s Theorem (Notes#6)] Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (12/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Null Spaces of Powers of an Operator Generalized Eigenvectors Nilpotent Operators Generalized Eigenvectors and Eigenspaces Definition (Generalized Eigenvector) Suppose T ∈ L(V ) and λ is an eigenvalue of T . A vector v ∈ V is called a generalized eigenvector of T corresponding to λ if v 6= 0 and (T − λI )k (v ) = 0 for some k ≥ 0. Definition (Generalized Eigenspace, G (λ, T )) Suppose T ∈ L(V ) and λ is an eigenvalue of T . The generalized eigenspace of T corresponding to λ, denoted G (λ, T ), is defined to be the set of all generalized eigenvectors of T corresponding to λ, along with the 0 vector. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (13/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Null Spaces of Powers of an Operator Generalized Eigenvectors Nilpotent Operators Generalized Eigenvectors and Eigenspaces Since we get that standard eigenspace when k = 1, it is always true that E (λ, T ) ⊂ G (λ, T ) that is “eigenvectors are also generalized eigenvectors.” The next result answers the question “what value of k should we pick?” Theorem (Description of Generalized Eigenspace) Suppose T ∈ L(V ) and λ ∈ F is an eigenvalue of T . Then G (λ, T ) = null((T − λI )dim(V ) ). Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (14/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Null Spaces of Powers of an Operator Generalized Eigenvectors Nilpotent Operators Generalized Eigenvectors and Eigenspaces Proof (Description of Generalized Eigenspace) (1) Suppose v ∈ null((T − λI )dim(V ) ), then by definition v ∈ G (λ, T ), so null((T − λI )dim(V ) ) ⊂ G (λ, T ). (2) Suppose v ∈ G (λ, T ), then ∃k ≥ 0: v ∈ null((T − λI )k ) Applying [Sequence of Increasing Null Spaces] and [Null Spaces Stop to (T − λI ) shows v ∈ null((T − λI )dim(V ) ) so that G (λ, T ) ⊂ null((T − λI )dim(V ) ) Growing] Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (15/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Null Spaces of Powers of an Operator Generalized Eigenvectors Nilpotent Operators Generalized Eigenvectors and Eigenspaces Example We revisit the T ∈ L(F3 ) defined by T (z1 , z2 , z3 ) = (4z2 , 0, 5z3 ) null(T ) = {(w , 0, 0) : w ∈ F} = E(0, T) null(T 3 ) = {(w1 , w2 , 0) : w1 , w2 ∈ F} = G(0, T) Since dim(null(T )) > 0, λ = 0 is an eigenvalue; the other eigenvalue is λ = 5: E(0, T) = {(w, 0, 0) : w ∈ F}, E(5, T) = {(0, 0, w) : w ∈ F} and (T − 5I )(z1 , z2 , z3 ) = (4z2 − 5z1 , −5z2 , 0), so (T − 5I )2 (z1 , z2 , z3 ) (T − 5I )3 (z 1 , z2 , z3 ) = (25z1 − 40z2 , 25z2 , 0) = (300z2 − 125z1 , −125z2 , 0) null((T − 5I )3 ) = {(0, 0, w ) : w ∈ F} G(0, T) = {(w1 , w2 , 0) : w1 , w2 ∈ F}, G(5, T) = {(0, 0, w) : w ∈ F} F3 = G(0, T) ⊕ G(5, T) Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (16/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Null Spaces of Powers of an Operator Generalized Eigenvectors Nilpotent Operators Linearly Independent Generalized Eigenvectors Rewind (Linearly Independent Eigenvectors [Notes#5]) Let T ∈ L(V ). Suppose λ1 , . . . , λm are distinct eigenvalues of T , and v1 , . . . , vm are the corresponding eigenvectors; then v1 , . . . , vm is linearly independent. Theorem (Linearly Independent Generalized Eigenvectors) Let T ∈ L(V ). Suppose λ1 , . . . , λm are distinct eigenvalues of T , and v1 , . . . , vm are the corresponding generalized eigenvectors; then v1 , . . . , vm is linearly independent. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (17/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Null Spaces of Powers of an Operator Generalized Eigenvectors Nilpotent Operators Linearly Independent Generalized Eigenvectors Proof (Linearly Independent Generalized Eigenvectors) Suppose a1 , . . . , am ∈ C, vi ∈ G (λi , T ) such that 0 = a1 v1 + · · · + am vm (i) Let k be the largest non-negative integer such that (T − λ1 I )k v1 6= 0, and let w = (T − λ1 I )k v1 : (T − λ1 I )w = (T − λ1 I )k+1 v1 = 0 so T (w ) = λ1 w . Now, (T − λI )w = (λ1 − λ)w ∀λ ∈ F, and (T − λI )n w = (λ1 − λ)n w (ii) ∀λ ∈ F, n = dim(V ). We apply the operator (Note: the terms commute) (T − λ1 I )k (T − λ2 I )n · · · (T − λm I )n to (i). Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (18/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Null Spaces of Powers of an Operator Generalized Eigenvectors Nilpotent Operators Linearly Independent Generalized Eigenvectors Proof (Linearly Independent Generalized Eigenvectors) We get 1 0 = 2 = 3 = 1 a1 (T − λ1 I )k (T − λ2 I )n · · · (T − λm I )n v1 a1 (T − λ2 I )n · · · (T − λm I )n w a1 (λ1 − λ2 )n · · · (λ1 − λm )n w = [Description of Generalized Eigenspaces] 2 w = (T − λ1 I )k v1 = 3 = (ii) This forces a1 = 0. We can now repeat the argument and show that a2 = a3 = · · · √ = am = 0, which shows that v1 , . . . , vm is linearly independent. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (19/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Null Spaces of Powers of an Operator Generalized Eigenvectors Nilpotent Operators Nilpotent Operators Definition (Nilpotent) An operator N ∈ L(V ) is called nilpotent if N k = 0 for some k. Example (Some Nilpotent Operators) Operators with null(N) = range(N), e.g. N ∈ L(F4 ) N(z1 , z2 , z3 , z4 ) = (z3 , z4 , 0, 0) N 2 (z1 , z2 , z3 , z4 ) = (0, 0, 0, 0) Shift operators, e.g. N(z1 , . . . , zn ) N 2 (z1 , . . . , zn ) N n (z1 , . . . , zn ) = = = (0, z1 , . . . , zn−1 ) (0, 0, z1 , . . . , zn−2 ) (0, 0, . . . , 0) D ∈ L(Pm (F)) defined by D(p) = p ′ , since D m+1 (p) = 0. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (20/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Null Spaces of Powers of an Operator Generalized Eigenvectors Nilpotent Operators Nilpotent Operators Theorem (Nilpotent Operator Raised to Dimension of Domain is 0) Suppose N ∈ L(V ) is nilpotent, then N dim(V ) = 0. Proof (Nilpotent Operator Raised to Dimension of Domain is 0) Since N is nilpotent G (0, N) = V . [Description of Generalized √ implies N dim(V ) = 0. Eigenspace] Comment (Why Are We Here???) Given T ∈ L(V ), we want to find a basis B(V ) of V such that M(T ; B(V )) is as simple as possible, meaning that the matrix contains many 0’s. Nilpotent operators will help us in this pursuit. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (21/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Null Spaces of Powers of an Operator Generalized Eigenvectors Nilpotent Operators Matrix of a Nilpotent Operator Theorem (Matrix of a Nilpotent Operator) Suppose N ∈ L(V ) is nilpotent, then ∃ a  0 ∗  .. . . . . M(N; B(V )) =   .. . 0 ··· basis ... .. . .. . ··· B(V ) so that  ∗ .. .   ∗ 0 that is M(N; B(V )) is strictly upper triangular Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (22/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Null Spaces of Powers of an Operator Generalized Eigenvectors Nilpotent Operators Matrix of a Nilpotent Operator Proof (Matrix of a Nilpotent Operator) (1) Let B1 (null(N)) be a basis for null(N). (2) Let B2 (null(N 2 )) be an extension of B1 (null(N)) to a basis for null(N 2 ). (k+1) Let Bk+1 (null(N k+1 )) be an extension of Bk (null(N k )) to a basis for null(N k+1 ). STOP when Bk+1 (null(N k+1 )) is a basis for V [Nilpotent Operator Raised to Dimension of Domain is 0] guarantees this will happen. Now, consider this basis B(V ) = v1 , . . . , vn and the matrix M(N; B(V )): (i) The first dim(null(N)) columns corresponding to B1 (null(N)) are all zeros (since they are a basis for null(N). Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (23/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Null Spaces of Powers of an Operator Generalized Eigenvectors Nilpotent Operators Matrix of a Nilpotent Operator Proof (Matrix of a Nilpotent Operator) (ii) The next dim(null(N 2 )) − dim(null(N)) columns correspond to the extension of B1 (null(N)) to a basis for null(N 2 ); any of these vectors vℓ ∈ null(N 2 ), so N(vℓ ) ∈ null(N); this means N(vℓ ) = a1 v1 + · · · + adim(null(N)) vdim(null(N)) since ℓ > dim(null(N)) this means only entries strictly above the diagonal are non-zero. (k) √ As we process the columns dim(null(N k+1 )) − dim(null(N k )) corresponding to the extension of Bk (null(N)) to a basis for null(N k+1 ); any of vectors in that block vℓ ∈ null(N k+1 ), so N(vℓ ) ∈ null(N k ); which like above forces all diagonal and sub-diagonal entries in M(N) to be zeros. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (24/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Null Spaces of Powers of an Operator Generalized Eigenvectors Nilpotent Operators Matrix of a Nilpotent Operator — Examples Example (Revisited from [Slide 20]) • Given N(z1 , z2 , z3 , z4 ) = (z3 , z4 , 0, 0), we have  0 0 1  0 0 0 M(N; Bstd.coord ) =   0 0 0 0 0 0  0 1   0  0 √ • N(z1 , . . . , zn ) = (0, z1 , . . . , zn−1 ), we have      M(N; Bstd.coord ) =     0 ··· .. . 1 .. . 0 .. . . . . ··· 0 ··· 0 .. .. ··· . . Peter Blomgren hblomgren.peter@gmail.comi .. . 1 0 .. . .. . .. . 0          not quite what we want 8. Operators on Complex Vector Spaces — (25/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Null Spaces of Powers of an Operator Generalized Eigenvectors Nilpotent Operators Matrix of a Nilpotent Operator — Examples Example (Revisited from [Slide 20]) • N(z1 , . . . , zn ) = (0, z1 , . . . , zn−1 ), we have   en    en−1  M(N; Bstd.coord.reverse.order ) =  .  ..    e2 e1 Peter Blomgren hblomgren.peter@gmail.comi en 0 en−1 1 en−2 0 0 .. . .. . 0 0 1 .. . ··· ··· .. . .. .. ··· ··· . . ··· e1 0 .. .        0    1  0 8. Operators on Complex Vector Spaces √ — (26/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Null Spaces of Powers of an Operator Generalized Eigenvectors Nilpotent Operators Matrix of a Nilpotent Operator — Examples Example (Revisited from [Slide 20]) • D ∈ L(Pm (F)) defined by D(p) = p ′ , we have  1 x x2 · · ·  1 0 1 0 ···   .. . . ..  x . . 2 .  M(N; Bstd.poly ) =  . . . . .. .. ..  ..   . ..  x n−1 .. . n x 0 ··· ··· ··· Peter Blomgren hblomgren.peter@gmail.comi xn 0 .. .        0    n  0 √ 8. Operators on Complex Vector Spaces — (27/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Null Spaces of Powers of an Operator Generalized Eigenvectors Nilpotent Operators hhh Live Math iii e.g. 8A-{3, 4, 5} Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (28/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Null Spaces of Powers of an Operator Generalized Eigenvectors Nilpotent Operators Live Math :: Covid-19 Version 8A-5 8A-5: Suppose T ∈ L(V ), m is a positive integer, and v ∈ V is such that T m−1 (v ) 6= 0 but T m (v ) = 0. Prove that v , T (v ), T 2 (v ), . . . , T m−1 (v ) is linearly independent. > Step “0” > Suppose a0 , a1 , . . . , am−1 ∈ F are such that a0 v + a1 T (v ) + a2 T 2 (v ) + · · · + am−1 T m−1 (v ) = 0. (8A-5.i) Since T m−1 (v ) 6= 0, applying T m−1 to (8A-5.i) we get a0 T m−1 (v ) = 0; this implies a0 = 0. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (29/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Null Spaces of Powers of an Operator Generalized Eigenvectors Nilpotent Operators Live Math :: Covid-19 Version 8A-5 Step “1” > > We now have a1 , . . . , am−1 ∈ F such that a1 T (v ) + a2 T 2 (v ) + · · · + am−1 T m−1 (v ) = 0. (8A-5.ii) Applying T m−2 to (8A-5.ii) we get a1 T m−1 (v ) = 0; this implies a1 = 0. Step “k” > > Keep turning “the crank,” and we get a0 = a1 = · · · = am−1 = 0, which means that v , T (v ), T 2 (v ), . . . , T m−1 (v ) is linearly independent. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (30/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Description of Operators on Complex Vector Spaces Multiplicity of an Eigenvalue Block Diagonal Matrices Square Roots Invariance of The Null Space and Range of p(T ) We now put our new pieces together and show that every operator on a finite-dimensional complex vector space has enough generalized eigenvectors to provide a decomposition. We need some “glue” for the proof of the main result: Theorem (The Null Space and Range of p(T ) are Invariant Under T ) Suppose T ∈ L(V ) and p ∈ P(F), then null(p(T )) and range(p(T )) are invariant under T . Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (31/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Description of Operators on Complex Vector Spaces Multiplicity of an Eigenvalue Block Diagonal Matrices Square Roots Invariance of The Null Space and Range of p(T ) Proof (The Null Space and Range of p(T ) are Invariant Under T ) Suppose v ∈ null(p(T )), then p(T )(v ) = 0, hence (p(T )(T (v )) = T (p(T )(v )) = T (0) = 0 √ → T (v ) ∈ null(p(T )). 1 Suppose v ∈ range(p(T )). Then ∃u ∈ V : v = p(T )(u), hence T (v ) = T (p(T )(u)) = p(T )(T (u)) √ → T (v ) ∈ range(p(T )). 2 Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (32/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Description of Operators on Complex Vector Spaces Multiplicity of an Eigenvalue Block Diagonal Matrices Square Roots Description of Operators on Complex Vector Spaces Theorem (Description of Operators on Complex Vector Spaces) Suppose V is a complex vector space and T ∈ L(V ). Let λ1 , . . . , λm be the distinct eigenvalues of T . Then (a) V = G (λ1 , T ) ⊕ · · · ⊕ G (λm , T ) (b) each G (λk , T ) is invariant under T (c) each (T − λk I )|G (λk ,T ) is nilpotent. If we “trade in” our eigenspaces E (λk , T ) for generalized eigenspaces G (λk , T ), we can decompose all operators on a direct sum of invariant subspaces! This is a fairly big deal. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (33/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Description of Operators on Complex Vector Spaces Multiplicity of an Eigenvalue Block Diagonal Matrices Square Roots Description of Operators on Complex Vector Spaces Proof (Description of Operators on Complex Vector Spaces) Let n = dim(V ). (b,c) By [Description of Generalized Eigenspace] G (λk , T ) = null((T − λk I )n ), applying [The Null Space and Range of p(T ) are Invariant Under T ] with p(z) = (z − λk )n , we get (b)invariance . (c)nilpotency follows directly from the definitions. (a) If n = 1, (a) is trivially true. Let n > 1, and assume: [Induction–Hypothesis] (a) holds for all W : dim(W ) < n. Since V is a complex vector space, T has an eigenvalue [Existence of Eigenvalues (Notes#5)]. Applying [V = null(T n ) ⊕ range(T n ); n = dim(V )] to (T − λ1 I ) shows [Induction–Base] V = G (λ1 , T ) ⊕ U (a*) where U = range((T − λ1 I )n ). Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (34/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Description of Operators on Complex Vector Spaces Multiplicity of an Eigenvalue Block Diagonal Matrices Square Roots Description of Operators on Complex Vector Spaces Proof (Description of Operators on Complex Vector Spaces) Using [The Null Space and Range of p(T ) are Invariant p(z) = (z − λ1 )n , we see that U is invariant under T Under T ] with Since G (λ1 , T ) 6= {0}, dim(U) < n, and we can apply the inductive hypothesis to T |U . All generalized eigenvectors corresponding to λ1 are in G (λ1 , T ), hence the eigenvalues of T |U ∈ {λ2 , . . . , λm } 6∋ λ1 . We can now write U = G (λ2 , T |U ) ⊕ · · · ⊕ G (λm , T |U ) Showing that G (λk , T |U ) = G (λk , T ) completes the proof. Let k ∈ {2, . . . , m}, the inclusion G (λk , T |U ) ⊂ G (λk , T ) is clear; to show the other direction, let v ∈ G (λk , T ). By (a*) we can write v = v1 + u where v1 ∈ G (λ1 , T ), and u ∈ U. Need: v1 = 0, u ∈ G (λk , T |U ) Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (35/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Description of Operators on Complex Vector Spaces Multiplicity of an Eigenvalue Block Diagonal Matrices Square Roots Description of Operators on Complex Vector Spaces Proof (Description of Operators on Complex Vector Spaces) By the inductive hypothesis, we can write u = v2 + · · · + vm , where each vℓ ∈ G (λℓ , T |U ) ⊂ G (λℓ , T ). We have v = v1 + v2 + · · · + vm Since generalized eigenvectors corresponding to distinct eigenvalues are linearly independent [Linearly Independent Generalized Eigenvectors], this expression forces vℓ = 0 ∀ℓ 6= k. Since k ∈ {2, . . . , m} we must have v1 = √ 0; thus v = u ∈ U and since v ∈ U we can conclude v ∈ G (λk , T |U ). Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (36/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Description of Operators on Complex Vector Spaces Multiplicity of an Eigenvalue Block Diagonal Matrices Square Roots A Basis of Generalized Eigenvectors The following looks like an afterthought, but it completes the story (so far)... Theorem (A Basis of Generalized Eigenvectors) Suppose V is a complex vector space and T ∈ L(V ). Then there is a basis of V consisting of generalized eigenvectors of T . Proof (A Basis of Generalized Eigenvectors) Choose a basis of each G (λk , T ) in [Description of Operators on Complex Vector Spaces]. Put all these bases together to form a basis of V consisting of generalized eigenvectors of T . Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (37/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Description of Operators on Complex Vector Spaces Multiplicity of an Eigenvalue Block Diagonal Matrices Square Roots Multiplicity Definition (Multiplicity) Suppose T ∈ L(V ). The multiplicity of an eigenvalue λ of T is defined to be the dimension of the corresponding generalized eigenspace G (λ, T ) the multiplicity of an eigenvalue    λ of T equals dim(V ) dim null (T − λI ) Comment (Multiplicity Math 254 vs. Math 524) Math 254 Math 524 “algebraic multiplicity” dim(null((T − λI )dim(V ) )) = dim(G (λ, T )) “geometric multiplicity” dim(null(T − λI )) = dim(E (λ, T )) Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (38/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Description of Operators on Complex Vector Spaces Multiplicity of an Eigenvalue Block Diagonal Matrices Square Roots Multiplicity Example Let T ∈ L(C3 ) be defined by T (z1 , z2 , z3 ) = (6z1 + 3z2 + 4z3 , 6z2 + 2z3 , 7z3 ) with respect to the standard basis:   6 3 4 M(T ) = 0 6 2 , 0 0 7 E (6, T ) = span((1, 0, 0)), λ(T ) = {6, 7}. E (7, T ) = span((10, 2, 1))) G (6, T ) = span((1, 0, 0), (0, 1, 0)), G (7, T ) = span((10, 2, 1)) 3 C = G (6, T ) ⊕ G (7, T ) 3 B = {(1, 0, 0), (0, 1, 0), (10, 2, 1)} is a basis of C consisting of generalized eigenvectors of T . Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (39/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Description of Operators on Complex Vector Spaces Multiplicity of an Eigenvalue Block Diagonal Matrices Square Roots Sum of the Multiplicities Equals dim(V ) Theorem (Sum of the Multiplicities Equals dim(V )) Suppose V is a complex vector space and T ∈ L(V ). Then the sum of the multiplicities of all the eigenvalues of T equals dim(V ). Proof (Sum of the Multiplicities Equals dim(V )) [Description of Operators on Complex Vector Spaces] and [A Sum is a Direct Sum if and only if Dimensions Add Up (Notes#3.2)] Comment (Multiplicity Without Determinants) Is is worth noting that our definition of multiplicity does not require determinants. Also, we do not need two “types” of multiplicity. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (40/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Description of Operators on Complex Vector Spaces Multiplicity of an Eigenvalue Block Diagonal Matrices Square Roots Block Diagonal Matrices We introduce a bit more language for the discussion of matrix forms: Definition (Block Diagonal Matrix) A Block Diagonal Matrix is a square matrix of the form   A1 0   .. A=  . 0 Am where A1 , . . . , Am are square matrices along the diagonal; all other entries are 0. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (41/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Description of Operators on Complex Vector Spaces Multiplicity of an Eigenvalue Block Diagonal Matrices Square Roots Block Diagonal Matrix Example (Block Diagonal Matrix) Let   A1 = 4 , A2 = then A = diag(A1 , A2 , A3 ) is  4 0 0 0  0 2 1 0   0 0 2 0 A=  0 0 0 1   0 0 0 0 0 0 0 0   2 1 , 0 2  1 A3 = 0 0  1 0 1 1 0 1 a block diagonal matrix:   4 0 0  0 0  2 1    0 0  2 =  1 0  1   1 1   0 1 Peter Blomgren hblomgren.peter@gmail.comi      1 0   1 1  1 8. Operators on Complex Vector Spaces — (42/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Description of Operators on Complex Vector Spaces Multiplicity of an Eigenvalue Block Diagonal Matrices Square Roots Block Diagonal Matrix with Upper-Triangular Blocks Theorem (Block Diagonal Matrix with Upper-Triangular Blocks) Suppose V is a complex vector space and T ∈ L(V ). Let λ1 , . . . , λm be the distinct eigenvalues of T , with multiplicities d1 , . . . , dm . Then there is a basis of V with respect to which T has a block diagonal matrix of the form A = diag(A1 , . . . , Am ), where each Ak is a (dk × dk ) upper-triangular matrix of the form   λk ∗   .. Ak =   . 0 λk Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (43/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Description of Operators on Complex Vector Spaces Multiplicity of an Eigenvalue Block Diagonal Matrices Square Roots Upper-Triangular Matrix vs. Block Diagonal Matrix with Upper-Triangular Blocks Rewind (Schur’s Theorem [Notes#6]) Suppose V is a finite-dimensional complex vector space and T ∈ L(V ). Then T has an upper-triangular matrix with respect to some orthonormal basis of V . Comment Here, we are trading away the orthonormal basis; and we are getting more zeros in the matrix. This is useful for theoretical purposes, but not always a good idea in practical computations. For computational stability and accuracy [Math 543], orthonormal bases are very desirable. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (44/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Description of Operators on Complex Vector Spaces Multiplicity of an Eigenvalue Block Diagonal Matrices Square Roots Block Diagonal Matrix with Upper-Triangular Blocks Proof (Block Diagonal Matrix with Upper-Triangular Blocks) Each (T − λk I )|G (λk ,T ) is nilpotent [Description of Operators on Complex Vector Spaces]. For each k, we select a basis Bk of G (λk , T ) (which is a vector space with dim(G (λk , T )) = dk such that M((T − λk I )|G (λk ,T ) ; Bk ) is strictly upper triangular; thus M(T |G (λk ,T ) ; Bk ) = M((T − λk I )|G (λk ,T ) + λk I |G (λk ,T ) ) is upper triangular with λk repeated on the diagonal. Collecting the bases Bk (G (λk , T )), k = 1, . . . , m gives a basis B(V ); and the M(T ; B(V )) has the desired structure. Note, the example matrix on [Slide 42] is in this form. For an operator T on a 6dimensional vectors space with M(T ) as in the example, the eigenvalues are {4, 2, 1} with corresponding multiplicities {1, 2, 3}. Additionally, the matrices on [Slides 25–27] are in this form. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (45/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Description of Operators on Complex Vector Spaces Multiplicity of an Eigenvalue Block Diagonal Matrices Square Roots Block Diagonal Matrix with Upper-Triangular Blocks Example (Revisited from [Slide 39]) Let T ∈ L(C3 ) be defined byT (z1 , z2 , z3 ) = (6z1 + 3z2 + 4z3 , 6z2 + 2z3 , 7z3 ) with respect to the standard basis the matrix is not in the desired form:   6 3 4 M(T ) = 0 6 2 , λ(T ) = {6, 7}. 0 0 7 However, G (6, T ) = span((1, 0, 0), (0, 1, 0)), G (7, T ) = span((10, 2, 1)), so that B = {(1, 0, 0), (0, 1, 0), (10, 2, 1)}, and  M(T ; B) =  Peter Blomgren hblomgren.peter@gmail.comi 6  3 6 7  8. Operators on Complex Vector Spaces — (46/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Description of Operators on Complex Vector Spaces Multiplicity of an Eigenvalue Block Diagonal Matrices Square Roots Square Roots Some, but not all operators have square roots. At this point, we know that positive operators have positive square roots [Characterization of Positive Operators (Notes#7.2)]. To that we add: Theorem (Identity Plus Nilpotent has a Square Root) Suppose N ∈ L(V ) is nilpotent, then (I + N) has a square root. Proof (Identity Plus Nilpotent has a Square Root) √ We use the Taylor series for 1 + x as motiviation: √ 1 + x = 1 + a1 x + a2 x 2 + · · · + ”a∞ x ∞ ” for our purpose, the values of the coefficients are not (yet) important; since N ∈ L(V ) is nilpotent N m = 0 for some value of m, we seek a square root of the form √ I + N = I + a1 N + a2 N 2 + · · · + am−1 N m−1 Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (47/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Description of Operators on Complex Vector Spaces Multiplicity of an Eigenvalue Block Diagonal Matrices Square Roots Square Roots Proof (Identity Plus Nilpotent has a Square Root) We select the coefficients a1 , . . . , am−1 so that I + N = (I + a1 N + a2 N 2 + · · · + am−1 N m−1 )2 Given enough patience, we can figure out what the coefficient √ values should be; but all we need is that they exist. We can now use this results to guarantee that all invertible operators (over C) have square roots... Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (48/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Description of Operators on Complex Vector Spaces Multiplicity of an Eigenvalue Block Diagonal Matrices Square Roots Square Roots Note that this result does not hold over R, e.g. T (x) = −x, x ∈ R does not have a square root. Theorem (Over C, Invertible Operators Have Square Roots) Suppose V is a complex vector space and T ∈ L(V ) is invertible. Then T has a square root. Proof (Over C, Invertible Operators Have Square Roots) Let λ1 , . . . , λm be the distinct eigenvalues of T . For each k ∃ a nilpotent Nk ∈ L(G (λk , T )) such that such that T |G (λk ,T ) = λk I + Nk [Description of Operators on Complex Vector Spaces]. Since T is invertible λk 6= 0, so we can write   Nk , k = 1, . . . , m T |G (λk ,T ) = λk I + λk Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (49/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Description of Operators on Complex Vector Spaces Multiplicity of an Eigenvalue Block Diagonal Matrices Square Roots Square Roots Proof (Over C, Invertible Operators Have Square Roots) The scaled operator Nk /λk are nilpotent, so each (I + Nk /λk ) has a square√rootp[Identity Plus Nilpotent has a Square Root]. Rk = λk I + Nk /λk is the square root Rk of T |G (λk ,T ) . Any v ∈ V can be uniquely written in the form v = u1 + · · · + um , uk ∈ G (λk , T ) [Description of Operators on Complex Vector Spaces]. R ∈ L(V ) by Now define R(v ) = R1 (u1 ) + · · · + Rm (um ), since ∀uℓ ∈ G (λℓ , T ) Rℓ (uℓ ) ∈ G (λℓ , T ) 2 R 2 (v ) = R12 (u1 ) + · · · + Rm (um ), = T |G (λ1 ,T ) (u1 ) + · · · + T |G (λm ,T ) (um ) = T (v ) Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (50/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Description of Operators on Complex Vector Spaces Multiplicity of an Eigenvalue Block Diagonal Matrices Square Roots hhh Live Math iii e.g. 8B-{3, 4, 5} Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (51/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Description of Operators on Complex Vector Spaces Multiplicity of an Eigenvalue Block Diagonal Matrices Square Roots Live Math :: Covid-19 Version 8B-4 8B-4: Suppose T ∈ L(V ), dim(V ) = n, and null(T n−2 ) 6= null(T n−1 ). Show that T has at most two distinct eigenvalues. Since null(T n−2 ) 6= null(T n−1 ) {0} = null(T 0 ) ( null(T 1 ) ( · · · ( null(T n−2 ) ( null(T n−1 ) Therefore dim(null(T n−1 )) ≥ (n − 1) ⇔ dim(G (0, T )) ≥ (n − 1) Also, we know (λj 6= 0) V |{z} dim(V )=n = G (0, T ) | {z } dim(G (0,T ))≥(n−1)   W z }| {   ⊕ G (λ1 , T ) ⊕ · · · ⊕ G (λm , T ) Peter Blomgren hblomgren.peter@gmail.comi | {z dim(W )≤1 8. Operators on Complex Vector Spaces } — (52/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form The Cayley–Hamilton Theorem The Minimal Polynomial Characteristic Polynomial Keep in mind: All the polynomial action here is over F = C. Definition (Characteristic Polynomial) Suppose V is a complex vector space and T ∈ L(V ). Let λ1 , . . . , λm denote the distinct eigenvalues of T , with multiplicities d1 , . . . , dm . The polynomial pT (z) = (z − λ1 )d1 · · · (z − λm )dm is called the characteristic polynomial of T . Comment Again, we have defined something familiar from the use of the determinant. Peter Blomgren hblomgren.peter@gmail.comi [Math 254] without 8. Operators on Complex Vector Spaces — (53/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form The Cayley–Hamilton Theorem The Minimal Polynomial Characteristic Polynomial Example (Characteristic Polynomials) The characteristic polynomials associated with previous examples [Slide 39]: p(z) = (z − 6)2 (z − 7)1 [Slide 42]: p(z) = (z − 4)1 (z − 2)2 (z − 1)3 Theorem (Degree and Zeros of Characteristic Polynomial) Suppose V is a complex vector space and T ∈ L(V ). Then (a) the characteristic polynomial, pT (z) of T has degree dim(V ) (b) the zeros of pT (z) are the eigenvalues of T . Proof (Degree and Zeros of Characteristic Polynomial) (a) follows from [Sum of the Multiplicities Equals dim(V )], and (b) from the definition of the characteristic polynomial. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (54/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form The Cayley–Hamilton Theorem The Minimal Polynomial Cayley–Hamilton Theorem Theorem (Cayley–Hamilton Theorem) Suppose V is a complex vector space and T ∈ L(V ). Let pT (z) denote the characteristic polynomial of T . Then pT (T ) = 0. Comment (Cayley–Hamilton Theorem over R) The Cayley–Hamilton Theorem also holds for real vector spaces. Comment (Importance of the Cayley–Hamilton Theorem) The Cayley–Hamilton Theorem is one of the key structural theorems in linear algebra. For one thing it gives us the “license” to find eigenvalues using the characteristic polynomial. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (55/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form The Cayley–Hamilton Theorem The Minimal Polynomial Cayley–Hamilton Theorem Proof (Cayley–Hamilton Theorem) Let λ1 , . . . , λm be the distinct eigenvalues of the operator T , and let d1 , . . . , dm be the dimensions of the corresponding generalized eigenspaces G (λ1 , T ), . . . , G (λm , T ). For each k = 1, . . . , m, we know that (T − λk I )|G (λk ,T ) is nilpotent. Thus we have (T − λk I )dk |G (λk ,T ) = 0 [Nilpotent Operator Raised to Dimension of Domain is 0]. Every vector in V is a sum of vectors in G (λ1 , T ), . . . , G (λm , T ) [Description of Operators on Complex Vector Spaces]; i.e. ∀v ∈ V , and ∃vℓ ∈ G (λℓ , T ): v = v1 + · · · + vℓ . To prove that pT (T ) = 0 (⇔ pT (T )v = 0 ∀v ∈ V ), we need only show that pT (T )|G (λk ,T ) = 0, k = 1, . . . , m. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (56/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form The Cayley–Hamilton Theorem The Minimal Polynomial Cayley–Hamilton Theorem Proof (Cayley–Hamilton Theorem) Fix k ∈ {1, . . . , m}. We have pT (T ) = (T − λ1 I )d1 · · · (T − λm I )dm . The operators on the right side of the equation above all commute, so we can move the factor (T − λk I )dk to be the last term in the expression on the right. √ Since (T − λk I )dk |G (λk ,T ) = 0, we conclude that pT (T )|G (λk ,T ) = 0. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (57/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form The Cayley–Hamilton Theorem The Minimal Polynomial Monic Polynomial Here, we introduce an alternative polynomial which can be used to identify eigenvalues. First, we need some language and notation (us usual!) Definition (Monic Polynomial) A monic polynomial is a polynomial whose highest-degree coefficient equals 1. Example Monic — p(z) = z 407 − πz 103 + Not monic — q(z) = (1 + Peter Blomgren hblomgren.peter@gmail.comi ǫ)z 2 √ 7 + 1, ǫ > 0 8. Operators on Complex Vector Spaces — (58/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form The Cayley–Hamilton Theorem The Minimal Polynomial Minimal Polynomial Theorem (Minimal Polynomial) Suppose T ∈ L(V ). Then there is a unique monic polynomial p of smallest degree such that p(T ) = 0. Proof (Minimal Polynomial)  Let n = dim(V ), then the list (of length n2 + 1 ) I , T , T 2, . . . , T n 2 is not linearly independent in L(V ), since dim(L(V )) = n2 . Let m be the smallest positive integer such that the list I , T , T 2, . . . , T m (i) is linearly dependent. [Linear Dependence (Notes#2)] implies that one of the operators in the list above is a linear combination of the previous ones. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (59/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form The Cayley–Hamilton Theorem The Minimal Polynomial Minimal Polynomial Proof (Minimal Polynomial) The choice of m means that T m is a linear combination of I , T , T 2 , . . . , T m−1 ; hence ∃a0 , a1 , . . . , am−1 ∈ F such that a0 I + a1 T + a2 T 2 + . . . am−1 T m−1 + T m = 0 (ii) We use the coefficients to define a monic polynomial p ∈ P(F) by: p(z) = a0 + a1 z + a2 z 2 + . . . am−1 z m−1 + z m By (ii) p(T ) = 0. That takes care of existence. To show uniqueness, note that the choice of m implies that no monic polynomial q ∈ P(F) with degree smaller than m can satisfy q(T ) = 0. Suppose q ∈ P(F) with degree m and q(T ) = 0. Then (p − q)(T ) = 0 and deg(p − q) < m. The choice of m now implies that (p − q) is the zero-polynomial ⇔ q = p, completing the proof. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (60/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form The Cayley–Hamilton Theorem The Minimal Polynomial Minimal Polynomial of an Operator T Definition (Minimal Polynomial (of an operator T )) Suppose T ∈ L(V ). Then the minimal polynomial of T is the unique monic polynomial p of smallest degree such that p(T ) = 0. The proof of the last theorem shows that the degree of the minimal polynomial of each operator on V is at most (dim(V ))2 . The [Cayley–Hamilton Theorem] tells us that if V is a complex vector space, then the minimal polynomial of each operator on V has degree at most dim(V ). This improvement (dim(V ))2 → dim(V ) also holds on real vector spaces. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (61/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form The Cayley–Hamilton Theorem The Minimal Polynomial Finding the Minimal Polynomial Take#1 “Guaranteed”∗ to Work, Labor Intensive: Given the matrix M(T ) (with respect to some basis) of an operator T ∈ L(V ). The minimal polynomial of T can be identified as follows: Consider the system of (dim(V )2 — each matrix entry) linear equations∗∗ a0 M(I ) + a1 M(T ) + · · · + am−1 M(T )m−1 = −M(T )m (i) for successive values of m = 1, . . . , dim(V )2 ; until there is a solution a1 , . . . , am−1 ; the minimal polynomial is then given by p(z) = a0 + a1 z + · · · + am−1 z m−1 + z m ∗ ∗∗ Requires an iPhone XIX with infinite precision processing capabilities. 2 2 The linear systems are of the form A~ x =~ b, where A ∈ Fdim(V ) ×m , ~ b ∈ Fdim(V ) , and the solution vector ~ x ∈ Fm = (a0 , a1 , . . . , am−1 ). Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (62/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form The Cayley–Hamilton Theorem The Minimal Polynomial Finding the Minimal Polynomial Take#2 Works “Almost Always”∗ , Less Labor Intensive: Given the matrix M(T ). The minimal polynomial of T can with probability 1 be identified as follows: Pick a random vector v ∈ Fdim(V ) , and consider the system of (dim(V ) — each vector entry) linear equations a0 M(I )v + a1 M(T )v + · · · + am−1 M(T )m−1 v = −M(T )m v (i) for successive values of m = 1, . . . , dim(V ); until there is a solution a1 , . . . , am−1 ; the minimal polynomial is the given by p(z) = a0 + a1 z + · · · + am−1 z m−1 + z m ∗ The random v ∈ Fdim(V ) must be such that v = u1 + · · · + um , uℓ 6= 0; where Fdim(V ) = G (λ1 , M(T )) ⊕ · · · ⊕ G (λm , M(T )), and uℓ ∈ G (λℓ , M(T )), and still requires an iPhone XIX with infinite precision processing capabilities. See [Math 543] for discussion on finite precision computing. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (63/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form The Cayley–Hamilton Theorem The Minimal Polynomial Finding the Minimal Polynomial Computation Example Let T ∈ L(C5 ), with M(T ) wrt the standard basis: 0 1  0 0 0  0 0 1 0 0 0 0 0 1 0 0 0 0 0 1  −3 6  0 0 0 Lets try the “random” vector v = (1, 1, 1, 1, 1), we construct A ∈ C5×6 by letting the kth column ak be M(T )k−1 v : 1  1  A= 1  1 1  −3 7 1 1 1 −3 3 7 1 1 Peter Blomgren hblomgren.peter@gmail.comi −3 3 3 7 1 −3 3 3 3 7 −21 39 3 3 3      8. Operators on Complex Vector Spaces — (64/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form The Cayley–Hamilton Theorem The Minimal Polynomial Finding the Minimal Polynomial Computation Example We are looking for a solution to a linear system; we put good use, and compute 1  0  rref(A) =  0  0 0  0 1 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 1 [Math 254] to  −3 6   0  0  0 Since the 6th column is linearly dependent (it does not have a leading one); we identify the linear relation: −3a1 + 6a2 − a6 = 0 which yields the minimal polynomial p(z) = 3 − 6z + z 5 Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (65/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form The Cayley–Hamilton Theorem The Minimal Polynomial Finding the Minimal Polynomial Computation Example Finally, “we”∗ compute p(M(T )) = 3I − 6M(T ) + M(T )5 : 1  0  3 0  0 0  0 1 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 1  0  1     − 6 0  0  0  0 0 1 0 0 0 0 0 1 0 0 0 0 0 1 −3 6 0 0 0       +   −3 6 0 0 0 0 −3 6 0 0 0 0 −3 6 0 0 0 0 −3 6 −18 36 0 0 −3    =0  Which shows that we indeed have the minimal polynomial. ∗ We = I + my computer. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (66/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form The Cayley–Hamilton Theorem The Minimal Polynomial Multiple of the Minimal Polynomial Theorem (q(T ) = 0 ⇔ q is a Multiple of the Minimal Polynomial) Suppose T ∈ L(V ) q ∈ P(F). Then q(T ) = 0 if and only if q is a polynomial multiple of the minimal polynomial of T . Proof (q(T ) = 0 ⇔ q is a Multiple of the Minimal Polynomial) Let p be the minimal polynomial of T. (⇐) Suppose q is a polynomial multiple of p. Thus ∃s ∈ P(F) such that q = ps, and q(T ) = p(T )s(T ) = 0s(T ) = 0 Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (67/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form The Cayley–Hamilton Theorem The Minimal Polynomial Multiple of the Minimal Polynomial Proof (q(T ) = 0 ⇔ q is a Multiple of the Minimal Polynomial) (⇒) Suppose q(T ) = 0. By [Division ∃s, r ∈ P(F) such that Algorithm for Polynomi- als (Notes#4)], q = ps + r (i) and deg(r ) < deg(p), therefore 0 = q(T ) = p(T ) s(T ) + r (T ) = r (T ) | {z } min.poly hence, r (T ) = 0 ⇒ r ≡ 0 ⇒ q = ps. Peter Blomgren hblomgren.peter@gmail.comi √ 8. Operators on Complex Vector Spaces — (68/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form The Cayley–Hamilton Theorem The Minimal Polynomial Characteristic Polynomial and Minimal Polynomial Theorem (Characteristic Polynomial is a Multiple of Minimal Polynomial) Suppose F = C and T ∈ L(V ). Then the characteristic polynomial of T is a polynomial multiple of the minimal polynomial of T . Note: We have not yet defined the characteristic polynomial when F = R, but once we do, the above theorem will apply. Proof (Characteristic Polynomial is a Multiple of Minimal Polynomial) char [Cayley–Hamilton Theorem], pT (T ) = 0; and [q(T ) = 0 ⇔ q is a char Multiple of the Minimal Polynomial] shows pT (T ) = s(T )pTmin (T ). By Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (69/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form The Cayley–Hamilton Theorem The Minimal Polynomial The Minimal Polynomial → Eigenvalues Theorem (Eigenvalues are the Zeros of the Minimal Polynomial) Let T ∈ L(V ). Then the zeros of the minimal polynomial of T are the eigenvalues of T . Proof (Eigenvalues are the Zeros of the Minimal Polynomial) Let p(z) = a0 + a1 z + · · · + am−1 z m−1 + z m be the minimum polynomial of T . Suppose λ ∈ F is a zero of p. Then p can be written in the form p(z) = (z − λ)q(z) where q is a monic polynomial with coefficients in F [Each Zero of a Polynomial Corresponds to a Degree-1 Factor (Notes#4)], Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (70/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form The Cayley–Hamilton Theorem The Minimal Polynomial The Minimal Polynomial → Eigenvalues Proof (Eigenvalues are the Zeros of the Minimal Polynomial) Since p(T ) = 0, we have 0 = (T − λI )q(T )(v ), ∀v ∈ V . Since deg(q) < deg(p) ∃v ∈ V : q(T )(v ) 6= 0; therefore λ must be an eigenvalue of T . Now, suppose λ ∈ F is an eigenvalue of T . ∃v ∈ V : v 6= 0, T j (v ) = λj v , j = 1, . . . . Now, 0 = p(T )(v ) ⇒ p(λ) = 0. = = = (a0 I + a1 T + · · · + am−1 T m−1 + T m )(v ) (a0 + a1 λ + · · · + am−1 λm−1 + λm )v p(λ)v √ Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (71/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form The Cayley–Hamilton Theorem The Minimal Polynomial The Minimal Polynomial ↔ Eigenvalues Example (Re-revisited [Slides 39, 46]) 3 Let T ∈ L(C ) be defined by T (z1 , z2 , z3 ) = (6z1 + 3z2 + 4z3 , 6z2 + 2z3 , 7z3 ) wrt the standard basis:   6 3 4 M(T ) = 0 6 2 , λ(T ) = {6, 7}. 0 0 7 G (6, T ) = span((1, 0, 0), (0, 1, 0)) G (7, T ) = span((10, 2, 1)) dim(G (6, T )) = 2 dim(G (7, T )) = 1 the characteristic polynomial is pT (z) = (z − 6)2 (z − 7); the minimal polynomial is either (z − 6)2 (z − 7) or (z − 6)(z − 7). Since  0 (M(T ) − 6I )(M(T ) − 7I ) =  0 0 −3 0 0  6 0 , 0  0 2 (M(T ) − 6I ) (M(T ) − 7I ) =  0 0 0 0 0  0 0  0 2 it follows that the minimal polynomial of T is pmin T (z) = (z − 6) (z − 7). Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (72/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form The Cayley–Hamilton Theorem The Minimal Polynomial The Minimal Polynomial ↔ Eigenvalues Example (Modified) Let T ∈ L(C3 ) be defined by standard basis:  6 M(T ) = 0 0 T (z1 , z2 , z3 ) = (6z1 , 6z2 , 7z3 ) wrt the 0 6 0  0 0 , 7 λ(T ) = {6, 7}. G (6, T ) = span((1, 0, 0), (0, 1, 0)) G (7, T ) = span((0, 0, 1)) dim(G (6, T )) = 2 dim(G (7, T )) = 1 the characteristic polynomial is pT (z) = (z − 6)2 (z − 7); the minimal polynomial is either (z − 6)2 (z − 7) or (z− 6)(z − 7). Since  0 (M(T ) − 6I )(M(T ) − 7I ) =  0 0 0 0 0 0 0  0 it follows that the minimal polynomial of T is pmin T (z) = (z − 6)(z − 7). Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (73/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form The Cayley–Hamilton Theorem The Minimal Polynomial hhh Live Math iii e.g. 8C-{3, 4, 5} Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (74/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form The Cayley–Hamilton Theorem The Minimal Polynomial Live Math :: Covid-19 Version 8C-4 8C-4: Given an example T ∈ L(C4 ) whose characteristic polynomial is p(z) = (z − 1)(z − 5)3 , and minimal polynomial is q(z) = (z − 1)(z − 5)2 . Any T with M(T ) ∈ R4×4 , upper triangular, with diagonal (1, 5, 5, 5) will have p(z) = (z − 1)(z − 5)3 . Diagonal M(T ) > > We try   M(T ) =   q(z) ∈   1 5 5    5 ⇔ T (z1 , z2 , z3 , z4 ) = (z1 , 5z2 , 5z3 , 5z4 ) (z − 1)(z − 5), (z − 1)(z − 5)2 , (z − 1)(z − 5)3 Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (75/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form The Cayley–Hamilton Theorem The Minimal Polynomial Live Math :: Covid-19 Version 8C-4  0 0 (M(T ) − I4 )(M(T ) − 5I4 ) =  0 0 0 0 0 0 0 0 0 0  0 0  0 0 ⇒ q(z) = (z − 1)(z − 5). Upper Triangular M(T ) >   M(T ) =    1 5 1 5 5    > ⇔ T (z1 , z2 , z3 , z4 ) = (z1 , 5z2 + z3 , 5z3 , 5z4 )  0 0 0 0 0 0  (M(T ) − I4 )(M(T ) − 5I4 ) =  0 0 Peter Blomgren hblomgren.peter@gmail.comi 0 4 0 0  0 0  0 0 ⇒ q(z) 6= (z − 1)(z − 5). 8. Operators on Complex Vector Spaces — (76/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form The Cayley–Hamilton Theorem The Minimal Polynomial Live Math :: Covid-19 Version  0 0 2  (M(T ) − I4 )(M(T ) − 5I4 ) =  0 0 8C-4 0 0 0 0 Peter Blomgren hblomgren.peter@gmail.comi 0 0 0 0  0 0  0 0 ⇒ q(z) = (z − 1)(z − 5)2 . 8. Operators on Complex Vector Spaces — (77/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Jordan “Normal” / “Canonical” Form Jordan “Normal” / “Canonical” Form At this point we know that if V is a complex vector space, then ∀T ∈ L(V ) there is a basis of V with respect to which T has [Block Diagonal Matrix with Upper-Triangular Blocks (Slide 43)]. Now, we a chasing more zeros: the goal is a basis of V wrt which the matrix of T contains 0’s everywhere except possibly on the diagonal (the eigenvalues), and the first super-diagonal (we allow 1’s or 0’s). We use nilpotents operators to get us there... Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (78/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Jordan “Normal” / “Canonical” Form Examples Example (Compare with [Slide 25–26]) Once again we consider a shift-operator in F4 : N(z1 , . . . , z4 ) = (0, z1 , . . . , z3 ); its action on v = (1, 0, 0, 0) generates a basis  B(F4 ) = N 3 (v ), N 2 (v ), N(v ), v = {e4 , e3 , e2 , e1 } , and   e4  M(N, B(F4 )) =  e3  e2 e1 e4 0 0 0 0 e3 1 0 0 0 e2 0 1 0 0 e1 0 0 1 0      Definition (Jordan Chain — Generator / Lead Vector; adopted from [Wikipedia]) Given an eigenvalue λ, its corresponding Jordan block gives rise to a Jordan chain. The generator, or lead vector, vr of the chain is a generalized eigenvector such that (A − λI )r vr = 0, where r is the size of the Jordan block. The vector v1 = (A − λI )r −1 vr is an eigenvector corresponding to λ. In general, vi−1 = (A − λI )vi . The lead vector generates the chain via repeated multiplication by (A − λI ). B = {v1 , · · · , vr } is a basis for the Jordan block. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (79/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Jordan “Normal” / “Canonical” Form Examples Example Let N ∈ L(F6 ): N(z1 , . . . , z6 ) = (0, z1 , z2 , 0, z4 , 0). Here thinking of a space isomorphic to F6 helps: F6 ∼ = F3 × F2 × F1 . On each space we have a right-shift operator: NF3 (z1 , z2 , z3 ) = (0, z1 , z2 ), NF2 (z1 , z2 ) = (0, z1 ), NF1 (z1 ) = (0), and we can define the linear map N× : F3 × F2 × F1 7→ F3 × F2 × F1 by N× ((z1 , z2 , z3 ), (z4 , z5 ), z6 ) = = (NF3 (z1 , z2 , z3 ), NF2 (z4 , z5 ), NF1 (z6 )) ((0, z1 , z2 ), (0, z4 ), (0)). By the previous example, the lead vectors wF3 ,1 = (1, 0, 0), wF2 ,1 = (1, 0), and wF1 ,1 = (1) will generate bases for B(F3 ), B(F2 ), and B(F1 ) so that  0   3 M NF3 , B(F ) =  0 0 1 0 0  0 1 , 0    0 2 M NF2 , B(F ) = 0 Peter Blomgren hblomgren.peter@gmail.comi 1 0  ,     1 0 M NF1 , B(F ) = . 8. Operators on Complex Vector Spaces — (80/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Jordan “Normal” / “Canonical” Form Examples Example If we “translate” all that back to N ∈ L(F6 ): N(z1 , . . . , z6 ) = (0, z1 , z2 , 0, z4 , 0). We have 3 lead vectors: w3 = (1, 0, 0, 0, 0, 0), w2 = (0, 0, 0, 1, 0, 0), w1 = (0, 0, 0, 0, 0, 1).  B(F6 ) = N 2 (w3 ), N(w3 ), w3 , N(w2 ), w2 , w1 , so that              0 0 1 0 0 0     0 1 0 0 0 0                  1 0 0 0 0 0 6 B(F ) =   ,   ,   ,   ,   ,   0 0 0 0 1 0       0 0 1 0 0       0 0 0 0 0 0 1    and 0 1 0 0 1     0   6   M(N, B(F )) =   0 1     0   0 Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (81/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Jordan “Normal” / “Canonical” Form Basis Corresponding to a Nilpotent Operator This theorem formalizes what we have denomstrated in the examples: Theorem (Basis Corresponding to a Nilpotent Operator) Suppose N ∈ L(V ) is nilpotent. Then there exist vectors v1 , . . . , vn ∈ V and nonnegative integers m1 , . . . , mn such that (a) N m1 (v1 ), . . . , N(v1 ), v1 , . . . , N mn (vn ), . . . , N(vn ), vn is a basis of V , (b) N m1 +1 (v1 ) = · · · = N mn +1 (vn ) = 0. Proof (Basis Corresponding to a Nilpotent Operator) If dim(V ) = 1, the only nilpotent operator is 0, let v 6= 0, and m1 = 0. [Induction–Hypothesis] Assume n = dim(V ) > 1 and the theorem holds on all spaces of smaller dimension. [Induction–Base] Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (82/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Jordan “Normal” / “Canonical” Form Basis Corresponding to a Nilpotent Operator Proof (Basis Corresponding to a Nilpotent Operator) Because N is nilpotent, N is not injective ⇒ not surjective [For L(V ): Injectivity ⇔ Surjectivity in Finite Dimensions (Notes#3.2)] and hence dim(range(N)) < dim(V ). Thus we can apply our inductive hypothesis to N|range(N) ∈ L(range(N)). By [Induction–Hypothesis] applied to N|range(N) there exist vectors v1 , . . . , vn ∈ range(N) nonnegative integers m1 , . . . , mn such that N m1 (v1 ), . . . , N(v1 ), v1 , . . . , N mn (vn ), . . . , N(vn ), vn is a basis of range(N), and N m1 +1 (v1 ) = . . . N Peter Blomgren hblomgren.peter@gmail.comi mn +1 (i) (vn ) = 0 8. Operators on Complex Vector Spaces — (83/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Jordan “Normal” / “Canonical” Form Basis Corresponding to a Nilpotent Operator Proof (Basis Corresponding to a Nilpotent Operator) Since (∀ℓ) vℓ ∈ range(N) ∃uℓ ∈ V : vℓ = N(uℓ ); thus N k+1 uℓ = N k vℓ . We use this to rewrite and augment (i): N m1 +1 (u1 ), . . . , N(u1 ), u1 , . . . , N mn +1 (un ), . . . , N(un ), un (ii) We need to verify that this is a list of linearly independent vectors. Assume some linear combination of the vectors in (ii) equals zero; apply N to that linear combination; this yields a linear combination of the vectors in (i) equal to zero. Since those vectors are linearly independent, the coefficients multiplying the vectors in the set (i) must be zero. What remains is a linear combination of  m1 +1 N (u1 ), . . . , N mn +1 (un ) = {N m1 (v1 ), . . . , N mn (vn )} which is a subset of (i), and hence those coefficients are also zero. ⇒ the list in (ii) is linearly independent. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (84/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Jordan “Normal” / “Canonical” Form Basis Corresponding to a Nilpotent Operator Proof (Basis Corresponding to a Nilpotent Operator) Next, we extend (ii) to a basis of V [Linearly Independent List Extends to a Basis (Notes#2)]: N m1 +1 (u1 ), . . . , N(u1 ), u1 , . . . , N (we need coverage for null(N) ∩ range(N)⊥ ) mn +1 (un ), . . . , N(un ), un , w1 , . . . , wp (iii) Each N(wk ) ∈ range(N) ⇒ N(wk ) ∈ span(i) = span(N(ii)) We can find xℓ ∈ span(ii) so that N(wℓ ) = N(xℓ ); let un+ℓ = wℓ − xℓ 6= 0. By construction N(un+ℓ ) = 0, and N m1 +1 (u1 ), . . . , N(u1 ), u1 , . . . , N mn +1 (un ), . . . , N(un ), un , un+1 , . . . , un+p (iv) spans V , because its span contains each xℓ and each un+ℓ and hence each wℓ and (iii) spans V . (iv) has the same length as (iii), so we have a basis with the desired properties. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (85/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Jordan Basis Jordan “Normal” / “Canonical” Form Jordan Form Definition (Jordan Basis) Suppose T ∈ L(V ). A basis of V is called a Jordan basis, J(V ) for T if wrt this basis T has a block diagonal matrix, where each block Ak is upper-triangular with diagonal entries λk , and first super-diagonal entries all ones:   λk 1 0   A1 0   .. ..   . .   .   .. M(T ; J (V )) =   , Ak =   . .  . 1 0 Ap 0 λk Theorem (Jordan Form) Suppose V is a complex vector space. If T ∈ L(V ), then there is a basis of V that is a Jordan basis for T . Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (86/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Jordan Basis Jordan “Normal” / “Canonical” Form Jordan Form Proof (Jordan Form) First consider a nilpotent operator N ∈ L(V ), and the vectors v1 , . . . , vn ∈ V given by [Basis Corresponding to a Nilpotent Operator]. For each k, N sends the first vector in the list N mk (vk ), . . . , N(vk ), vk to 0, and “left-shifts” the other vectors in the list. That is, [Basis Corresponding to a Nilpotent Operator] gives a basis of V wrt which N has a block diagonal matrix, where each matrix on the diagonal has the form   0 1 0   .. ..   . .    . . . 1   0 0 Thus the theorem holds for nilpotent operators... Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (87/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Jordan Basis Jordan “Normal” / “Canonical” Form Jordan Form Proof (Jordan Form) Now suppose T ∈ L(V ). Let λ1 , . . . , λm be the distinct eigenvalues of T . We have the generalized eigenspace decomposition V = G (λ1 , T ) ⊕ · · · ⊕ G (λm , T ), where each (T − λk I )|G (λk ,T ) is nilpotent [Description of Operators on Thus some basis of each G (λk , T ) is a Jordan basis for (T − λk I )|G (λk ,T ) . Put √ these bases together to get a basis of V that is a Jordan basis for T . Complex Vector Spaces]. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (88/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Jordan Basis Jordan “Normal” / “Canonical” Form Jordan Form Example (Jordan Form) Consider    A=  177 19 8 42 56 548 63 24 132 176 271 14 17 55 80 −548 −79 −20 −141 −184 −356 −23 −20 −76 −105      We try to find the minimal polynomial; we “randomly” select v = (1, 0, 0, 0, 0), and  form B = v Av A2 v A3 v A4 v A5 v :   1 177 957 4245 16761 62457 19 66 273 996 3567   0   8 48 216 864 3240  B= 0  0 42 204 894 3480 12882  0 56 288 1272 4992 18552 We need to find the first column which is linearly dependent on the previous; hence, we row-reduce: Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (89/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Jordan Basis Jordan “Normal” / “Canonical” Form Jordan Form Example (Jordan Form) 1  0  rref(B) =  0  0 0  0 1 0 0 0 0 0 1 0 0 −9 −3 5 0 0 −45 −24 22 0 0 −198 −111 86 0 0      Hence, −9v − 3Av + 5A2 v = A3 v ; and we have a candidate for the minimal polynomial: p(z) = z 3 − 5z 2 + 3z + 9 = (z + 1)(z − 3)2 . The way we have done it – using a “random” vector to start the problem, we are NOT guaranteed that this is the minimal polynomial. However, applying the polynomial to the full original matrix will give us the answer; in this case, indeed A3 − 5A2 + 3A + 9I5 = 0. If the test had failed: p(z) would have been one of the factors in the polynomial (so the work would not have been completely wasted). Another “random” guess (not a linear combination of the vectors in B) would be needed to identify more factors. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (90/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Jordan “Normal” / “Canonical” Form Jordan Form Example (Jordan Form) First we compute the eigenspaces E (−1, A) = null(A + 1I5 ) = span((2, 1, 0, 1, 1)), E (3, A) = null(A − 3I5 ) = span ((−19, 14, −6, 5, 0)), ((24, −7, 4, 0, 4)) . Clearly, these 3 vector cannot span C5 , we need genereralized eigenspaces...  null (A + 1I5 )2 = null(A + 1I5 ) ⇒ G (−1, A) = E (−1, A)          −8 0 11 4  3 0  0 0           null (A − 3I5 )2 = span  0 , 1 ,  0 , 0  0 0  3 0 0 0 0 3  Whereas, technically G (3, A) = null (A − 3I5 )5 , there can be no growth beyond this point (including G (−1, A) we already have 5 vectors). Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (91/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Jordan “Normal” / “Canonical” Form Jordan Form Example (Jordan Form) The null-space dimension of nilpotent “M(((T − λℓ I )k )|G (λℓ ,T ) )” matrix-blocks equal to k; hence the differences   • dim(null (A − 3I5 )2 ) − dim(null (A − 3I5 )1 ) = 4 − 2 = 2 tells us that we have 2 blocks of size 2 or larger; and   • dim(null (A − 3I5 )3 ) − dim(null (A − 3I5 )2 ) = 4 − 4 = 0 tells us that we have 0 blocks of size 3 or larger. At this point we know the Jordan Form of A:  −1  3 1   3    3     1  3 What remains is figuring out the basis B(V ) which gets us there. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (92/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Jordan “Normal” / “Canonical” Form Jordan Form Example (Jordan Form) We apply (A − 3I5 )k , k = 0, . . . to each vector in G (3, A) to form 4 Jordan Chains:     −8  252       28  3      8 ,  0 ,    0        60 0 80     0  271       14 0      14 , 1 ,   0        55 0 80     11  270      −28  0      28 ,  0 ,   30  3       0 64     4  372     7 0        −28 , 0 .   0        −60 3 −100 We form a basis using the vectors from 2 of the chains, and E (−1, A), e.g           0 −8 271 252 2 1  28  3  14 0           B(V ) = span 0 ,  8 ,  0 ,  14 , 1 1  60  0  55 0 0 80 0 80 1 Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (93/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Jordan “Normal” / “Canonical” Form Jordan Form Example (Jordan Form) ... and we arrive:    A=  177 19 8 42 56 548 63 24 132 176 271 14 17 55 80 −548 −79 −20 −141 −184 −356 −23 −20 −76 −105               0 271 −8 252 2 1  28  3  14 0          B(V ) = span 0 ,  8 ,  0  14 , 1 1  60  0  55 0 0 80 0 80 1   −1   3 1    3 M(A, B(V )) =     3 1  3 Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (94/99) Generalized Eigenvectors and Nilpotent Operators Decomposition of an Operator Characteristic and Minimal Polynomials Jordan Form Jordan “Normal” / “Canonical” Form hhh Live Math iii e.g. 8D-{3, 4, 5} Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (95/99) Problems, Homework, and Supplements Suggested Problems Assigned Homework Supplements Suggested Problems 8.A—1, 2, 3, 4, 5 8.B—1, 2, 3, 4, 5 8.C—1, 2, 3, 4, 5 8.D—1, 2, 3, 4, 5 Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (96/99) Problems, Homework, and Supplements Suggested Problems Assigned Homework Supplements Assigned Homework HW#8, Not Due Strongly Suggested Problems 8.A—1, 2 8.B—1, 2 8.C—1, 2 8.D—1, 2 Expect variants on the take-home and in-class finals. Peter Blomgren hblomgren.peter@gmail.comi 8. Operators on Complex Vector Spaces — (97/99) Suggested Problems Assigned Homework Supplements Problems, Homework, and Supplements Explicit References to Previous Theorems or Definitions (with count) 2 1 3-1 2 1 3 3-2 2 4 5 4 2 1 8 1 1 7-2 Peter Blomgren hblomgren.peter@gmail.comi 7-1 6 8. Operators on Complex Vector Spaces — (98/99) Problems, Homework, and Supplements Suggested Problems Assigned Homework Supplements Explicit References to Previous Theorems or Definitions 2 3-1 3-2 1 8 4 5 7-2 Peter Blomgren hblomgren.peter@gmail.comi 7-1 6 8. Operators on Complex Vector Spaces — (99/99) Math 524: Linear Algebra Notes #7.2 — Operators on Inner Product Spaces Peter Blomgren hblomgren.peter@gmail.comi Department of Mathematics and Statistics Dynamical Systems Group Computational Sciences Research Center San Diego State University San Diego, CA 92182-7720 http://terminus.sdsu.edu/ Spring 2021 (Revised: January 20, 2021) Peter Blomgren hblomgren.peter@gmail.comi 7.2. Operators on Inner Product Spaces — (1/56) Outline 1 Student Learning Targets, and Objectives SLOs: Operators on Inner Product Spaces 2 Positive Operators and Isometries Positive Operators Isometries 3 Polar Decomposition and Singular Value Decomposition Polar Decomposition Singular Value Decomposition 4 Problems, Homework, and Supplements Suggested Problems Assigned Homework Supplements Peter Blomgren hblomgren.peter@gmail.comi 7.2. Operators on Inner Product Spaces — (2/56) Student Learning Targets, and Objectives SLOs: Operators on Inner Product Spaces Student Learning Targets, and Objectives Target Positive Operators Objective Be able to characterize Positive Operators, and in particular construct the Unique Positive Square Root Operator. Target Isometries Objective Be able to state the definition of, and characterize Isometries Target Polar Decomposition Objective Be able to abstractly construct∗ the Polar Decomposition of an Operator, through Identification of the appropriate Isometry and Postive Operator. Target Singular Value Decomposition Objective Be able to abstractly construct∗ the Singular Value Decomposition of an Operator, by Identifying the Singular Values and Orthonormal Bases. ∗ Generally practical constructions must be addressed with computational tools from [Math 543]. Peter Blomgren hblomgren.peter@gmail.comi 7.2. Operators on Inner Product Spaces — (3/56) Positive Operators and Isometries Polar Decomposition and Singular Value Decomposition Positive Operators Isometries Positive Operators Definition (Positive Operator) An operator T ∈ L(V ) is called positive if T is self-adjoint and hT (v ), v i ≥ 0 ∀v ∈ V . If V is a complex vector space, then the requirement that T is self-adjoint can be dropped from the definition above: Rewind (Over C, hT (v ), v i ∈ R ∀v ∈ V Only for Self-Adjoint Operators [Notes#7.1]) Suppose V is a complex inner product space and T ∈ L(V ). Then T is self-adjoint if and only if hT (v ), v i ∈ R ∀v ∈ V . Peter Blomgren hblomgren.peter@gmail.comi 7.2. Operators on Inner Product Spaces — (4/56) Positive Operators and Isometries Polar Decomposition and Singular Value Decomposition Positive Operators Isometries Positive Operators Example (Positive Operators) If U is a subspace of V , then the orthogonal projections PU and PU ⊥ are positive operators If T ∈ L(V ) is self-adjoint and b, c ∈ R are such that b 2 < 4c, then (T 2 + bT + cI ) is a positive operator, as shown by the proof of [Invertible Quadratic (Operator) Expressions (Notes#7.1)] Rewind (Invertible Quadratic (Operator) Expressions [Notes#7.1]) 2 Suppose T ∈ L(V ) is self-adjoint, and b, c ∈ R : b < 4c, then T 2 + bT + cI is invertible. Peter Blomgren hblomgren.peter@gmail.comi 7.2. Operators on Inner Product Spaces — (5/56) Positive Operators and Isometries Polar Decomposition and Singular Value Decomposition Positive Operators Isometries Square Root Definition (Square Root) An operator R is called a square root of an operator T if R 2 = T . Example (Square Root) If T ∈ L(F3 ) is defined by T (z1 , z2 , z3 ) = (z3 , 0, 0), then the operator R ∈ L(F3 ) defined by R(z1 , z2 , z3 ) = (z2 , z3 , 0) is a square root of T : R 2 (z1 , z2 , z3 ) = R(z2 , z3 , 0) = (z3 , 0, 0) = T (z1 , z2 , z3 ) Example (n-th Roots?) If T ∈ L(Fn+1 ) is defined by T (z1 , . . . , zn+1 ) = (zn+1 , 0, . . . , 0), then the operator R ∈ L(Fn+1 ) defined by R(z1 , . . . , zn+1 ) = (z2 , z3 , . . . , zn+1 , 0) is an nth root of T : R n (z1 , . . . , zn ) = = R n−1 (z2 , z3 , . . . , zn+1 , 0) = R n−2 (z3 , z4 , . . . , zn+1 , 0, 0) . . . = (zn+1 , 0, . . . , 0) = T (z1 , . . . , zn+1 ) Peter Blomgren hblomgren.peter@gmail.comi 7.2. Operators on Inner Product Spaces — (6/56) Positive Operators and Isometries Polar Decomposition and Singular Value Decomposition Positive Operators Isometries “Positive” vs “Non-Negative” vs “Semi-Positive” Comment (“Positive” vs “Non-Negative” vs “Semi-Positive”) The positive operators correspond to the numbers [0, ∞), so a more precise terminology would use the term non-negative instead of positive. However, operator-theorists consistently call these the positive operators. Restricted to the Matrix-Vector “universe” we tend to talk about (strictly) Positive Definite and Positive Semi-Definite Matrices (“Matrix-Operators,” if you want). Peter Blomgren hblomgren.peter@gmail.comi 7.2. Operators on Inner Product Spaces — (7/56) Positive Operators and Isometries Polar Decomposition and Singular Value Decomposition Positive Operators Isometries Characterization of Positive Operators Theorem (Characterization of Positive Operators) Let T ∈ L(V ), then the following are equivalent (a) T is positive (b) T is self-adjoint and all the eigenvalues of T are non-negative (c) T has a positive square root (d) T has a self-adjoint square root; (e) there exists an operator R ∈ L(V ) such that T = R ∗ R Matrices: Cholesky factorization; or “Hermitian LU-factorization” Peter Blomgren hblomgren.peter@gmail.comi 7.2. Operators on Inner Product Spaces — (8/56) Positive Operators and Isometries Polar Decomposition and Singular Value Decomposition Positive Operators Isometries Characterization of Positive Operators Proof (Characterization of Positive Operators) (a)⇒(b) T is positive (hT (v ), v i ≥ 0, and by (R):definition or C:[Notes#7.1] we also have T = T ∗ ); suppose λ is an eigenvalue of T and v the corresponding eigenvector, then 0 ≤ hT (v ), v i = hλv , v i = λhv , v i ⇒ λ ∈ [0, ∞) Peter Blomgren hblomgren.peter@gmail.comi ⇒ (b) 7.2. Operators on Inner Product Spaces — (9/56) Positive Operators and Isometries Polar Decomposition and Singular Value Decomposition Positive Operators Isometries Characterization of Positive Operators Proof (Characterization of Positive Operators) (b)⇒(c) T is self-adjoint (T = T ∗ ) and λ(T ) ∈ [0, ∞). By [Complex Spectral Theorem (Notes#7.1)] or [Real Spectral Theorem (Notes#7.1)], there is an orthonormal basis v1 , . . . , vn of V consisting of eigenvectors of T ; let λk : T (vk ) = λk vk ; thus λk ∈ [0, ∞). Let R ∈ L(V ) such that p R(vk ) = λk vk , k = 1, . . . , n R is a positive operator, and R 2 (vk ) = λk vk = T (vk ), k = 1, . . . , n; i.e. R 2 = T . ⇒ (c) Thus R is a positive square root of T . Peter Blomgren hblomgren.peter@gmail.comi 7.2. Operators on Inner Product Spaces — (10/56) Positive Operators and Isometries Polar Decomposition and Singular Value Decomposition Positive Operators Isometries Characterization of Positive Operators Proof (Characterization of Positive Operators) (c)⇒(d) By definition, every positive operator is self-adjoint. (d)⇒(e) Assume ∃R ∈ L(V ) so that R = R ∗ and R 2 = T : Then T = R ∗ R ∗ ∗ ∗ ⇒ (e) (e)⇒(a) Suppose ∃R ∈ L(V ) : T = R R, then T = (R R)∗ = R ∗ (R ∗ )∗ = R ∗ R = T . (which makes T self-adjoint). Also, hT (v ), v i = h(R ∗ R)(v ), v i = hR(v ), R(v )i ≥ 0 ∀v ∈ V , hence T is positive. We now have (a)⇒(b)⇒(c)⇒(d)⇒(e)⇒(a). Peter Blomgren hblomgren.peter@gmail.comi √ 7.2. Operators on Inner Product Spaces ⇒ (a) — (11/56) Positive Operators and Isometries Polar Decomposition and Singular Value Decomposition Positive Operators Isometries Uniqueness of the Square Root Theorem (Each Positive Operator Has Only One Positive Square Root) Every positive operator on V has a unique positive square root. Comment (“Positive Operators Act Like Real Numbers”) Each non-negative number has a unique non-negative square root. Again, positive operators have “real” properties. Comment (What is Unique?) A positive operator can have infinitely many square roots; only one of them can be positive. Peter Blomgren hblomgren.peter@gmail.comi 7.2. Operators on Inner Product Spaces — (12/56) Positive Operators and Isometries Polar Decomposition and Singular Value Decomposition Positive Operators Isometries Uniqueness of the Square Root Proof (Each Positive Operator Has Only One Positive Square Root) Suppose T ∈ L(V ) is positive; let t ∈ V be an eigenvector, and λ(T ) ≥ 0: T (t) = λ(T ) t. Let R be a positive square root of T . √ Note: We show R(t) = λ(T ) t ⇒ the action of R on the eigenvectors of T is uniquely determined. Since there is a basis of V consisting of eigenvectors of T [C/R Spectral Theorem (Notes#7.1)], this implies that R is uniquely determined. √ To show that R(t) = λ(T ) t, we use the fact that [C/R Spectral Theorem (Notes#7.1)] guarantees an orthonormal basis r1 , . . . , rn of V consisting on eigenvectors of R. Since R is a positive operator λ(R) ≥ 0 (R) (R) (R) ⇒ ∃λ1 , . . . , λn ≥ 0 such that R(rk ) = λk rk for k = 1, . . . , n. → → Peter Blomgren hblomgren.peter@gmail.comi → → 7.2. Operators on Inner Product Spaces → — (13/56) Positive Operators and Isometries Polar Decomposition and Singular Value Decomposition Positive Operators Isometries Uniqueness of the Square Root Proof (Each Positive Operator Has Only One Positive Square Root) ! Since r1 , . . . , rn is a basis of V , we can write t = (a1 r1 + · · · + an rn ), for a1 , . . . , an ∈ F, thus (R) (R) R(t) = a 1 λ1 r1 + · · · + a n λn rn R 2 (t) = a1 (λ1 )2 r1 + · · · + an (λn )2 rn (R) (R) But R 2 = T (by assumption, it is a positive square root of T ), and T (t) = λ(T ) t; therefore, the above implies (R) 2 a1 λ(T ) r 1 + · · · + an λ(T ) r n = a1 (λ1 )2 r1 + · · · + an (λ(R) n ) rn (R) 2 ) ) (R) ⇒ aj (λ(T ) − (λj = 0, j = 1, . . . , n (either aj = 0, or (λ(T ) − (λj )2 ) = 0). p X p X aj λ(T ) rj = λ(T ) t, aj rj ⇒ R(t) = Hence, t = j:aj 6=0 j:aj 6=0 which is what we needed to show. √ Peter Blomgren hblomgren.peter@gmail.comi 7.2. Operators on Inner Product Spaces — (14/56) Positive Operators and Isometries Polar Decomposition and Singular Value Decomposition Positive Operators Isometries Isometries — Norm-Preserving Operators Definition (Isometry) An operator S ∈ L(V ) is called an isometry if kS(v )k = kv k ∀v ∈ V . “An operator is an isometry if it preserves norms.” Rewind (Orthogonal Transformations [Math-254 (Notes#5.3)]) A linear transformation T : Rn → Rn is called orthogonal if it preserves the length of vectors: kT (~ x )k = k~ x k, ∀~ x ∈ Rn . If T (~ x ) = A~ x is an orthogonal transformation, we say that A is an orthogonal (or unitary, when it has complex entries) matrix. Peter Blomgren hblomgren.peter@gmail.comi 7.2. Operators on Inner Product Spaces — (15/56) Positive Operators and Isometries Polar Decomposition and Singular Value Decomposition Positive Operators Isometries Isometries — Norm-Preserving Operators Example Suppose λ1 , . . . , λn are scalars with |λk | = 1, and S ∈ L(V ) satisfies S(sj ) = λj sj for some orthonormal basis s1 , . . . , sn of V . We demonstrate that S is an isometry. Let v ∈ V , then v = hv , s1 is1 + · · · + hv , sn isn kv k2 S(v ) kS(v )k2 1 = |hv , s1 i|2 + · · · + |hv , sn i|2 = hv , s1 iS(s1 ) + · · · + hv , sn iS(sn ) = λ1 hv , s1 is1 + · · · + λn hv , sn isn 1 = = |λ1 |2 |hv , s1 i|2 + · · · + |λn |2 |hv , sn i|2 |hv , s1 i|2 + · · · + |hv , sn i|2 1 = [Writing a Vector as a Linear Combination of Orthonormal Basis (Notes#6)] Peter Blomgren hblomgren.peter@gmail.comi 7.2. Operators on Inner Product Spaces — (16/56) Positive Operators and Isometries Polar Decomposition and Singular Value Decomposition Positive Operators Isometries Characterization of Isometries Theorem (Characterization of Isometries) Suppose S ∈ L(V ), then the following are equivalent: (a) S is an isometry (b) hS(u), S(v )i = hu, v i ∀u, v ∈ V (c) S(u1 ), . . . , S(un ) is orthonormal for every orthonormal list of vectors u1 , . . . , un in V (d) there exists an orthonormal list of vectors u1 , . . . , un of V such that S(u1 ), . . . , S(un ) is orthonormal (e) S ∗ S = I (f) SS ∗ = I (g) S ∗ is an isometry (h) S is invertible and S −1 = S ∗ Peter Blomgren hblomgren.peter@gmail.comi 7.2. Operators on Inner Product Spaces — (17/56) Positive Operators and Isometries Polar Decomposition and Singular Value Decomposition Positive Operators Isometries Some Help for the Proof Theorem (The Inner Product on a Real Inner Product Space) Suppose V is a real inner product space, then ku + v k2 − ku − v k2 hu, v i = 4 ∀u, v ∈ V . Theorem (The Inner Product on a Complex Inner Product Space) Suppose V is a complex inner product space, then ku + v k2 − ku − v k2 + iku + iv k2 − iku − iv k2 hu, v i = 4 ∀u, v ∈ V . The proofs for these identities are by “direct computation” (very similar to what we did in [Notes#7.1]). The bottom line is that we can express the inner product in terms of the norm. Peter Blomgren hblomgren.peter@gmail.comi 7.2. Operators on Inner Product Spaces — (18/56) Positive Operators and Isometries Polar Decomposition and Singular Value Decomposition Positive Operators Isometries Characterization of Isometries Proof (Characterization of Isometries) (a)⇒(b) Suppose S is an isometry; the “help theorems” show that inner products can be computed from norms. Since S preserves norms, ⇒ S preserves inner products. ⇒ (b) (b)⇒(c) Assume S preserves inner products, let u1 , . . . , un be an orthonormal list of vectors in V ; S(u1 ), . . . , S(un ) must be an orthonormal list of vectors since hS(ui ), S(uj )i = hui , uj i = δij . ⇒ (c) √ (c)⇒(d) Peter Blomgren hblomgren.peter@gmail.comi 7.2. Operators on Inner Product Spaces — (19/56) Positive Operators and Isometries Polar Decomposition and Singular Value Decomposition Positive Operators Isometries Characterization of Isometries Proof (Characterization of Isometries) (d)⇒(e) Let u1 , . . . , un be an orthonormal basis of V such that S(u1 ), . . . , S(un ) is orthonormal. Thus hS ∗ S(uj ), uk i = hS(uj ), S(uk )i = huj , uk i All v , w ∈ V can be written as unique linear combinations of u1 , . . . , un , therefore hS ∗ S(v ), w i = hv , w i ⇒ S ∗ S = I . ⇒ (e) (e)⇒(f) S ∗ S = I . ⇒ {S ∗ (SS ∗ ) = S ∗ , (SS ∗ )S = S} ⇒ SS ∗ = I . ⇒ (f) (f)⇒(g) SS ∗ = I , let v ∈ V , then kS ∗ (v )k2 = hS ∗ (v ), S ∗ (v )i = hSS ∗ (v ), v i = hv , v i = kv k2 ⇒ S ∗ is an isometry. Peter Blomgren hblomgren.peter@gmail.comi ⇒ (g) 7.2. Operators on Inner Product Spaces — (20/56) Positive Operators and Isometries Polar Decomposition and Singular Value Decomposition Positive Operators Isometries Characterization of Isometries Proof (Characterization of Isometries) (g)⇒(h) S ∗ is an isometry. We can apply the previously shown parts of the theorem, in particular (a)⇒(e), and (a)⇒(f) to S ∗ (with (S ∗ )∗ ). This gives S ∗ S = SS ∗ = I , which means that S is invertible, and S −1 = S ∗ . ⇒ (h) (h)⇒(a) S is invertible, and S −1 = S ∗ ; let v ∈ V , then kS(v )k2 = hS(v ), S(v )i = h(S ∗ S)(v ), v i = hv , v i = kv k2 that is S is an isometry. We now have (a)⇒(b)⇒(c)⇒(d)⇒(e)⇒(f)⇒(g)⇒(h)⇒(a). Peter Blomgren hblomgren.peter@gmail.comi √ 7.2. Operators on Inner Product Spaces ⇒ (a) — (21/56) Positive Operators and Isometries Polar Decomposition and Singular Value Decomposition Positive Operators Isometries Description of Isometries when F = C Theorem (Description of Isometries when F = C) Suppose V is a complex inner product space and S ∈ L(V ). Then the following are equivalent: (a) S is an isometry (b) There is an orthonormal basis of V consisting of eigenvectors of S whose corresponding eigenvalues all have absolute value 1 Proof (Description of Isometries when F = C) The example on slide 16 shows (b)⇒(a). To show (a)⇒(b), we assume S is an isometry and use [Complex Spectral Theorem (Notes#7.1)] to guarantee an orthonormal basis s1 , . . . , sn of V consisting of eigenvectors of S. Let λ1 , . . . , λn be the corresponding eigenvalues. Then |λj | = kλj sj k = kS(sj )k = ksj k = 1, √ that is |λj | = 1 j = 1, . . . , n. . Upcoming: [Description of Isometries when F = R (Notes#7.2–Preview)]. Peter Blomgren hblomgren.peter@gmail.comi 7.2. Operators on Inner Product Spaces — (22/56) Positive Operators and Isometries Polar Decomposition and Singular Value Decomposition Positive Operators Isometries “Preview” Preview (Description of Isometries when F = R) Suppose V is a real inner product space and S ∈ L(V ). Then the following are equivalent: (a) S is an isometry (b) There is an orthon...
Purchase answer to see full attachment
User generated content is uploaded by users for the purposes of learning and should be used following Studypool's honor code & terms of service.

Explanation & Answer

View attached explanation and answer. Let me know if you have any questions.Here are my answers to your assignment.Both documents are the same - one is in Ms. Word format, and the other in PDF - for your convenience.



1. 〈𝑓, 𝑔〉 = ∫0 𝑓(𝑥)𝑔(𝑥)𝑒 −𝑥 𝑑𝑥 , ∀ 𝑖𝑛𝑒𝑡𝑒𝑔𝑟 𝑝, 𝑞 ≥ 0 〈𝑥 𝑝 , 𝑥 𝑞 〉 = (𝑝 + 𝑞)!
𝑠𝑡𝑎𝑛𝑑𝑎𝑟𝑑 𝑝𝑜𝑙𝑦𝑛𝑜𝑚𝑖𝑎𝑙 𝑏𝑎𝑠𝑖𝑠 {1, 𝑥, 𝑥 2 , 𝑥 3 }
𝐿𝑒𝑡 𝑢0 = 1
𝑢1 = 𝑥 −

〈𝑥, 1〉 ∙ 1
〈𝑥 1 , 𝑥 0 〉
(1 + 0)!
=
𝑥

=𝑥−
= 𝑥−1
|1|2
〈𝑥 0 , 𝑥 0 〉
(0 + 0)!

〈𝑥 2 , 1〉
〈𝑥 2 , 𝑥 − 1〉
𝑢2 = 𝑥 −
∙1−
∙ (𝑥 − 1)
|1|2
|𝑥 − 1|2
2

a. 〈𝑥 2 , 1〉 = (2 + 0)! = 2
b. |1|2 = 1



c. 〈𝑥 2 , 𝑥 − 1〉 = ∫0 (𝑥 2 )(𝑥 − 1)𝑒 −𝑥 𝑑𝑥 = ∫0 𝑥 3 𝑒 −𝑥 𝑑𝑥 − ∫0 𝑥 2 𝑒 −𝑥 𝑑𝑥 = 6 − 2 = 4




d. |𝑥 − 1|2 = 〈𝑥 − 1, 𝑥 − 1〉 = ∫0 (𝑥 − 1)2 𝑒 −𝑥 𝑑𝑥 = ∫0 (𝑥 2 − 2𝑥 + 1)𝑒 −𝑥 𝑑𝑥






= ∫0 𝑥 2 𝑒 −𝑥 𝑑𝑥 − 2 ∫0 𝑥𝑒 −𝑥 𝑑𝑥 + ∫0 𝑒 −𝑥 𝑑𝑥 = 2 − 2(1) + 1 = 1
2
4
𝑢2 = 𝑥 2 − ∙ 1 − ∙ (𝑥 − 1) = 𝑥 2 − 2 − 4𝑥 + 4 = 𝑥 2 − 4𝑥 + 2
1
1
𝑢3 = 𝑥 3 −
a.
b.
c.
d.
e.

〈𝑥 3 , 1〉
〈𝑥 3 , 𝑥 − 1〉
〈𝑥 3 , 𝑥 2 − 4𝑥 + 2〉
(𝑥

1



1)

∙ (𝑥 2 − 4𝑥 + 2)
|1|2
|𝑥 − 1|2
|𝑥 2 − 4𝑥 + 2|2

〈𝑥 3 , 1〉 = (3 + 0)! = 3! = 6
|1|2 = 1



〈𝑥 3 , 𝑥 − 1〉 = ∫0 (𝑥 3 )(𝑥 − 1)𝑒 −𝑥 𝑑𝑥 = ∫0 𝑥 4 𝑒 −𝑥 𝑑𝑥 − ∫0 𝑥 3 𝑒 −𝑥 𝑑𝑥 = 24 − 6 = 18
|𝑥 − 1|2 = 1



〈𝑥 3 , 𝑥 2 − 4𝑥 + 2〉 = ∫0 (𝑥 3 )(𝑥 2 − 4𝑥 + 2)𝑒 −𝑥 𝑑𝑥 = ∫0 𝑥 5 𝑒 −𝑥 𝑑𝑥 − 4 ∫0 𝑥 4 𝑒 −𝑥 𝑑𝑥 +


2 ∫0 𝑥 3 𝑒 −𝑥 𝑑𝑥 = 120 − 4(24) + 2(6) = 120 − 96 + 12 = 36
f.





〈𝑥 2 − 4𝑥 + 2, 𝑥 2 − 4𝑥 + 2〉 = ∫0 (𝑥 2 − 4𝑥 + 2)(𝑥 2 − 4𝑥 + 2)𝑒 −𝑥 𝑑𝑥 = ∫0 𝑥 4 𝑒 −𝑥 𝑑𝑥 −








8 ∫0 𝑥 3 𝑒 −𝑥 𝑑𝑥 + 20 ∫0 𝑥 2 𝑒 −𝑥 𝑑𝑥 − 16 ∫0 𝑥𝑒 −𝑥 𝑑𝑥 + 4 ∫0 𝑒 −𝑥 𝑑𝑥 = 24 − 8(6) + 20(2) −
16(1) + 4(1) = 4
6
18
36
𝑢3 = 𝑥 3 − ∙ 1 −
∙ (𝑥 − 1) −
∙ (𝑥 2 − 4𝑥 + 2) = 𝑥 3 − 9𝑥 2 + 18𝑥 − 6
1
1
4

(−1)0
𝑢0 = (1)(1) = 1
0!
(−1)1
𝐿1 =
− 𝑢1 = (−1)(𝑥 − 1) = 1 − 𝑥
1!

(−1)2
1
𝑥2
𝐿2 =
𝑢2 = ( ) (𝑥 2 − 4𝑥 + 2) =
− 2𝑥 + 1
2!
2
2
(−1)3
1 3
𝑥 3 3𝑥 2
2
(𝑥
− 9𝑥 + 18𝑥 − 6) = − +
− 3𝑥 + 1
{ 𝐿3 = 3! 𝑢3 = − 6
6
2
𝐿0 =

∴ 𝐿 {1, 1 − 𝑥,

𝑥2
𝑥 3 3𝑥 2
− 2𝑥 + 1, − +
− 3𝑥 + 1}
2
6
2

2. Given that ℬ𝑠 (ℂ7 ) = {𝑒1 , 𝑒2 , 𝑒3 , 𝑒4 , 𝑒5 , 𝑒6 , 𝑒7 } is std basis of ℂ7
a. 𝐹𝑜𝑟 ℳ(𝑇, ℬ𝑠 (𝐶 7 )):
𝑇(𝑒1 ) = (4, 0, 0, 0, 0, 0, 0)
𝑇(𝑒2 ) = (1, 4, 0, 0, 0, 0, 0)
𝑇(𝑒3 ) = (1, 1, 4, 0, 0, 0, 0)
𝑇(𝑒4 ) = (1, 1, 1, 4, 0, 0, 0)
𝑇(𝑒5 ) = (0, 0, 0, 0, 3, 0, 0)
𝑇(𝑒6 ) = (0, 0, 0, 0, 1, 3, 0)
𝑇(𝑒7 ) = (0, 0, 0, 0,1, 1, 3)
The solution will be:
4
0
0
ℳ(𝑇) = 0
0
0
[0

1
4
0
0
0
0
0

1
1
4
0
0
0
0

1
1
1
4
0
0
0

0
0
0
0
3
0
0

0
0
0
0
1
3
0

b. {𝜆𝑘 }𝑘=1 = 4, 4, 4, 4, 3, 3, 3

0
0
0
0
1
1
3]

c. For each eigenvalue, 𝜆𝑘 :
i.
Find the eigenspace
For 𝜆 = 4, solution set of [(𝐴 − 4𝐼)𝑥 = 0]
So:
0
0
0
0
0
0
[0

1
0
0
0
0
0
0

𝑥1
1 0
0
0
0
𝑥
1 0
0
0
0
2
𝑥3
0
1 0
0
0
0 0
0
0 ∙ 𝑥4 = 0
𝑥5
0
0 −1 1
1
𝑥6
0
0 0 −1 1
−1] [𝑥7 ] [0]
0 0
0

1
1
0
0
0
0
0

𝑥3 = �...


Anonymous
Awesome! Made my life easier.

Studypool
4.7
Indeed
4.5
Sitejabber
4.4

Related Tags