In mathematics, an invariant subspace of a linear mapping T : V → V i.e. from some vector space V to itself, is a subspace W of V that is preserved by T. More generally, an invariant subspace for a collection of linear mappings is a subspace preserved by each mapping individually.
Consider a vector space [math]\displaystyle{ V }[/math] and a linear map [math]\displaystyle{ T: V \to V. }[/math] A subspace [math]\displaystyle{ W \subset V }[/math] is called an invariant subspace for [math]\displaystyle{ T }[/math], or equivalently, T-invariant, if T transforms any vector [math]\displaystyle{ \mathbf{v} \in W }[/math] back into W. In formulas, this can be written[math]\displaystyle{ \mathbf{v} \in W \implies T(\mathbf{v}) \in W }[/math]or[1] [math]\displaystyle{ TW\subseteq W\text{.} }[/math]
In this case, T restricts to an endomorphism of W:[2][math]\displaystyle{ T|_W : W \to W\text{;}\quad T|_W(\mathbf{x}) = T(\mathbf{x})\text{.} }[/math]
The existence of an invariant subspace also has a matrix formulation. Pick a basis C for W and complete it to a basis B of V. With respect to B, the operator T has form [math]\displaystyle{ T = \begin{bmatrix} T|_W & T_{12} \\ 0 & T_{22} \end{bmatrix} }[/math] for some T12 and T22.
Any linear map [math]\displaystyle{ T : V \to V }[/math] admits the following invariant subspaces:
These are the trivial invariant subspaces. Certain linear operators have no non-trivial invariant subspace: for instance, rotation of a two-dimensional real vector space. However, the axis of a rotation in three dimensions is always an invariant subspace.
If U is a 1-dimensional invariant subspace for operator T with vector v ∈ U, then the vectors v and Tv must be linearly dependent. Thus [math]\displaystyle{ \forall\mathbf{v}\in U\;\exists\alpha\in\mathbb{R}: T\mathbf{x}=\alpha\mathbf{v}\text{.} }[/math]In fact, the scalar α does not depend on v.
The equation above formulates an eigenvalue problem. Any eigenvector for T spans a 1-dimensional invariant subspace, and vice-versa. In particular, an nonzero invariant vector (i.e. a fixed point of T) spans an invariant subspace of dimension 1.
As a consequence of the fundamental theorem of algebra, every linear operator on a nonzero finite-dimensional complex vector space has an eigenvector. Therefore, every such linear operator has a non-trivial invariant subspace.
Determining whether a given subspace W is invariant under T is ostensibly a problem of geometric nature. Matrix representation allows one to phrase this problem algebraically.
Write V as the direct sum W⊕W′; a suitable W′ can always be chosen by extending a basis of W. The projection operator P onto W has matrix representation
A straightforward calculation shows that W is T-invariant if and only if PTP = TP.
If 1 is the identity operator, then 1-P is projection onto W′. The equation TP = PT holds if and only if both ran P and ran(1 − P) are invariant under T. In that case, T has matrix representation [math]\displaystyle{ T = \begin{bmatrix} T_{11} & 0 \\ 0 & T_{22} \end{bmatrix} : \begin{matrix} \operatorname{Ran}P \\ \oplus \\ \operatorname{Ran}(1-P) \end{matrix} \rightarrow \begin{matrix} \operatorname{Ran}P \\ \oplus \\ \operatorname{Ran}(1-P) \end{matrix} \;. }[/math]
Colloquially, a projection that commutes with T "diagonalizes" T.
As the above examples indicate, the invariant subspaces of a given linear transformation T shed light on the structure of T. When V is a finite-dimensional vector space over an algebraically closed field, linear transformations acting on V are characterized (up to similarity) by the Jordan canonical form, which decomposes V into invariant subspaces of T. Many fundamental questions regarding T can be translated to questions about invariant subspaces of T.
The set of T-invariant subspaces of V is sometimes called the invariant-subspace lattice of T and written Lat(T). As the name suggests, it is a (modular) lattice, with meets and joins given by (respectively) set intersection and linear span. A minimal element in Lat(T) in said to be a minimal invariant subspace.
In the study of infinite-dimensional operators, Lat(T) is sometimes restricted to only the closed invariant subspaces.
Given a collection T of operators, a subspace is called T-invariant if it is invariant under each T ∈ T.
As in the single-operator case, the invariant-subspace lattice of T, written Lat(T), is the set of all T-invariant subspaces, and bears the same meet and join operations. Set-theoretically, it is the intersection [math]\displaystyle{ \mathrm{Lat}(\mathcal{T})=\bigcap_{T\in\mathcal{T}}{\mathrm{Lat}(T)}\text{.} }[/math]
Let End(V) be the set of all linear operators on V. Then Lat(End(V))={0,V}.
Given a representation of a group G on a vector space V, we have a linear transformation T(g) : V → V for every element g of G. If a subspace W of V is invariant with respect to all these transformations, then it is a subrepresentation and the group G acts on W in a natural way. The same construction applies to representations of an algebra.
As another example, let T ∈ End(V) and Σ be the algebra generated by {1, T }, where 1 is the identity operator. Then Lat(T) = Lat(Σ).
Just as the fundamental theorem of algebra ensures that every linear transformation acting on a finite-dimensional complex vector space has a nontrivial invariant subspace, the fundamental theorem of noncommutative algebra asserts that Lat(Σ) contains nontrivial elements for certain Σ.
Theorem (Burnside) — Assume V is a complex vector space of finite dimension. For every proper subalgebra Σ of End(V), Lat(Σ) contains a nontrivial element.
One consequence is that every commuting family in L(V) can be simultaneously upper-triangularized. To see this, note that an upper-triangular matrix representation corresponds to a flag of invariant subspaces, that a commuting family generates a commuting algebra, and that End(V) is not commutative when dim(V) ≥ 2.
If A is an algebra, one can define a left regular representation Φ on A: Φ(a)b = ab is a homomorphism from A to L(A), the algebra of linear transformations on A
The invariant subspaces of Φ are precisely the left ideals of A. A left ideal M of A gives a subrepresentation of A on M.
If M is a left ideal of A then the left regular representation Φ on M now descends to a representation Φ' on the quotient vector space A/M. If [b] denotes an equivalence class in A/M, Φ'(a)[b] = [ab]. The kernel of the representation Φ' is the set {a ∈ A | ab ∈ M for all b}.
The representation Φ' is irreducible if and only if M is a maximal left ideal, since a subspace V ⊂ A/M is an invariant under {Φ'(a) | a ∈ A} if and only if its preimage under the quotient map, V + M, is a left ideal in A.
The invariant subspace problem concerns the case where V is a separable Hilbert space over the complex numbers, of dimension > 1, and T is a bounded operator. The problem is to decide whether every such T has a non-trivial, closed, invariant subspace. This problem is unsolved (As of 2021).
In the more general case where V is assumed to be a Banach space, there is an example of an operator without an invariant subspace due to Per Enflo (1976). A concrete example of an operator without an invariant subspace was produced in 1985 by Charles Read.
Related to invariant subspaces are so-called almost-invariant-halfspaces (AIHS's). A closed subspace [math]\displaystyle{ Y }[/math] of a Banach space [math]\displaystyle{ X }[/math] is said to be almost-invariant under an operator [math]\displaystyle{ T \in \mathcal{B}(X) }[/math] if [math]\displaystyle{ TY \subseteq Y+E }[/math] for some finite-dimensional subspace [math]\displaystyle{ E }[/math]; equivalently, [math]\displaystyle{ Y }[/math] is almost-invariant under [math]\displaystyle{ T }[/math] if there is a finite-rank operator [math]\displaystyle{ F \in \mathcal{B}(X) }[/math] such that [math]\displaystyle{ (T+F)Y \subseteq Y }[/math], i.e. if [math]\displaystyle{ Y }[/math] is invariant (in the usual sense) under [math]\displaystyle{ T+F }[/math]. In this case, the minimum possible dimension of [math]\displaystyle{ E }[/math] (or rank of [math]\displaystyle{ F }[/math]) is called the defect.
Clearly, every finite-dimensional and finite-codimensional subspace is almost-invariant under every operator. Thus, to make things nontrivial, we say that [math]\displaystyle{ Y }[/math] is a halfspace whenever it is a closed subspace with infinite dimension and infinite codimension.
The AIHS problem asks whether every operator admits an AIHS. In the complex setting it has already been solved; that is, if [math]\displaystyle{ X }[/math] is a complex infinite-dimensional Banach space and [math]\displaystyle{ T \in \mathcal{B}(X) }[/math] then [math]\displaystyle{ T }[/math] admits an AIHS of defect at most 1. It is not currently known whether the same holds if [math]\displaystyle{ X }[/math] is a real Banach space. However, some partial results have been established: for instance, any self-adjoint operator on an infinite-dimensional real Hilbert space admits an AIHS, as does any strictly singular (or compact) operator acting on a real infinite-dimensional reflexive space.
| last=Roman | first=Stephen | title=Advanced Linear Algebra | edition=Third | series=Graduate Texts in Mathematics | publisher = Springer | date=2008 | pages= | isbn=978-0-387-72828-5
Original source: https://en.wikipedia.org/wiki/Invariant subspace.
Read more |