Shift matrix
In mathematics, a shift matrix is a binary matrix with ones only on the superdiagonal or subdiagonal, and zeroes elsewhere. A shift matrix U with ones on the superdiagonal is an upper shift matrix. The alternative subdiagonal matrix L is unsurprisingly known as a lower shift matrix. The (i, j)th component of U and L are
- {\displaystyle U_{ij}=\delta _{i+1,j},\quad L_{ij}=\delta _{i,j+1},}
where {\displaystyle \delta _{ij}} is the Kronecker delta symbol.
For example, the ×ばつ 5 shift matrices are
- {\displaystyle U_{5}={\begin{pmatrix}0&1&0&0&0\0円&0&1&0&0\0円&0&0&1&0\0円&0&0&0&1\0円&0&0&0&0\end{pmatrix}}\quad L_{5}={\begin{pmatrix}0&0&0&0&0\1円&0&0&0&0\0円&1&0&0&0\0円&0&1&0&0\0円&0&0&1&0\end{pmatrix}}.}
Clearly, the transpose of a lower shift matrix is an upper shift matrix and vice versa.
As a linear transformation, a lower shift matrix shifts the components of a column vector one position down, with a zero appearing in the first position. An upper shift matrix shifts the components of a column vector one position up, with a zero appearing in the last position.[1]
Premultiplying a matrix A by a lower shift matrix results in the elements of A being shifted downward by one position, with zeroes appearing in the top row. Postmultiplication by a lower shift matrix results in a shift left. Similar operations involving an upper shift matrix result in the opposite shift.
Clearly all finite-dimensional shift matrices are nilpotent; an n×ばつ n shift matrix S becomes the zero matrix when raised to the power of its dimension n.
Shift matrices act on shift spaces. The infinite-dimensional shift matrices are particularly important for the study of ergodic systems. Important examples of infinite-dimensional shifts are the Bernoulli shift, which acts as a shift on Cantor space, and the Gauss map, which acts as a shift on the space of continued fractions (that is, on Baire space.)
Properties
[edit ]Let L and U be the n×ばつ n lower and upper shift matrices, respectively. The following properties hold for both U and L. Let us therefore only list the properties for U:
- det(U) = 0
- tr(U) = 0
- rank(U) = n − 1
- The characteristic polynomials of U is
- {\displaystyle p_{U}(\lambda )=(-1)^{n}\lambda ^{n}.}
- Un = 0. This follows from the previous property by the Cayley–Hamilton theorem.
- The permanent of U is 0.
The following properties show how U and L are related:
- LT = U; UT = L
- The null spaces of U and L are
- {\displaystyle N(U)=\operatorname {span} \left\{(1,0,\ldots ,0)^{\mathsf {T}}\right\},}
- {\displaystyle N(L)=\operatorname {span} \left\{(0,\ldots ,0,1)^{\mathsf {T}}\right\}.}
- The spectrum of U and L is {\displaystyle \{0\}}. The algebraic multiplicity of 0 is n, and its geometric multiplicity is 1. From the expressions for the null spaces, it follows that (up to a scaling) the only eigenvector for U is {\displaystyle (1,0,\ldots ,0)^{\mathsf {T}}}, and the only eigenvector for L is {\displaystyle (0,\ldots ,0,1)^{\mathsf {T}}}.
- For LU and UL we have
- {\displaystyle UL=I-\operatorname {diag} (0,\ldots ,0,1),}
- {\displaystyle LU=I-\operatorname {diag} (1,0,\ldots ,0).}
These matrices are both idempotent, symmetric, and have the same rank as U and L - Ln−aUn−a + LaUa = Un−aLn−a + UaLa = I (the identity matrix), for any integer a between 0 and n inclusive.
If N is any nilpotent matrix, then N is similar to a block diagonal matrix of the form
- {\displaystyle {\begin{pmatrix}S_{1}&0&\ldots &0\0円&S_{2}&\ldots &0\\\vdots &\vdots &\ddots &\vdots \0円&0&\ldots &S_{r}\end{pmatrix}}}
where each of the blocks S1, S2, ..., Sr is a shift matrix (possibly of different sizes).[2] [3]
Examples
[edit ]- {\displaystyle S={\begin{pmatrix}0&0&0&0&0\1円&0&0&0&0\0円&1&0&0&0\0円&0&1&0&0\0円&0&0&1&0\end{pmatrix}};\quad A={\begin{pmatrix}1&1&1&1&1\1円&2&2&2&1\1円&2&3&2&1\1円&2&2&2&1\1円&1&1&1&1\end{pmatrix}}.}
Then,
- {\displaystyle SA={\begin{pmatrix}0&0&0&0&0\1円&1&1&1&1\1円&2&2&2&1\1円&2&3&2&1\1円&2&2&2&1\end{pmatrix}};\quad AS={\begin{pmatrix}1&1&1&1&0\2円&2&2&1&0\2円&3&2&1&0\2円&2&2&1&0\1円&1&1&1&0\end{pmatrix}}.}
Clearly there are many possible permutations. For example, {\displaystyle S^{\mathsf {T}}AS} is equal to the matrix A shifted up and left along the main diagonal.
- {\displaystyle S^{\mathsf {T}}AS={\begin{pmatrix}2&2&2&1&0\2円&3&2&1&0\2円&2&2&1&0\1円&1&1&1&0\0円&0&0&0&0\end{pmatrix}}.}
See also
[edit ]Notes
[edit ]- ^ Beauregard & Fraleigh (1973, p. 312)
- ^ Beauregard & Fraleigh (1973, pp. 312, 313)
- ^ Herstein (1964, p. 250)
References
[edit ]- Beauregard, Raymond A.; Fraleigh, John B. (1973), A First Course In Linear Algebra: with Optional Introduction to Groups, Rings, and Fields , Boston: Houghton Mifflin, ISBN 0-395-14017-X, OCLC 1019797576
- Herstein, I. N. (1964), Topics In Algebra, Waltham: Blaisdell Publishing, ISBN 978-1-114-54101-6, OCLC 1419919702, OL 5885700M
{{citation}}: ISBN / Date incompatibility (help)