Jump to content
Wikipedia The Free Encyclopedia

Hilbert matrix

From Wikipedia, the free encyclopedia
Square matrix where a[i,j]=1/(i+j-1)

In linear algebra, a Hilbert matrix, introduced by Hilbert (1894), is a square matrix with entries being the unit fractions

H i j = 1 i + j 1 . {\displaystyle H_{ij}={\frac {1}{i+j-1}}.} {\displaystyle H_{ij}={\frac {1}{i+j-1}}.}

For example, this is the 5 ×ばつ 5 Hilbert matrix:

H = [ 1 1 2 1 3 1 4 1 5 1 2 1 3 1 4 1 5 1 6 1 3 1 4 1 5 1 6 1 7 1 4 1 5 1 6 1 7 1 8 1 5 1 6 1 7 1 8 1 9 ] . {\displaystyle H={\begin{bmatrix}1&{\frac {1}{2}}&{\frac {1}{3}}&{\frac {1}{4}}&{\frac {1}{5}}\\{\frac {1}{2}}&{\frac {1}{3}}&{\frac {1}{4}}&{\frac {1}{5}}&{\frac {1}{6}}\\{\frac {1}{3}}&{\frac {1}{4}}&{\frac {1}{5}}&{\frac {1}{6}}&{\frac {1}{7}}\\{\frac {1}{4}}&{\frac {1}{5}}&{\frac {1}{6}}&{\frac {1}{7}}&{\frac {1}{8}}\\{\frac {1}{5}}&{\frac {1}{6}}&{\frac {1}{7}}&{\frac {1}{8}}&{\frac {1}{9}}\end{bmatrix}}.} {\displaystyle H={\begin{bmatrix}1&{\frac {1}{2}}&{\frac {1}{3}}&{\frac {1}{4}}&{\frac {1}{5}}\\{\frac {1}{2}}&{\frac {1}{3}}&{\frac {1}{4}}&{\frac {1}{5}}&{\frac {1}{6}}\\{\frac {1}{3}}&{\frac {1}{4}}&{\frac {1}{5}}&{\frac {1}{6}}&{\frac {1}{7}}\\{\frac {1}{4}}&{\frac {1}{5}}&{\frac {1}{6}}&{\frac {1}{7}}&{\frac {1}{8}}\\{\frac {1}{5}}&{\frac {1}{6}}&{\frac {1}{7}}&{\frac {1}{8}}&{\frac {1}{9}}\end{bmatrix}}.}

The entries can also be defined by the integral

H i j = 0 1 x i + j 2 d x , {\displaystyle H_{ij}=\int _{0}^{1}x^{i+j-2},円dx,} {\displaystyle H_{ij}=\int _{0}^{1}x^{i+j-2},円dx,}

that is, as a Gramian matrix for powers of x. It arises in the least squares approximation of arbitrary functions by polynomials.

The Hilbert matrices are canonical examples of ill-conditioned matrices, being notoriously difficult to use in numerical computation. For example, the 2-norm condition number of the matrix above is about 4.8×ばつ105.

Historical note

[edit ]

Hilbert (1894) introduced the Hilbert matrix to study the following question in approximation theory: "Assume that I = [a, b], is a real interval. Is it then possible to find a non-zero polynomial P with integer coefficients, such that the integral

a b P ( x ) 2 d x {\displaystyle \int _{a}^{b}P(x)^{2}dx} {\displaystyle \int _{a}^{b}P(x)^{2}dx}

is smaller than any given bound ε > 0, taken arbitrarily small?" To answer this question, Hilbert derives an exact formula for the determinant of the Hilbert matrices and investigates their asymptotics. He concludes that the answer to his question is positive if the length ba of the interval is smaller than 4.

Properties

[edit ]

The Hilbert matrix is symmetric and positive definite. The Hilbert matrix is also totally positive (meaning that the determinant of every submatrix is positive).

The Hilbert matrix is an example of a Hankel matrix. It is also a specific example of a Cauchy matrix.

The determinant can be expressed in closed form, as a special case of the Cauchy determinant. The determinant of the n ×ばつ n Hilbert matrix is

det ( H ) = c n 4 c 2 n , {\displaystyle \det(H)={\frac {c_{n}^{4}}{c_{2n}}},} {\displaystyle \det(H)={\frac {c_{n}^{4}}{c_{2n}}},}

where

c n = i = 1 n 1 i n i = i = 1 n 1 i ! . {\displaystyle c_{n}=\prod _{i=1}^{n-1}i^{n-i}=\prod _{i=1}^{n-1}i!.} {\displaystyle c_{n}=\prod _{i=1}^{n-1}i^{n-i}=\prod _{i=1}^{n-1}i!.}

Hilbert already mentioned the curious fact that the determinant of the Hilbert matrix is the reciprocal of an integer (see sequence OEISA005249 in the OEIS), which also follows from the identity

1 det ( H ) = c 2 n c n 4 = n ! i = 1 2 n 1 ( i [ i / 2 ] ) . {\displaystyle {\frac {1}{\det(H)}}={\frac {c_{2n}}{c_{n}^{4}}}=n!\cdot \prod _{i=1}^{2n-1}{\binom {i}{[i/2]}}.} {\displaystyle {\frac {1}{\det(H)}}={\frac {c_{2n}}{c_{n}^{4}}}=n!\cdot \prod _{i=1}^{2n-1}{\binom {i}{[i/2]}}.}

Using Stirling's approximation of the factorial, one can establish the following asymptotic result:

det ( H ) a n n 1 / 4 ( 2 π ) n 4 n 2 , {\displaystyle \det(H)\sim a_{n},円n^{-1/4}(2\pi )^{n},4円^{-n^{2}},} {\displaystyle \det(H)\sim a_{n},円n^{-1/4}(2\pi )^{n},4円^{-n^{2}},}

where an converges to the constant e 1 / 4 2 1 / 12 A 3 0.6450 {\displaystyle e^{1/4},2円^{1/12},円A^{-3}\approx 0.6450} {\displaystyle e^{1/4},2円^{1/12},円A^{-3}\approx 0.6450} as n {\displaystyle n\to \infty } {\displaystyle n\to \infty }, where A is the Glaisher–Kinkelin constant.

The inverse of the Hilbert matrix can be expressed in closed form using binomial coefficients; its entries are

( H 1 ) i j = ( 1 ) i + j ( i + j 1 ) ( n + i 1 n j ) ( n + j 1 n i ) ( i + j 2 i 1 ) 2 , {\displaystyle (H^{-1})_{ij}=(-1)^{i+j}(i+j-1){\binom {n+i-1}{n-j}}{\binom {n+j-1}{n-i}}{\binom {i+j-2}{i-1}}^{2},} {\displaystyle (H^{-1})_{ij}=(-1)^{i+j}(i+j-1){\binom {n+i-1}{n-j}}{\binom {n+j-1}{n-i}}{\binom {i+j-2}{i-1}}^{2},}

where n is the order of the matrix.[1] It follows that the entries of the inverse matrix are all integers, and that the signs form a checkerboard pattern, being positive on the principal diagonal. For example,

[ 1 1 2 1 3 1 4 1 5 1 2 1 3 1 4 1 5 1 6 1 3 1 4 1 5 1 6 1 7 1 4 1 5 1 6 1 7 1 8 1 5 1 6 1 7 1 8 1 9 ] 1 = [ 25 300 1050 1400 630 300 4800 18900 26880 12600 1050 18900 79380 117600 56700 1400 26880 117600 179200 88200 630 12600 56700 88200 44100 ] . {\displaystyle {\begin{bmatrix}1&{\frac {1}{2}}&{\frac {1}{3}}&{\frac {1}{4}}&{\frac {1}{5}}\\{\frac {1}{2}}&{\frac {1}{3}}&{\frac {1}{4}}&{\frac {1}{5}}&{\frac {1}{6}}\\{\frac {1}{3}}&{\frac {1}{4}}&{\frac {1}{5}}&{\frac {1}{6}}&{\frac {1}{7}}\\{\frac {1}{4}}&{\frac {1}{5}}&{\frac {1}{6}}&{\frac {1}{7}}&{\frac {1}{8}}\\{\frac {1}{5}}&{\frac {1}{6}}&{\frac {1}{7}}&{\frac {1}{8}}&{\frac {1}{9}}\end{bmatrix}}^{-1}=\left[{\begin{array}{rrrrr}25&-300&1050&-1400&630\\-300&4800&-18900&26880&-12600\1050円&-18900&79380&-117600&56700\\-1400&26880&-117600&179200&-88200\630円&-12600&56700&-88200&44100\end{array}}\right].} {\displaystyle {\begin{bmatrix}1&{\frac {1}{2}}&{\frac {1}{3}}&{\frac {1}{4}}&{\frac {1}{5}}\\{\frac {1}{2}}&{\frac {1}{3}}&{\frac {1}{4}}&{\frac {1}{5}}&{\frac {1}{6}}\\{\frac {1}{3}}&{\frac {1}{4}}&{\frac {1}{5}}&{\frac {1}{6}}&{\frac {1}{7}}\\{\frac {1}{4}}&{\frac {1}{5}}&{\frac {1}{6}}&{\frac {1}{7}}&{\frac {1}{8}}\\{\frac {1}{5}}&{\frac {1}{6}}&{\frac {1}{7}}&{\frac {1}{8}}&{\frac {1}{9}}\end{bmatrix}}^{-1}=\left[{\begin{array}{rrrrr}25&-300&1050&-1400&630\\-300&4800&-18900&26880&-12600\1050円&-18900&79380&-117600&56700\\-1400&26880&-117600&179200&-88200\630円&-12600&56700&-88200&44100\end{array}}\right].}

The condition number of the n×ばつ n Hilbert matrix grows as O ( ( 1 + 2 ) 4 n / n ) {\displaystyle O\left(\left(1+{\sqrt {2}}\right)^{4n}/{\sqrt {n}}\right)} {\displaystyle O\left(\left(1+{\sqrt {2}}\right)^{4n}/{\sqrt {n}}\right)}.

Applications

[edit ]

The method of moments applied to polynomial distributions results in a Hankel matrix, which in the special case of approximating a probability distribution on the interval [0, 1] results in a Hilbert matrix. This matrix needs to be inverted to obtain the weight parameters of the polynomial distribution approximation.[2]

References

[edit ]
  1. ^ Choi, Man-Duen (1983). "Tricks or Treats with the Hilbert Matrix". The American Mathematical Monthly. 90 (5): 301–312. doi:10.2307/2975779. JSTOR 2975779.
  2. ^ Munkhammar, Joakim; Mattsson, Lars; Rydén, Jesper (2017). "Polynomial probability distribution estimation using the method of moments". PLOS ONE. 12 (4) e0174573. Bibcode:2017PLoSO..1274573M. doi:10.1371/journal.pone.0174573 . PMC 5386244 . PMID 28394949.

Further reading

[edit ]
Matrix classes
Explicitly constrained entries
Constant
Conditions on eigenvalues or eigenvectors
Satisfying conditions on products or inverses
With specific applications
Used in statistics
Used in graph theory
Used in science and engineering
Related terms

AltStyle によって変換されたページ (->オリジナル) /