As we saw in Exploiting matrix structure, symmetry can simplify the LU factorization into the symmetric form A=LDLT. Important specializations occur as well for the eigenvalue and singular value factorizations. In this section we stay with complex-valued matrices, so we are interested in the case when A∗=A, i.e., A is hermitian. However, we often loosely speak of symmetry to mean this property even in the complex case. All of the statements in this section easily specialize to the real case.
and it’s tempting to conclude that U=V. Happily, this is nearly true. The following theorem is typically proved in an advanced linear algebra course.
Another way to state the result of this theorem is that a hermitian matrix has real eigenvalues and a complete set of orthonormal eigenvectors—that is, it is normal. Because hermitian matrices are normal, their eigenvalue condition number is guaranteed to be 1 by Theorem 7.2.3.
The converse of Theorem 7.4.1 is also true: every normal matrix with real eigenvalues is hermitian. This was illustrated in Demo 7.2.3.
Recall that for a matrix A and compatible vector x, the quadratic form x∗Ax is a scalar.
If v is an eigenvector such that Av=λv, then one easily calculates that RA(v)=λ. That is, the Rayleigh quotient maps an eigenvector into its associated eigenvalue.
If A∗=A, then the Rayleigh quotient has another interesting property: ∇RA(v)=0 if v is an eigenvector. By a multidimensional Taylor series, then,
as ϵ→0. The conclusion is that a good estimate of an eigenvector becomes an even better estimate of an eigenvalue.
7.4.3Definite, semidefinite, and indefinite matrices¶
In the real case, we called a symmetric matrix Asymmetric positive definite (SPD) if xTAx>0 for all nonzero vectors x. In the complex case the relevant property is hermitian positive definite (HPD), meaning that A∗=A and x∗Ax>0 for all complex vectors x. Putting this property together with the Rayleigh quotient leads to the following.
According to Theorem 7.4.3, for an HPD matrix, the EVD A=VDV∗ meets all the requirements of the SVD, provided the ordering of eigenvalues is chosen appropriately.
A hermitian matrix with all negative eigenvalues is called negative definite, and one with eigenvalues of different signs is indefinite. Finally, if one or more eigenvalues is zero and the rest have one sign, it is positive or negative semidefinite.
✍ Each line below is an EVD for a hermitian matrix. State whether the matrix is definite, indefinite, or semidefinite. Then state whether the given factorization is also an SVD, and if it is not, modify it to find an SVD.
⌨ The range of the function RA(x) is a subset of the complex plane known as the field of values of the matrix A. Use 500 random vectors to plot points in the field of values of A=⎣⎡10−2020−201⎦⎤. Then compute its eigenvalues and guess what the exact field of values is.
only - Unknown Directive
It's the interval $[-1,3]$ (between the extreme eigenvalues).
✍ Let A=[3−2−20].
(a) Write out RA(x) explicitly as a function of x1 and x2.
(b) Find RA(x) for x1=1, x2=2.
(c) Find the gradient vector ∇RA(x).
(d) Show that the gradient vector is zero when x1=1, x2=2.
✍ A skew-Hermitian matrix is one that satisfies A∗=−A. Show that if A is skew-Hermitian, then RA is imaginary-valued.
⌨ Thanks largely to Theorem 7.4.1, the eigenvalue problem for symmetric/hermitian matrices is easier than for general matrices.
(a) Let A be a 1000×1000 random real matrix, and let S=A+AT. Using @elapsed, time the eigvals function for A and then for S. You should find that the computation for S is around an order of magnitude faster.
(b) Perform the experiment from part (a) on n×n matrices for n=200,300,…,1600. Plot running time as a function of n for both matrices on a single log-log plot. Is the ratio of running times roughly constant, or does it grow with n?