Let \(A\) and \(B\)be symmetric \(n \times n\) matrices whose eigenvalues are all positive. Show that the eigenvalues of\(A + B\) are all positive.

Short Answer

Expert verified

It is proved that the eigenvalues of \(A + B\) are all positive.

Step by step solution

01

Symmetric Matrices and Quadratic Forms

When any Symmetric Matrix\(A\)is diagonalized orthogonallyas \(PD{P^{ - 1}}\)we have:

\(\begin{aligned}{}{x^T}Ax = {y^T}Dy\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\left\{ {{\rm{as }}x = Py} \right\}\\{\rm{and}}\\\left\| x \right\| = \left\| {Py} \right\| = \left\| y \right\|\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\left\{ {\forall y \in \mathbb{R}} \right\}\end{aligned}\)

02

Show that the eigenvalues of \(A + B\) are all positive

As per the question, we have:

The positive \(n \times n\)matrices\(A{\rm{ and }}B\)have all positive eigenvalues. So, their quadratic forms \({x^T}Ax{\rm{ and }}{x^T}Bx\) will also be positive.

So, we have:

\(\begin{aligned}{}{x^T}Ax > 0{\rm{ and }}{x^T}Bx > 0\\{x^T}Ax + {x^T}Bx > 0\\{x^T}\left( {A + B} \right)x > 0\end{aligned}\)

Here, we see a matrix \(A + B\)whose quadratic form is positive.

Hence proved, the eigenvalues of \(A + B\) are all positive.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Determine which of the matrices in Exercises 1–6 are symmetric.

5. \(\left( {\begin{aligned}{{}{}}{ - 6}&2&0\\2&{ - 6}&2\\0&2&{ - 6}\end{aligned}} \right)\)

Let \(A = \left( {\begin{aligned}{{}}{\,\,\,4}&{ - 1}&{ - 1}\\{ - 1}&{\,\,\,4}&{ - 1}\\{ - 1}&{ - 1}&{\,\,\,4}\end{aligned}} \right)\), and\({\rm{v}} = \left( {\begin{aligned}{{}}1\\1\\1\end{aligned}} \right)\). Verify that 5 is an eigenvalue of \(A\) and \({\rm{v}}\)is an eigenvector. Then orthogonally diagonalize \(A\).

Question: 13. The sample covariance matrix is a generalization of a formula for the variance of a sample of \(N\) scalar measurements, say \({t_1},................,{t_N}\). If \(m\) is the average of \({t_1},................,{t_N}\), then the sample variance is given by

\(\frac{1}{{N - 1}}\sum\limits_{k = 1}^n {{{\left( {{t_k} - m} \right)}^2}} \)

Show how the sample covariance matrix, \(S\), defined prior to Example 3, may be written in a form similar to (1). (Hint: Use partitioned matrix multiplication to write \(S\) as \(\frac{1}{{N - 1}}\) times the sum of \(N\) matrices of size \(p \times p\). For \(1 \le k \le N\), write \({X_k} - M\) in place of \({\hat X_k}\).)

Orhtogonally diagonalize the matrices in Exercises 37-40. To practice the methods of this section, do not use an eigenvector routine from your matrix program. Instead, use the program to find the eigenvalues, and for each eigenvalue \(\lambda \), find an orthogonal basis for \({\bf{Nul}}\left( {A - \lambda I} \right)\), as in Examples 2 and 3.

37. \(\left( {\begin{aligned}{{}}{\bf{6}}&{\bf{2}}&{\bf{9}}&{ - {\bf{6}}}\\{\bf{2}}&{\bf{6}}&{ - {\bf{6}}}&{\bf{9}}\\{\bf{9}}&{ - {\bf{6}}}&{\bf{6}}&{\bf{2}}\\{\bf{6}}&{\bf{9}}&{\bf{2}}&{\bf{6}}\end{aligned}} \right)\)

In Exercises 17–24, \(A\) is an \(m \times n\) matrix with a singular value decomposition \(A = U\Sigma {V^T}\) , where \(U\) is an \(m \times m\) orthogonal matrix, \({\bf{\Sigma }}\) is an \(m \times n\) “diagonal” matrix with \(r\) positive entries and no negative entries, and \(V\) is an \(n \times n\) orthogonal matrix. Justify each answer.

23. Let \(U = \left( {{u_1}...{u_m}} \right)\) and \(V = \left( {{v_1}...{v_n}} \right)\) where the \({{\bf{u}}_i}\) and \({{\bf{v}}_i}\) are in Theorem 10. Show that \(A = {\sigma _1}{u_1}v_1^T + {\sigma _2}{u_2}v_2^T + ... + {\sigma _r}{u_r}v_r^T\).

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free