In Exercises 13-20, find an invertible matrix \(P\) and a matrix \(C\) of the form \(\left( {\begin{array}{*{20}{r}}a&{ - b}\\b&a\end{array}} \right)\) such that the given matrix has the form\(A = PC{P^{ - 1}}\). For Exercises 13-16, use information from Exercises 1-4.

18. \(\left( {\begin{array}{*{20}{r}}1&{ - 1}\\{.4}&{.6}\end{array}} \right)\)

Short Answer

Expert verified

The invertible matrix and matrix are \(P = \left( {\begin{array}{*{20}{r}}1&{ - 3}\\2&0\end{array}} \right)\;{\rm{and}}\;\;C = \left( {\begin{array}{*{20}{r}}{.8}&{ - .6}\\{.6}&{.8}\end{array}} \right)\).

Step by step solution

01

Finding the matrix \(P\) and the matrix \(C\)

Let \(A\) be a \(2 \times 2\) with eigenvalues of the form \(a \pm bi\) , and let \(v\) be the eigenvector corresponding to the eigenvalue \(a - bi\), then the matrix \(P\) will be \(P = \left( {\begin{array}{*{20}{c}}{{\mathop{\rm Re}\nolimits} v}&{{\mathop{\rm Im}\nolimits} v}\end{array}} \right)\).

The matrix \(C\) can be obtained by \(C = {P^{ - 1}}AP\).

02

Find the Invertible matrix

Given that \(A = \left( {\begin{array}{*{20}{r}}1&{ - 1}\\{.4}&{.6}\end{array}} \right)\).

Then,

\(\left( {A - \lambda {I_2}} \right) = \left( {\begin{array}{*{20}{c}}{1 - \lambda }&{ - 1}\\{.4}&{.6 - \lambda }\end{array}} \right)\)

Let characteristic equation of \(A\) will be,

\(\begin{array}{c}\det \left( {\lambda {I_2} - A} \right) = 0\\(1 - \lambda )(.6 - \lambda ) + .4 = 0\\{\lambda ^2} - 1.6\lambda + .6 + .4 = 0\\{\lambda ^2} - 1.6\lambda + 1 = 0\end{array}\)

Further solving we get,

\(\begin{array}{c}{\lambda ^2} - 1.6\lambda + 1 = 0\\\left( {\lambda - \left( {.8 + .6i} \right)} \right)\left( {\lambda - \left( {.8 - .6i} \right)} \right) = 0\end{array}\).

This implies that the roots of\({\lambda ^2} - 1.6\lambda + 1 = 0\)are \(.8 \pm .6i\).

Hence the eigenvalues of\(A\)are \(.8 \pm .6i\).

Now let\({X_1} = \left( {\begin{array}{*{20}{l}}{{x_1}}\\{{x_2}}\end{array}} \right)\)be an eigenvector corresponding to the eigenvalue\(.8 - .6i\).

Therefor\(A{X_1} = \left( {.8 - .6i} \right){X_1}\) then we get,

\(\begin{array}{c}\left( {A - \left( {.8 - .6i} \right)I} \right){X_1} = 0\\\left( {\begin{array}{*{20}{c}}{.2 + .6i}&{ - 1}\\{.4}&{ - .2 + .6i}\end{array}} \right)\left( {\begin{array}{*{20}{l}}{{x_1}}\\{{x_2}}\end{array}} \right) = \left( {\begin{array}{*{20}{l}}0\\0\end{array}} \right)\end{array}\)

Therefore,

\(\begin{array}{c}.4{x_1} + \left( { - .2 + .6i} \right){x_2} = 0\\{\rm{ (}}{\rm{.4) }}{x_1} = \left( {.2 - .6i} \right){x_2}\\{x_1} = \frac{{1 - 3i}}{2}{x_2}\end{array}\)

This \({x_2}\) is a free variable.

03

Find the matrix further

Therefore\(\left( {\begin{array}{*{20}{c}}{1 - 3i}\\2\end{array}} \right)\)is an eigenvector corresponding to the eigenvalue\(.8 - .6i\).

Then we get,

\(P = \left( {\begin{array}{*{20}{l}}{{\mathop{\rm Re}\nolimits} {X_1}}&{{\mathop{\rm Im}\nolimits} {X_1}}\end{array}} \right) \Rightarrow P = \left( {\begin{array}{*{20}{c}}1&{ - 3}\\2&0\end{array}} \right)\)

And then,

\(\begin{array}{c}C = {P^{ - 1}}AP\\ = \frac{1}{6}\left( {\begin{array}{*{20}{r}}0&3\\{ - 2}&1\end{array}} \right)\left( {\begin{array}{*{20}{c}}1&{ - 1}\\{.4}&{.6}\end{array}} \right)\left( {\begin{array}{*{20}{l}}1&{ - 3}\\2&0\end{array}} \right)\\ = \left( {\begin{array}{*{20}{r}}{.8}&{ - .6}\\{.6}&{.8}\end{array}} \right)\end{array}\)

Thus, the required matrices are \(P = \left( {\begin{array}{*{20}{c}}1&{ - 3}\\2&0\end{array}} \right)\quad {\rm{and}}\;C = \left( {\begin{array}{*{20}{r}}{.8}&{ - .6}\\{.6}&{.8}\end{array}} \right)\).

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Question: Let \(A = \left( {\begin{array}{*{20}{c}}{.5}&{.2}&{.3}\\{.3}&{.8}&{.3}\\{.2}&0&{.4}\end{array}} \right)\), \({{\rm{v}}_1} = \left( {\begin{array}{*{20}{c}}{.3}\\{.6}\\{.1}\end{array}} \right)\), \({{\rm{v}}_2} = \left( {\begin{array}{*{20}{c}}1\\{ - 3}\\2\end{array}} \right)\), \({{\rm{v}}_3} = \left( {\begin{array}{*{20}{c}}{ - 1}\\0\\1\end{array}} \right)\) and \({\rm{w}} = \left( {\begin{array}{*{20}{c}}1\\1\\1\end{array}} \right)\).

  1. Show that \({{\rm{v}}_1}\), \({{\rm{v}}_2}\), and \({{\rm{v}}_3}\) are eigenvectors of \(A\). (Note: \(A\) is the stochastic matrix studied in Example 3 of Section 4.9.)
  2. Let \({{\rm{x}}_0}\) be any vector in \({\mathbb{R}^3}\) with non-negative entries whose sum is 1. (In section 4.9, \({{\rm{x}}_0}\) was called a probability vector.) Explain why there are constants \({c_1}\), \({c_2}\), and \({c_3}\) such that \({{\rm{x}}_0} = {c_1}{{\rm{v}}_1} + {c_2}{{\rm{v}}_2} + {c_3}{{\rm{v}}_3}\). Compute \({{\rm{w}}^T}{{\rm{x}}_0}\), and deduce that \({c_1} = 1\).
  3. For \(k = 1,2, \ldots ,\) define \({{\rm{x}}_k} = {A^k}{{\rm{x}}_0}\), with \({{\rm{x}}_0}\) as in part (b). Show that \({{\rm{x}}_k} \to {{\rm{v}}_1}\) as \(k\) increases.

Question: Let \(A = \left( {\begin{array}{*{20}{c}}{.6}&{.3}\\{.4}&{.7}\end{array}} \right)\), \({v_1} = \left( {\begin{array}{*{20}{c}}{3/7}\\{4/7}\end{array}} \right)\), \({x_0} = \left( {\begin{array}{*{20}{c}}{.5}\\{.5}\end{array}} \right)\). (Note: \(A\) is the stochastic matrix studied in Example 5 of Section 4.9.)

  1. Find a basic for \({\mathbb{R}^2}\) consisting of \({{\rm{v}}_1}\) and anther eigenvector \({{\rm{v}}_2}\) of \(A\).
  2. Verify that \({{\rm{x}}_0}\) may be written in the form \({{\rm{x}}_0} = {{\rm{v}}_1} + c{{\rm{v}}_2}\).
  3. For \(k = 1,2, \ldots \), define \({x_k} = {A^k}{x_0}\). Compute \({x_1}\) and \({x_2}\), and write a formula for \({x_k}\). Then show that \({{\bf{x}}_k} \to {{\bf{v}}_1}\) as \(k\) increases.

Exercises 19–23 concern the polynomial \(p\left( t \right) = {a_{\bf{0}}} + {a_{\bf{1}}}t + ... + {a_{n - {\bf{1}}}}{t^{n - {\bf{1}}}} + {t^n}\) and \(n \times n\) matrix \({C_p}\) called the companion matrix of \(p\): \({C_p} = \left[ {\begin{aligned}{*{20}{c}}{\bf{0}}&{\bf{1}}&{\bf{0}}&{...}&{\bf{0}}\\{\bf{0}}&{\bf{0}}&{\bf{1}}&{}&{\bf{0}}\\:&{}&{}&{}&:\\{\bf{0}}&{\bf{0}}&{\bf{0}}&{}&{\bf{1}}\\{ - {a_{\bf{0}}}}&{ - {a_{\bf{1}}}}&{ - {a_{\bf{2}}}}&{...}&{ - {a_{n - {\bf{1}}}}}\end{aligned}} \right]\).

22. Let \(p\left( t \right) = {a_{\bf{0}}} + {a_{\bf{1}}}t + {a_{\bf{2}}}{t^{\bf{2}}} + {t^{\bf{3}}}\), and let \(\lambda \) be a zero of \(p\).

  1. Write the companion matrix for \(p\).
  2. Explain why \({\lambda ^{\bf{3}}} = - {a_{\bf{0}}} - {a_{\bf{1}}}\lambda - {a_{\bf{2}}}{\lambda ^{\bf{2}}}\), and show that \(\left( {{\bf{1}},\lambda ,{\lambda ^2}} \right)\) is an eigenvector of the companion matrix for \(p\).

19–23 concern the polynomial \(p\left( t \right) = {a_{\bf{0}}} + {a_{\bf{1}}}t + ... + {a_{n - {\bf{1}}}}{t^{n - {\bf{1}}}} + {t^n}\) and \(n \times n\) matrix \({C_p}\) called the companion matrix of \(p\): \({C_p} = \left( {\begin{aligned}{*{20}{c}}{\bf{0}}&{\bf{1}}&{\bf{0}}&{...}&{\bf{0}}\\{\bf{0}}&{\bf{0}}&{\bf{1}}&{}&{\bf{0}}\\:&{}&{}&{}&:\\{\bf{0}}&{\bf{0}}&{\bf{0}}&{}&{\bf{1}}\\{ - {a_{\bf{0}}}}&{ - {a_{\bf{1}}}}&{ - {a_{\bf{2}}}}&{...}&{ - {a_{n - {\bf{1}}}}}\end{aligned}} \right)\).

23. Let \(p\) be the polynomial in Exercise \({\bf{22}}\), and suppose the equation \(p\left( t \right) = {\bf{0}}\) has distinct roots \({\lambda _{\bf{1}}},{\lambda _{\bf{2}}},{\lambda _{\bf{3}}}\). Let \(V\) be the Vandermonde matrix

\(V{\bf{ = }}\left( {\begin{aligned}{*{20}{c}}{\bf{1}}&{\bf{1}}&{\bf{1}}\\{{\lambda _{\bf{1}}}}&{{\lambda _{\bf{2}}}}&{{\lambda _{\bf{3}}}}\\{\lambda _{\bf{1}}^{\bf{2}}}&{\lambda _{\bf{2}}^{\bf{2}}}&{\lambda _{\bf{3}}^{\bf{2}}}\end{aligned}} \right)\)

(The transpose of \(V\) was considered in Supplementary Exercise \({\bf{11}}\) in Chapter \({\bf{2}}\).) Use Exercise \({\bf{22}}\) and a theorem from this chapter to deduce that \(V\) is invertible (but do not compute \({V^{{\bf{ - 1}}}}\)). Then explain why \({V^{{\bf{ - 1}}}}{C_p}V\) is a diagonal matrix.

Question: Is \(\left( {\begin{array}{*{20}{c}}{ - 1 + \sqrt 2 }\\1\end{array}} \right)\) an eigenvalue of \(\left( {\begin{array}{*{20}{c}}2&1\\1&4\end{array}} \right)\)? If so, find the eigenvalue.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free