Prove Theorem 2(d). (Hint: The \(\left( {i,j} \right)\)- entry in \(\left( {rA} \right)B\) is \(\left( {r{a_{i1}}} \right){b_{1j}} + ... + \left( {r{a_{in}}} \right){b_{nj}}\).)

Short Answer

Expert verified

Theorem 2(d) is proved.

Step by step solution

01

Statement of theorem 2

Theorem 2states that Abe a \(m \times n\) matrix let Band Chave sizes for which the indicated sums and products are defined.

  1. \(A\left( {BC} \right) = \left( {AB} \right)C\) (associative law of multiplication)
  2. \(A\left( {B + C} \right) = AB + AC\) (left distributive law)
  3. \(\left( {B + C} \right)A = BA + CA\) (right distributive law)
  4. \(r\left( {AB} \right) = \left( {rA} \right)B = A\left( {rB} \right)\) (for any scalar \(r\))
02

Prove theorem 2(d)

The \(\left( {i,j} \right)\)- entries for \(r\left( {AB} \right),\left( {rA} \right)B,\) and \(A\left( {rB} \right)\) are all the same since

\(r\sum\limits_{k = 1}^n {{a_{ik}}{b_{kj}}} = \sum\limits_{k = 1}^n {\left( {r{a_{ik}}} \right){b_{kj}}} = \sum\limits_{k = 1}^n {{a_{ik}}\left( {r{b_{kj}}} \right)} \).

Hence, theorem 2(d) is proved.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Exercises 15 and 16 concern arbitrary matrices A, B, and Cfor which the indicated sums and products are defined. Mark each statement True or False. Justify each answer.

15. a. If A and B are \({\bf{2}} \times {\bf{2}}\) with columns \({{\bf{a}}_1},{{\bf{a}}_2}\) and \({{\bf{b}}_1},{{\bf{b}}_2}\) respectively, then \(AB = \left( {\begin{aligned}{*{20}{c}}{{{\bf{a}}_1}{{\bf{b}}_1}}&{{{\bf{a}}_2}{{\bf{b}}_2}}\end{aligned}} \right)\).

b. Each column of ABis a linear combination of the columns of Busing weights from the corresponding column of A.

c. \(AB + AC = A\left( {B + C} \right)\)

d. \({A^T} + {B^T} = {\left( {A + B} \right)^T}\)

e. The transpose of a product of matrices equals the product of their transposes in the same order.

Let \({{\bf{r}}_1} \ldots ,{{\bf{r}}_p}\) be vectors in \({\mathbb{R}^{\bf{n}}}\), and let Qbe an\(m \times n\)matrix. Write the matrix\(\left( {\begin{aligned}{*{20}{c}}{Q{{\bf{r}}_1}}& \cdots &{Q{{\bf{r}}_p}}\end{aligned}} \right)\)as a productof two matrices (neither of which is an identity matrix).

Let Abe an invertible \(n \times n\) matrix, and let \(B\) be an \(n \times p\) matrix. Explain why \({A^{ - 1}}B\) can be computed by row reduction: If\(\left( {\begin{aligned}{*{20}{c}}A&B\end{aligned}} \right) \sim ... \sim \left( {\begin{aligned}{*{20}{c}}I&X\end{aligned}} \right)\), then \(X = {A^{ - 1}}B\).

If Ais larger than \(2 \times 2\), then row reduction of \(\left( {\begin{aligned}{*{20}{c}}A&B\end{aligned}} \right)\) is much faster than computing both \({A^{ - 1}}\) and \({A^{ - 1}}B\).

Suppose the transfer function W(s) in Exercise 19 is invertible for some s. It can be showed that the inverse transfer function \(W{\left( s \right)^{ - {\bf{1}}}}\), which transforms outputs into inputs, is the Schur complement of \(A - BC - s{I_n}\) for the matrix below. Find the Sachur complement. See Exercise 15.

\(\left[ {\begin{array}{*{20}{c}}{A - BC - s{I_n}}&B\\{ - C}&{{I_m}}\end{array}} \right]\)

Suppose the second column of Bis all zeros. What can you

say about the second column of AB?

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free