198 lines
3.8 KiB
TeX
198 lines
3.8 KiB
TeX
|
\section{Matrices}
|
||
|
|
||
|
\definition{}
|
||
|
A \textit{matrix} is a two-dimensional array of numbers: \\
|
||
|
$$
|
||
|
A =
|
||
|
\begin{bmatrix}
|
||
|
1 & 2 & 3 \\
|
||
|
4 & 5 & 6
|
||
|
\end{bmatrix}
|
||
|
$$
|
||
|
The above matrix has two rows and three columns. It is thus a $2 \times 3$ matrix.
|
||
|
|
||
|
\problem{}
|
||
|
Draw a $3 \times 2$ matrix.
|
||
|
|
||
|
\vfill
|
||
|
|
||
|
\definition{Matrices as Transformations}
|
||
|
We can define the \say{product\footnotemark{}} of a matrix $A$ and a vector $v$:
|
||
|
\footnotetext{This is an uncommon word to use in this context. You will soon see why.}
|
||
|
$$
|
||
|
Av =
|
||
|
\begin{bmatrix}
|
||
|
1 & 2 & 3 \\
|
||
|
4 & 5 & 6
|
||
|
\end{bmatrix}
|
||
|
\begin{bmatrix}
|
||
|
a \\ b \\ c
|
||
|
\end{bmatrix}
|
||
|
=
|
||
|
\begin{bmatrix}
|
||
|
1a + 2b + 3c \\
|
||
|
4a + 5b + 6c
|
||
|
\end{bmatrix}
|
||
|
$$
|
||
|
Look closely. Each element of the resulting $2 \times 1$ matrix is the dot product of a row of $A$ with $v$:
|
||
|
|
||
|
$$
|
||
|
Av =
|
||
|
\begin{bmatrix}
|
||
|
\text{---} a_1 \text{---} \\
|
||
|
\text{---} a_2 \text{---}
|
||
|
\end{bmatrix}
|
||
|
\begin{bmatrix}
|
||
|
| \\
|
||
|
v \\
|
||
|
| \\
|
||
|
\end{bmatrix}
|
||
|
=
|
||
|
\begin{bmatrix}
|
||
|
r_1v \\
|
||
|
r_2v
|
||
|
\end{bmatrix}
|
||
|
$$
|
||
|
|
||
|
Naturally, a vector can only be multiplied by a matrix if the number of rows in the vector equals the number of columns in the matrix.
|
||
|
|
||
|
\problem{}
|
||
|
Compute the following \say{product}:
|
||
|
|
||
|
$$
|
||
|
\begin{bmatrix}
|
||
|
2 & 9 \\
|
||
|
7 & 5 \\
|
||
|
3 & 4
|
||
|
\end{bmatrix}
|
||
|
\begin{bmatrix}
|
||
|
5 \\ 3
|
||
|
\end{bmatrix}
|
||
|
$$
|
||
|
|
||
|
\vfill
|
||
|
\pagebreak
|
||
|
|
||
|
|
||
|
\generic{Remark:}
|
||
|
It is a bit more interesting to think of matrix-vector multiplication in the following way: \\
|
||
|
|
||
|
\begin{minipage}[t]{0.48\textwidth}\vspace{0pt}
|
||
|
\begin{center}
|
||
|
The problem:
|
||
|
\vspace{2mm}
|
||
|
|
||
|
$$
|
||
|
\begin{bmatrix}
|
||
|
2 & 9 \\
|
||
|
7 & 5 \\
|
||
|
3 & 4
|
||
|
\end{bmatrix}
|
||
|
\begin{bmatrix}
|
||
|
5 \\ 3
|
||
|
\end{bmatrix}
|
||
|
=
|
||
|
\begin{bmatrix}
|
||
|
37 \\ 50 \\ 27
|
||
|
\end{bmatrix}
|
||
|
$$
|
||
|
\end{center}
|
||
|
\end{minipage}%
|
||
|
\hfill
|
||
|
\begin{minipage}[t]{0.48\textwidth}\vspace{0pt}
|
||
|
\begin{center}
|
||
|
Top-input, right-output:
|
||
|
\vspace{2mm}
|
||
|
|
||
|
\begin{tikzpicture}[>=stealth,thick,baseline]
|
||
|
\matrix [
|
||
|
matrix of math nodes,
|
||
|
left delimiter={[},
|
||
|
right delimiter={]}
|
||
|
] (A) {
|
||
|
2 & 9 \\
|
||
|
7 & 5 \\
|
||
|
3 & 4 \\
|
||
|
};
|
||
|
|
||
|
\node[
|
||
|
fit=(A-1-1)(A-1-1),
|
||
|
inner xsep=0mm,inner ysep=3mm,
|
||
|
label=above:5
|
||
|
] (L) {};
|
||
|
\draw[->, gray] (L.north) -- ([yshift=0mm]A-1-1.north);
|
||
|
|
||
|
\node[
|
||
|
fit=(A-1-2)(A-1-2),
|
||
|
inner xsep=0mm,inner ysep=3mm,
|
||
|
label=above:3
|
||
|
] (R) {};
|
||
|
\draw[->, gray] (R.north) -- ([yshift=0mm]A-1-2.north);
|
||
|
|
||
|
|
||
|
\node[
|
||
|
fit=(A-1-2)(A-1-2),
|
||
|
inner xsep=8mm,inner ysep=0mm,
|
||
|
label=right:{$10 + 27 = 37$}
|
||
|
](Y) {};
|
||
|
\draw[->, gray] ([xshift=3mm]A-1-2.east) -- (Y);
|
||
|
|
||
|
\node[
|
||
|
fit=(A-2-2)(A-2-2),
|
||
|
inner xsep=8mm,inner ysep=0mm,
|
||
|
label=right:{$35 + 15 = 50$}
|
||
|
](H) {};
|
||
|
\draw[->, gray] ([xshift=3mm]A-2-2.east) -- (H);
|
||
|
|
||
|
\node[
|
||
|
fit=(A-3-2)(A-3-2),
|
||
|
inner xsep=8mm,inner ysep=0mm,
|
||
|
label=right:{$15 + 12 = 27$}
|
||
|
](N) {};
|
||
|
\draw[->, gray] ([xshift=3mm]A-3-2.east) -- (N);
|
||
|
\end{tikzpicture}
|
||
|
\end{center}
|
||
|
\end{minipage}%
|
||
|
|
||
|
\vspace{2mm}
|
||
|
|
||
|
Be aware that this is only a model for intuition. \\
|
||
|
Make sure you understand the dot product definition on the previous page.
|
||
|
|
||
|
\vspace{5mm}
|
||
|
|
||
|
\theorem{}<thebigtheorem>
|
||
|
Any linear map $T: \mathbb{R}^n \to \mathbb{R}^m$ can be written as an $n \times m$ matrix. \\
|
||
|
Conversely, every $n \times m$ matrix represents a liner map $T: \mathbb{R}^n \to \mathbb{R}^m$ \\
|
||
|
|
||
|
\vspace{2mm}
|
||
|
|
||
|
In other words, \textbf{matrices are linear transformations}. \\
|
||
|
If you only learn only one thing today, this should be it.
|
||
|
|
||
|
\vfill
|
||
|
|
||
|
\problem{}<prooffwd>
|
||
|
Show that the transformation $T: \mathbb{R}^n \to \mathbb{R}^m$ defined by $T(v) = Av$ is linear. \\
|
||
|
Before you start, answer the following questions:
|
||
|
\begin{itemize}
|
||
|
\item What is $A$?
|
||
|
\item What is $v$?
|
||
|
\item What are their sizes?
|
||
|
\end{itemize}
|
||
|
|
||
|
\vfill
|
||
|
|
||
|
\problem{}<proofback>
|
||
|
Show that any linear transformation can be written as a matrix.
|
||
|
|
||
|
\vfill
|
||
|
\pagebreak
|
||
|
|
||
|
|
||
|
\problem{}
|
||
|
Does \ref{thebigtheorem} hold in arbitrary vector spaces? \\
|
||
|
Repeat \ref{prooffwd} and \ref{proofback} using only axioms.
|
||
|
|
||
|
\vfill
|
||
|
\pagebreak
|