2024-02-11 10:09:30 -08:00
|
|
|
\section*{Part 0: Vector Basics}
|
2024-02-10 20:57:14 -08:00
|
|
|
|
|
|
|
\definition{Vectors}
|
|
|
|
An $n$-dimensional \textit{vector} is an element of $\mathbb{R}^n$. In this handout, we'll write vectors as columns. \par
|
|
|
|
For example, $\left[\begin{smallmatrix} 1 \\ 3 \\ 2 \end{smallmatrix}\right]$ is a vector in $\mathbb{R}^3$.
|
|
|
|
|
|
|
|
|
|
|
|
\definition{Euclidean norm}
|
|
|
|
The length of an $n$-dimensional vector $v$ is computed as follows:
|
|
|
|
\begin{equation*}
|
2024-02-11 13:10:08 -08:00
|
|
|
|v| = \sqrt{v_1^2 + ... + v_n^2}
|
2024-02-10 20:57:14 -08:00
|
|
|
\end{equation*}
|
2024-02-11 13:10:08 -08:00
|
|
|
Where $v_1$ through $v_n$ represent individual components of this vector. For example,
|
2024-02-10 20:57:14 -08:00
|
|
|
\begin{equation*}
|
|
|
|
\left|\left[\begin{smallmatrix} 1 \\ 3 \\ 2 \end{smallmatrix}\right]\right| = \sqrt{1^2 + 3^2 + 2^2} = \sqrt{14}
|
|
|
|
\end{equation*}
|
|
|
|
|
|
|
|
\definition{Transpose}
|
|
|
|
The \textit{transpose} of a vector $v$ is $v^\text{T}$, given as follows:
|
|
|
|
\begin{equation*}
|
|
|
|
\left[\begin{smallmatrix} 1 \\ 3 \\ 2 \end{smallmatrix}\right]^\text{T}
|
|
|
|
=
|
|
|
|
\left[\begin{smallmatrix} 1 & 3 & 2 \end{smallmatrix}\right]
|
|
|
|
\end{equation*}
|
|
|
|
That is, we rewrite the vector with its rows as columns and its columns as rows. \par
|
|
|
|
We can transpose matrices too, of course, but we'll get to that later.
|
|
|
|
|
|
|
|
|
|
|
|
\problem{}
|
|
|
|
What is the length of $\left[\frac{1}{\sqrt{2}}, \frac{1}{\sqrt{2}}\right]^\text{T}$? \par
|
|
|
|
|
|
|
|
\vfill
|
|
|
|
|
|
|
|
\definition{}
|
|
|
|
We say a vector $v$ is a \textit{unit vector} or a \textit{normalized} vector if $|v| = 1$.
|
|
|
|
\pagebreak
|
|
|
|
|
|
|
|
|
|
|
|
\definition{Vector products}
|
|
|
|
The \textit{dot product} of two $n$-dimensional vectors $v$ and $u$ is computed as follows:
|
|
|
|
\begin{equation*}
|
|
|
|
v \cdot u = v_0u_0 + v_1u_1 + ... + v_nu_n
|
|
|
|
\end{equation*}
|
|
|
|
|
|
|
|
|
|
|
|
\vfill
|
|
|
|
|
|
|
|
\definition{Vector angles}<vectorangle>
|
|
|
|
For any two vectors $a$ and $b$, the following holds:
|
|
|
|
|
|
|
|
\null\hfill
|
|
|
|
\begin{minipage}{0.48\textwidth}
|
|
|
|
\begin{equation*}
|
|
|
|
\cos{(\phi)} = \frac{a \cdot b}{|a| \times |b|}
|
|
|
|
\end{equation*}
|
|
|
|
\end{minipage}
|
|
|
|
\hfill
|
|
|
|
\begin{minipage}{0.48\textwidth}
|
|
|
|
\begin{center}
|
|
|
|
\begin{tikzpicture}[scale=1.5]
|
|
|
|
\draw[->] (0, 0) -- (0.707, 0.707);
|
|
|
|
\draw[->, gray] (0.5, 0.0) arc (0:45:0.5);
|
|
|
|
\node[gray] at (0.6, 0.22) {$\phi$};
|
|
|
|
|
|
|
|
\draw[->] (0, 0) -- (1.2, 0);
|
|
|
|
\node[right] at (1.2, 0) {$a$};
|
|
|
|
|
|
|
|
\node[right] at (0.707, 0.707) {$b$};
|
|
|
|
\end{tikzpicture}
|
|
|
|
\end{center}
|
|
|
|
\end{minipage}
|
|
|
|
\hfill\null
|
|
|
|
This can easily be shown using the law of cosines. \par
|
|
|
|
For the sake of time, we'll skip the proof---it isn't directly relevant to this handout.
|
|
|
|
|
|
|
|
\definition{Orthogonal vectors}
|
|
|
|
We say two vectors are \textit{perpendicular} or \textit{orthogonal} if the angle between them is $90^\circ$. \par
|
|
|
|
Note that this definition works with vectors of any dimension.
|
|
|
|
|
|
|
|
\note{
|
|
|
|
In fact, we don't need to think about other dimensions: two vectors in an $n$-dimensional space nearly always
|
|
|
|
define a unique two-dimensional plane (with two exceptions: $\phi = 0^\circ$ and $\phi = 180^\circ$).
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
|
|
\problem{}
|
|
|
|
What is the dot product of two orthogonal vectors?
|
|
|
|
|
|
|
|
|
|
|
|
\vfill
|
|
|
|
\pagebreak
|
|
|
|
|
2024-02-11 10:09:30 -08:00
|
|
|
\definition{Linear combinations}
|
|
|
|
A \textit{linear combination} of two or more vectors $v_1, v_2, ..., v_k$ is the weighted sum
|
|
|
|
\begin{equation*}
|
|
|
|
a_1v_1 + a_2v_2 + ... + a_kv_k
|
|
|
|
\end{equation*}
|
|
|
|
where $a_i$ are arbitrary real numbers.
|
|
|
|
|
|
|
|
|
|
|
|
\definition{Linear dependence}
|
|
|
|
We say a set of vectors $\{v_1, v_2, ..., v_k\}$ is \textit{linearly independent} if we can write $0$ as a nontrivial
|
|
|
|
linear combination of these vectors. For example, the following set is linearly dependent
|
|
|
|
\begin{equation*}
|
|
|
|
\Bigl\{
|
|
|
|
\left[\begin{smallmatrix} 1 \\ 0 \end{smallmatrix}\right],
|
|
|
|
\left[\begin{smallmatrix} 0 \\ 1 \end{smallmatrix}\right],
|
|
|
|
\left[\begin{smallmatrix} 0.5 \\ 0.5 \end{smallmatrix}\right]
|
|
|
|
\Bigr\}
|
|
|
|
\end{equation*}
|
|
|
|
Since $
|
|
|
|
\left[\begin{smallmatrix} 1 \\ 0 \end{smallmatrix}\right] +
|
|
|
|
\left[\begin{smallmatrix} 0 \\ 1 \end{smallmatrix}\right] -
|
|
|
|
2 \left[\begin{smallmatrix} 0.5 \\ 0.5 \end{smallmatrix}\right]
|
|
|
|
= 0
|
|
|
|
$. A graphical representation of this is below.
|
|
|
|
|
|
|
|
\null\hfill
|
|
|
|
\begin{minipage}{0.48\textwidth}
|
|
|
|
\begin{center}
|
|
|
|
\begin{tikzpicture}[scale=1]
|
|
|
|
\fill[color = black] (0, 0) circle[radius=0.05];
|
|
|
|
|
|
|
|
\node[right] at (1, 0) {$\left[\begin{smallmatrix} 1 \\ 0 \end{smallmatrix}\right]$};
|
|
|
|
\node[above] at (0, 1) {$\left[\begin{smallmatrix} 0 \\ 1 \end{smallmatrix}\right]$};
|
|
|
|
|
|
|
|
\draw[->] (0, 0) -- (1, 0);
|
|
|
|
\draw[->] (0, 0) -- (0, 1);
|
|
|
|
\draw[->] (0, 0) -- (0.5, 0.5);
|
|
|
|
\node[above right] at (0.5, 0.5) {$\left[\begin{smallmatrix} 0.5 \\ 0.5 \end{smallmatrix}\right]$};
|
|
|
|
\end{tikzpicture}
|
|
|
|
\end{center}
|
|
|
|
\end{minipage}
|
|
|
|
\hfill
|
|
|
|
\begin{minipage}{0.48\textwidth}
|
|
|
|
\begin{center}
|
|
|
|
\begin{tikzpicture}[scale=1]
|
|
|
|
\fill[color = black] (0, 0) circle[radius=0.05];
|
|
|
|
|
|
|
|
\node[below] at (0.5, 0) {$\left[\begin{smallmatrix} 1 \\ 0 \end{smallmatrix}\right]$};
|
|
|
|
\node[right] at (1, 0.5) {$\left[\begin{smallmatrix} 0 \\ 1 \end{smallmatrix}\right]$};
|
|
|
|
|
|
|
|
\draw[->] (0, 0) -- (0.95, 0);
|
|
|
|
\draw[->] (1, 0) -- (1, 0.95);
|
|
|
|
\draw[->] (1, 1) -- (0.55, 0.55);
|
|
|
|
\draw[->] (0.5, 0.5) -- (0.05, 0.05);
|
|
|
|
\node[above left] at (0.5, 0.5) {$-2\left[\begin{smallmatrix} 0.5 \\ 0.5 \end{smallmatrix}\right]$};
|
|
|
|
\end{tikzpicture}
|
|
|
|
\end{center}
|
|
|
|
\end{minipage}
|
|
|
|
\hfill\null
|
|
|
|
|
|
|
|
\problem{}
|
|
|
|
Find a linearly independent set of vectors in $\mathbb{R}^3$
|
|
|
|
|
|
|
|
\vfill
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
\definition{Coordinates}
|
|
|
|
Say we have a set of linearly independent vectors $B = \{b_1, ..., b_k\}$. \par
|
|
|
|
We can write linear combinations of $B$ as \textit{coordinates} with respect to this set:
|
|
|
|
|
|
|
|
\vspace{2mm}
|
|
|
|
|
|
|
|
If we have a vector $v = x_1b_1 + x_2b_2 + ... + x_kb_k$, we can write $v = (x_1, x_2, ..., x_k)$ with respect to $B$.
|
|
|
|
|
|
|
|
|
|
|
|
\vspace{4mm}
|
|
|
|
|
|
|
|
For example, take
|
|
|
|
$B = \biggl\{
|
|
|
|
\left[\begin{smallmatrix} 1 \\ 0 \\ 0 \end{smallmatrix}\right],
|
|
|
|
\left[\begin{smallmatrix} 0 \\ 1 \\ 0\end{smallmatrix}\right],
|
|
|
|
\left[\begin{smallmatrix} 0 \\ 0 \\ 1 \end{smallmatrix}\right]
|
|
|
|
\biggr\}$ and $v = \left[\begin{smallmatrix} 8 \\ 3 \\ 9 \end{smallmatrix}\right]$
|
|
|
|
|
|
|
|
The coordinates of $v$ with respect to $B$ are, of course, $(8, 3, 9)$.
|
|
|
|
|
|
|
|
\problem{}
|
|
|
|
What are the coordinates of $v$ with respect to the basis
|
|
|
|
$B = \biggl\{
|
|
|
|
\left[\begin{smallmatrix} 1 \\ 0 \\ 1 \end{smallmatrix}\right],
|
|
|
|
\left[\begin{smallmatrix} 0 \\ 1 \\ 0\end{smallmatrix}\right],
|
|
|
|
\left[\begin{smallmatrix} 0 \\ 0 \\ -1 \end{smallmatrix}\right]
|
|
|
|
\biggr\}$?
|
|
|
|
|
|
|
|
|
|
|
|
|
2024-02-10 20:57:14 -08:00
|
|
|
|
|
|
|
%For example, the set $\{[1,0,0], [0,1,0], [0,0,1]\}$ (which we usually call $\{x, y, z\})$
|
|
|
|
%forms an orthonormal basis of $\mathbb{R}^3$. Every element of $\mathbb{R}^3$ can be written as a linear combination of these vectors:
|
|
|
|
%
|
|
|
|
%\begin{equation*}
|
|
|
|
% \left[\begin{smallmatrix} a \\ b \\ c \end{smallmatrix}\right]
|
|
|
|
% =
|
|
|
|
% a \left[\begin{smallmatrix} 1 \\ 0 \\ 0 \end{smallmatrix}\right] +
|
|
|
|
% b \left[\begin{smallmatrix} 0 \\ 1 \\ 0 \end{smallmatrix}\right] +
|
|
|
|
% c \left[\begin{smallmatrix} 0 \\ 0 \\ 1 \end{smallmatrix}\right]
|
|
|
|
%\end{equation*}
|
|
|
|
%
|
|
|
|
%The tuple $[a,b,c]$ is called the \textit{coordinate} of a point with respect to this basis.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
\vfill
|
|
|
|
\pagebreak
|