Edits & renames
This commit is contained in:
parent
bd33aeb739
commit
33e69a13d4
110
Advanced/Introduction to Quantum/parts/00 vectors.tex
Normal file
110
Advanced/Introduction to Quantum/parts/00 vectors.tex
Normal file
@ -0,0 +1,110 @@
|
||||
\section*{Prerequisite: Vector Basics}
|
||||
|
||||
\definition{Vectors}
|
||||
An $n$-dimensional \textit{vector} is an element of $\mathbb{R}^n$. In this handout, we'll write vectors as columns. \par
|
||||
For example, $\left[\begin{smallmatrix} 1 \\ 3 \\ 2 \end{smallmatrix}\right]$ is a vector in $\mathbb{R}^3$.
|
||||
|
||||
|
||||
\definition{Euclidean norm}
|
||||
The length of an $n$-dimensional vector $v$ is computed as follows:
|
||||
\begin{equation*}
|
||||
|v| = \sqrt{v_0^2 +v_1^2 + ... + v_n^2}
|
||||
\end{equation*}
|
||||
Where $v_0$ through $v_n$ represent individual components of this vector. For example,
|
||||
\begin{equation*}
|
||||
\left|\left[\begin{smallmatrix} 1 \\ 3 \\ 2 \end{smallmatrix}\right]\right| = \sqrt{1^2 + 3^2 + 2^2} = \sqrt{14}
|
||||
\end{equation*}
|
||||
|
||||
\definition{Transpose}
|
||||
The \textit{transpose} of a vector $v$ is $v^\text{T}$, given as follows:
|
||||
\begin{equation*}
|
||||
\left[\begin{smallmatrix} 1 \\ 3 \\ 2 \end{smallmatrix}\right]^\text{T}
|
||||
=
|
||||
\left[\begin{smallmatrix} 1 & 3 & 2 \end{smallmatrix}\right]
|
||||
\end{equation*}
|
||||
That is, we rewrite the vector with its rows as columns and its columns as rows. \par
|
||||
We can transpose matrices too, of course, but we'll get to that later.
|
||||
|
||||
|
||||
\problem{}
|
||||
What is the length of $\left[\frac{1}{\sqrt{2}}, \frac{1}{\sqrt{2}}\right]^\text{T}$? \par
|
||||
|
||||
\vfill
|
||||
|
||||
\definition{}
|
||||
We say a vector $v$ is a \textit{unit vector} or a \textit{normalized} vector if $|v| = 1$.
|
||||
\pagebreak
|
||||
|
||||
|
||||
\definition{Vector products}
|
||||
The \textit{dot product} of two $n$-dimensional vectors $v$ and $u$ is computed as follows:
|
||||
\begin{equation*}
|
||||
v \cdot u = v_0u_0 + v_1u_1 + ... + v_nu_n
|
||||
\end{equation*}
|
||||
|
||||
|
||||
\vfill
|
||||
|
||||
\definition{Vector angles}<vectorangle>
|
||||
For any two vectors $a$ and $b$, the following holds:
|
||||
|
||||
\null\hfill
|
||||
\begin{minipage}{0.48\textwidth}
|
||||
\begin{equation*}
|
||||
\cos{(\phi)} = \frac{a \cdot b}{|a| \times |b|}
|
||||
\end{equation*}
|
||||
\end{minipage}
|
||||
\hfill
|
||||
\begin{minipage}{0.48\textwidth}
|
||||
\begin{center}
|
||||
\begin{tikzpicture}[scale=1.5]
|
||||
\draw[->] (0, 0) -- (0.707, 0.707);
|
||||
\draw[->, gray] (0.5, 0.0) arc (0:45:0.5);
|
||||
\node[gray] at (0.6, 0.22) {$\phi$};
|
||||
|
||||
\draw[->] (0, 0) -- (1.2, 0);
|
||||
\node[right] at (1.2, 0) {$a$};
|
||||
|
||||
\node[right] at (0.707, 0.707) {$b$};
|
||||
\end{tikzpicture}
|
||||
\end{center}
|
||||
\end{minipage}
|
||||
\hfill\null
|
||||
This can easily be shown using the law of cosines. \par
|
||||
For the sake of time, we'll skip the proof---it isn't directly relevant to this handout.
|
||||
|
||||
\definition{Orthogonal vectors}
|
||||
We say two vectors are \textit{perpendicular} or \textit{orthogonal} if the angle between them is $90^\circ$. \par
|
||||
Note that this definition works with vectors of any dimension.
|
||||
|
||||
\note{
|
||||
In fact, we don't need to think about other dimensions: two vectors in an $n$-dimensional space nearly always
|
||||
define a unique two-dimensional plane (with two exceptions: $\phi = 0^\circ$ and $\phi = 180^\circ$).
|
||||
}
|
||||
|
||||
|
||||
\problem{}
|
||||
What is the dot product of two orthogonal vectors?
|
||||
|
||||
|
||||
\vfill
|
||||
\pagebreak
|
||||
|
||||
|
||||
%For example, the set $\{[1,0,0], [0,1,0], [0,0,1]\}$ (which we usually call $\{x, y, z\})$
|
||||
%forms an orthonormal basis of $\mathbb{R}^3$. Every element of $\mathbb{R}^3$ can be written as a linear combination of these vectors:
|
||||
%
|
||||
%\begin{equation*}
|
||||
% \left[\begin{smallmatrix} a \\ b \\ c \end{smallmatrix}\right]
|
||||
% =
|
||||
% a \left[\begin{smallmatrix} 1 \\ 0 \\ 0 \end{smallmatrix}\right] +
|
||||
% b \left[\begin{smallmatrix} 0 \\ 1 \\ 0 \end{smallmatrix}\right] +
|
||||
% c \left[\begin{smallmatrix} 0 \\ 0 \\ 1 \end{smallmatrix}\right]
|
||||
%\end{equation*}
|
||||
%
|
||||
%The tuple $[a,b,c]$ is called the \textit{coordinate} of a point with respect to this basis.
|
||||
|
||||
|
||||
|
||||
\vfill
|
||||
\pagebreak
|
@ -142,56 +142,13 @@ We could, of course, mark the point \texttt{x} at $[1, 1]$, which is equal parts
|
||||
\vspace{4mm}
|
||||
|
||||
But \texttt{x} isn't a member of $\mathbb{B}$, it's not a valid state. \par
|
||||
Our bit is fully $\vec{e}_0$ or fully $\vec{e}_1$. By our current definitions, there's nothing in between.
|
||||
|
||||
|
||||
\vspace{8mm}
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
\definition{Orthonormal Basis}
|
||||
The unit vectors $\vec{e}_0$ and $\vec{e}_1$ form an \textit{orthonormal basis} of the plane $\mathbb{R}^2$. \par
|
||||
Our bit is fully $\vec{e}_0$ or fully $\vec{e}_1$. By our current definitions, there's nothing in between. \par
|
||||
\note{
|
||||
\say{ortho-} means \say{orthogonal}; normal means \say{normal,} which means length $= 1$. \\
|
||||
}{
|
||||
Note that $\vec{e}_0$ and $\vec{e}_1$ are orthonormal by \textit{definition}. \\
|
||||
We don't have to prove anything, we simply defined them as such.
|
||||
} \par
|
||||
|
||||
\vspace{2mm}
|
||||
|
||||
There's much more to say about basis vectors, but we don't need all the tools of linear algebra here. \par
|
||||
We just need to understand that a set of $n$ orthogonal unit vectors defines an $n$-dimensional space. \par
|
||||
This is fairly easy to think about: each vector corresponds to an axis of the space, and every point
|
||||
in that space can be written as a \textit{linear combination} (i.e, a weighted sum) of these basis vectors.
|
||||
|
||||
\vspace{2mm}
|
||||
|
||||
For example, the set $\{[1,0,0], [0,1,0], [0,0,1]\}$ (which we usually call $\{x, y, z\})$
|
||||
forms an orthonormal basis of $\mathbb{R}^3$. Every element of $\mathbb{R}^3$ can be written as a linear combination of these vectors:
|
||||
|
||||
\begin{equation*}
|
||||
\left[\begin{smallmatrix} a \\ b \\ c \end{smallmatrix}\right]
|
||||
=
|
||||
a \left[\begin{smallmatrix} 1 \\ 0 \\ 0 \end{smallmatrix}\right] +
|
||||
b \left[\begin{smallmatrix} 0 \\ 1 \\ 0 \end{smallmatrix}\right] +
|
||||
c \left[\begin{smallmatrix} 0 \\ 0 \\ 1 \end{smallmatrix}\right]
|
||||
\end{equation*}
|
||||
|
||||
The tuple $[a,b,c]$ is called the \textit{coordinate} of a point with respect to this basis.
|
||||
|
||||
|
||||
Note that the unit vectors $\vec{e}_0$ and $\vec{e}_1$ form an \textit{orthonormal basis} of the plane $\mathbb{R}^2$.
|
||||
}
|
||||
|
||||
\vfill
|
||||
|
||||
\pagebreak
|
||||
|
||||
|
@ -232,7 +232,8 @@ $?
|
||||
|
||||
|
||||
\problem{}
|
||||
The vectors we found in \ref{basistp} are a basis of what space? \par
|
||||
What is the \textit{span} of the vectors we found in \ref{basistp}? \par
|
||||
In other words, what is the set of vectors that can be written as weighted sums of the vectors above?
|
||||
|
||||
\vfill
|
||||
\pagebreak
|
Loading…
x
Reference in New Issue
Block a user