141 lines
4.8 KiB
TeX
141 lines
4.8 KiB
TeX
\section{Probability}
|
|
|
|
\definition{}
|
|
A \textit{sample space} is a finite set $\Omega$. \par
|
|
The elements of this set are called \textit{outcomes}. \par
|
|
An \textit{event} is a set of outcomes (i.e, a subset of of $\Omega$).
|
|
|
|
\definition{}
|
|
A \textit{probability function} over a sample space $\Omega$ is a function $\mathcal{P}: P(\Omega) \to [0, 1]$ \par
|
|
that maps events to real numbers between 0 and 1. \par
|
|
Any probability function has the following properties:
|
|
\begin{itemize}
|
|
\item $\mathcal{P}(\varnothing) = 0$
|
|
\item $\mathcal{P}(\Omega) = 1$
|
|
\item For events $A$ and $B$ where $A \cap B = \varnothing$, $\mathcal{P}(A \cup B) = \mathcal{P}(A) + \mathcal{P}(B)$
|
|
\end{itemize}
|
|
|
|
|
|
\problem{}<threecoins>
|
|
Say we flip a fair coin three times. \par
|
|
List all elements of the sample space $\Omega$ this experiment generates.
|
|
|
|
\vfill
|
|
|
|
\problem{}
|
|
Using the same setup as \ref{threecoins}, find the following:
|
|
\begin{itemize}
|
|
\item $\mathcal{P}(~ \{\omega \in \Omega ~|~ \omega \text{ has at least two \say{heads}}\} ~)$
|
|
\item $\mathcal{P}(~ \{\omega \in \Omega ~|~ \omega \text{ has an odd number of \say{heads}}\} ~)$
|
|
\item $\mathcal{P}(~ \{\omega \in \Omega ~|~ \omega \text{ has at least one \say{tails}}\} ~)$
|
|
\end{itemize}
|
|
|
|
\vfill
|
|
\pagebreak
|
|
|
|
%
|
|
% MARK: Page
|
|
%
|
|
|
|
|
|
\definition{}
|
|
Given a sample space $\Omega$ and a probability function $\mathcal{P}$, \par
|
|
a \textit{random variable} is a function from $\Omega$ to a specified output set.
|
|
|
|
\vspace{2mm}
|
|
|
|
For example, given the three-coin-toss sample space
|
|
$\Omega = \{
|
|
\texttt{TTT},~ \texttt{TTH},~ \texttt{THT},~
|
|
\texttt{THH},~ \texttt{HTT},~ \texttt{HTH},~
|
|
\texttt{HHT},~ \texttt{HHH}
|
|
\}$,
|
|
We can define a random variable $\mathcal{H}$ as \say{the number of heads in a throw of three coins}. \par
|
|
As a function, $\mathcal{H}$ maps values in $\Omega$ to values in $\mathbb{Z}^+_0$ and is defined as:
|
|
\begin{itemize}
|
|
\item $\mathcal{H}(\texttt{TTT}) = 0$
|
|
\item $\mathcal{H}(\texttt{TTH}) = 1$
|
|
\item $\mathcal{H}(\texttt{THT}) = 1$
|
|
\item $\mathcal{H}(\texttt{THH}) = 2$
|
|
\item ...and so on.
|
|
\end{itemize}
|
|
|
|
Intuitively, a random variable assigns a \say{value} in $\mathbb{R}$ to every possible outcome.
|
|
|
|
|
|
\definition{}
|
|
We can compute the probability that a random variable takes a certain value by computing the probability of
|
|
the set of outcomes that produce that value. \par
|
|
|
|
\vspace{2mm}
|
|
|
|
For example, if we wanted to compute $\mathcal{P}(\mathcal{H} = 2)$, we would find
|
|
$\mathcal{P}\bigl(\{\texttt{THH}, \texttt{HTH}, \texttt{HHT}\}\bigr)$.
|
|
|
|
|
|
\problem{}
|
|
Say we flip a coin with $\mathcal{P}(\texttt{H}) = \nicefrac{1}{3}$ three times. \par
|
|
What is $\mathcal{P}(\mathcal{H} = 1)$, with $\mathcal{H}$ defined as above? \par
|
|
What is $\mathcal{P}(\mathcal{H} = 5)$?
|
|
|
|
\vfill
|
|
|
|
|
|
\problem{}
|
|
Say we roll a fair six-sided die twice. \par
|
|
Let $\mathcal{X}$ be a random variable measuring the sum of the two results. \par
|
|
Find $\mathcal{P}(\mathcal{X} = x)$ for all $x$ in $\mathbb{Z}$.
|
|
|
|
\vfill
|
|
\pagebreak
|
|
|
|
|
|
%
|
|
% MARK: Page
|
|
%
|
|
|
|
|
|
\definition{}<defexp>
|
|
Say we have a random variable $\mathcal{X}$ that produces outputs in $\mathbb{R}$. \par
|
|
The \textit{expected value} of $\mathcal{X}$ is then defined as
|
|
\begin{equation*}
|
|
\mathcal{E}(\mathcal{X})
|
|
~\coloneqq~ \sum_{x \in \mathbb{R}}\Bigl(x \times \mathcal{P}\bigl(\mathcal{X} = x\bigr)\Bigr)
|
|
~=~ \sum_{\omega \in \Omega}\Bigl(\mathcal{X}(\omega) \times \mathcal{P}(\omega)\Bigr)
|
|
\end{equation*}
|
|
That is, $\mathcal{E}(\mathcal{X})$ is the average of all possible outputs of $\mathcal{X}$ weighted by their probability.
|
|
|
|
\problem{}
|
|
Say we flip a coin with $\mathcal{P}(\texttt{H}) = \nicefrac{1}{3}$ two times. \par
|
|
Define $\mathcal{H}$ as the number of heads we see. \par
|
|
Find $\mathcal{E}(\mathcal{H})$.
|
|
|
|
\vfill
|
|
|
|
\problem{}
|
|
Let $\mathcal{A}$ and $\mathcal{B}$ be two random variables. \par
|
|
Show that $\mathcal{E}(\mathcal{A} + \mathcal{B}) = \mathcal{E}(\mathcal{A}) + \mathcal{E}(\mathcal{B})$.
|
|
|
|
\begin{solution}
|
|
Use the second definition of $\mathcal{E}$, $\sum_{\omega \in \Omega}\Bigl(\mathcal{X}(\omega) \times \mathcal{P}(\omega)\Bigr)$.
|
|
|
|
\vspace{2mm}
|
|
|
|
Make sure students understand all parts of \ref{defexp}, and are comfortable with the fact that a random variable \say{assigns values} to outcomes.
|
|
\end{solution}
|
|
|
|
\vfill
|
|
|
|
\definition{}
|
|
Let $A$ and $B$ be events on a sample space $\Omega$. \par
|
|
We say that $A$ and $B$ are \textit{independent} if $\mathcal{P}(A \cap B) = \mathcal{P}(A) \times \mathcal{P}(B)$. \par
|
|
Intuitively, events $A$ and $B$ are independent if the outcome of one does not affect the other.
|
|
|
|
\definition{}
|
|
Let $\mathcal{A}$ and $\mathcal{B}$ be two random variables over $\Omega$. \par
|
|
We say that $\mathcal{A}$ and $\mathcal{B}$ are independent if the events $\{\omega \in \Omega ~|~ \mathcal{A}(\omega) = a\}$
|
|
and $\{\omega \in \Omega ~|~ \mathcal{B}(\omega) = b\}$ are independent for all $(a, b)$ that $\mathcal{A}$ and $\mathcal{B}$ can produce.
|
|
|
|
|
|
|
|
\pagebreak |