Tom edits
This commit is contained in:
parent
8925d5b823
commit
e43ff5e1fb
@ -18,7 +18,7 @@
|
||||
|
||||
\maketitle
|
||||
|
||||
\input{parts/0 review.tex}
|
||||
\input{parts/0 probability.tex}
|
||||
\input{parts/1 intro.tex}
|
||||
\input{parts/2 secretary.tex}
|
||||
\input{parts/3 orderstat.tex}
|
||||
|
@ -1,4 +1,4 @@
|
||||
\section{Review}
|
||||
\section{Probability}
|
||||
|
||||
\definition{}
|
||||
A \textit{sample space} is a finite set $\Omega$. \par
|
||||
@ -16,15 +16,14 @@ Any probability function has the following properties:
|
||||
\end{itemize}
|
||||
|
||||
|
||||
\problem{}
|
||||
\problem{}<threecoins>
|
||||
Say we flip a fair coin three times. \par
|
||||
List all elements of the sample space $\Omega$ this experiment generates.
|
||||
|
||||
\vfill
|
||||
|
||||
\problem{}
|
||||
Again, flip a fair coin three times. \par
|
||||
Find the following:
|
||||
Using the same setup as \ref{threecoins}, find the following:
|
||||
\begin{itemize}
|
||||
\item $\mathcal{P}(~ \{\omega \in \Omega ~|~ \omega \text{ has at least two \say{heads}}\} ~)$
|
||||
\item $\mathcal{P}(~ \{\omega \in \Omega ~|~ \omega \text{ has an odd number of \say{heads}}\} ~)$
|
||||
@ -94,7 +93,7 @@ Find $\mathcal{P}(\mathcal{X} = x)$ for all $x$ in $\mathbb{Z}$.
|
||||
|
||||
|
||||
\definition{}
|
||||
Say we have a random variable that produces outputs in a set $A$. \par
|
||||
Say we have a random variable $\mathcal{X}$ that produces outputs in $\mathbb{R}$. \par
|
||||
The \textit{expected value} of $\mathcal{X}$ is then defined as
|
||||
\begin{equation*}
|
||||
\mathcal{E}(\mathcal{X})
|
||||
@ -118,7 +117,7 @@ Show that $\mathcal{E}(\mathcal{A} + \mathcal{B}) = \mathcal{E}(\mathcal{A}) + \
|
||||
|
||||
\definition{}
|
||||
Let $A$ and $B$ be events on a sample space $\Omega$. \par
|
||||
We say that $A$ and $B$ are \textit{independent} if $\mathcal{E}(A \cap B) = \mathcal{P}(A) + \mathcal{P}(B)$. \par
|
||||
We say that $A$ and $B$ are \textit{independent} if $\mathcal{P}(A \cap B) = \mathcal{P}(A) + \mathcal{P}(B)$. \par
|
||||
Intuitively, events $A$ and $B$ are independent if the outcome of one does not affect the other.
|
||||
|
||||
\definition{}
|
@ -3,7 +3,7 @@
|
||||
\generic{Setup:}
|
||||
Suppose we toss a 6-sided die $n$ times. \par
|
||||
It is easy to detect the first time we roll a 6. \par
|
||||
What should we do if we want to annouce the \textit{last}?
|
||||
What should we do if we want to detect the \textit{last}?
|
||||
|
||||
\problem{}<lastl>
|
||||
Given $l \leq n$, what is the probability that the last $l$
|
||||
@ -17,7 +17,8 @@ tosses of this die contain exactly one six? \par
|
||||
\vfill
|
||||
|
||||
\problem{}
|
||||
For what value of $l$ is the probability in \ref{lastl} maximal?
|
||||
For what value of $l$ is the probability in \ref{lastl} maximal? \par
|
||||
The following table may help.
|
||||
|
||||
\begin{center}
|
||||
\begin{tabular}{|| c | c | c ||}
|
||||
@ -53,7 +54,8 @@ For what value of $l$ is the probability in \ref{lastl} maximal?
|
||||
|
||||
\problem{}
|
||||
Finish your solution: \par
|
||||
In $n$ rolls of a six-sided die, when should we announce the last time we roll a 6? \par
|
||||
In $n$ rolls of a six-sided die, what strategy maximizes
|
||||
our chance of detecting the last $6$ that is rolled? \par
|
||||
What is the probability of our guess being right?
|
||||
|
||||
\begin{solution}
|
||||
|
@ -229,20 +229,18 @@ Let $r = \frac{k-1}{n}$, the fraction of applicants we reject. Show that
|
||||
\vfill
|
||||
|
||||
\problem{}
|
||||
With a bit of faily unpleasant calculus, we can show that
|
||||
With a bit of faily unpleasant calculus, we can show that the following is true for large $n$:
|
||||
\begin{equation*}
|
||||
\underset{n \rightarrow \infty}{\text{lim}}
|
||||
\sum_{x=k}^{n}\frac{1}{x-1}
|
||||
~\approx~ \text{ln}\Bigl(\frac{n}{k}\Bigr)
|
||||
\end{equation*}
|
||||
Use this fact to find $\underset{n \rightarrow \infty}{\text{lim}} \phi_n(k)$.~
|
||||
\hint{For large $n$, $\frac{k-1}{n} \approx \frac{k}{n}$.}
|
||||
Use this fact to find an approximation of $\phi_n(k)$ at large $n$ in terms of $r$. \par
|
||||
\hint{If $n$ is big, $\frac{k-1}{n} \approx \frac{k}{n}$.}
|
||||
|
||||
\begin{solution}
|
||||
\begin{equation*}
|
||||
\underset{n \rightarrow \infty}{\text{lim}} \phi_n(k)
|
||||
~=~ \underset{n \rightarrow \infty}{\text{lim}}
|
||||
\Biggl( r \sum_{x = k}^{n}\left( \frac{1}{x-1} \right) \Biggr)
|
||||
\phi_n(k)
|
||||
~=~ r \sum_{x = k}^{n}\left( \frac{1}{x-1} \right)
|
||||
~\approx~ r \times \text{ln}\left(\frac{n}{k}\right)
|
||||
~=~ -r \times \text{ln}\left(\frac{k}{n}\right)
|
||||
~\approx~ -r \times \text{ln}(r)
|
||||
@ -252,7 +250,7 @@ Use this fact to find $\underset{n \rightarrow \infty}{\text{lim}} \phi_n(k)$.~
|
||||
\vfill
|
||||
|
||||
\problem{}
|
||||
Find the $k$ that maximizes $\underset{n \rightarrow \infty}{\text{lim}} \phi_n(k)$. \par
|
||||
Find the $r$ that maximizes $\underset{n \rightarrow \infty}{\text{lim}} \phi_n$. \par
|
||||
Also, find the value of $\phi_n$ at this point. \par
|
||||
\note{If you aren't familiar with calculus, ask an instructor for help.}
|
||||
|
||||
@ -270,7 +268,7 @@ Also, find the value of $\phi_n$ at this point. \par
|
||||
|
||||
\vfill
|
||||
|
||||
Following this strategy, we should thus expect to select the best candidate about $e^{-1} = 37\%$ of the time,
|
||||
Thus, the \say{look-then-leap} strategy with $r = e^{-1}$ should select the best candidate about $e^{-1} = 37\%$ of the time,
|
||||
\textit{regardless of $n$.} Our probability of success does not change as $n$ gets larger! \par
|
||||
\note{Recall that the random strategy succeeds with probability $\nicefrac{1}{n}$. \par
|
||||
That is, it quickly becomes small as $n$ gets large.}
|
||||
|
@ -85,14 +85,13 @@ Given some $y$, what is the probability that all five $\mathcal{X}_i$ are smalle
|
||||
|
||||
|
||||
\definition{}
|
||||
Say we have a random variable $\mathcal{X}$ which we observe $n$ times. \par
|
||||
Say we have a random variable $\mathcal{X}$ which we observe $n$ times. \note{(for example, we repeatedly roll a die)}
|
||||
We'll arrange these observations in increasing order, labeled $x_1 < x_2 < ... < x_n$. \par
|
||||
Under this definition, $x_i$ is called the \textit{$i^\text{th}$ order statistic}---the $i^\text{th}$ smallest sample of $\mathcal{X}$.
|
||||
|
||||
|
||||
\problem{}<ostatone>
|
||||
Say we have a random variable $\mathcal{X}$ uniformly distributed on $[0, 1]$. \par
|
||||
We take $5$ observations of $\mathcal{X}$. \par
|
||||
Say we have a random variable $\mathcal{X}$ uniformly distributed on $[0, 1]$, of which we take $5$ observations. \par
|
||||
Given some $y$, what is the probability that $x_5 < y$? How about $x_4 <y $?
|
||||
|
||||
\begin{solution}
|
||||
|
Loading…
x
Reference in New Issue
Block a user