2025-01-22 21:50:31 -08:00
|
|
|
\section{Approximation}
|
|
|
|
|
|
|
|
\generic{Observation:}
|
|
|
|
For small values of $a$, $\log_2(1 + a)$ is approximately equal to $a$. \par
|
|
|
|
Note that this equality is exact for $a = 0$ and $a = 1$, since $\log_2(1) = 0$ and $\log_2(2) = 1$.
|
|
|
|
|
|
|
|
\vspace{2mm}
|
|
|
|
|
|
|
|
We'll add a \say{correction term} $\varepsilon$ to this approximation, so that $\log_2(1 + a) \approx a + \varepsilon$.
|
|
|
|
|
|
|
|
TODO: why? Graphs.
|
|
|
|
|
|
|
|
\problem{}
|
2025-02-08 14:54:40 -08:00
|
|
|
Use the fact that $\log_2(1 + a) \approx a + \varepsilon$ to approximate $\log_2(x_f)$ in terms of $x_i$. \par
|
|
|
|
|
|
|
|
\vspace{5mm}
|
|
|
|
|
|
|
|
Namely, show that
|
|
|
|
\begin{equation*}
|
|
|
|
\log_2(x_f) ~=~ \frac{1}{2^{23}}(x_i) - 127 + \varepsilon
|
|
|
|
\end{equation*}
|
|
|
|
for some error term $\varepsilon$
|
2025-01-22 21:50:31 -08:00
|
|
|
|
|
|
|
|
|
|
|
\begin{solution}
|
|
|
|
Let $E$ and $F$ be the exponent and float bits of $x_f$. \par
|
|
|
|
We then have:
|
|
|
|
\begin{align*}
|
|
|
|
\log_2(x_f)
|
|
|
|
&=~ \log_2 \left( 2^{E-127} \times \left(1 + \frac{F}{2^{23}}\right) \right) \\
|
|
|
|
&=~ E-127 + \log_2\left(1 + \frac{F}{2^{23}}\right) \\
|
|
|
|
&\approx~ E-127 + \frac{F}{2^{23}} + \varepsilon \\
|
|
|
|
&=~ \frac{1}{2^{23}}(2^{23}E + F) - 127 + \varepsilon \\
|
|
|
|
&=~ \frac{1}{2^{23}}(x_i) - 127 + \varepsilon
|
|
|
|
\end{align*}
|
|
|
|
\end{solution}
|
|
|
|
|
|
|
|
\vfill
|
|
|
|
\pagebreak
|