next up previous
Next: EEL6825: HW#2 Up: EEL6825: Homework Assignments Previous: EEL6825: Homework Assignments

EEL6825: HW#1

Due Friday, Friday, September 10, 1999 in class. Late homework will lose $e^{\char93 ~of~days~late} -1 $ percentage points. To see the current late penalty, click on
http://www.cnel.ufl.edu/analog/harris/latepoints.html A computer is not necessary for this assignment.

1.
Three one-dimensional distributions are given as uniform in [-1/3,1/3] for $\omega_1$, uniform in [-1/2,1/2] for $\omega_2$ and uniform in [-1,1] for $\omega_3$. Assume the a priori probabilities are equal.
(a)
Compute $P(\omega_i\vert x)$ for each class and sketch each function on a separate plot.
(b)
Implement a Bayes classifier for the three distributions. Be sure to describe the class for each possible value of x.
(c)
Compute the Bayes error.

2.
Two normal distributions are characterized by:

\begin{displaymath}P(\omega_1)=P(\omega_2)=0.5\end{displaymath}


\begin{displaymath}\mu_1=
\left[
\begin{array}{c}
0 \\
1
\end{array}\right]
,
\mu_2=
\left[
\begin{array}{c}
0 \\
-1
\end{array}\right]
\end{displaymath}

Derive the analytic form and sketch the Bayes decision boundary for the following cases: (Also sketch some equi-probability contours for each distribution.)
(a)

\begin{displaymath}\Sigma_1=\Sigma_2=I\end{displaymath}

(b)

\begin{displaymath}\Sigma_1=I\end{displaymath}


\begin{displaymath}\Sigma_2=
\left[
\begin{array}{cc}
2&0 \\
0&1
\end{array}\right]
\end{displaymath}

(c)

\begin{displaymath}\Sigma_1=
\left[
\begin{array}{cc}
1&.5 \\
.5&1
\end{array}\right]
\end{displaymath}


\begin{displaymath}\Sigma_2=
\left[
\begin{array}{cc}
1&-0.5 \\
-0.5&1
\end{array}\right]
\end{displaymath}

3.
In many pattern classification problems one has the option either to assign the pattern to one of c classes or to reject it as being unrecognizable. If the costs for rejects is not too high, rejection may be a desirable action. Let

\begin{displaymath}c_{ij} =
\left\{
\begin{array}{cll}
0& i=j & i,j=1,\dots,c\...
...bda_r&i=c+1& \\
\lambda_s&{\rm otherwise}&
\end{array}\right.
\end{displaymath}

where $\lambda_r$ is the loss incurred for choosing the (c+1)th action of rejection, and $\lambda_s$ is the loss incurred for making a substitution error. Show that the minimum risk is obtainable if we decide $\omega_i$ if $ p(\omega_i\vert x) > p(\omega_j\vert x) $for all j and if $ p(\omega_i\vert x) > 1- {\lambda_r}/{\lambda_s}$ and reject otherwise. What happens if $\lambda_r=0$? What happens if $\lambda_r>\lambda_s$?

4.
Problem 2.6 in T&K (Neyman-Pearson Test)

5.
Problem 2.9 in T&K (Analytic form of Bayes error for normal distributions)


next up previous
Next: EEL6825: HW#2 Up: EEL6825: Homework Assignments Previous: EEL6825: Homework Assignments
Dr John Harris
1999-12-10