Exam 1 - Review#

The following is written by Gabriella Shalumov. I will add some remarks based on what we discussed in class, change a few things here and there and add some exercises.

Events and Bayes Theorem#

Tip

What to know

  • Know how to compute probabilities if an experiment is described, and the outcomes are equally likely to occur:

  1. \(\displaystyle P(E)=\frac{n(\textrm{outcomes you like})}{n(\textrm{outcomes you can choose from})} = \frac{n(E)}{n(\Omega)}\)

  2. \(\displaystyle P(E_1 \cup E_2)=\frac{n(\textrm{outcomes you like})}{n(\textrm{outcomes you can choose from})}=\frac{n(\textrm{outcomes in one event})+n(\textrm{outcomes in the other event)} -n(\textrm{outcomes that were counted twice})}{n(\textrm{all outcomes you can choose from})}=\frac{n(E_1)+n(E_2)-n(E_1E_2)}{n(\Omega)}\)

    2’. if the outcomes are not equally likely to occur:
    \(P(E_1 \cup E_2) =P(E_1)+P(E_2)-P(E_1E_2)\))

  3. \(\displaystyle P(E|F) = \frac{n(\textrm{outcomes you like})}{n(\textrm{outcomes you can choose from})} = \frac{n(EF)}{n(F)}\)

    3’. if the outcomes are not equally likely to occur:
    \(\displaystyle P(E|F)=\frac{P(EF)}{P(F)}\)

  • Know the axioms of probability in general:

  1. \(P(E)\geq 0\)

  2. \(P(\Omega)=1\)

  3. \(P(E\cup F)=P(E)+P(F)\), if \(EF=\emptyset\)

  4. \(P(E|F)P(F)=P(EF)\) (which is equivalent of moving on a probability tree)

  • Know Bayes’ theorem:
    \(\displaystyle \textrm{posterior}=\frac{\textrm{likelihood}\cdot \textrm{prior}}{\textrm{marginal}}\)

    \(\displaystyle P(E|F)=\frac{P(F|E)P(E)}{P(F)}\)

  • Know how to compute probabilities on contingency tables, know how to work with probability trees.

  • Know what independence of (multiple) events means.

Discrete Random Variables#

There are countably (possibly finitely many) outcomes. Each outcome is labeled by a number (by the random variable \(X\)). Therefore the labels are discrete. For each label (i.e. possible values of \(X\)), you can create a row in a probability distribution table, or its own bar in a histogram.

  • \(\displaystyle E[X]=\sum_k k\cdot P(X=k)\)

    • \(\displaystyle E[f(X)]=\sum_k f(k)\cdot P(X=k)\)

    • \(\displaystyle E[X+Y]=E[X]+E[Y]\)

    • \(\displaystyle E[XY]=E[X]\cdot E[Y]\), if \(X\) and \(Y\) are independent, where \(X\) and \(Y\) are independent if \(P(X=k,Y=l)=P(X=k)\cdot P(Y=l)\) for any \(k,l\).
      (The converse is not true.)
      \(Cov(X,Y)=E[XY]-E[X]E[Y]=E[X-E[X]]\cdot E[Y-E[Y]]\) measures the degree two which the two variables change together. \(Cov(X,Y)\) suggests there’s no linear relationship between two variables, but doesn’t necessarily mean two variables are independent (they may have a nonlinear relationship).

  • \(\displaystyle Var(X)=\sum_k (k-E[X])^2\cdot P(X=k)=E[X^2]-(E[X])^2\)