Information Theory - Motivate entropy (amount info sample) - Entropy is 1) always > 0, 2) independent of values assumed by random variable - Obtain differential entropy - Prove/motivate uniform maximizes entropy - Conditional entropy, with examples - KL divergence - Convex and concave functions -