This document discusses information theory and entropy. It defines information as something that reduces uncertainty and provides new knowledge. Entropy is defined as the average amount of information contained in a message. The entropy of a probability distribution measures the uncertainty in the distribution. It is calculated as the expected value of the information contained in each symbol of the distribution. A higher entropy means more uncertainty in the distribution. Logarithms are used to measure information and entropy because they allow the properties of non-negativity, additivity, and continuity.