Entropie and Shannon Information

Entropy $S$ is a measure for disorder. The harmonic oscillator takes periodically the same states in phase space corresponding to a certain energy. For a chaotic system this periodicity is not given. After long times that trajectory will fill out the whole phase space. We can define a probability density $\rho_i$ to find the system in a certain state. Numerically this density can be calculated by counting the points per unit volume after a long time of iteration. The number of points per unit volume divided by the total number of points gives the probability density. \begin{align} \rho_i=\frac{n_i}{\Delta V \sum_i n_i} \end{align} The probability to find the system in a certain state is \begin{align} P_i=\rho_i \Delta V=\frac{n_i}{\sum_i n_i} \end{align} The Entropie is defined as \begin{align} S=-\sum_i P_i \, log(P_i) \end{align} A pendulum at rest takes just one point in phase space. Thus $P=\frac{n_i}{\sum_i n_i}=\frac{1}{1}=1$ and $S=0$, we have the maximal information on the state of the system. A system that can be in two states and spends equal amounts of time in each state has $P_i=1/2$ and $S=-2 \cdot 1/2 \, log(1/2)=0,69$. However, if the system spends only $1/4$ of the time in one state and the rest $3/4$ in the other the entropy decreases. $P_1=1/4$ and $P_2=3/4$, $S=-1/4 \, log(1/4)-3/4 \, log(3/4)=0,56$