In a group of symbol, the average information content per symbol is known as its entropy and it is represented by âHâ.
Let we have n symbols with probability of occurrence p1, p2, p3, p4, ......., pn, then its entropy H will be given by
Entropy is maximum when all the symbols have equal probability of occurrence and it is
If any source is emitting symbols at the rate of R symbols/sec, then source information rate is given by, (IS)
IS = RH bits/sec.