Coupon Accepted Successfully!



In a group of symbol, the average information content per symbol is known as its entropy and it is represented by ‘H’.

Let we have n symbols with probability of occurrence p1, p2, p3, p4, ......., pn, then its entropy H will be given by

733.png bits/symbols.

Entropy is maximum when all the symbols have equal probability of occurrence and it is

If any source is emitting symbols at the rate of R symbols/sec, then source information rate is given by, (IS)

IS = RH bits/sec.

Test Your Skills Now!
Take a Quiz now
Reviewer Name