Saturday, 12 October 2013

Fundamental limits on performance and channels for communication

         Entropy Functions and Equivocation:
Average Self information is called as Entropy.Self information of a symbol with probability of transmission over cahnnel Pk is given by:Ik=log(1/Pk).
        Entropy is given by H(s)=summation 1 to r Pi(log(1/Pi).Pi is the probability of the symbol.
 1.Priori Entropy:The entropy of input symbols a1,a2......ar before their transmission is called as Priori entropy.It is denoted by H(A).
2.Posteriori Entropy(Conditional)Entropy:The entropy of input symblos a1,a2,a3.....ar after transmission and reception of a particular output bj is called as Posteriori Entropy.It is denoted by H(A/Bj).
3.Equivocation:Equivocation is defined as the average value of all the posteriori or conditional entropies.It is denoted by H(A/B).

Mutual Information:On average,observation of output symbols provides us with [H(A)-H(A/B)]bits of information.This information is called "Mutual Information" or "Transinformation."It ie represented by I(A,B).

Properties of Mutual Information:
  1.The Mutual information of a channel is symmetric.
  2.The mutual information is always non-negative.
  3.The mutual information of a channel may be expressed in terms of channel outputs as:
           I(A,B)=H(A)-H(A/B).
 4.The mutual information is related to joint entropy of channel as:
           I(A,B)=H(A)+H(B)-H(A,B).
        

No comments:

Post a Comment