Example of a Binary Symmetric Channel (BSC) Let us consider a channel input alphabet X = {a1, a2}…

Example of a Binary Symmetric Channel (BSC) Let us consider a channel input alphabet X = {a1, a2}…

Example of a Binary Symmetric Channel (BSC) Let us consider a channel input alphabet X = {a1, a2} and a channel output alphabet Y = { b1, b2}. Further, let P(b1?a1) = P(b2?a2) = 1-e and P(b1?a2) = P(b2?a1) = e. This is an example of a binary channel as both the input and output alphabets have two elements each. Further, the channel is symmetric and unbiased in its behavior to the two possible input letters a1and a2. Note that the output alphabet Y need not necessarily be the same as the input alphabet in general. P(b?a) a1 1-e Fig.P.1.3.1 Representation of a binary symmetric channel; e indicates the probability of an error during transmission. Usually, if a1 is transmitted through the channel, b1 will be received at the output provided the channel has not caused any error. So, e in our description represents the probability of the channel causing an error on an average to the transmitted letters. Let us assume that the probabilities of occurrence of a1 and a2 are the same, i.e. PX(a1) = PX(a2) = 0.5. A source presenting finite varieties of letters with equal probability is known as a discrete memory less source (DMS). Such a source is unbiased to any letter or symbol. Now, observe that, 1 1 2 2 1 (,) (, ) 2 P ab P ab XY XY -? = = And 12 21 (, ) (,) 2 P ab P ab XY XY ? = = Note that the output letters are also equally probable, i.e. 2 1 1 ( )() 2 ji i j Py x px = ? = forAverage Mutual Information The concept of averaging information over an ensemble is also applicable over a joint ensemble and the average information, thus obtained, is known as Average Mutual Information: 2 1 1 ( ) ( ; ) ( ; ) log ( ) K J X Y k j XY k j k j X k P ab IXY P a b P a ? = = ? ? ? bit