Information Flow / Conditional Entropy

Soldato
Joined
7 Apr 2004
Posts
4,212
Hi,

Bit of a long shot but im wondering if anyone knows how to do this.

x could have all possible integer values from 0<x<63 (inclusive) with a uniform distribution.

The entropy of x, H(x) = 6 bits.

Now given this code:

Code:
int y;
int r = rand(0, 1); // random integer 0 or 1
y = r * 64 + x;

I need to work out the 'mutual information' between y and x. i.e. How much information does y reveal about x.

It's defined as I(Y;X) = H(Y) - H(Y|X)

I don't really understand how to apply conditional entropy to this problem though.

Anyone have any ideas?
 
Back
Top Bottom