Networks exam - Huffman Code/Entropy

Associate
Joined
13 Jan 2007
Posts
2,424
Location
Belfast,Northern Ireland
Currently in the process of a cram and hitched myself on a snag. Basically a common question among past papers was to calculate the entropy based on Shannon's formula based on input probabilities. Now I have the formula and understand it, the problem presents itself when trying to calculate it on exam day.

For example we have these probabilities:

0.4, 0.3, 0.1, 0.1, 0.06, 0.04

So entropy = -(0.4)log2(0.4) -(0.3)log2(0.3) -(0.1)log2(0.1) -(0.1)log2(0.1) -(0.06)log2(0.06) -(0.04)log2(0.04)

In this case entropy is meant to be 2.14bits

Now log is in base 10 yes? So getting base 2 of log is the major problem here. I have done some searching and came across the basic formula to convert :

loga (x) = log (x) / log (a)

However due to the value before the log does this change things?

so EG: (0.4) log2 (0.4) = (0.4) log (0.4) / (0.4) log (2)

Is this correct or am I going the wrong way about things? I dont wish to go further if im being an idiot here. Pretty annoying as the lecturers notes are absolutely terrible, given shannon's formula but not told how to apply it! Seems working that out could end up pretty lengthy for only a few marks....bugger.
 
Anybody with any idea? Im just confused as to how to apply it to my numbers i.e.

-(0.3) log2 (0.3)

is not what the formula is generally provided for, its more simply log(0.3) or something. I know its a fairly simple work around but i've fried my brain trying and I dont think my lecturer is going to bother responding
 
Currently in the process of a cram and hitched myself on a snag. Basically a common question among past papers was to calculate the entropy based on Shannon's formula based on input probabilities. Now I have the formula and understand it, the problem presents itself when trying to calculate it on exam day.

For example we have these probabilities:

0.4, 0.3, 0.1, 0.1, 0.06, 0.04

So entropy = -(0.4)log2(0.4) -(0.3)log2(0.3) -(0.1)log2(0.1) -(0.1)log2(0.1) -(0.06)log2(0.06) -(0.04)log2(0.04)

In this case entropy is meant to be 2.14bits

Now log is in base 10 yes? So getting base 2 of log is the major problem here. I have done some searching and came across the basic formula to convert :

loga (x) = log (x) / log (a)

However due to the value before the log does this change things?

so EG: (0.4) log2 (0.4) = (0.4) log (0.4) / (0.4) log (2)
The last line is wrong.

You know that log_2 (0.4) = log (0.4) / log(2). (Where log_2 is log to base 2).

So 0.4 log_2 (0.4) is simply 0.4 log(0.4) / log(2).
 
DaveF thank you VERY much, thats been annoying me for quite a bit of time today.

Quite amazing a lecturer supplies his students with bugger all information that is needed to complete an exam question that comes up every year!

But once again thank you, very very much appreciated......to P-Box's and S-box's now (yay)
 
Back
Top Bottom