Currently in the process of a cram and hitched myself on a snag. Basically a common question among past papers was to calculate the entropy based on Shannon's formula based on input probabilities. Now I have the formula and understand it, the problem presents itself when trying to calculate it on exam day.
For example we have these probabilities:
0.4, 0.3, 0.1, 0.1, 0.06, 0.04
So entropy = -(0.4)log2(0.4) -(0.3)log2(0.3) -(0.1)log2(0.1) -(0.1)log2(0.1) -(0.06)log2(0.06) -(0.04)log2(0.04)
In this case entropy is meant to be 2.14bits
Now log is in base 10 yes? So getting base 2 of log is the major problem here. I have done some searching and came across the basic formula to convert :
loga (x) = log (x) / log (a)
However due to the value before the log does this change things?
so EG: (0.4) log2 (0.4) = (0.4) log (0.4) / (0.4) log (2)
Is this correct or am I going the wrong way about things? I dont wish to go further if im being an idiot here. Pretty annoying as the lecturers notes are absolutely terrible, given shannon's formula but not told how to apply it! Seems working that out could end up pretty lengthy for only a few marks....bugger.
For example we have these probabilities:
0.4, 0.3, 0.1, 0.1, 0.06, 0.04
So entropy = -(0.4)log2(0.4) -(0.3)log2(0.3) -(0.1)log2(0.1) -(0.1)log2(0.1) -(0.06)log2(0.06) -(0.04)log2(0.04)
In this case entropy is meant to be 2.14bits
Now log is in base 10 yes? So getting base 2 of log is the major problem here. I have done some searching and came across the basic formula to convert :
loga (x) = log (x) / log (a)
However due to the value before the log does this change things?
so EG: (0.4) log2 (0.4) = (0.4) log (0.4) / (0.4) log (2)
Is this correct or am I going the wrong way about things? I dont wish to go further if im being an idiot here. Pretty annoying as the lecturers notes are absolutely terrible, given shannon's formula but not told how to apply it! Seems working that out could end up pretty lengthy for only a few marks....bugger.