[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: VMs: meaning of entropy
Hi Rene,
Thank you for the explanations
> (about entropy)
> Actually, h1 is the single-character entropy,
> independent of context (independent of the
> preceding ones). If all 26 characters are equally
> frequent, this equals 2log(26). That, in fact,
> is sometimes witten as h0, which is simply the
> theoretical upper limit of h1.
So in Monkey:
h0 - entropy if all characters used were equally frequent
h1 - uncoditional single char entropy
h2 - conditional single char entropy (1 preceding known)
hn - as above (n-1 preceding known)
> h3 as calculated by monkey is also a conditional
> single-character entropy, but assumes that the
> preceding 2 characters are know. It can be
> calculated as the difference between entropy of
> character triplets and that of pairs.
But the entropies of pairs, triplets etc. are not
output, are they? Would they add any more meaning
to the text statistics?
> Hope that clarifies, Rene
Yes - thank you!
Best regards,
Rafal
______________________________________________________________________
To unsubscribe, send mail to majordomo@xxxxxxxxxxx with a body saying:
unsubscribe vms-list