[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Naive question

Are there other measures of information content than entropy?

As I understand it, if you interpret "in" as two characters instead of one, the
entropy of the message containing it ought to appear lower because there are
more possible combinations of characters (i.e. "ii" or "nn" is possible if "i"
and "n" are separate characters, but not if "in" represents a single character).

Is there such a thing as a measure of information content which is not so
dependent on knowing exactly what is or is not a character?