[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Naive question
16/01/02 23:18:56, Bruce Grant <bgrant@xxxxxxxxxxxxx> wrote:
>Are there other measures of information content than entropy?
Thorny question, that. Think of Arabic, with its internal
flexions. The "basic" information is carried in the
"consonantal skeleton" of words, e.g. kbr "great",
ktb "book, to write", whereas the vowels carry the
grammatical information, e.g. kitab "book", kutuub
"books", kataba "he wrote", kaatib "writing".
Compare with English write, writing, writer, etc.
Or again:
English great, greatest = Arabic kabiir, akbar
A measure of information I have seen used is the
logarithm of the relative frequency of a word.
This is on the basis that a very common word carries
little information (viz a, the , of, etc.). But this
is still the entropy, first-order word entropy
actually. Je donne ma langue au chat.