[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: VMs: Slightly modified VMS generator output



Hi Jeff,

At 23:35 03/12/2003 +0000, Jeff wrote:
Every one here is making the unfounded assumption that because a word that
is generated by an algorithm does not appear in the VMS text it cannot be
valid. In the case of an artificial language the VMS will be a subset of a
universal set. If the method can generate every word in the subset then how
can it be said that the extra words are not part of the universal set that
the author chose not to select?

No, it's only you that's making that assumption. What we're *actually* trying to do is to generate models for generating text that are strongly predictive of the text as observed (ie minimising the size of the [negentropic] residual), whilst simultaneously trying to minimise the size of the model (ie minimising the number & complexity of assumptions).


AIUI, these two values are essentially what the Bayesian Information Criterion (BIC) tries to merge into a single composite measure.

IMO, what we have so far failed to do is to operationalise our ideas fully - what is the BIC of the OKOKO model, or of my four-column model? How much information do the three (quite reasonable) assumptions mentioned by John Grove discard? etc

Cheers, .....Nick Pelling.....


______________________________________________________________________ To unsubscribe, send mail to majordomo@xxxxxxxxxxx with a body saying: unsubscribe vms-list