[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
The Mathematics of Crankery
Here's something I've long wondered about. We know
that if you have a decipherment system with enough
knobs to twiddle and/or an unknown text that is short
enough, you can read anything into anything. Typically
such bogus systems are exposed as bogus by reduction to
absurdity. Thus Friedmann used the Shakespearean
ciphers to read that he, William Friedmann, had written
Shakespeare's plays himself. The detractors of the
"Bible Code" used that system to read prophecies of
assassinations of world leaders into "Moby Dick".
I wonder if there is some rigorous mathematical system
to disprove such systems. I think that the statistical
concept of "degrees of freedom" is involved, but I'm
not a good enough mathematician to carry it further.
Some systems are loose enough to read things into a
text of any size, no matter how large. Edo Nyland's
system may be an example of this.
Some systems read isolated snippets of intelligible
text in a large text. The Bible Code is an example of
this.
Other systems only work on short unknown texts. The
Phaistos Disk is a very short text; therefore many
systems can read intelligible text into it, and many of
these systems would fail on a longer sample of Phaistos
Disk text.
So. Could we find the degrees of freedom in a
decipherment scheme, the degrees of freedom in the
ciphertext, and prove whether the two taken together
constitute a valid or a bogus decipherment system?
Dennis