[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: VMs: Entropy, was WAR against EVA



Hi Larry,

At 07:57 04/03/03 -0500, Larry Roux wrote:
There are Voynich glyphs that never appear next to each other. The question is: is it due to the encryption scheme? Or is it due to the author running an "e" into "i" to make "ch"?

This is only one of the many features which differentiate Voynichese (however you stroke it) from other languages. Even if you think you have an answer that explains one factor, it is barely the smallest of stepping stone towards an integrated/holistic answer which addresses them all.


In either case, I would think that if this was a hoax than you would find a much more random occurrence of characters and would find ql more often. Obviously the author had a strict set of rules which were followed.

This is, in essence, the core of the argument against more complex enciphering or encoding schemes (such as polyalphabetic substitution) which would - in almost all circumstances - tend to "flatten the stats". That is, the various distribution curves for the result of a "straight" polyalpha encoding wouldn't be so enormously "peaky" as the ones we see - all the individual symbols would share a similar probability of occurring.


Code-makers and code-breakers pre-1550 understood (some intuitively, some [like Cicco Simonetta] explicitly) that it was frequency distribution that usually gives the game away. Polyalpha (in normal use) flattens that distribution, which removes many of the obvious ways in.

The main reason that polyalpha didn't take over from "simple" ciphers (many of which were not actually that simple) seems to have been due to a general emotional attachment to the whole culture that had sprung up around ciphering 1350-1500 - polyalpha was better, but people liked the old way better.

To claim - as GC does, following Leonell Strong's alleged decryption in the 1940s - that the VMS' code is a tricky polyalpha, perhaps with the keys hand-picked for every page, would almost certainly require the later dating, as claimed (say, the 1560s in England).

However, when I look at the 9-rosette page, I see Milan circa 1460: when I look at the cipherbet, I see resonant echoes of several tricky Northern Italian ciphers circa 1440-1460: the majolica designs appear geometric (which points to NE Italy pre-1480), etc: and so I draw the general date and place conclusions you'd expect.

Polyalpha would always tend to destroy low-level structure (while introducing its own subtle cycle-length level of structure), and it would require an extraordinary act of will not only to retain structure with a polyalpha, but also to get it to the level where we can observe word-centric paradigms (such as core-mantle-crust) or adjacency-centric rules being consistently followed.

Strong believed this to be the case: but it simply doesn't gel with my reading of the art historical evidence, while also failing to fit my understanding of basic statistics.

If the dating is pre-1480, then I believe the code would need to have been a composite of several simpler codes/ciphers well-known by that time - ciphers, codes, Roman numerals, shorthand, steganography, etc. However, the structure within which these separate mechanisms are combined may be the root cause of many of the "strict set of rules" you mention.

One of my long-held suspicions is that the VMS may simply be an obfuscated shorthand, or (more precisely) an obfuscated tachygraphic ("fast writing") system. The basic shorthand system (as evidenced by its single-stroke alphabet formation) its alphabet is based on appears to have been designed for writing on wax tablets.

But given a small (but fast) alphabet without vast numbers of Tironian notae - specialist symbols for frequently occurring words, like <8> "cum-"/"con-" , or <9> "-us" - to help, how could a tachygrapher keep up with the flow of speech?

The only recorded small-alphabet system that precedes Dr Timothie Bright's "characterie" of the 1560s is from a Mr Ratcliffe of Bristol (who was using it in the early 1500s), which simply involved dropping vowels and other unnecessary parts of speech, but used normal English letters for the rest. However, that's exactly as far back as the history of shorthand goes - the rest has yet to be found out.

It's a perfectly reasonable hypothesis to propose that - especially given its alphabet structure - the VMS is based on a wax-tablet shorthand system parallel to Ratcliffe's, constructed in the period between the death of notae and the birth of characterie.

I think that a system built around a structured (though obfuscated), largely consonant-based, shorthand-driven text would have many of the features we see in the VMS. Though YMMV! :-)

Cheers, .....Nick Pelling.....

PS: if someone were constructing a universal language circa 1500, why on earth would they construct it to look just like a simple cipher (without actually being one)? :-9

______________________________________________________________________
To unsubscribe, send mail to majordomo@xxxxxxxxxxx with a body saying:
unsubscribe vms-list