[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: VMs: VMS: more past stuff: "On som properties of informative texts"



sigh? sorry all (just me again :-( 
Jacques is closing in to Nick & GC (we only need 8 more?)

 
 From: Jacques Guy <jguy@xxxxxxxxxxxxxxxx>
 Subject: VMs: VMS: more past stuff: "On som properties of informative texts"
 
 Again an article by B. V. Sukhotin:
(SNIP)..................
 We have:
 
 .           Number of 3-tuples occurring N times
 .                   in partition
 .
 .           P1            P2         P3
 .N
 .
 .0          10             7          6
 .1           3             6          5
 .2           3             5          5
 .3           1             3          4
 .4           4             0          0
 .5           3             2          2
 .6           0             1          2
 .7           1             2          1
 .8           2             0          1
 .9           0             0          0
 .10          0             0          0
 .11          0             1          0
 
 
 The objective function, here the sum of the absolute deviations from
 the mean frequency, should be very small for a truly random text.
 Now, since an erroneous partition should resemble a random text more
 than a correct partition should, the maximum value of the objective
 function should correspond to the correct partition.
 
 In fact, the sum of the absolute deviations from the mean frequency is
 60 for partition P1, 53 for partition P2, and 49 for partition P3, and the
 maximum sum, 60, does correspond to the correct partition, P1.
 
 But the four possible objective functions suggested above take
 an extreme value when symbols are cyclically repeated throughout
 the text. Such texts are clearly not what our intuition sees as
 informative. Therefore, none of these functions provides a measure
 of the informativeness of texts.
 
 Assume now that a mostly random text is given. Whichever two possible
 partitions are considered, the corresponding values of the objective
 function (whether it is the sum of the absolute deviations, of their
 square, or the entropy of any order) can be expected to be very close.
 
 If, on the other hand, a mostly non-random text is given, the values
 of these same functions calculated on any two partitions will be just
 as close as for a random text.
 
 The above observations suggest that, when an informative text is
 encoded as in our example, the difference of quality between the best
 and the worst partition can only be infinitely small.
 
 Consider now a text somehow encoded.
(End SNIP...........)

-=se=->
Think not 3-tuples but 4 (fours!)... THESE are MIRRORED on themselves
(but in Reverse order. The 'partitions' stay STATIC on the key pages. 
(per ES) 
I would love to know who/whom/when 'removed' some/which pages in vms.

FWIW - nary enough were removed to destroy the KEY(s)!

WE are surrounding "IT" !  coooooooooooooL :-)


Best to you & yours
-=se=-
steve (FOLD IT / FlIP it) ekwall


ps: an eva "T" does NOT equal an "eva T" (depending where you are at
in the text!) ALWAYS START AT THE _BEGINNING OF THE (each) PAGE(s).

<-=se=-

______________________________________________________________________
To unsubscribe, send mail to majordomo@xxxxxxxxxxx with a body saying:
unsubscribe vms-list