Thanks Glurk for the graph. That shows me that the breaking point for programs is problematic. The problem I see is that the computer programs inflexibility into taking to task idiocycronys with deciphering the code. The hardens had the human ability to work around all of the below using reasoning something that a computer program may have only if it's programmed to seek it out.
A spelling errors
B code designer mistakes
C + instead of and
D filler
E the amount of multiplicity encoded by the author.
I would expect a few of the above can take any code over the graph that Glurk has shown.
So my question is .. By reducing the size of the code into smaller sections rather than trying to holy grail the code in one hit. Has this been tried. I understand the larger a code the better..but not if it is out of a programs capabilities.
Break it into halves, thirds, 40 letters 50,60, 80',100 letters at a time.. Draw out a few words.. Surely the code must have some multiplicity weaknesses in parts. Then pen to paper and use the laptop attached to our shoulders. I did have this discussion with jarlve but don't know of any results, got a bit busy with life for a while...
And I wish not to cross swords.
Cheers.