Austin Obyrne wrote: > Recent help from readers has enabled me to put the final touches to an encryption project that has been ongoing for at least 10 years such that I can now say categorically that I have an Ada-driven vector-based cryptographic scheme that can encrypt any alphanumeric character that can be keyed in from a standard keyboard. > > That includes the entire writable subset of ASCII which is the character set numbered 32 to 126 inclusively in ASCII decimal representation. Sorry to be blunt but this inability to handle binary data indicates a design flaw in your program. Any decent encryption program can encrypt arbitrary files – any data whatsoever. > The cryptography, selectively chosen to be programmed in Ada only, is copyright registered by me in the USA and in the United Kingdom and I have not intention of agreeing to it being programmed in any other language except Ada. Copyright doesn't allow you to forbid anyone from reimplementing your algorithm in any language they choose. Only a patent can give you that right. If you have bought a patent on your algorithm, then it won't be widely used before the patent expires, because there are several patent-free strong ciphers available. > if it is key-able by a standard key board I can encrypt it unbreakably. The only encryption scheme that has been proven unbreakable is One-Time Pad, which is quite impractical for most uses. If you can present a more practical scheme and prove that it's unbreakable, then you're going to make a big splash when you publish your paper in a cryptology journal. But if your proof doesn't hold water, then you will probably be quickly dismissed. > The foregoing claim is made in good faith Right, so you don't have a proof and what you meant is that you haven't been able to break your own cipher. Any fool can come up with a cipher that he himself can't break. When an algorithm has been published and multiple experienced cryptologists have scrutinized it and not found any significant weaknesses, that's when it's time to start thinking that the cipher might be good. Björn Persson