From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.3 required=5.0 tests=BAYES_00,INVALID_MSGID autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,ec4f8ae8cbf8519f X-Google-Attributes: gid103376,public From: dewar@merv.cs.nyu.edu (Robert Dewar) Subject: Re: Ada vs. C: performance, size Date: 1997/01/10 Message-ID: #1/1 X-Deja-AN: 208922970 references: <9701092022.AA15007@most> organization: New York University newsgroups: comp.lang.ada Date: 1997-01-10T00:00:00+00:00 List-Id: Responding to one quote in Wes' message "I have been reading articles on code optimation problems with the Verdix compiler and was wondering if code size would be roughly the same on equivalent pieces of "C" and "Ada" code. One condition of course is that no optimizations is used on both, and Ada would be allowed to suppress checking. I would assume that they would be fairly close. Just a thought...." Comparing quality of code with "no optimizations" is silly and tells nothing. I can't see that you can consider it an advantage of one compiler over another that if you tell both compilers to generate lousy code, one generates better code under these conditions. As to compilers generating errors with optimization on, yes, I know this has been a pattern for some compilers in the past, and we often find that people using GCC or GNAT assume that the code will be more reliable at -O0 than -O2. In fact this is not the case, we have relatively few code generation problems (after all at this stage the GCC code generator is pretty well shaken down), and those that we have are as likely, or perhaps even more likely to occur at -O0 than at -O2, since in practice virtually all GCC production code is delivered with at least -O1 optimization. The quality of code at -O0 is (by design and intention) horrible -- GCC really believes you if you say no optimization and generates lots of junk! I am certainly not saying you will never find a case with GCC or GNAT where turning on optimization will cause problems, just that the frequency of this occurrence is very low, and probably no higher than the case of turning off optimization causing problems. Note that by problems here I mean code generation problems, not all problems that can occur in your code. It is quite often the case that optimization will reveal underlying problems in your code. Harmless erroneous programs can become not so harmful when optimized, since the optimizer is allowed to "believe" that it is dealing with correctly written code. So you can often find examples of poorly written code where turning optimization on (or moving from one Ada compiler to another) will reveal previously unseen problems. In our experience of porting large Ada codes to GNAT, we find that this kind of occurrence represents a significant part of the porting effort in some cases. I'll give one example. In one program we worked on, there was a large buffer defined as an array of characters, which on the machine we were working on (Sun Solaris) was byte aligned, which is perfectly reasonable. The code however, passed the address of the array to an external C routine, which after many layers of passing backwards and forwards, treated the buffer as an array of words, causing an alignment trap. Now of course this is a bug in the code, the alignment of the buffer needed to be specified (possible in Ada 95, not easily possible in Ada 83), and was easily fixed once understood, but this kind of problem is not unusual.