comp.lang.ada
 help / color / mirror / Atom feed
* Re: Assuming optimization? What is best of these code alternatives?
       [not found] <0868c42e-ed44-4b36-a929-2bffb338ee34@googlegroups.com>
@ 2014-09-11 13:34 ` J-P. Rosen
  2014-09-11 14:48   ` Adam Beneschan
  2014-09-12  4:32   ` Per Sandberg
  2014-09-11 14:19 ` gautier_niouzes
                   ` (3 subsequent siblings)
  4 siblings, 2 replies; 9+ messages in thread
From: J-P. Rosen @ 2014-09-11 13:34 UTC (permalink / raw)


Le 11/09/2014 15:14, reinkor a écrit :
> I am not sure to what extent one may assume the Ada compiler 
> makes optimization.
It does. An Ada compiler without optimization would not be usable.
(and I worked on Ada/ED !)

> For example, assume the following two program 
> code alternatives:
> 
> 
> -----------------------Alternative 1:
>    i1,i2,k : Integer;
> begin
>     (Read i1,i2 and n - do not change after)
> 
>     k := i1 + i2; -- store "i1 + i2" in k to avoid recalculation (?)
>     loop
>       for i in k .. n loop
>           Do something...
>       end loop;
>     end loop;
> 
> ------------------------Alternative 2:
>    i1,i2 : Integer;
> begin
>     (Read i1, i2 and n - do not change after)
> 
>     loop
>       for i in i1 + i2 .. n loop -- will recalculate "i1 + i2" ?
>           Do something...
>       end loop;
>     end loop;
> 
> What is the "best" alternative?  Alternative 2 
> is shortest (and probably easier to read), but
> could it give slower code that for Alternative 1 ?
> 
Response 1: don't bother with such micro-optimization
Response 2: the language requires that the bounds be evaluated only
once, not on each loop. Therefore, it won't make any difference.

FWIW, here is my favorite collection of quotes about optimization:

Kernighan & Plauger, 1974.  Elements of Programming Style:
- Make it right before you make it faster.
- Keep it right when you make it faster.
- Make it clear before you make it faster.
- Don't sacrifice clarity for small gains in "efficiency."
- Let your compiler do the simple optimizations.
- Keep it simple to make it faster.
- Don't diddle code to make it faster - find a better algorithm.
- Instrument your programs.  Measure before making "efficiency" changes.

Ledgard, Nagin, Hueras, 1979. Pascal with Style: Programming Proverbs:
Shortening the code, running the program faster, or using fewer
variables are all popular pastimes. Not mentioning ... the extra testing
time needed to check the new and often subtle boundary conditions, are
you sure that fewer machine instructions or faster machine execution is
likely?

M.A. Jackson
Rules of Optimization:
- Rule 1: Don't do it.
- Rule 2 (for experts only): Don't do it yet.

W.A. Wulf
"More computing sins are committed in the name of efficiency (without
necessarily achieving it) than for any other single reason - including
blind stupidity."

Donald Knuth
"We should forget about small efficiencies, say about 97% of the time:
premature optimization is the root of all evil."

-- 
J-P. Rosen
Adalog
2 rue du Docteur Lombard, 92441 Issy-les-Moulineaux CEDEX
Tel: +33 1 45 29 21 52, Fax: +33 1 45 29 25 00
http://www.adalog.fr


^ permalink raw reply	[flat|nested] 9+ messages in thread

* Re: Assuming optimization? What is best of these code alternatives?
       [not found] <0868c42e-ed44-4b36-a929-2bffb338ee34@googlegroups.com>
  2014-09-11 13:34 ` Assuming optimization? What is best of these code alternatives? J-P. Rosen
@ 2014-09-11 14:19 ` gautier_niouzes
  2014-09-11 14:49   ` Adam Beneschan
  2014-09-11 17:30 ` Jeffrey Carter
                   ` (2 subsequent siblings)
  4 siblings, 1 reply; 9+ messages in thread
From: gautier_niouzes @ 2014-09-11 14:19 UTC (permalink / raw)


Le jeudi 11 septembre 2014 15:14:39 UTC+2, reinkor a écrit :
> I am not sure to what extent one may assume the Ada compiler 
> makes optimization.  For example, assume the following two program 
> code alternatives:
[...]
>     k := i1 + i2; -- store "i1 + i2" in k to avoid recalculation (?)
>     loop
>       for i in k .. n loop

In any case the value i1 + i2 is the initial value for the loop, so even a totally non-optimizing compiler will compute the initial loop value once.
So in this case: the presence of k will be either, in terms of performance:
  - neutral (optimized compilation)
  - slighlty slower (a naive compiler will store i1 + i2 into k, then copy k into i when initializing the loop)
So here k will be only useful if you think it is more readable, especially if k could appears at many places.
_________________________ 
Gautier's Ada programming 
http://sf.net/users/gdemont/



^ permalink raw reply	[flat|nested] 9+ messages in thread

* Re: Assuming optimization? What is best of these code alternatives?
  2014-09-11 13:34 ` Assuming optimization? What is best of these code alternatives? J-P. Rosen
@ 2014-09-11 14:48   ` Adam Beneschan
  2014-09-12  4:32   ` Per Sandberg
  1 sibling, 0 replies; 9+ messages in thread
From: Adam Beneschan @ 2014-09-11 14:48 UTC (permalink / raw)


On Thursday, September 11, 2014 6:34:44 AM UTC-7, J-P. Rosen wrote:

> > begin
> >     (Read i1, i2 and n - do not change after)
> >     loop
> >       for i in i1 + i2 .. n loop -- will recalculate "i1 + i2" ?
> >           Do something...
> >       end loop;
> >     end loop;

> Response 1: don't bother with such micro-optimization
> Response 2: the language requires that the bounds be evaluated only
> once, not on each loop. Therefore, it won't make any difference.

I think you missed the outer loop.  The for loop is an inner loop, and i1+i2 does need to be recalculated for every iteration of the outer loop.  However, the OP has stated that i1 and i2 will not change, and his proposed initialization takes place outside the outer loop.

The question is, if i1 and i2 won't change, will the compiler optimize so that it doesn't have to re-perform the addition in each iteration of the outer loop.  The only answer is to try it with your compiler and see.

                                  -- Adam



^ permalink raw reply	[flat|nested] 9+ messages in thread

* Re: Assuming optimization? What is best of these code alternatives?
  2014-09-11 14:19 ` gautier_niouzes
@ 2014-09-11 14:49   ` Adam Beneschan
  0 siblings, 0 replies; 9+ messages in thread
From: Adam Beneschan @ 2014-09-11 14:49 UTC (permalink / raw)


On Thursday, September 11, 2014 7:20:00 AM UTC-7, gautier...@hotmail.com wrote:

> In any case the value i1 + i2 is the initial value for the loop, so even a totally non-optimizing compiler will compute the initial loop value once.

You also missed the outer loop.  Please see my other response.

                                 -- Adam



^ permalink raw reply	[flat|nested] 9+ messages in thread

* Re: Assuming optimization? What is best of these code alternatives?
       [not found] <0868c42e-ed44-4b36-a929-2bffb338ee34@googlegroups.com>
  2014-09-11 13:34 ` Assuming optimization? What is best of these code alternatives? J-P. Rosen
  2014-09-11 14:19 ` gautier_niouzes
@ 2014-09-11 17:30 ` Jeffrey Carter
  2014-09-11 18:12 ` Stefan.Lucks
  2014-09-12  6:17 ` anon
  4 siblings, 0 replies; 9+ messages in thread
From: Jeffrey Carter @ 2014-09-11 17:30 UTC (permalink / raw)


On 09/11/2014 06:14 AM, reinkor wrote:
> 
> What is the "best" alternative?  Alternative 2 
> is shortest (and probably easier to read), but
> could it give slower code that for Alternative 1 ?

The question, as always, is what are your timing requirements, and does the
clearest version meet them? If so, then there's no need to waste effort worrying
about "efficiency".

The optimization algorithm is

Write the clearest code

loop
   Measure the result

   exit when measurement meets the requirements

   Determine what's using the most time, and improve it,
      keeping it as clear as possible
end loop;

Other than GNAT with -O0, I'm not aware of any compiler that does no
optimization. 25 years ago an Ada-83 compiler produced smaller and faster code
than assembler optimized by a team of experts, and optimizers have surely
improved since then. So it's folly to think a human can do better than the
compiler without some evidence to the contrary.

-- 
Jeff Carter
"Clear? Why, a 4-yr-old child could understand this
report. Run out and find me a 4-yr-old child. I can't
make head or tail out of it."
Duck Soup
94


^ permalink raw reply	[flat|nested] 9+ messages in thread

* Re: Assuming optimization? What is best of these code alternatives?
       [not found] <0868c42e-ed44-4b36-a929-2bffb338ee34@googlegroups.com>
                   ` (2 preceding siblings ...)
  2014-09-11 17:30 ` Jeffrey Carter
@ 2014-09-11 18:12 ` Stefan.Lucks
  2014-09-12  6:17 ` anon
  4 siblings, 0 replies; 9+ messages in thread
From: Stefan.Lucks @ 2014-09-11 18:12 UTC (permalink / raw)


[-- Attachment #1: Type: TEXT/PLAIN, Size: 1564 bytes --]

On Thu, 11 Sep 2014, reinkor wrote:

>    k := i1 + i2; -- store "i1 + i2" in k to avoid recalculation (?)
>    loop
>      for i in k .. n loop
>          Do something...
>      end loop;
>    end loop;

>    loop
>      for i in i1 + i2 .. n loop -- will recalculate "i1 + i2" ?
>          Do something...
>      end loop;
>    end loop;
>
> What is the "best" alternative?

I assume you turn on the compiler's optimisation setting. I didn't try it, 
but I would expect the following:

   - If "Do something" is sufficiently simple, the Ada compiler will
     find out that i1, i2 and, for the first variant, k, don't
     change. Then the generated code should be the same.

   - If "Do something" is so complex that the Ada compiler is unable to
     find out that neither of i1, i2, and k change, you are unlikely to
     notice any difference in speed, since the time to compute i1+i2 is
     will be negligible, compared to the time for "Do something".

Nevertheless, if you want to make sure the compiler notices that neither 
i1 nor i2 change, you could declare them as constants:

    declare
      I1: constant Some_Type := Get_I1;
      I2: constant Some_Type := Get_I2;
    begin
      loop
        for I in I1+I2 .. N loop
          Do_Something
        end loop;
      end loop;
    end;

------  I  love  the  taste  of  Cryptanalysis  in  the morning!  ------
     <http://www.uni-weimar.de/cms/medien/mediensicherheit/home.html>
--Stefan.Lucks (at) uni-weimar.de, Bauhaus-Universität Weimar, Germany--

^ permalink raw reply	[flat|nested] 9+ messages in thread

* Re: Assuming optimization? What is best of these code alternatives?
  2014-09-11 13:34 ` Assuming optimization? What is best of these code alternatives? J-P. Rosen
  2014-09-11 14:48   ` Adam Beneschan
@ 2014-09-12  4:32   ` Per Sandberg
  1 sibling, 0 replies; 9+ messages in thread
From: Per Sandberg @ 2014-09-12  4:32 UTC (permalink / raw)


Well
It all depends on Compiler and target architecture so i would recommend 
the following actions i order.
1 Read the compilers manual on optimization flags.
2 Get yourself a decent analyzer.
3 If you are running gnat compile your program with "-O2"
4 Measure.
5 Analyze the results they are usually surprising.
5 Do code changes.
6 Back to 3

So the usual approach is
1) Write the program as readeble and understandable as possible.
2) Analyze the execution behavior on the real target with real data,
3) Optimize after deep analysis and reflection.

The measurement tools usually points to parts of the code and constructs 
that are complete opposite of what you expected.

/Good quotes J-P
/Per


On 11.09.2014 15:34, J-P. Rosen wrote:
> Le 11/09/2014 15:14, reinkor a écrit :
>> I am not sure to what extent one may assume the Ada compiler
>> makes optimization.
> It does. An Ada compiler without optimization would not be usable.
> (and I worked on Ada/ED !)
>
>> For example, assume the following two program
>> code alternatives:
>>
>>
>> -----------------------Alternative 1:
>>     i1,i2,k : Integer;
>> begin
>>      (Read i1,i2 and n - do not change after)
>>
>>      k := i1 + i2; -- store "i1 + i2" in k to avoid recalculation (?)
>>      loop
>>        for i in k .. n loop
>>            Do something...
>>        end loop;
>>      end loop;
>>
>> ------------------------Alternative 2:
>>     i1,i2 : Integer;
>> begin
>>      (Read i1, i2 and n - do not change after)
>>
>>      loop
>>        for i in i1 + i2 .. n loop -- will recalculate "i1 + i2" ?
>>            Do something...
>>        end loop;
>>      end loop;
>>
>> What is the "best" alternative?  Alternative 2
>> is shortest (and probably easier to read), but
>> could it give slower code that for Alternative 1 ?
>>
> Response 1: don't bother with such micro-optimization
> Response 2: the language requires that the bounds be evaluated only
> once, not on each loop. Therefore, it won't make any difference.
>
> FWIW, here is my favorite collection of quotes about optimization:
>
> Kernighan & Plauger, 1974.  Elements of Programming Style:
> - Make it right before you make it faster.
> - Keep it right when you make it faster.
> - Make it clear before you make it faster.
> - Don't sacrifice clarity for small gains in "efficiency."
> - Let your compiler do the simple optimizations.
> - Keep it simple to make it faster.
> - Don't diddle code to make it faster - find a better algorithm.
> - Instrument your programs.  Measure before making "efficiency" changes.
>
> Ledgard, Nagin, Hueras, 1979. Pascal with Style: Programming Proverbs:
> Shortening the code, running the program faster, or using fewer
> variables are all popular pastimes. Not mentioning ... the extra testing
> time needed to check the new and often subtle boundary conditions, are
> you sure that fewer machine instructions or faster machine execution is
> likely?
>
> M.A. Jackson
> Rules of Optimization:
> - Rule 1: Don't do it.
> - Rule 2 (for experts only): Don't do it yet.
>
> W.A. Wulf
> "More computing sins are committed in the name of efficiency (without
> necessarily achieving it) than for any other single reason - including
> blind stupidity."
>
> Donald Knuth
> "We should forget about small efficiencies, say about 97% of the time:
> premature optimization is the root of all evil."
>


^ permalink raw reply	[flat|nested] 9+ messages in thread

* Re: Assuming optimization? What is best of these code alternatives?
       [not found] <0868c42e-ed44-4b36-a929-2bffb338ee34@googlegroups.com>
                   ` (3 preceding siblings ...)
  2014-09-11 18:12 ` Stefan.Lucks
@ 2014-09-12  6:17 ` anon
  2014-09-12  8:06   ` reinkor
  4 siblings, 1 reply; 9+ messages in thread
From: anon @ 2014-09-12  6:17 UTC (permalink / raw)


Actually since the equation ( i1 + i2 .. n ) for the "for statement" is 
evaluated once, this means that alternative 2 is the faster code.  
Because in alternative 1 the system perform must reference ( K ) at 
least once and possible twice plus provide storage for K.

Now, for optimization in Ada, The RM provide a "pragma Optimize" with 
two options "Space" and "Time".  But in GNAT "pragma Optimize" is 
non-functional, instead GNAT uses GCC backend option -OX where X is
0 to 4 at the movement to perform optimization.

Now, since GNAT is a front-end to the compiler it is possible for 
GNAT to use "pragma Optimize ( Space ) ;" to remove source-level dead 
code.  While "pragma Optimize ( Time ) ;" could turn on optimization in
the backend to a preset default optimization or allow command-line 
option to override default.

Note: GCC does not have an option to remove dead code from precompile 
libraries or routines during linking.


In <0868c42e-ed44-4b36-a929-2bffb338ee34@googlegroups.com>, reinkor <reinkor@gmail.com> writes:
>I am not sure to what extent one may assume the Ada compiler 
>makes optimization.  For example, assume the following two program 
>code alternatives:
>
>
>-----------------------Alternative 1:
>   i1,i2,k : Integer;
>begin
>    (Read i1,i2 and n - do not change after)
>
>    k := i1 + i2; -- store "i1 + i2" in k to avoid recalculation (?)
>    loop
>      for i in k .. n loop
>          Do something...
>      end loop;
>    end loop;
>
>------------------------Alternative 2:
>   i1,i2 : Integer;
>begin
>    (Read i1, i2 and n - do not change after)
>
>    loop
>      for i in i1 + i2 .. n loop -- will recalculate "i1 + i2" ?
>          Do something...
>      end loop;
>    end loop;
>
>What is the "best" alternative?  Alternative 2 
>is shortest (and probably easier to read), but
>could it give slower code that for Alternative 1 ?
>
>Yes, this example is somehow over-simplified :-)
>
>



^ permalink raw reply	[flat|nested] 9+ messages in thread

* Re: Assuming optimization? What is best of these code alternatives?
  2014-09-12  6:17 ` anon
@ 2014-09-12  8:06   ` reinkor
  0 siblings, 0 replies; 9+ messages in thread
From: reinkor @ 2014-09-12  8:06 UTC (permalink / raw)


OK, thanks so far. I may have oversimplified in my example.
The expression "i1 + i2" may be replaced by something more complex such
as an ordered set (container). Here is a copy from my "real code" where A and B
are (container) ordered sets. Is it any point to store the result from
the following set operations before entering the loops here ?

         AandB := A and B;
         AmB   := A - B;
         AmC   := A - C;
         BmA   := B - A;  -- obs "set minus" = { x |  x \in B and x not \in A)
         CmA   := C - A;

         for j in pos1.j + 1 .. pos2.j - 1 loop
         for i in pos1.i + 1 .. pos2.i - 1 loop
            Bpos := (i,j) + Br0;
            Cpos := (i,j) + Cr0;
            img2(i,j) := RelInf(i,j,img1,AmB);
            if not Rel(img2(i,j),img2(Bpos.i,Bpos.j)) then
               if (for some E of BmA => Rel(img1(i + E.i,j+E.j),img2(Bpos.i,Bpos.j))) then
                  img2(i,j) := RelInf(i,j,img1,AmC);
                  if not Rel(img2(i,j),img2(Cpos.i,Cpos.j)) then
                     if (for some E of CmA => Rel(img1(i + E.i,j+E.j),img2(Cpos.i,Cpos.j))) then
                        img2(i,j) :=  RelInf(i,j,img1,AandB);
                     else
                        img2(i,j) := img2(Cpos.i,Cpos.j);
                     end if;
                  end if;
               else
                  img2(i,j) :=  img2(Bpos.i,Bpos.j);
               end if;
            end if;
         end loop;
         end loop;


^ permalink raw reply	[flat|nested] 9+ messages in thread

end of thread, other threads:[~2014-09-12  8:06 UTC | newest]

Thread overview: 9+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
     [not found] <0868c42e-ed44-4b36-a929-2bffb338ee34@googlegroups.com>
2014-09-11 13:34 ` Assuming optimization? What is best of these code alternatives? J-P. Rosen
2014-09-11 14:48   ` Adam Beneschan
2014-09-12  4:32   ` Per Sandberg
2014-09-11 14:19 ` gautier_niouzes
2014-09-11 14:49   ` Adam Beneschan
2014-09-11 17:30 ` Jeffrey Carter
2014-09-11 18:12 ` Stefan.Lucks
2014-09-12  6:17 ` anon
2014-09-12  8:06   ` reinkor

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox