comp.lang.ada
 help / color / mirror / Atom feed
* Re: 9X and the NEED for preprocessing
       [not found] <629648260@<1825>
@ 1989-12-18 19:09 ` stt
  1989-12-19 21:29   ` preprocessing & optimization William Thomas Wolfe, 2847 
  1989-12-19 22:12   ` 9X and the NEED for preprocessing arny.b.engelson
  0 siblings, 2 replies; 6+ messages in thread
From: stt @ 1989-12-18 19:09 UTC (permalink / raw)



With regard to Ada preprocessors, and Ada9X:

First of all, preprocessors create no optimization problem,
since they operate at the lexical, or possibly syntactic, level,
long before the optimizer takes a look at the program.

However, I am not a great fan of Ada preprocessors.
We have implemented a compiler system and development tools
for 6 targets and 7 hosts without using a preprocessor.
Our general strategy is to define one or more target/host-independent
package specs with target/host-dependent bodies.
We minimize the size of such packages, and simply reimplement
them for each distinct target/host.

Sometimes, the package spec is target/host-dependent as well,
but only in its definitions, not in the names defined (e.g.,
one host might define the type "Link_Name" as being an 8 character string,
another might define it as being a 30 character string).

The net effect of this approach is that a particular configuration
is determined by a set of source files, not a set of preprocessor
switches.  Also, for cases where we do restrict differences to bodies,
we can select a different configuration at link time by choosing
a distinct "implementation catalog variant" (in Intermetrics AIE-speak),
requiring no recompilation.

Anyway, so much for truth and beauty.  If there is a compelling
argument for a standardized preprocessor, I am sure that the Ada9X
process will be willing to consider it, even though the "official"
public revision request period is over.  The Ada9X process is
going to include a number of public reviews, and the various
project teams working on Ada9X will continue to keep their
ears open for brilliant and/or urgent proposals.

S. Tucker Taft  (Ada9X DR -- aka "distinguished" reviewer)
Intermetrics, Inc.
Cambridge, MA  02138

^ permalink raw reply	[flat|nested] 6+ messages in thread

* Re: preprocessing & optimization
  1989-12-18 19:09 ` 9X and the NEED for preprocessing stt
@ 1989-12-19 21:29   ` William Thomas Wolfe, 2847 
  1989-12-21 20:58     ` stt
  1989-12-19 22:12   ` 9X and the NEED for preprocessing arny.b.engelson
  1 sibling, 1 reply; 6+ messages in thread
From: William Thomas Wolfe, 2847  @ 1989-12-19 21:29 UTC (permalink / raw)


From stt@inmet.inmet.com:
> preprocessors create no optimization problem,
> since they operate at the lexical, or possibly syntactic, level,
> long before the optimizer takes a look at the program.

   Are you sure that this is true in the general case?  (Not just
   for current compiler products, but in the general case?)  If a 
   preprocessor takes a high-level construct and reduces it to a 
   series of lower-level constructs, then isn't there a certain 
   loss of exploitable semantic content?

   Perhaps many present optimizers are not engaging in this level
   of sophistication, but analytically speaking, the most heavy-duty 
   of optimizing compilers would squeeze the final drops from every 
   single bit of semantic knowledge available to it, using semantics 
   to develop sophisticated reasoning leading to proofs that certain
   optimizations can be safely performed.

   If something like Anna/TSL were incorporated into Ada, and the
   preprocessor were to preserve all the semantic information during
   transformation (inserting annotations to restore the semantics 
   lost during preprocessing), then under those conditions the actions
   of a preprocessor would probably not incur any penalty.  But if this
   is not the case, then it would seem, at least theoretically, that
   there would be considerable losses in the potential for optimization. 


   Bill Wolfe, wtwolfe@hubcap.clemson.edu

^ permalink raw reply	[flat|nested] 6+ messages in thread

* Re: 9X and the NEED for preprocessing
  1989-12-18 19:09 ` 9X and the NEED for preprocessing stt
  1989-12-19 21:29   ` preprocessing & optimization William Thomas Wolfe, 2847 
@ 1989-12-19 22:12   ` arny.b.engelson
  1 sibling, 0 replies; 6+ messages in thread
From: arny.b.engelson @ 1989-12-19 22:12 UTC (permalink / raw)


In article <20600027@inmet> stt@inmet.inmet.com writes:
>
>With regard to Ada preprocessors, and Ada9X:
>
>However, I am not a great fan of Ada preprocessors.
>We have implemented a compiler system and development tools
>for 6 targets and 7 hosts without using a preprocessor.

I doubt there is a situation that REQUIRES the use of a preprocessor, but
that doesn't mean we shouldn't have one.  We don't really NEED a "for loop"
in Ada, nor recursion, nor many other features that one programmer chooses
to use while another does not.  That doesn't mean we shouldn't have them.

>Our general strategy is to define one or more target/host-independent
>package specs with target/host-dependent bodies.
>We minimize the size of such packages, and simply reimplement
>them for each distinct target/host.
>Sometimes, the package spec is target/host-dependent as well,
>but only in its definitions, not in the names defined (e.g.,
>one host might define the type "Link_Name" as being an 8 character string,
>another might define it as being a 30 character string).

This is my preferred strategy as well, and I believe it is the "better"
method (since it stays within the language), but not all programmers agree.

Differences in package specs generally require a lot more compilation
to go from one target to another.  I think it also tends to be harder to
split the constant code from the code that varies by target when you are
dealing with package specs.  There tend to be a lot of dependencies.

>The net effect of this approach is that a particular configuration
>is determined by a set of source files, not a set of preprocessor
>switches.  Also, for cases where we do restrict differences to bodies,
>we can select a different configuration at link time by choosing
>a distinct "implementation catalog variant" (in Intermetrics AIE-speak),
>requiring no recompilation.

This doesn't work when your targets use compilers from different companies.
Sometimes you have no choice in picking your compiler(s) or your targets.
Also, what do you do when you have different variations of a package body,
and must change a piece of code that is common to all of them (but was too
difficult to split out)? You end up making the same changes in each version
of the package body.

>Anyway, so much for truth and beauty.  If there is a compelling
>argument for a standardized preprocessor, I am sure that the Ada9X
>process will be willing to consider it, even though the "official"
>public revision request period is over.  The Ada9X process is
>going to include a number of public reviews, and the various
>project teams working on Ada9X will continue to keep their
>ears open for brilliant and/or urgent proposals.
>
>S. Tucker Taft  (Ada9X DR -- aka "distinguished" reviewer)
>Intermetrics, Inc.
>Cambridge, MA  02138

As to what would constitute a compelling argument, I don't know.  But, I
can say that there are times a preprocessor is a handy thing.  We have a
case with multiple targets and multiple compilers (sorry, I can't be very
specific), resulting in (some parts of the code) up to 10 different versions
of a package.  This includes some targets with functionality left out, other
functionality added, different record (bit) layout, etc.  Some vary in the
definitions.  We also have different customers with slightly different
requirements and using different subsets of target processors.

Imagine maintaining 10 different versions of a file and ensuring that they
are all functionally equivalent (in their common areas).  Some of this is
reduced by proper separating of the common code from the code that changes,
but in some cases this is not possible.  A preprocessor can ease
maintainence and testing, and provides a convenient way to add/remove
performance monitoring and debugging code.  It's not for me, but it may be
for others, and therefore we will all benefit from it being standardized.

As for the Ada9X process, I don't think it necessary that a standardized
preprocessor effort be tied to Ada9X.  A preprocessor is not part of the
language, it is a support tool.  It can easily be a separate effort
(such as CAIS, Ada/POSIX, Ada/SQL, etc.).  Besides, I think it is a more
difficult task than it may first seem.  What I think can end up being a
problem is contractual legalities (the code is not pure Ada).  Of course,
we could MAKE a preprocessor part of the language, forcing all compilers
to support it, adding ACVC tests for it, etc.

  -- Arny Engelson   att!wayback!arny

^ permalink raw reply	[flat|nested] 6+ messages in thread

* Re: preprocessing & optimization
  1989-12-19 21:29   ` preprocessing & optimization William Thomas Wolfe, 2847 
@ 1989-12-21 20:58     ` stt
  1989-12-22 20:39       ` Tucker's new proposal William Thomas Wolfe, 2847 
  0 siblings, 1 reply; 6+ messages in thread
From: stt @ 1989-12-21 20:58 UTC (permalink / raw)



Regarding preprocessors, optimization, and standardization:

I was talking about preprocessors in the "C Preprocessor" vein.
E.g., defines, ifdefs, etc.

I am sorry if I gave the impression I was talking about
a preprocessor like C++ 2.0.  Such a "preprocessor" is
really a full compiler which happens to emit C rather than
object code.  For the C++ kind of "preprocessor," you are absolutely
right that optimization of the output can be more difficult.

I also presume that Dave Emery's proposal for Ada was oriented toward
the conditional compilation type of preprocessor.

Frankly, I had personally written off the concept of a standard
preprocessor for Ada.  However, seeing the interest in Dave's
proposal, it seems worth considering again.  
As a compiler implementor, I hated the old "pragma Include" because
it made automatic recompilation a different kind of problem.
As it is now, it is possible to recompile from parse trees.
With "pragma Include," that was effectively impossible.

However, I *can* imagine a careful definition of conditional
compilation mechanisms which would preserve the ability to
recompile from parse trees.  This would *not* be a preprocessor
approach, however.  Instead, it would mean creating new
syntax.  Here is a quick proposal:

   Allow if constructs and case constructs within a sequence of
   declarations, so long as all conditional and case
   expressions are static, and each if/case "arm" contains
   only declarations (rather than statements).

Note that variant records already use the case construct at the
end of a list of declarations.  This proposal would simply generalize
this a bit (?), and allow the case expression to be an arbitrary
static expression as well as the currently-legal discriminant.
Furthermore, identifiers *could* be reused in mutually exclusive
if/case arms, since this would not interfere with
compile-time type checking.

One could even imagine allowing if/case constructs in context clauses,
to do conditional "with"ing.

The nice thing about this kind of approach is that it builds on the
existing concepts of the language (namely static expressions,
if/case constructs, and variant records), rather than introducing
a new/foreign syntax like "#ifdef".  

It is interesting to compare the situation
in Ada with that in C++.  Now that C++ has the concepts
of inlines and "consts" the need for preprocessors is reduced for C++,
though apparently not eliminated (for the same reasons that Dave
is providing for Ada).
It seems like it might be a better solution in the C++ arena
to also start allowing conditional constructs among declarations,
and strip them out as part of generating C (presuming that the
C++ "compiler" already has to be able to evaluate static expressions).

Here is an example demonstrating this proposal (though I admit
this example could probably be done better using other methods):

    with Parameters;
    package Fum is
        case Parameters.Target is
	    when Parameters.VAX =>
                type Target_Double is digits 11;
            when Parameters.S370 =>
		type Target_Double is digits 14;
	    when others =>
		type Target_Double is digits 15;
	end case;
        . . .
    end Fum;

Probably an important additional requirement would be that
the arms of an if/case construct which are not chosen
should not be required to be semantically correct (though
they would still require syntactic correctness).  This
should apply whether the if/case construct is acting as a declaration
or a statement.  This would allow reference in such unselected arms
to declarations which were similarly unselected.  This also
has the net effect of mandating conditional compilation, rather
than leaving it as an optional "optimization."

And while we're at it..., AI-00128 which disallows use of short-circuit
control forms in static expressions should probably be revisited.
-----------------------
S. Tucker Taft   uunet!inmet!stt; taft@ajpo.sei.cmu.edu
Intermetrics, Inc.
Cambridge, MA  02138

^ permalink raw reply	[flat|nested] 6+ messages in thread

* Re: Tucker's new proposal
  1989-12-21 20:58     ` stt
@ 1989-12-22 20:39       ` William Thomas Wolfe, 2847 
  1989-12-30  5:22         ` Metafont Consultant Account
  0 siblings, 1 reply; 6+ messages in thread
From: William Thomas Wolfe, 2847  @ 1989-12-22 20:39 UTC (permalink / raw)


From stt@inmet.inmet.com:
> I *can* imagine a careful definition of conditional
> compilation mechanisms which would preserve the ability to
> recompile from parse trees.  This would *not* be a preprocessor
> approach, however.  [...] 
%
%    Allow if constructs and case constructs within a sequence of
%    declarations, so long as all conditional and case
%    expressions are static, and each if/case "arm" contains
%    only declarations (rather than statements).
% 
> The nice thing about this kind of approach is that it builds on the
> existing concepts of the language (namely static expressions,
> if/case constructs, and variant records), rather than introducing
> a new/foreign syntax like "#ifdef".  

   I must be getting a bit too much of the spiked eggnog myself,
   because this is actually starting to sound like not too bad an
   idea...  can anyone think of counterarguments?  


   Bill Wolfe, wtwolfe@hubcap.clemson.edu

^ permalink raw reply	[flat|nested] 6+ messages in thread

* Re: Tucker's new proposal
  1989-12-22 20:39       ` Tucker's new proposal William Thomas Wolfe, 2847 
@ 1989-12-30  5:22         ` Metafont Consultant Account
  0 siblings, 0 replies; 6+ messages in thread
From: Metafont Consultant Account @ 1989-12-30  5:22 UTC (permalink / raw)


In article <7518@hubcap.clemson.edu> billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu writes:
>From stt@inmet.inmet.com:
>> I *can* imagine a careful definition of conditional
>> compilation mechanisms which would preserve the ability to
>> recompile from parse trees.  This would *not* be a preprocessor
>> approach, however.  [...] 
>%
>%    Allow if constructs and case constructs within a sequence of
>%    declarations, so long as all conditional and case
>%    expressions are static, and each if/case "arm" contains
>%    only declarations (rather than statements).
>% 
>> The nice thing about this kind of approach is that it builds on the
>> existing concepts of the language (namely static expressions,
>> if/case constructs, and variant records), rather than introducing
>> a new/foreign syntax like "#ifdef".  
>
>   I must be getting a bit too much of the spiked eggnog myself,
>   because this is actually starting to sound like not too bad an
>   idea...  can anyone think of counterarguments?  
>
>
>   Bill Wolfe, wtwolfe@hubcap.clemson.edu


Sure.  If you implement and encourage use of a "cpp" type
preprocessor, you completely demotivate the efforts to actually clean
up the portability problems in the existing language standard by
providing an out, and the ideal of programs that compile and execute
the same on all (reasonably configuration compatible) platforms goes
right down the tubes.  The completely predictable result is Ada
programs characterized by the nest of twisty #ifdef's, all different,
that make maintenance of any substantial C program such a heartache.
Not A Good Thing(tm).  ;-)

Again, my opinion, not the account furnisher's.
xanthian@well.sf.ca.us
Kent, the (bionic) man from xanth, now available
as a build-a-xanthian kit at better toy stores.

^ permalink raw reply	[flat|nested] 6+ messages in thread

end of thread, other threads:[~1989-12-30  5:22 UTC | newest]

Thread overview: 6+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
     [not found] <629648260@<1825>
1989-12-18 19:09 ` 9X and the NEED for preprocessing stt
1989-12-19 21:29   ` preprocessing & optimization William Thomas Wolfe, 2847 
1989-12-21 20:58     ` stt
1989-12-22 20:39       ` Tucker's new proposal William Thomas Wolfe, 2847 
1989-12-30  5:22         ` Metafont Consultant Account
1989-12-19 22:12   ` 9X and the NEED for preprocessing arny.b.engelson

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox