From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.3 required=5.0 tests=BAYES_00,INVALID_MSGID autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,8f8cea8602e61aba X-Google-Attributes: gid103376,public From: Brian Rogoff Subject: Re: Overload Resolution in Ada (Re: The Red Language) Date: 1997/09/19 Message-ID: #1/1 X-Deja-AN: 273985777 References: <340E2DC5.25D7@worldnet.att.net> X-Trace: 874723822 6277 (none) 206.86.0.12 Newsgroups: comp.lang.ada Date: 1997-09-19T00:00:00+00:00 List-Id: On Fri, 19 Sep 1997, Robert A Duff wrote: > In article , > Brian Rogoff wrote: > >My apologies for the off topic post... > > Heh? Surely you jest! I don't jest, and my name isn't Shirley :-). Yeah, I was kidding. I read news only once a day now, and snide little remarks like that relieve my annoyance at the Eternal Interlingual Flamewar whose skirmishes are now making a constant din on cla. I'll use a ":-)" next time. > >Out of curiosity, what is the worst case running time(s) of the two pass > >algorithm(s)? I suppose I should have elaborated further, though you took a very nice stab at an answer. I'm not suggesting that overload resolution is in any way a limiting factor on the speed of an Ada compiler; I'm just slowly trying to understand how automatic generic instantiation could be made to work and there is a great deal of interaction between overload resolution and type inference (as I'm sure you know ML type inference is exponential, but not too bad in practice :-). > Umm, I don't know. I guess you can make it arbitrarily slow. Are you > asking for O(f(N)), where N is the number of nodes in the expression > tree (in the "complete context", which is a single statement or > declaration or whatever). Yes, although you can certainly introduce other variables in the big O estimate for the other factors which matter. I'm most interested in average case behavior. We can factor out symbol table complexity and similar problem areas (those that can be tackled by complexity reducing "tricks") to get a first approximation of the complexity. > Well, maybe it's not totally useless. I guess I would worry if the > compiler had to look at each node O(N**2) times. Hmm. N isn't usually > very big in Ada, since it's not an expression language, but it can be -- > I've seen things like "X := (... 10_000 component positional aggregate ...);" > in machine-generated code. Note that Ada has a special rule for > aggregates -- the compiler has to figure out the type of the aggregate > *before* looking inside it. Good point. Its hard to anticipate what will be done in machine generated code, well, actually its easy, I guess what I mean is that I'd never expect to see a 10_000 component positional aggregate in *my* code :-). > Anyway, in an *optimizing* compiler, overload resolution is probably not > the bottleneck. Sure. As I said this is really an academic sort of question. We can now return to the "God only codes in Eiffel" threads. :-) -- Brian