From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Thread: 109fba,cd8ed9115942852f X-Google-NewGroupId: yes X-Google-Thread: 103376,b92b95c9b5585075 X-Google-NewGroupId: yes X-Google-Attributes: gid4f1905883f,gida07f3367d7,domainid0,public,usenet X-Google-Language: ENGLISH,ASCII-7-bit Path: g2news1.google.com!news3.google.com!feeder.news-service.com!feeder.news-service.com!eternal-september.org!feeder.eternal-september.org!.POSTED!not-for-mail From: "Jed" Newsgroups: comp.lang.c++,comp.lang.ada Subject: Re: Why use C++? Date: Sat, 13 Aug 2011 02:53:32 -0500 Organization: A noiseless patient Spider Message-ID: References: <1e292299-2cbe-4443-86f3-b19b8af50fff@c29g2000yqd.googlegroups.com> <1fd0cc9b-859d-428e-b68a-11e34de84225@gz10g2000vbb.googlegroups.com> <9ag33sFmuaU1@mid.individual.net> <1d8wyhvpcmpkd.ggiui9vebmtl.dlg@40tude.net> <150vz10ihvb5a.1lysmewa1muz4$.dlg@40tude.net> <1q4c610mmuxn7$.1k6s78wa0r8fj.dlg@40tude.net> <1vn800hbyx8k4$.1lsveclj56197$.dlg@40tude.net> <1gu6ni1yb54k3$.4nbvfqqndl8m$.dlg@40tude.net> Injection-Date: Sat, 13 Aug 2011 07:53:16 +0000 (UTC) Injection-Info: mx04.eternal-september.org; posting-host="MIpMIVYD9PDhsUAPC84lbA"; logging-data="17238"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1+fkO57g+wnK+d+wcajHSn0" X-MimeOLE: Produced By Microsoft MimeOLE V6.00.2900.6109 X-RFC2646: Format=Flowed; Original X-Newsreader: Microsoft Outlook Express 6.00.2900.5931 Cancel-Lock: sha1:AiJYLGzuCnmyXf0xWPY5p2AtMTA= X-Priority: 3 X-MSMail-Priority: Normal Xref: g2news1.google.com comp.lang.c++:82909 comp.lang.ada:20590 Date: 2011-08-13T02:53:32-05:00 List-Id: "Dmitry A. Kazakov" wrote in message news:1gu6ni1yb54k3$.4nbvfqqndl8m$.dlg@40tude.net... > On Fri, 12 Aug 2011 14:06:55 -0500, Jed wrote: > >> "Dmitry A. Kazakov" wrote in message >> news:fwnlcp4mgj03$.1tjeqjtos00o8$.dlg@40tude.net... > >>> I want the type semantics specified. C/C++ int is neither lower level >>> nor >>> closer to the machine it is just ill-defined. The concern is this, >>> not its >>> relation to the machine, of which I (as a programmer) just do not >>> care. >> >> What more definition do you want? > > More than what? I see nothing. The semantics of C++ types are specified. >> Size guarantee? >> Other than that, what >> is ambiguous (platform-specific) about C++ int? > > That is one of the meanings of ill-defined. Don't use it. Consider it deprecated. Some baggage stays around for compatibility even when new constructs are introduced. > >>>>>>> That depends. In Ada integer types are constructed from ranges. >>>>>> >>>>>> Oh, I thought ranges in Ada were "subtype ranges", hence being >>>>>> based >>>>>> upon >>>>>> the built-ins. >>>>> >>>>> Both: >>>>> >>>>> type Tiny is range -100..100; -- A new type >>>>> subtype Nano is Tiny range 0..2; -- A subtype >>>> >>>> What underlies those "new" types though? >>> >>> That is up to the compiler. >> >> Do you care about the size of the type ever? > > You mean the size of the representation taking into account its memory > alignment? I don't care except for the cases I have to communicate the > hardware or to implement a communication protocol. Though I have to do > this > quite often, so I cannot represent a typical case, not so many people > are > doing embedded/protocol stuff these days. Anyway, that is well under 1% > for > me. Interesting. > > BTW, if you aim at this application domain, note endianness and > encoding > stuff. You have very little chance that int is the type the protocol or > hardware uses. Picking on int? You don't have to use int. You probably should never use it. It's virtually useless (maybe not just "virtually" either). Why would you use int instead of another integer type? You have a bunch of others to pick from. > In fact, you need here even more language support, because > the type semantics is to be defined far more rigorously. You need to > take > the description of the protocol and rewrite it in the language terms. > Even > Ada's representation clauses cannot this, e.g. to describe the integers > used in an analogue input terminal (as an integral type). So much for dumping knowledge of the representation then. > >>>> Aren't they just "syntactic" sugar over some well-defined >>>> primitives? >>> >>> What are "well-defined primitives"? >> >> The built-in things that are closest to the machine, with consistent >> representation across machines that can be relied on, that other >> things >> are built on top of. (That's a stab at it, anywho). > > How these contradictory requirements could be fulfilled by something > well-defined? In fact the vagueness of such "primitives" is a > consequence > of that contradiction. There is no such primitives, in principle. Yeah, I went to far. I should have left out "across machines". > >>>> Wouldn't they have to be? I.e., >>>> at some point, the hardware has to be interfaced. >>> >>> Yes. Consider a that the target is a pile of empty bier cans >>> maintained by >>> a robot. I presume that somehow the position of the cans or maybe >>> their >>> colors in that pile must reflect values of the type. Why should I (as >>> a >>> programmer) worry about that? >> >> Because you know it isn't beer cans (call it, say, "a simplifying >> assumption") and you may want to (probably will want to) send some of >> those integers to another machine across the internet or to disk. > > In each such case I will have to use a type different from the machine > type. This is just another argument why types need to be precisely > specified. I addressed this issue above. But they are precisely specified on any given platform. I think you are seeking to abstract that away into the implementation. So you want more than just "precisely specified" (for it is already so), you want it hidden away. > >>>>> Mathematically there is no need to have a supertype containing the >>>>> ranged one. >>>> >>>> I'm only concerned about pragmatics (implementation). >>> >>> It is simple to implement such a type based on an integer machine >>> type when >>> values of the language type correspond to the values of the machine >>> type >>> (injection). It is less simple but quite doable to implement this >>> type on a >>> array of machine values. The algorithms are well known and well >>> studied. I >>> see no problem with that. >> >> Just another layer of abstraction. So it's as simple as that (another >> layer of abstraction), yes? > > Yes, what in your opinion a machine word is? Either 32-bits or 64-bits depending if I'm on Win32 or Win64 (little-endian and 2's complement is a given). ;) > Just another layer of > abstraction above states of the p-n junctions of some transistors. Do > you > care? At some level, I care, yes. At that level, no. I'm not against your proposal, I just don't know the scope of it. I have a feeling that going to the level you are calling for is what would say is "an exercise" instead of practical engineering. Just a feeling though, mind you. Tell me this, how close is Ada (and which one) to your ideal? All I know about Ada is what I've read about it. Maybe I should write some programs in it. > >>>>> (The way compiler would implement that type is uninteresting, >>>> >>>> Ha! To me that is KEY. >>> >>> This is not a language design question. It is a matter of compiler >>> design >>> targeting given machine. >> >> Oh? I've been having a language design discussion (primarily). > > Then that cannot by any means be a key issue. > That's opinion, not fact. >>> Why should you? Again, considering design by contract, and attempt to >>> reveal the implementation behind the contract is a design error. You >>> shall >>> not rely on anything except the contract, that is a fundamental >>> principle. >> >> While that paradigm does have a place, that place is not "everywhere". > > Where? > The formality of "design by contract" as used in developing functions/methods. Sure, one can easily move to the more general meaning of "contract' and use it when talking just about anything, but that's to dilute the waters that are "design by contract". >> The comments being made in this thread are really suggesting 4GL. >> While >> that is fine, there is a place for 3GLs and hybrids. One size fits >> all, >> really doesn't. > > Maybe, but in software engineering it certainly does. Your strongly-believed, but wrong, opionion noted. > >> Well then, if wrapping is OK, what else is needed and why? > > Wrapping is an implementation, the question is WHAT does this > implementation actually implement? I was just asking why you find that inadequate such that something (largely more complex?) is needed. > >>> Inadequate would be to expose such integer types in the contracts. >> >> What contracts? > >>> Implementations based on existing types are possible, but you would >>> need >>> much language support (e.g. reflection) in order to ensure that the >>> implementation fulfills the contract. As an example, consider an >>> implementation of >>> >>> type I is range -200_000_000..100_000; -- This is the contract, >>> which >>> -- includes the range and the behavior of +,-,*/,**,mod,rem, >>> overflow >>> -- checks etc >>> >>> on top of C's int. >> >> Oh, THAT "contract". (I reserve that term for function calls, just to >> avoid overloading it, but I actually prefer "specification" as I >> associate "contract" with Eiffel). >> >> I'm don't know what you are suggesting the ideal implementation would >> be >> of the ranged type above. > > There is no ideal implementations, there are ones fulfilling functional > and > non-functional requirements. > >>>>>>> The point is that it is the application domain >>>>>>> requirements to drive the design, and the choice of types in >>>>>>> particular. >>>>>> >>>>>> Can that be achieved? At what degree of complexity? >>>>> >>>>> We probably are close to the edge. >>>> >>>> Meaning "highly complex"? >>> >>> Enough complex to make unsustainable the ways programs are designed >>> now. >> >> Explain please. > > If the bugs rate will not be reduced, there will not be enough human > resources to keep software development economically feasible. And this > is > not considering future losses of human lives in car accidents caused by > software faults etc. Oh, you were thinking something else when I wrote: "Can that be achieved? At what degree of complexity?". I asked if creating your ideal language (this "definitively specified higher level" language) is feasible and if so, what does that do to the complexity of the implemenation (compiler)? Is a comparison of Ada and C++ pretty much that answer? > >>>>>> Can it/should it be hardware-supported? >>>>> >>>>> No. Hardware becomes less and less relevant as the software >>>>> complexity increases. >>>> >>>> OK, so another layer of abstraction is what you want. The syntax of, >>>> say, >>>> Ada's ranged types, for example. So your call is just for more >>>> syntactic >>>> sugar then, yes? >>> >>> Rather more support to contract driven design and static analysis. >> >> That doesn't appear to be a lot of complexity to introduce into a >> compiler. It seems like common sense. So, what am I missing here? > > Contract specifications. "int" is not a contract. Does Ada meet your desired requirements? > >> The new enums in C++11 are a step in the right direction, yes? Maybe >> in >> the next iteration we'll get ranges. > > Let's see. > And not hold our breaths!