From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-0.3 required=5.0 tests=BAYES_00, REPLYTO_WITHOUT_TO_CC autolearn=no autolearn_force=no version=3.4.4 X-Google-Thread: 103376,19924f2facf8443 X-Google-Attributes: gid103376,domainid0,public,usenet X-Google-Language: ENGLISH,ASCII-7-bit Path: g2news1.google.com!news4.google.com!feeder1-2.proxad.net!proxad.net!feeder2-2.proxad.net!newsfeed.arcor.de!newsspool2.arcor-online.net!news.arcor.de.POSTED!not-for-mail From: "Dmitry A. Kazakov" Subject: Re: Larger matrices Newsgroups: comp.lang.ada User-Agent: 40tude_Dialog/2.0.15.1 MIME-Version: 1.0 Content-Type: text/plain; charset="us-ascii" Content-Transfer-Encoding: 7bit Reply-To: mailbox@dmitry-kazakov.de Organization: cbb software GmbH References: <40ed91c2-3dab-4994-9a7b-4032058f0671@56g2000hsm.googlegroups.com> <4899b545$0$20713$9b4e6d93@newsspool4.arcor-online.net> <96f76821-fc2a-4ec1-83e7-b7b9a5be0520@r66g2000hsg.googlegroups.com> <9cabee20-877a-4fdc-80f8-7746879331da@8g2000hse.googlegroups.com> <489a9675$0$20718$9b4e6d93@newsspool4.arcor-online.net> <75a339dd-969b-4c7a-8e89-7b640171bc2f@e53g2000hsa.googlegroups.com> <13426f2d-0060-47f0-8139-09506383f648@e53g2000hsa.googlegroups.com> <489c2f68$0$1060$9b4e6d93@newsspool3.arcor-online.net> <489c542e$0$12944$9b4e6d93@newsspool2.arcor-online.net> <1d0ueuhuo8z2e.ewzvifms2lau$.dlg@40tude.net> <489c68e1$0$12941$9b4e6d93@newsspool2.arcor-online.net> Date: Fri, 8 Aug 2008 18:37:55 +0200 Message-ID: <1ernhelikct1f$.phk7n6xktz5h.dlg@40tude.net> NNTP-Posting-Date: 08 Aug 2008 18:37:56 CEST NNTP-Posting-Host: b875c7eb.newsspool2.arcor-online.net X-Trace: DXC=ZaP2]a On Fri, 08 Aug 2008 17:40:16 +0200, Georg Bauhaus wrote: > Dmitry A. Kazakov schrieb: > >> They are not. Lisp is a list-oriented language, Prolog is a logical >> inference language. > > Lisp and Prolog with FFI(!) are not universal? Come on. Of course they are not. Both are domain-specific languages. >>> This notion of "purpose" is not very specific. >>> Let me put is this way: Ada has to be especially good at >>> systems programming, hence it has to be fairly low level. >> >> Wrong. Where that follows from? Systems programming need not to be >> low-level. > > A systems programming language must provide for the low > level. Nope, it must provide systems programming domain abstractions. Ada does this. This does not make it low-level. It seems that you are confusing different concepts of abstraction. Level depends on where is the ground, the underlying computational environment. This is not necessarily the hardware, but it can be one. 1. Considering hardware as the environment. Ada running on an Ada-hardware (if such existed) would make this implementation of Ada low-level. But it would not make Ada low-level, because it also runs on the hardware requiring much work to create an Ada compiler. Therefore Ada in general is high level relatively to the hardware. 2. Considering programming paradigms as the environment, i.e. the terms in which programs are thought an composed. Ada is again very high level, as it supports up to 3rd set abstractions: value -> type (sets of) -> class / generic (sets of) OO decomposition, concurrency etc. > A language that does not provide for the low level > is not a systems programming language. A logical fallacy: A => B does not imply A = B. > Ada has many features > that provide for low level programming. It lacks a numer of > features used in higher level programming (E.g. function > environments, unbounded numbers, ...). Since when unbounded numbers became high level? Consider it used to implement modular arithmetic or ASCII characters. How low-level! > Systems programming > is not the same as low level programming or high level programming; > rather, the set of objects is typically not as abstract as > some mathematical N-array of numbers. I don't see how interrupt, task or I/O port are less abstract than array. >>> What hardware? >> >> Vector processors. > > OK, I wasn't aware that the LA packages were made for vector > processors only. Take Intel x86 instead, if you don't like vector processor. >>> Assume data flow hardware, and assume a way >>> to put the function result in the input box of the next processing unit. >>> What now? >> >> Nothing, Ada.Numerics.Generic_Real_Arrays was not designed in order to >> support this hardware. > > Aha? I see that GNAT delegates to Fortran libraries. Traditionally, > Fortran is certainly an associate of non-PC hardware. That was not the point, it was as it reads, the design did not target any hardware and will be *relatively* inefficient on any existing hardware. Relatively, because it most likely will beat both Lisp and Prolog. >>> I don't see how prototyping a hypertext graph algorithm >>> requires a maximally efficient implementation of matrix >>> computations. >> >> Because of the problem size. Incidence matrices grow as O(n**2), i.e. >> extremely fast. > > O(N**2) is not extremely fast; many non-extreme algorithms > are in this class. The case discussed hits some memory barrier > at n = 5_000 and that's it. We get elaborate array indexing support > in some array programming languages. If chosing one of those > PLs costs me a constant factor of 10, I get all the indexing > stuff in return, it seems worth the cost during prototyping. No chance. O(N**2) is only memory complexity. You should also consider the number of operations required per element. Naive matrix multiplication is O(N**3). Factor 10 means (N x 10)**3 thousand times slower! And this is only the lower bound. But all this is in order learn that the fancy language X is greatly slower than Ada and is totally unsuitable for production code? I know it in advance! -- Regards, Dmitry A. Kazakov http://www.dmitry-kazakov.de