From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-0.3 required=5.0 tests=BAYES_00, REPLYTO_WITHOUT_TO_CC autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 109fba,b87849933931bc93 X-Google-Attributes: gid109fba,public X-Google-Thread: 114809,b87849933931bc93 X-Google-Attributes: gid114809,public X-Google-Thread: 1108a1,b87849933931bc93 X-Google-Attributes: gid1108a1,public X-Google-Thread: fac41,b87849933931bc93 X-Google-Attributes: gidfac41,public X-Google-Thread: f43e6,b87849933931bc93 X-Google-Attributes: gidf43e6,public X-Google-Thread: 103376,b87849933931bc93 X-Google-Attributes: gid103376,public From: clovis@wartech.com Subject: Re: What is wrong with OO ? Date: 1996/12/31 Message-ID: <5aadbr$ad8@masters0.InterNex.Net> X-Deja-AN: 206918172 references: <5a0niaINNlda@topdog.cs.umbc.edu> <32C43AC8.24E2@sn.no> <32C557F6.532C@rase.com> <5aa0eo$thd@krusty.irvine.com> organization: InterNex Information Services 1-800-595-3333 reply-to: clovis@wartech.com newsgroups: comp.lang.c++,comp.lang.smalltalk,comp.lang.eiffel,comp.lang.ada,comp.object,comp.software-eng Date: 1996-12-31T00:00:00+00:00 List-Id: In <5aa0eo$thd@krusty.irvine.com>, adam@irvine.com (Adam Beneschan) writes: >In article <32C557F6.532C@rase.com> tansel@rase.com writes: You're quite right. Tansel is acting like quite a pretender on this one, and he obviously doesn't know what either a Von Neumann machine is, or what a Turing machine is. A Von Neumann "machine" is really a recommended architecture, and really, there aren't many of them. They feature regular instruction sets, with zero duplication of function, and all operations being orthogonal. The basic feature of the Von Neumann machine is all classes of basic arithmetic operation, and all classes of comparisions within the two basic numeric types. The "ideal" Von Neumann machine supports natural numbers, integers, reals (actually, length limited rational numbers), in which any operation -- add, subtract, multiply, divide and compare -- are entirely separate. That is ALL we mean by a Von Neumann machine. If you delete ANY aspect of a Von Neumann machine, you can't do basic computation. One is either missing discrete whole numbers, or the ability to compute rational numbers and their real number simulation, or the ability to tell if one number is the same size as, or larger, than another number. The Von Neumann model, while not strictly followed in terms of orthogonality, is inherent in every general purpose digital computing machine ever made, even the Intel 80x86 family (which does it all, but which is not symmetrical or wholly orthogonal; and the same is largely true of RISC, whose primary function is to reduce transistor count). If you are using a discrete representation, noise-resistant at speed (e.g. this mandates binary encoding at the hardware level), Von Neumann is the Ideal Model. The Turing Machine is the minimal machine -- generally thought of, but not necessarily, a pure Von Neumann, architecture. Anything computable is computable on a Turing Machine with sufficient memory. In short, all digital computers are essentially just variances on Von Neumann and Turing, both of whom were mathematicians interested in computing technology as the notion of a code-based computation became available, that is, a computing machine which was capable of responding to codes which could control execution based on previous results. Similarly, compilers are merely translators which first parse an artificial language, translate (generally) to P-Code, and then finally translate to native object code in the code generator. None of this is really rocket science. And it's never ceased to amaze me how many of the "hip" guys can't even define the basic terms. The most "Von Neumann" architecture seen on a microprocessor was the National Semi 16032 family in its original incarnation. It was entirely symmetrical and entirely orthogonal as nearly as I could tell. The Motorola 68k family was mostly DEC PDP like and borrowed a bit from the VAX as well. The Intel 80x86 family was, and is, quite a mess in its lack of symmetry, and is also not at all fully orthogonal; sometimes it takes two instructions to do one function (memory-memory move from a single register), and sometimes, "special" instructions perform the functoin of several available, simpler instructions. The only "real" operations the hardware CAN perform at this point in time are the four arithmetic operations (+,-,*,/) and compares, and jumps/branches based on flag settings. The most advanced math co-processing units are just larger blocks of these operations. Assembler will ALWAYS be the most efficient language. The more you abstract the problem, the more you "generalize" the solution to a given problem, the more you necessarily give up in efficiency. C takes a minimal 3x hit on integer arithmetic, 10x on more complex stuff; and OO, because of the overhead interpreting where to send things, is 10x on top of that. Whether one can AFFORD to spend the time hand-tweaking a fullbore assembler application is another matter. Obviously, we have to have enough talent to write 3x over C, and 30x over OO. The optimal solution is a tradeoff -- a genuine cost/benefit relationship. My notion of the ideal and genuine Systems Analyst is someone who has the intellect to break a problem to its component parts, and localize which tools and components meet the cost/benefit analysis which gives maximum bang for the buck at each component level. Assembly language has its place. No systems programmer would discard it for a heartbeat -- and it is universally used in realtime since only hand-carved assembler allows on to rigorously control latency and pathlength. Procedural is much easier to maintain, and is typically the language of choice for people doing numerical analysis. Equations are more readily written/debugged/maintained in them; the prototype was FORTRAN, which was the FIRST procedural language and was intended to support primarily this kind of operation. OO is, in my view, strictly a Smalltalk proposition until something slicker comes along. I personally use all three in Smalltalk applications which involve a variety of disciplines. Smalltalk handles the GUI, but engines in C and ASM are also required in some cases. Any realtime requirements point to ASM, and any number crunching points to C most of the time (Assembler if the computation must be made in realtime, and be interruptable in mid-stream). Smalltalk itself is not entirely Smalltalk. It contains both procedural (often C), and frequently ASM routines at the kernel since no general purpose machine ever built comprehends objects -- all the underlying hardware knows about is basic arithmetic, compares, and jumps/branches based on flags settings. ASM is REALITY. C is an abstraction of that reality. Smalltalk is an abstraction of reality at a higher level than C. And that's all that is REALLY happening here. We're building in convenience features to avoid having to write 200 lines of Assembler to handle a single Smalltalk message. And we get "safety features" to go along with it (where Assembler has no safety features at all). Where Smalltalk simply "does not understand message XXX" Asm returns garbage, or plain crashes. This is why, although I'm as good as they come at basic Von Neumann coding, I don't sneer at Smalltalk. One instance of a Smalltalk debugger coming up with "YYY doesn't understand XXX" could be worth 8 hours of assembler debugging. It's a tradeoff, and if we want to represent ourselves as real computer scientists, and real computing professionals, it behooves us to be able to explain in clear terms why one paradigm is better, where it is better, and the details of why it is better AT A GIVEN TASK. I'd never code for a GUI with assembler, nor use Smalltalk for solving systems of differential equations. We can get by without smalltalk if we must. If we try to do without Von Neumann, there isn't such a thing as a working computer. If we try to do without Turing Theory and practice, we are again without a working computer. If we try to do without object code, the machine will never understand us, and that means, basically, the Assembler, and we can't write working programs even if we have a working computer if we don't use that object code. We CAN live without C, or C++, or Smalltalk. I wouldn't like it a lot to do without them. The whole point is that I'm so far advanced on all levels that I can use any of them interchangably. Let's get it right, people. Object code (the assembler's output) is reality. Von Neumann is reality. Turing is reality. Procedural and OO are only TOOLS built on these basics. Machines work just fine without them. They don't work AT ALL without object code, assemblers, Von Neumann basics and Turing Basics. The rest is extremely convenient and extremely productive. But they're just the decorations on the icing on the cake. And if they disappeared tomorrow, we'd all live, and still get our work done -- just more slowly and with more frustration than before. > >The numerical number crunching problems are perfect problems that a von > >Neumann machine is designed for, so can be handled quite elegantly with > >procedural languages. They are well defined algorithms that take up a > >very reasonable number of lines of code. However we need Runge-Kutta in > >our real life even less than we need our calculator. I use Smalltalk > >extensively, but revert back to C or even assembler when I need > >procedural number crunching, and offer these as DLLs. On the other hand, > >a simple ORB is reasonably trivial in Smalltalk, but if you want to > >develop in any procedure oriented system, even in C++, it takes a lot of > >time and effort. > >A number of times in this thread, OO has been compared to "von Neumann >machines" as if they are opposing paradigms. This is confusing to >me--could someone explain it? My understanding of von Neumann >machines is that they execute one statement at a time, in order. Most >of the high-level languages I've seen do the same thing, whether or >not they're OO languages. It seems to me that if (as implied by >earlier posts in this thread) the "von Neumann" paradigm is the >problem, then the solution is something like Backus' FP or Prolog or >Haskell or dataflow--not OO, which seems to me to have nothing to do >with whether the von Neumann model is being followed or not. Am I >missing something? > > -- Adam