From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 109fba,df854b5838c3e14 X-Google-Attributes: gid109fba,public X-Google-Thread: 103376,948c42d74a60770e X-Google-Attributes: gid103376,public X-Google-Thread: 1014db,df854b5838c3e14 X-Google-Attributes: gid1014db,public From: c2a192@ugrad.cs.ubc.ca (Kazimir Kylheku) Subject: Re: C/C++ knocks the crap out of Ada Date: 1996/03/17 Message-ID: <4ihf28INNsob@keats.ugrad.cs.ubc.ca> X-Deja-AN: 143692147 references: <00001a73+00002504@msn.com> <4idh80$6tj@solutions.solon.com> <4idk8oINNrr2@keats.ugrad.cs.ubc.ca> <4if97t$1cqm@saba.info.ucla.edu> organization: Computer Science, University of B.C., Vancouver, B.C., Canada newsgroups: comp.lang.ada,comp.lang.c,comp.lang.c++ Date: 1996-03-17T00:00:00+00:00 List-Id: In article <4if97t$1cqm@saba.info.ucla.edu>, Jay Martin wrote: >c2a192@ugrad.cs.ubc.ca (Kazimir Kylheku) writes: > >> >Like all tools, lex and yacc are excellent for some tasks, and useless >> >for others. I can only assume you've been trying to use them for >> >inappropriate tasks, or more likely, that you haven't ever used them, >> >and that you're not familiar with C, either. You've posted many >> >claims, with *no* documentation, *no* examples, and *no* rationale. > >>That trick allows an author to retreat infinitely without admitting he is >>wrong, because there is no wall to back into. ``But I never said this or >>that...''. Of course not. Didn't say a damn thing, in fact. Without the >>documentation, examples and whatnot, it is just *.advocacy fluff. > >Alright dumbshits lesson time: > > -- Tools and operating systems that only support one language are junk. > Basic computer science, anyone who thinks otherwise is incompetent. What operating systems are those? Clearly you can't be talking about UNIX which supports too many languages for me to even try to enumerate (including Ada). > -- Stupidly go on and on about how an certain IO routine is not as fast > as Lex, not too swift. (1) Won't matter if your reading in 1K. > (2) Just shows the IO routine is broken. Its trivia. Why does it show the IO routine is broken? Maybe it shows that flex generated programs (it was GNU flex that made the faster code---lex's was _slightly_ slower) programs are in fact faster. Why do you assume that the IO routine must necessarily be broken if it can't outperform a flex program? Have you actually _looked_ at the the body of a flex program before spouting off? As soon as you do anything less trivial than just scanning a bunch of numbers, lex-generated programs quickly become faster than the standard scanf() routine. The scanf() routine has to interpret its format string each time it is invoked. On the other hand, lex generates a static table. Suppose I have just a few keywords to match: "foo" { } "bar" { } "lex" { } If I used scanf(), I would have to extract a token, and then do a search to find which of the keywords it matches using hashing or what have you. A lex program doesn't have to do that since when its automaton arrives at an acceptance state, it knows which keyword has been matched. It matches them all simultaneously. My performance test (which I ran on three machines/operating systems, not just one) showed that even for scanning a simple file of numbers, the lex program was no less efficient than the IO routine scanf(), while the one generated by flex from the same specification was twice as fast. > -- Go on and on how its great to over-engineer something simple. > How great it is to be a "savant" at a criptic tool with little Cryptic tool in your eyes. A compiler is also a cryptic tool to someone who doesn't ``speak'' the language. So is a text editor, etc. You train to learn these things. You think that lex and yacc mastery is congenital? I learned by taking a course, reading books and documentation, and, of course, practice. The same way I learned everything else in computing. Programming languages, operating systems, tools, etc. Just because you are not willing to put in the extra effort doesn't mean that you have to denigrate the skills learned by others who are willing to put in the effort. You surely must have put out _some_ effort to get where you are, wherever that may be. > eye to the effects of using that tool to the maintainability > of the software. This is called "eat shit next guy" hacking. In the past I have had to modify some programs that embodied yacc parsers. I found it quite easy. One time I was looking for a bug the ``tcpdump'' utility. Bacause its expression compiler was done with yacc, the task was made all the simpler. The program was far more understandable to me. I rejoiced at the program's nice layout---far from feeling like the programmers had the above sentiment toward me, the next guy. >Of course when you tell of maintainance headaches caused by abuse of >these tools do to they are ready availability and it is supposed to be >so cool to use them. Then its the usual C/Unix hacker macho attitude >of "they are just lame programmers" the tools are great! What does ``cool'' and ``macho'' have to do anything? Get out of puberty! Are you suffering from some kind of inferiority complex? >The Unix philosophy is great for the quick hack, but for larger >software development the philosophy becomes counter productive. I think you are stretching your little lex and yacc slam a little too far. You are also contradicting yourself. A ``quick hack'' is an activity diametrically opposed to ``over-engineering something simple''. At first you were advocating the use of quick hacks rather than tools which abstract away from the low-level. Now you are saying that since these tools make your work go faster, you are guilty of quick hacking if you use them. Getting something done quickly is a worthwhile goal. It's called ``productivity'', and is not necessarily incompatible with other goals. --