comp.lang.ada
 help / color / mirror / Atom feed
* Re: Should I learn C or Pascal?
  1996-07-15  0:00   ` Should I learn C or Pascal? Ralph Silverman
@ 1996-07-15  0:00     ` Steve Sobol
  1996-07-16  0:00     ` Lee Crites
  1996-07-23  0:00     ` Richard A. O'Keefe
  2 siblings, 0 replies; 688+ messages in thread
From: Steve Sobol @ 1996-07-15  0:00 UTC (permalink / raw)



z007400b@bcfreenet.seflin.lib.fl.us (Ralph Silverman) wrote:

>Gabor Egressy (gegressy@uoguelph.ca) wrote:
>: Seth Perlman (sperlman@ezo.net) wrote:

Ralph,

This thing is huge... isn't it on a web page somewhere? :-/

>: : bear with me...

>: : First, my situation: I am going to be a senior in high school in the fall.

-- 
North Shore Technologies		Me == Steve Sobol == sjsobol@nstc.com
Web Consulting, PC Sales		Personal page: http://junior.apk.net/~sjsobol
Custom Win3.1/Win95 Programming 	Corporate page: http://www.nstc.com/nstc
        				(both under construction... bear with me..)
                    (Speak for North Shore? I *AM* North Shore. :)





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
       [not found] ` <4rs76l$aqd@ccshst05.uoguelph.ca>
@ 1996-07-15  0:00   ` Ralph Silverman
  1996-07-15  0:00     ` Steve Sobol
                       ` (2 more replies)
  0 siblings, 3 replies; 688+ messages in thread
From: Ralph Silverman @ 1996-07-15  0:00 UTC (permalink / raw)


[-- Warning: decoded text below may be mangled, UTF-8 assumed --]
[-- Attachment #1: Type: text/plain, Size: 43954 bytes --]


Gabor Egressy (gegressy@uoguelph.ca) wrote:
: Seth Perlman (sperlman@ezo.net) wrote:
: : Hi, all. I'm new to this newsgroup (and most of the comp. hierarchy), so
: : bear with me...

: : First, my situation: I am going to be a senior in high school in the fall.
: : I've been toying with programming computers ever since I learned BASIC on
: : my Commodore 64 at the age of 6. I learned a smattering of C a few years
: : ago, and I took a semester course in school this year which was an intro
: : to Pascal. Sadly, it's the most advanced computer course at my school,
: : which means that I have nowhere else to turn for instruction in
: : programming.


: : *** Should I learn to program in C or Pascal? ***

: Here is a quote from Brian W. Kernighan of "The C Programming Language" fame.
: He is also an internationally respected lecturer who has been there since 
: the inception of UNIX and C.
: "I feel that it is a mistake to use Pascal for anything much beyond its 
: original target. In its pure form, Pascal is a toy language, suitable for 
: teaching but not for real programming."
: Draw your own conclusions.

: To read his essay on this topic point your web browser at
: http://www.lysator.liu.se/c
: There are other articles of interest there.

: --
: ---------------------------------------------------------------------
: Gabor Egressy                gegressy@uoguelph.ca
: Guelph, Ontario              gabor@snowhite.cis.uoguelph.ca
: Canada

: No beast so fierce but knows some touch of pity
: But I know none, And therefore am no beast
:               Richard III., William Shakespeare
: vim : the best editor
: ftp.fu-berlin.de/misc/editors/vim/beta-test
: ---------------------------------------------------------------------

--
***********begin r.s. response***************

From schwarze@imn.th-leipzig.de Fri Apr 12 11:47:54 1996
Date: Wed, 28 Jun 95 11:42:19 +0200
From: rainer schwarze <schwarze@imn.th-leipzig.de>
To: z007400b@bcfreenet.seflin.lib.fl.us

Why Pascal is Not My Favorite Programming Language

Brian W. Kernighan, April 2, 1981
AT&T Bell Laboratories,� Murray Hill, New Jersey 07974

Abstract

The programming language Pascal has become the dominant language of instruction in computer science
education.� It has also strongly influenced languages developed subsequently, in particular Ada. 

Pascal was originally intended primarily as a teaching language, but it has been more and more often
recommended as a language for serious programming as well, for example, for system programming tasks
and even operating systems. 

Pascal, at least in its standard form, is just plain not suitable for serious programming.� This
paper discusses my personal discovery of some of the reasons why. 

1.� Genesis

This paper has its origins in two events - a spate of papers that compare C and Pascal(1, 2, 3, 4)
and a personal attempt to rewrite 'Software Tools'(5) in Pascal. 

Comparing C and Pascal is rather like comparing a Learjet to a Piper Cub - one is meant for getting
something done while the other is meant for learning - so such comparisons tend to be somewhat
farfetched.� But the revision of Software Tools seems a more relevant comparison.� The programs
therein were originally written in Ratfor, a ``structured'' dialect of Fortran implemented by a
preprocessor.� Since Ratfor is really Fortran in disguise, it has few of the assets that Pascal
brings - data types more suited to character processing, data structuring capabilities for better
defining the organization of one's data, and strong typing to enforce telling the truth about the
data. 

It turned out to be harder than I had expected to rewrite the programs in Pascal.� This paper is an
attempt to distill out of the experience some lessons about Pascal's suitability for programming (as
distinguished from learning about programming).� It is not a comparison of Pascal with C or Ratfor. 

The programs were first written in that dialect of Pascal supported by the Pascal interpreter pi
provided by the University of California at Berkeley.� The language is close to the nominal standard
of Jensen and Wirth,(6) with good diagnostics and careful run-time checking.� Since then, the
programs have also been run, unchanged except for new libraries of primitives, on four other systems:
an interpreter from the Free University of Amsterdam (hereinafter referred to as VU, for Vrije
Universiteit), a VAX version of the Berkeley system (a true compiler), a compiler purveyed by
Whitesmiths, Ltd., and UCSD Pascal on a Z80.� All but the last of these Pascal systems are written in
C. 

Pascal is a much-discussed language.� A recent bibliography(7) lists 175 items under the heading of
``discussion, analysis and debate.'' The most often cited papers (well worth reading) are a strong
critique by Habermann(8) and an equally strong rejoinder by Lecarme and Desjardins.(9) The paper by
Boom and DeJong(10) is also good reading.� Wirth's own assessment of Pascal is found in [11].� I have
no desire or ability to summarize the literature; this paper represents my personal observations and
most of it necessarily duplicates points made by others.� I have tried to organize the rest of the
material around the issues of 

   types and scope 
   control flow 
   environment 
   cosmetics 

and within each area more or less in decreasing order of significance. 

To state my conclusions at the outset: Pascal may be an admirable language for teaching beginners how
to program; I have no first-hand experience with that.� It was a considerable achievement for 1968.� 
It has certainly influenced the design of recent languages, of which Ada is likely to be the most
important.� But in its standard form (both current and proposed), Pascal is not adequate for writing
real programs.� It is suitable only for small, self-contained programs that have only trivial
interactions with their environment and that make no use of any software written by anyone else. 

2.� Types and Scopes

Pascal is (almost) a strongly typed language.� Roughly speaking, that means that each object in a
program has a well-defined type which implicitly defines the legal values of and operations on the
object.� The language guarantees that it will prohibit illegal values and operations, by some mixture
of compile- and run-time checking.� Of course compilers may not actually do all the checking implied
in the language definition.� Furthermore, strong typing is not to be confused with dimensional
analysis.� If one defines types 'apple' and 'orange' with 

     type
             apple = integer;
             orange = integer;

then any arbitrary arithmetic expression involving apples and oranges is perfectly legal.

Strong typing shows up in a variety of ways.� For instance, arguments to functions and procedures are
checked for proper type matching.� Gone is the Fortran freedom to pass a floating point number into a
subroutine that expects an integer; this I deem a desirable attribute of Pascal, since it warns of a
construction that will certainly cause an error.

Integer variables may be declared to have an associated range of legal values, and the compiler and
run-time support ensure that one does not put large integers into variables that only hold small
ones.� This too seems like a service, although of course run-time checking does exact a penalty.

Let us move on to some problems of type and scope. 

2.1.� The size of an array is part of its type

If one declares 

     var     arr10 : array [1..10] of integer;
             arr20 : array [1..20] of integer;

then arr10 and arr20 are arrays of 10 and 20 integers respectively.� Suppose we want to write a
procedure 'sort' to sort an integer array.� Because arr10 and arr20 have different types, it is not
possible to write a single procedure that will sort them both. 

The place where this affects Software Tools particularly, and I think programs in general, is that it
makes it difficult indeed to create a library of routines for doing common, general-purpose
operations like sorting. 

The particular data type most often affected is 'array of char', for in Pascal a string is an array
of characters.� Consider writing a function 'index(s,c)' that will return the position in the string
s where the character c first occurs, or zero if it does not.� The problem is how to handle the
string argument of 'index'.� The calls 'index('hello',c)' and 'index('goodbye',c)' cannot both be
legal, since the strings have different lengths.� (I pass over the question of how the end of a
constant string like 'hello' can be detected, because it can't.) The next try is 

     var     temp : array [1..10] of char;
     temp := 'hello';

     n := index(temp,c);

but the assignment to 'temp' is illegal because 'hello' and 'temp' are of different lengths. 

The only escape from this infinite regress is to define a family of routines with a member for each
possible string size, or to make all strings (including constant strings like 'define' ) of the same
length.

The latter approach is the lesser of two great evils.� In 'Tools', a type called 'string' is declared
as 

     type    string = array [1..MAXSTR] of char;

where the constant 'MAXSTR' is ``big enough,'' and all strings in all programs are exactly this
size.� This is far from ideal, although it made it possible to get the programs running.� It does not
solve the problem of creating true libraries of useful routines. 

There are some situations where it is simply not acceptable to use the fixed-size array
representation.� For example, the 'Tools' program to sort lines of text operates by filling up memory
with as many lines as will fit; its running time depends strongly on how full the memory can be
packed.

Thus for 'sort', another representation is used, a long array of characters and a set of indices into
this array: 

     type    charbuf = array [1..MAXBUF] of char;
             charindex = array [1..MAXINDEX] of 0..MAXBUF;

But the procedures and functions written to process the fixed-length representation cannot be used
with the variable-length form; an entirely new set of routines is needed to copy and compare strings
in this representation.� In Fortran or C the same functions could be used for both. 

As suggested above, a constant string is written as 

     'this is a string'

and has the type 'packed array [1..n] of char', where n is the length.� Thus each string literal of
different length has a different type.� The only way to write a routine that will print a message and
clean up is to pad all messages out to the same maximum length: 

     error('short message                    ');
     error('this is a somewhat longer message');

Many commercial Pascal compilers provide a 'string' data type that explicitly avoids the problem; '
string's are all taken to be the same type regardless of size.� This solves the problem for this
single data type, but no other.� It also fails to solve secondary problems like computing the length
of a constant string; another built-in function is the usual solution. 

Pascal enthusiasts often claim that to cope with the array-size problem one merely has to copy some
library routine and fill in the parameters for the program at hand, but the defense sounds weak at
best:(12) 

  ``Since the bounds of an array are part of its type (or, more exactly, of the type of its
  indexes), it is impossible to define a procedure or function which applies to arrays with
  differing bounds.� Although this restriction may appear to be a severe one, the experiences we
  have had with Pascal tend to show that it tends to occur very infrequently.� [...] However, the
  need to bind the size of parametric arrays is a serious defect in connection with the use of
  program libraries.'' 

This botch is the biggest single problem with Pascal.� I believe that if it could be fixed, the
language would be an order of magnitude more usable.� The proposed ISO standard for Pascal(13)
provides such a fix (``conformant array schemas''), but the acceptance of this part of the standard
is apparently still in doubt. 

2.2.� There are no static variables and no initialization

A 'static' variable (often called an 'own' variable in Algol-speaking countries) is one that is
private to some routine and retains its value from one call of the routine to the next.� De facto,
Fortran variables are internal static, except for COMMON; in C there is a 'static' declaration that
can be applied to local variables.� (Strictly speaking, in Fortran 77 one must use SAVE to force the
static attribute.) 

Pascal has no such storage class.� This means that if a Pascal function or procedure intends to
remember a value from one call to another, the variable used must be external to the function or
procedure.� Thus it must be visible to other procedures, and its name must be unique in the larger
scope.� A simple example of the problem is a random number generator: the value used to compute the
current output must be saved to compute the next one, so it must be stored in a variable whose
lifetime includes all calls of the random number generator.� In practice, this is typically the
outermost block of the program.� Thus the declaration of such a variable is far removed from the
place where it is actually used. 

One example comes from the text formatter described in Chapter 7 of 'Tools'.� The variable 'dir'
controls the direction from which excess blanks are inserted during line justification, to obtain
left and right alternately.� In Pascal, the code looks like this: 

     program formatter (...);

     var
             dir : 0..1;     { direction to add extra spaces }
             .
             .
             .
     procedure justify (...);
     begin
             dir := 1 - dir; { opposite direction from last time }
             ...
     end;

             ...

     begin { main routine of formatter }
             dir := 0;
             ...
     end;

The declaration, initialization and use of the variable 'dir' are scattered all over the program,
literally hundreds of lines apart.� In C or Fortran, 'dir' can be made private to the only routine
that needs to know about it: 

             ...
     main()
     {
             ...
     }

             ...

     justify()
     {
             static int dir = 0;

             dir = 1 - dir;
             ...
     }

There are of course many other examples of the same problem on a larger scale; functions for buffered
I/O, storage management, and symbol tables all spring to mind. 

There are at least two related problems.� Pascal provides no way to initialize variables statically
(i.e., at compile time); there is nothing analogous to Fortran's DATA statement or initializers like 

     int dir = 0;

in C.� This means that a Pascal program must contain explicit assignment statements to initialize
variables (like the 

     dir := 0;

above).� This code makes the program source text bigger, and the program itself bigger at run time. 

Furthermore, the lack of initializers exacerbates the problem of too-large scope caused by the lack
of a static storage class.� The time to initialize things is at the beginning, so either the main
routine itself begins with a lot of initialization code, or it calls one or more routines to do the
initializations.� In either case, variables to be initialized must be visible, which means in effect
at the highest level of the hierarchy.� The result is that any variable that is to be initialized has
global scope.

The third difficulty is that there is no way for two routines to share a variable unless it is
declared at or above their least common ancestor.� Fortran COMMON and C's external static storage
class both provide a way for two routines to cooperate privately, without sharing information with
their ancestors. 

The new standard does not offer static variables, initialization or non-hierarchical communication. 

2.3.� Related program components must be kept separate

Since the original Pascal was implemented with a one-pass compiler, the language believes strongly in
declaration before use.� In particular, procedures and functions must be declared (body and all)
before they are used.� The result is that a typical Pascal program reads from the bottom up - all the
procedures and functions are displayed before any of the code that calls them, at all levels.� This
is essentially opposite to the order in which the functions are designed and used. 

To some extent this can be mitigated by a mechanism like the #include facility of C and Ratfor:
source files can be included where needed without cluttering up the program.� #include is not part of
standard Pascal, although the UCB, VU and Whitesmiths compilers all provide it. 

There is also a 'forward' declaration in Pascal that permits separating the declaration of the
function or procedure header from the body; it is intended for defining mutually recursive
procedures.� When the body is declared later on, the header on that declaration may contain only the
function name, and must not repeat the information from the first instance. 

A related problem is that Pascal has a strict order in which it is willing to accept declarations.� 
Each procedure or function consists of 

  label label declarations, if any
  const constant declarations, if any
  type type declarations, if any
  var variable declarations, if any

  procedure and function declarations, if any
  begin
  body of function or procedure
  end

This means that all declarations of one kind (types, for instance) must be grouped together for the
convenience of the compiler, even when the programmer would like to keep together things that are
logically related so as to understand the program better.� Since a program has to be presented to the
compiler all at once, it is rarely possible to keep the declaration, initialization and use of types
and variables close together.� Even some of the most dedicated Pascal supporters agree:(14) 

  ``The inability to make such groupings in structuring large programs is one of Pascal's most
  frustrating limitations.'' 

A file inclusion facility helps only a little here. 

The new standard does not relax the requirements on the order of declarations. 

2.4.� There is no separate compilation

The ``official'' Pascal language does not provide separate compilation, and so each implementation
decides on its own what to do.� Some (the Berkeley interpreter, for instance) disallow it entirely;
this is closest to the spirit of the language and matches the letter exactly.� Many others provide a
declaration that specifies that the body of a function is externally defined.� In any case, all such
mechanisms are non-standard, and thus done differently by different systems. 

Theoretically, there is no need for separate compilation - if one's compiler is very fast (and if the
source for all routines is always available and if one's compiler has a file inclusion facility so
that multiple copies of source are not needed), recompiling everything is equivalent.� In practice,
of course, compilers are never fast enough and source is often hidden and file inclusion is not part
of the language, so changes are time-consuming. 

Some systems permit separate compilation but do not validate consistency of types across the
boundary.� This creates a giant hole in the strong typing.� (Most other languages do no
cross-compilation checking either, so Pascal is not inferior in this respect.)� I have seen at least
one paper (mercifully unpublished) that on page n castigates C for failing to check types across
separate compilation boundaries while suggesting on page n+1 that the way to cope with Pascal is to
compile procedures separately to avoid type checking. 

The new standard does not offer separate compilation. 

2.5.� Some miscellaneous problems of type and scope

Most of the following points are minor irritations, but I have to stick them in somewhere. 

It is not legal to name a non-basic type as the literal formal parameter of a procedure; the
following is not allowed: 

     procedure add10 (var a : array [1..10] of integer);

Rather, one must invent a type name, make a type declaration, and declare the formal parameter to be
an instance of that type: 

     type    a10 = array [1..10] of integer;
     ...
     procedure add10 (var a : a10);

Naturally the type declaration is physically separated from the procedure that uses it.� The
discipline of inventing type names is helpful for types that are used often, but it is a distraction
for things used only once. 

It is nice to have the declaration 'var' for formal parameters of functions and procedures; the
procedure clearly states that it intends to modify the argument.� But the calling program has no way
to declare that a variable is to be modified - the information is only in one place, while two places
would be better.� (Half a loaf is better than none, though - Fortran tells the user nothing about who
will do what to variables.) 

It is also a minor bother that arrays are passed by value by default - the net effect is that every
array parameter is declared 'var' by the programmer more or less without thinking.� If the 'var'
declaration is inadvertently omitted, the resulting bug is subtle. 

Pascal's 'set' construct seems like a good idea, providing notational convenience and some free type
checking.� For example, a set of tests like 

     if (c = blank) or (c = tab) or (c = newline) then ...

can be written rather more clearly and perhaps more efficiently as 

     if c in [blank, tab, newline] then ...

But in practice, set types are not useful for much more than this, because the size of a set is
strongly implementation dependent (probably because it was so in the original CDC implementation: 59
bits).� For example, it is natural to attempt to write the function 'isalphanum(c)' (``is c
alphanumeric?'') as 

     { isalphanum(c) -- true if c is letter or digit }
     function isalphanum (c : char) : boolean;
     begin
             isalphanum := c in ['a'..'z', 'A'..'Z', '0'..'9']
     end;

But in many implementations of Pascal (including the original) this code fails because sets are just
too small.� Accordingly, sets are generally best left unused if one intends to write portable
programs.� (This specific routine also runs an order of magnitude slower with sets than with a range
test or array reference.) 

2.6.� There is no escape

There is no way to override the type mechanism when necessary, nothing analogous to the ``cast''
mechanism in C.� This means that it is not possible to write programs like storage allocators or I/O
systems in Pascal, because there is no way to talk about the type of object that they return, and no
way to force such objects into an arbitrary type for another use.� (Strictly speaking, there is a
large hole in the type-checking near variant records, through which some otherwise illegal type
mismatches can be obtained.) 

3.� Control Flow

The control flow deficiencies of Pascal are minor but numerous - the death of a thousand cuts, rather
than a single blow to a vital spot.

There is no guaranteed order of evaluation of the logical operators 'and' and 'or' - nothing like &&
and || in C.� This failing, which is shared with most other languages, hurts most often in loop
control: 

     while (i <= XMAX) and (x[i] > 0) do ...

is extremely unwise Pascal usage, since there is no way to ensure that i is tested before x[i] is. 

By the way, the parentheses in this code are mandatory - the language has only four levels of
operator precedence, with relationals at the bottom. 

There is no 'break' statement for exiting loops.� This is consistent with the one entry-one exit
philosophy espoused by proponents of structured programming, but it does lead to nasty
circumlocutions or duplicated code, particularly when coupled with the inability to control the order
in which logical expressions are evaluated.� Consider this common situation, expressed in C or
Ratfor: 

     while (getnext(...)) {
             if (something)
                     break
             rest of loop
     }

With no 'break' statement, the first attempt in Pascal is 

     done := false;
     while (not done) and (getnext(...)) do
             if something then
                     done := true
             else begin
                     rest of loop
             end

But this doesn't work, because there is no way to force the ``not done'' to be evaluated before the
next call of 'getnext'.� This leads, after several false starts, to 

     done := false;
     while not done do begin
             done := getnext(...);
             if something then
                     done := true
             else if not done then begin
                     rest of loop
             end
     end

Of course recidivists can use a 'goto' and a label (numeric only and it has to be declared) to exit a
loop.� Otherwise, early exits are a pain, almost always requiring the invention of a boolean variable
and a certain amount of cunning.� Compare finding the last non-blank in an array in Ratfor: 

     for (i = max; i > 0; i = i - 1)
             if (arr(i) != ' ')
                     break

with Pascal: 

     done := false;
     i := max;
     while (i > 0) and (not done) do
             if arr[i] = ' ' then
                     i := i - 1
             else
                     done := true;

The index of a 'for' loop is undefined outside the loop, so it is not possible to figure out whether
one went to the end or not.� The increment of a 'for' loop can only be +1 or -1, a minor restriction.

There is no 'return' statement, again for one in-one out reasons.� A function value is returned by
setting the value of a pseudo-variable (as in Fortran), then falling off the end of the function.� 
This sometimes leads to contortions to make sure that all paths actually get to the end of the
function with the proper value.� There is also no standard way to terminate execution except by
reaching the end of the outermost block, although many implementations provide a 'halt' that causes
immediate termination. 

The 'case' statement is better designed than in C, except that there is no 'default' clause and the
behavior is undefined if the input expression does not match any of the cases.� This crucial omission
renders the 'case' construct almost worthless.� In over 6000 lines of Pascal in 'Software Tools in
Pascal', I used it only four times, although if there had been a 'default', a 'case' would have
served in at least a dozen places.

The new standard offers no relief on any of these points. 

4.� The Environment

The Pascal run-time environment is relatively sparse, and there is no extension mechanism except
perhaps source-level libraries in the ``official'' language. 

Pascal's built-in I/O has a deservedly bad reputation.� It believes strongly in record-oriented input
and output.� It also has a look-ahead convention that is hard to implement properly in an interactive
environment.� Basically, the problem is that the I/O system believes that it must read one record
ahead of the record that is being processed.� In an interactive system, this means that when a
program is started, its first operation is to try to read the terminal for the first line of input,
before any of the program itself has been executed.� But in the program 

     write('Please enter your name: ');
     read(name);
     ...

read-ahead causes the program to hang, waiting for input before printing the prompt that asks for it.

It is possible to escape most of the evil effects of this I/O design by very careful implementation,
but not all Pascal systems do so, and in any case it is relatively costly. 

The I/O design reflects the original operating system upon which Pascal was designed; even Wirth
acknowledges that bias, though not its defects.(15) It is assumed that text files consist of records,
that is, lines of text.� When the last character of a line is read, the built-in function 'eoln'
becomes true; at that point, one must call 'readln' to initiate reading a new line and reset 'eoln'.�
Similarly, when the last character of the file is read, the built-in 'eof' becomes true.� In both
cases, 'eoln' and 'eof' must be tested before each 'read' rather than after. 

Given this, considerable pains must be taken to simulate sensible input.� This implementation of '
getc' works for Berkeley and VU I/O systems, but may not necessarily work for anything else: 

     { getc -- read character from standard input }
     function getc (var c : character) : character;
     var
             ch : char;
     begin
             if eof then
                     c := ENDFILE
             else if eoln then begin
                     readln;
                     c := NEWLINE
             end

             else begin
                     read(ch);
                     c := ord(ch)
             end;
             getc := c
     end;

The type 'character' is not the same as 'char', since ENDFILE and perhaps NEWLINE are not legal
values for a 'char' variable. 

There is no notion at all of access to a file system except for predefined files named by (in effect)
logical unit number in the 'program' statement that begins each program.� This apparently reflects
the CDC batch system in which Pascal was originally developed.� A file variable 

     var fv : file of type

is a very special kind of object - it cannot be assigned to, nor used except by calls to built-in
procedures like 'eof', 'eoln', 'read', 'write', 'reset' and 'rewrite'.� ('reset' rewinds a file and
makes it ready for rereading; 'rewrite' makes a file ready for writing.) 

Most implementations of Pascal provide an escape hatch to allow access to files by name from the
outside environment, but not conveniently and not standardly.� For example, many systems permit a
filename argument in calls to 'reset' and 'rewrite': 

     reset(fv, filename);

But 'reset' and 'rewrite' are procedures, not functions - there is no status return and no way to
regain control if for some reason the attempted access fails.� (UCSD provides a compile-time flag
that disables the normal abort.) And since fv's cannot appear in expressions like 

     reset(fv, filename);
     if fv = failure then ...

there is no escape in that direction either.� This straitjacket makes it essentially impossible to
write programs that recover from mis-spelled file names, etc.� I never solved it adequately in the
'Tools' revision. 

There is no notion of access to command-line arguments, again probably reflecting Pascal's
batch-processing origins.� Local routines may allow it by adding non-standard procedures to the
environment. 

Since it is not possible to write a general-purpose storage allocator in Pascal (there being no way
to talk about the types that such a function would return), the language has a built-in procedure
called 'new' that allocates space from a heap.� Only defined types may be allocated, so it is not
possible to allocate, for example, arrays of arbitrary size to hold character strings.� The pointers
returned by 'new' may be passed around but not manipulated: there is no pointer arithmetic.� There is
no way to regain control if storage runs out.

The new standard offers no change in any of these areas. 

5.� Cosmetic Issues

Most of these issues are irksome to an experienced programmer, and some are probably a nuisance even
to beginners.� All can be lived with. 

Pascal, in common with most other Algol-inspired languages, uses the semicolon as a statement
separator rather than a terminator (as it is in PL/I and C).� As a result one must have a reasonably
sophisticated notion of what a statement is to put semicolons in properly.� Perhaps more important,
if one is serious about using them in the proper places, a fair amount of nuisance editing is
needed.� Consider the first cut at a program: 

     if a then
             b;
     c;

But if something must be inserted before b, it no longer needs a semicolon, because it now precedes
an 'end': 

     if a then begin
             b0;
             b
     end;
     c;

Now if we add an 'else', we must remove the semicolon on the 'end': 

     if a then begin
             b0;
             b
     end
     else
             d;
     c;

And so on and so on, with semicolons rippling up and down the program as it evolves. 

One generally accepted experimental result in programmer psychology is that semicolon as separator is
about ten times more prone to error than semicolon as terminator.(16) (In Ada,(17) the most
significant language based on Pascal, semicolon is a terminator.) Fortunately, in Pascal one can
almost always close one's eyes and get away with a semicolon as a terminator.� The exceptions are in
places like declarations, where the separator vs. terminator problem doesn't seem as serious anyway,
and just before 'else', which is easy to remember.

C and Ratfor programmers find 'begin' and 'end' bulky compared to { and }. 

A function name by itself is a call of that function; there is no way to distinguish such a function
call from a simple variable except by knowing the names of the functions.� Pascal uses the Fortran
trick of having the function name act like a variable within the function, except that where in
Fortran the function name really is a variable, and can appear in expressions, in Pascal, its
appearance in an expression is a recursive invocation: if f is a zero-argument function, 'f:=f+1' is
a recursive call of f. 

There is a paucity of operators (probably related to the paucity of precedence levels).� In
particular, there are no bit-manipulation operators (AND, OR, XOR, etc.).� I simply gave up trying to
write the following trivial encryption program in Pascal: 

     i := 1;
     while getc(c) <> ENDFILE do begin
             putc(xor(c, key[i]));
             i := i mod keylen + 1
     end

because I couldn't write a sensible 'xor' function.� The set types help a bit here (so to speak), but
not enough; people who claim that Pascal is a system programming language have generally overlooked
this point.� For example, [18, p. 685] 

  ``Pascal is at the present time [1977] the best language in the public domain for purposes of
  system programming and software implementation.'' 

seems a bit naive. 

There is no null string, perhaps because Pascal uses the doubled quote notation to indicate a quote
embedded in a string: 

     'This is a '' character'

There is no way to put non-graphic symbols into strings.� In fact, non-graphic characters are
unpersons in a stronger sense, since they are not mentioned in any part of the standard language.� 
Concepts like newlines, tabs, and so on are handled on each system in an 'ad hoc' manner, usually by
knowing something about the character set (e.g., ASCII newline has decimal value 10). 

There is no macro processor.� The 'const' mechanism for defining manifest constants takes care of
about 95 percent of the uses of simple #define statements in C, but more involved ones are hopeless.�
It is certainly possible to put a macro preprocessor on a Pascal compiler.� This allowed me to
simulate a sensible 'error' procedure as 

     #define error(s)begin writeln(s); halt end

('halt' in turn might be defined as a branch to the end of the outermost block.) Then calls like 

     error('little string');
     error('much bigger string');

work since 'writeln' (as part of the standard Pascal environment) can handle strings of any size.� It
is unfortunate that there is no way to make this convenience available to routines in general. 

The language prohibits expressions in declarations, so it is not possible to write things like 

      const   SIZE = 10;
      type    arr = array [1..SIZE+1] of integer;

or even simpler ones like 

      const   SIZE = 10;
              SIZE1 = SIZE + 1;

6.� Perspective

The effort to rewrite the programs in 'Software Tools' started in March, 1980, and, in fits and
starts, lasted until January, 1981.� The final product(19) was published in June, 1981.� During that
time I gradually adapted to most of the superficial problems with Pascal (cosmetics, the inadequacies
of control flow), and developed imperfect solutions to the significant ones (array sizes, run-time
environment). 

The programs in the book are meant to be complete, well-engineered programs that do non-trivial
tasks.� But they do not have to be efficient, nor are their interactions with the operating system
very complicated, so I was able to get by with some pretty kludgy solutions, ones that simply
wouldn't work for real programs. 

There is no significant way in which I found Pascal superior to C, but there are several places where
it is a clear improvement over Ratfor.� Most obvious by far is recursion: several programs are much
cleaner when written recursively, notably the pattern-search, quicksort, and expression evaluation. 

Enumeration data types are a good idea.� They simultaneously delimit the range of legal values and
document them.� Records help to group related variables.� I found relatively little use for pointers.

Boolean variables are nicer than integers for Boolean conditions; the original Ratfor programs
contained some unnatural constructions because Fortran's logical variables are badly designed. 

Occasionally Pascal's type checking would warn of a slip of the hand in writing a program; the
run-time checking of values also indicated errors from time to time, particularly subscript range
violations. 

Turning to the negative side, recompiling a large program from scratch to change a single line of
source is extremely tiresome; separate compilation, with or without type checking, is mandatory for
large programs. 

I derived little benefit from the fact that characters are part of Pascal and not part of Fortran,
because the Pascal treatment of strings and non-graphics is so inadequate.� In both languages, it is
appallingly clumsy to initialize literal strings for tables of keywords, error messages, and the
like. 

The finished programs are in general about the same number of source lines as their Ratfor
equivalents.� At first this surprised me, since my preconception was that Pascal is a wordier and
less expressive language. The real reason seems to be that Pascal permits arbitrary expressions in
places like loop limits and subscripts where Fortran (that is, portable Fortran 66) does not, so some
useless assignments can be eliminated; furthermore, the Ratfor programs declare functions while
Pascal ones do not. 

To close, let me summarize the main points in the case against Pascal. 

1. Since the size of an array is part of its type, it is not possible to write general-purpose
   routines, that is, to deal with arrays of different sizes.� In particular, string handling is very
   difficult. 
2. The lack of static variables, initialization and a way to communicate non-hierarchically combine
   to destroy the ``locality'' of a program - variables require much more scope than they ought to. 
3. The one-pass nature of the language forces procedures and functions to be presented in an
   unnatural order; the enforced separation of various declarations scatters program components that
   logically belong together. 
4. The lack of separate compilation impedes the development of large programs and makes the use of
   libraries impossible. 
5. The order of logical expression evaluation cannot be controlled, which leads to convoluted code
   and extraneous variables. 
6. The 'case' statement is emasculated because there is no default clause. 
7. The standard I/O is defective.� There is no sensible provision for dealing with files or program
   arguments as part of the standard language, and no extension mechanism. 
8. The language lacks most of the tools needed for assembling large programs, most notably file
   inclusion. 
9. There is no escape. 

This last point is perhaps the most important.� The language is inadequate but circumscribed, because
there is no way to escape its limitations.� There are no casts to disable the type-checking when
necessary.� There is no way to replace the defective run-time environment with a sensible one, unless
one controls the compiler that defines the ``standard procedures.'' The language is closed. 

People who use Pascal for serious programming fall into a fatal trap.

Because the language is so impotent, it must be extended.� But each group extends Pascal in its own
direction, to make it look like whatever language they really want.� Extensions for separate
compilation, Fortran-like COMMON, string data types, internal static variables, initialization, octal
numbers, bit operators, etc., all add to the utility of the language for one group, but destroy its
portability to others. 

I feel that it is a mistake to use Pascal for anything much beyond its original target.� In its pure
form, Pascal is a toy language, suitable for teaching but not for real programming. 

Acknowledgments

I am grateful to Al Aho, Al Feuer, Narain Gehani, Bob Martin, Doug McIlroy, Rob Pike, Dennis Ritchie,
Chris Van Wyk and Charles Wetherell for helpful criticisms of earlier versions of this paper. 

1. Feuer, A. R. and N. H. Gehani, ``A Comparison of the Programming Languages C and Pascal - Part I:
   Language Concepts,'' Bell Labs internal memorandum (September 1979). 
2. N. H. Gehani and A. R. Feuer, ``A Comparison of the Programming Languages C and Pascal - Part II:
   Program Properties and Programming Domains,'' Bell Labs internal memorandum (February 1980). 
3. P. Mateti, ``Pascal versus C: A Subjective Comparison,'' Language Design and Programming
   Methodology Symposium, Springer-Verlag, Sydney, Australia (September 1979). 
4. A. Springer, ``A Comparison of Language C and Pascal,'' IBM Technical Report G320-2128, Cambridge
   Scientific Center (August 1979). 
5. B. W. Kernighan and P. J. Plauger, Software Tools, Addison-Wesley, Reading, Mass. (1976). 
6. K. Jensen, Pascal User Manual and Report, Springer-Verlag (1978). (2nd edition.) 
7. David V. Moffat, ``A Categorized Pascal Bibliography,'' SIGPLAN Notices 15(10), pp. 63-75 (October
   1980). 
8. A. N. Habermann, ``Critical Comments on the Programming Language Pascal,'' Acta Informatica 3, pp.
   47-57 (1973). 
9. O. Lecarme and P. Desjardins, ``More Comments on the Programming Language Pascal,'' Acta
   Informatica 4, pp. 231-243 (1975). 
10. H. J. Boom and E. DeJong, ``A Critical Comparison of Several Programming Language
   Implementations,'' Software Practice and Experience 10(6), pp. 435-473 (June 1980). 
11. N. Wirth, ``An Assessment of the Programming Language Pascal,'' IEEE Transactions on Software
   Engineering SE-1(2), pp. 192-198 (June, 1975). 
12. O. Lecarme and P. Desjardins, ibid, p. 239. 
13. A. M. Addyman, ``A Draft Proposal for Pascal,'' SIGPLAN Notices 15(4), pp. 1-66 (April 1980). 
14. J. Welsh, W. J. Sneeringer, and C. A. R. Hoare, ``Ambiguities and Insecurities in Pascal,''
   Software Practice and Experience 7, pp. 685-696 (1977). 
15. N. Wirth, ibid., p. 196. 
16. J. D. Gannon and J. J. Horning, ``Language Design for Programming Reliability,'' IEEE Trans.
   Software Engineering SE-1(2), pp. 179-191 (June 1975). 
17. J. D. Ichbiah, et al, ``Rationale for the Design of the Ada Programming Language,'' SIGPLAN
   Notices 14(6) (June 1979). 
18. J. Welsh, W. J. Sneeringer, and C. A. R. Hoare, ibid. 
19. B. W. Kernighan and P. J. Plauger, Software Tools in Pascal, Addison-Wesley (1981). 



***********end r.s. response*****************
Ralph Silverman
z007400b@bcfreenet.seflin.lib.fl.us





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
       [not found] <sperlman-0507961717550001@p121.ezo.net>
       [not found] ` <4rs76l$aqd@ccshst05.uoguelph.ca>
@ 1996-07-16  0:00 ` Darin Johnson
  1996-07-24  0:00   ` Ralph Silverman
  1996-07-17  0:00 ` Aron Felix Gurski
                   ` (18 subsequent siblings)
  20 siblings, 1 reply; 688+ messages in thread
From: Darin Johnson @ 1996-07-16  0:00 UTC (permalink / raw)



> Also keep in mind that this rather lengthy diatribe was comparing c with the 
> 'standard' pascal, not what it has become.  Today's pascal is as
> different from what was being discussed as today's c++ is from the old c.

True, Pascal has been mostly subsumed by Modula II and III.  These are
nice languages, and you can do real-world and systems programming in
them.  They're not as popular (you probably have to go commercial to
get a compiler).
-- 
Darin Johnson
djohnson@ucsd.edu	O-
       Support your right to own gnus.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-15  0:00   ` Should I learn C or Pascal? Ralph Silverman
  1996-07-15  0:00     ` Steve Sobol
@ 1996-07-16  0:00     ` Lee Crites
  1996-07-17  0:00       ` David Verschoore
                         ` (3 more replies)
  1996-07-23  0:00     ` Richard A. O'Keefe
  2 siblings, 4 replies; 688+ messages in thread
From: Lee Crites @ 1996-07-16  0:00 UTC (permalink / raw)



z007400b@bcfreenet.seflin.lib.fl.us (Ralph Silverman) wrote:
>Gabor Egressy (gegressy@uoguelph.ca) wrote:
>: Seth Perlman (sperlman@ezo.net) wrote:
>: : First, my situation: I am going to be a senior in high school in the fall.
>: : I've been toying with programming computers ever since I learned BASIC on
>: : my Commodore 64 at the age of 6. I learned a smattering of C a few years
>: : ago, and I took a semester course in school this year which was an intro
>: : to Pascal. Sadly, it's the most advanced computer course at my school,
>: : which means that I have nowhere else to turn for instruction in
>: : programming.
>
>
>: : *** Should I learn to program in C or Pascal? ***
>
>: Here is a quote from Brian W. Kernighan of "The C Programming Language" fame.
>: He is also an internationally respected lecturer who has been there since 
>: the inception of UNIX and C.
>: "I feel that it is a mistake to use Pascal for anything much beyond its 
>: original target. In its pure form, Pascal is a toy language, suitable for 
>: teaching but not for real programming."
>: Draw your own conclusions.

Before you "draw your own conclusions," keep in mind that Kernighan is one of the 
founding voices in c.  A more biased opinion could hardly be found.  Ask Wirth what 
he thinks about the C vs Pascal dispute.  It would be the same kind of thing.

Also keep in mind that this rather lengthy diatribe was comparing c with the 
'standard' pascal, not what it has become.  Today's pascal is as different from what 
was being discussed as today's c++ is from the old c.

Both languages are evolving and becoming more and more of what the programmers need. 
Part and parcel to the "should I learn this or that" question, though, is the 
thought that one can do without knowing one of them.  I believe this is wrong.  The 
real answer, from my perspective, is you should learn BOTH, and learn both WELL!

After all, these c vs pascal disputes are chiefly religious in nature.  Dispense 
with the arm waving and name calling, either one will do for most applications.  I 
have heard some tell me how some application could only be written in one or the 
other, but so far nobody's been able to really convince me.

I remember the arguments about how COBOL was the ultimate language and nothing else 
could be needed.  I've been hearing these rather anal arguments for nearly 20 years. 
(I started learning programming in college in 1976)

So please, don't trap yourself in the one-or-the-other mindset.  Learn both.  You 
will be a better programmer -- and a more valuable employee -- for it.

Lee Crites
Computer Mavericks









^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
       [not found] <sperlman-0507961717550001@p121.ezo.net>
       [not found] ` <4rs76l$aqd@ccshst05.uoguelph.ca>
  1996-07-16  0:00 ` Darin Johnson
@ 1996-07-17  0:00 ` Aron Felix Gurski
  1996-07-19  0:00 ` Andrew Gierth
                   ` (17 subsequent siblings)
  20 siblings, 0 replies; 688+ messages in thread
From: Aron Felix Gurski @ 1996-07-17  0:00 UTC (permalink / raw)



Darin Johnson wrote:
> 
> > Also keep in mind that this rather lengthy diatribe was comparing c with the
> > 'standard' pascal, not what it has become.  Today's pascal is as
> > different from what was being discussed as today's c++ is from the old c.
> 
> True, Pascal has been mostly subsumed by Modula II and III.  These are
> nice languages, and you can do real-world and systems programming in
> them.  They're not as popular (you probably have to go commercial to
> get a compiler).
> --
> Darin Johnson
> djohnson@ucsd.edu       O-
>        Support your right to own gnus.

The language names are Modula-2 and Modula-3. There is a shareware Modula-2
compiler available for the DOS platform. Check the FAQ in comp.lang.modula2
for information about this and other compilers. Modula-3 compilers are 
available by ftp; check the faq in comp.lang.modula3.

		-- Aron




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-17  0:00       ` David Verschoore
@ 1996-07-17  0:00         ` Anthony Kanner
  1996-07-17  0:00         ` Mark McKinney
  1996-07-20  0:00         ` TRAN PHAN ANH
  2 siblings, 0 replies; 688+ messages in thread
From: Anthony Kanner @ 1996-07-17  0:00 UTC (permalink / raw)



David Verschoore wrote:
> 
> [snip]
> > So please, don't trap yourself in the one-or-the-other mindset.  Learn
> both.  You
> > will be a better programmer -- and a more valuable employee -- for it.
> >
> > Lee Crites
> > Computer Mavericks
> >
> Bravo!
> I would like to point out that a language is a tool.  Any tool used
> improperly will give less than expected results.
> What makes a good programmer is not necessarily the language but the
> technique in which it is used. Lean a
> language well. but more importantly, learn the technique of good
> programming practices.  Once you develop your
> personal 'technique', try another language and see how well your
> techniques port to the new language.
> 
> An artist is more likely able to paint a masterpiece than the man selling
> the paints. ;-)
> 
> You may want to check out Steve McConnell's book Code Complete as you
> learn your target language.
> -Dave

I most definitely agree with Dave.  Code Complete is a GREAT book. I am on page 100 and 
it is one of the best books on computer programming I have read.

Anthony
-- 
----------------------------------------------------
saturn/psx/snes
kanner@pacificnet.net
check out my web site at 
http://www.pacificnet.net/~kanner/
----------------------------------------------------




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-16  0:00     ` Lee Crites
@ 1996-07-17  0:00       ` David Verschoore
  1996-07-17  0:00         ` Anthony Kanner
                           ` (2 more replies)
  1996-07-18  0:00       ` Carlos DeAngulo
                         ` (2 subsequent siblings)
  3 siblings, 3 replies; 688+ messages in thread
From: David Verschoore @ 1996-07-17  0:00 UTC (permalink / raw)




[snip]
> So please, don't trap yourself in the one-or-the-other mindset.  Learn
both.  You 
> will be a better programmer -- and a more valuable employee -- for it.
> 
> Lee Crites
> Computer Mavericks
> 
Bravo!
I would like to point out that a language is a tool.  Any tool used
improperly will give less than expected results.
What makes a good programmer is not necessarily the language but the
technique in which it is used. Lean a 
language well. but more importantly, learn the technique of good
programming practices.  Once you develop your 
personal 'technique', try another language and see how well your
techniques port to the new language.

An artist is more likely able to paint a masterpiece than the man selling
the paints. ;-)

You may want to check out Steve McConnell's book Code Complete as you
learn your target language.
-Dave




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-17  0:00       ` David Verschoore
  1996-07-17  0:00         ` Anthony Kanner
@ 1996-07-17  0:00         ` Mark McKinney
  1996-07-19  0:00           ` Philip Brashear
  1996-07-20  0:00         ` TRAN PHAN ANH
  2 siblings, 1 reply; 688+ messages in thread
From: Mark McKinney @ 1996-07-17  0:00 UTC (permalink / raw)



Dave Vershore wrote:

>What makes a good programmer is not necessarily the language but the
>technique in which it is used. Lean a 
>language well. but more importantly, learn the technique of good
>programming practices.

This raises a big concern I have always had about how programming is taugh 
in general. Problem solving techniques, style, methodologies etc. should 
be taught or learned prior to a programming language. The "this is how you 
do it and then this is how yu do it well" approach seems highly 
ineffective. 
-Mark






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-16  0:00     ` Lee Crites
  1996-07-17  0:00       ` David Verschoore
@ 1996-07-18  0:00       ` Carlos DeAngulo
  1996-07-18  0:00         ` Robert Dewar
                           ` (2 more replies)
  1996-07-18  0:00       ` Patrick Horgan
  1996-07-18  0:00       ` Walter B. Hollman Sr.
  3 siblings, 3 replies; 688+ messages in thread
From: Carlos DeAngulo @ 1996-07-18  0:00 UTC (permalink / raw)



You should definitely learn C/C++. The business world today uses C++ as its
power language to develop the finest applications. Don't let anyone guide
you wrong.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-16  0:00     ` Lee Crites
  1996-07-17  0:00       ` David Verschoore
  1996-07-18  0:00       ` Carlos DeAngulo
@ 1996-07-18  0:00       ` Patrick Horgan
  1996-07-18  0:00         ` Robert Dewar
                           ` (4 more replies)
  1996-07-18  0:00       ` Walter B. Hollman Sr.
  3 siblings, 5 replies; 688+ messages in thread
From: Patrick Horgan @ 1996-07-18  0:00 UTC (permalink / raw)



In my company and in many other startups in Silicon Valley doing the bleeding
edge work in the newest cool stuff, you can't get a job without being a C++
programmer, period.

If C++ is not a choice for you learn C.  Pascal won't get you a job.

Of course I think you should learn at least seven or eight high level languages
just for fun, and five or six assemblers for the same reason.

-- 

   Patrick J. Horgan    patrick@broadvision.com   Have horse will ride.
   Opinions mine, not my employer's except by most bizarre coincidence.






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-18  0:00       ` Patrick Horgan
@ 1996-07-18  0:00         ` Robert Dewar
  1996-07-19  0:00           ` Billy Chambless
  1996-07-18  0:00         ` Jason Alan Turnage
                           ` (3 subsequent siblings)
  4 siblings, 1 reply; 688+ messages in thread
From: Robert Dewar @ 1996-07-18  0:00 UTC (permalink / raw)



Patrick said

"In my company and in many other startups in Silicon Valley doing the bleeding
edge work in the newest cool stuff, you can't get a job without being a C++
programmer, period."

Note the phrase "C++ programmer"

the second word is by FAR the most important. Concentrate on learning the
basic principles of computer science and software engineering. The language
you learn at first is not so important, and I would say Pascal is probably
a far better choice than C for experimentation with algorithms. You can
easily learn whatever is the language-du-jour when it comes to getting
a job later on.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-18  0:00       ` Carlos DeAngulo
@ 1996-07-18  0:00         ` Robert Dewar
       [not found]           ` <01bb7588$236982e0$7b91f780@deangulo>
  1996-07-19  0:00           ` Jon Bell
       [not found]         ` <01bb7591$83087d60$87ee6fce@timpent.airshields.com>
  1996-07-19  0:00         ` Dirk Dickmanns
  2 siblings, 2 replies; 688+ messages in thread
From: Robert Dewar @ 1996-07-18  0:00 UTC (permalink / raw)



Carlos says

"You should definitely learn C/C++. The business world today uses C++ as its
power language to develop the finest applications. Don't let anyone guide
you wrong."

Well I would say Carlos gives good advice (don't let anyone guide you wrong)
and you can start by not letting Carlos guide you wrong.

First, the business world uses many languages -- even today far more programs
are written in COBOL than in C and C++ combined by a very large margin. It is
true that a segment of the technical engineering and software devlopment
market uses C and C++ heavily today, but who knows what they may be using
tomorrow. Don't concentrate on learning languages, concentrate on learning
how to program.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-18  0:00       ` Patrick Horgan
  1996-07-18  0:00         ` Robert Dewar
@ 1996-07-18  0:00         ` Jason Alan Turnage
  1996-07-19  0:00           ` Vic Metcalfe
  1996-07-19  0:00           ` Robert Dewar
  1996-07-19  0:00         ` Andrew Gierth
                           ` (2 subsequent siblings)
  4 siblings, 2 replies; 688+ messages in thread
From: Jason Alan Turnage @ 1996-07-18  0:00 UTC (permalink / raw)



Patrick Horgan (patrick@broadvision.com) wrote:

: If C++ is not a choice for you learn C.  Pascal won't get you a job.

   Unless you want to be a high school computer teacher (snicker, snicker)

: Of course I think you should learn at least seven or eight high level
: languages just for fun, and five or six assemblers for the same reason.

   No doubt.  No good programmer only knows one language.  And no
   really good programmer doesn't know assembly.


--
Jason Turnage
Georgia Tech
turnage@cc.gatech.edu




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-16  0:00     ` Lee Crites
                         ` (2 preceding siblings ...)
  1996-07-18  0:00       ` Patrick Horgan
@ 1996-07-18  0:00       ` Walter B. Hollman Sr.
  3 siblings, 0 replies; 688+ messages in thread
From: Walter B. Hollman Sr. @ 1996-07-18  0:00 UTC (permalink / raw)



Would you be terribly upset if I asked you to take your SILLY-ASSED
Question To The "C" or "Pascal" news group? 
 
On Jul 16, 1996 05:30:47 in article <Re: Should I learn C or Pascal?>, 'Lee
Crites <adonai@mail.jump.net>' wrote: 
 
 
>z007400b@bcfreenet.seflin.lib.fl.us (Ralph Silverman) wrote: 
>>Gabor Egressy (gegressy@uoguelph.ca) wrote: 
>>: Seth Perlman (sperlman@ezo.net) wrote: 
>>: : First, my situation: I am going to be a senior in high school in the 

>fall. 
>>: : I've been toying with programming computers ever since I learned
BASIC on 
>>: : my Commodore 64 at the age of 6. I learned a smattering of C a few
years 
>>: : ago, and I took a semester course in school this year which was an
intro 
>>: : to Pascal. Sadly, it's the most advanced computer course at my
school, 
>>: : which means that I have nowhere else to turn for instruction in 
>>: : programming. 
>> 
>> 
>>: : *** Should I learn to program in C or Pascal? *** 
>> 
>>: Here is a quote from Brian W. Kernighan of "The C Programming Language"
 
>fame. 
>>: He is also an internationally respected lecturer who has been there
since  
>>: the inception of UNIX and C. 
>>: "I feel that it is a mistake to use Pascal for anything much beyond its
 
>>: original target. In its pure form, Pascal is a toy language, suitable
for  
>>: teaching but not for real programming." 
>>: Draw your own conclusions. 
> 
>Before you "draw your own conclusions," keep in mind that Kernighan is one
of  
>the  
>founding voices in c.  A more biased opinion could hardly be found.  Ask
Wirth  
>what  
>he thinks about the C vs Pascal dispute.  It would be the same kind of
thing. 
> 
>Also keep in mind that this rather lengthy diatribe was comparing c with
the  
>'standard' pascal, not what it has become.  Today's pascal is as different
from  
>what  
>was being discussed as today's c++ is from the old c. 
> 
>Both languages are evolving and becoming more and more of what the
programmers  
>need.  
>Part and parcel to the "should I learn this or that" question, though, is
the  
>thought that one can do without knowing one of them.  I believe this is
wrong.   
>The  
>real answer, from my perspective, is you should learn BOTH, and learn both
 
>WELL! 
> 
>After all, these c vs pascal disputes are chiefly religious in nature.   
>Dispense  
>with the arm waving and name calling, either one will do for most
applications.  
>I  
>have heard some tell me how some application could only be written in one
or  
>the  
>other, but so far nobody's been able to really convince me. 
> 
>I remember the arguments about how COBOL was the ultimate language and
nothing  
>else  
>could be needed.  I've been hearing these rather anal arguments for nearly
20  
>years.  
>(I started learning programming in college in 1976) 
> 
>So please, don't trap yourself in the one-or-the-other mindset.  Learn
both.   
>You  
>will be a better programmer -- and a more valuable employee -- for it. 
> 
>Lee Crites 
>Computer Mavericks 
> 
> 
> 
> 
> 
-- 
Walter B. Hollman Sr 
 
 
 
 
 
 
 





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-18  0:00       ` Patrick Horgan
  1996-07-18  0:00         ` Robert Dewar
  1996-07-18  0:00         ` Jason Alan Turnage
@ 1996-07-19  0:00         ` Andrew Gierth
  1996-07-19  0:00         ` Scott McMahan - Softbase Systems
  1996-07-19  0:00         ` Reto Koradi
  4 siblings, 0 replies; 688+ messages in thread
From: Andrew Gierth @ 1996-07-19  0:00 UTC (permalink / raw)



[This thread should never have been injected into comp.unix.programmer.
Would further contributors *please* take note. Followups set.]

billy@cast.msstate.edu wrote:
>In article <dewar.837728071@schonberg>, dewar@cs.nyu.edu (Robert Dewar) writes:
>|> Note the phrase "C++ programmer"
 
>|> the second word is by FAR the most important. Concentrate on learning the
>|> basic principles of computer science and software engineering. The language
>|> you learn at first is not so important, and I would say Pascal is probably

>Amen! Any good programmer can learn any new langauge that coems along.
>The danger new programmers often fall into is that of learning to be a
>"foo programmer", where foo is the language du jour.

>The technology changes to fast to be locked into one mode. C++ is
>meaga-way-cool today, but remember, C was the end-all a while back, and
>COBOL before that.

>Don't strive to be a C++ Programmer or a Java Programmer -- become a
>Good Programmer.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-18  0:00         ` Jason Alan Turnage
  1996-07-19  0:00           ` Vic Metcalfe
@ 1996-07-19  0:00           ` Robert Dewar
  1996-07-20  0:00             ` steved
  1996-07-23  0:00             ` Ralph Silverman
  1 sibling, 2 replies; 688+ messages in thread
From: Robert Dewar @ 1996-07-19  0:00 UTC (permalink / raw)



Jason said

": Of course I think you should learn at least seven or eight high level
: languages just for fun, and five or six assemblers for the same reason.

   No doubt.  No good programmer only knows one language.  And no
   really good programmer doesn't know assembly."

I worry at this recommendation. It encourages what I often see at the
beginning level of the "language collecting" phenomenon. People think
that learning about programming is learning the syntax of lots of 
different languages, while not really knowing how to program in any of
them.

Yes it is true that really good programmers tend to know several languages
and to know assembler, but this is not always true, I know some quite
brilliant COBOL programmers around who don't know other languages, and
these days you quite often find very good programmers who only know C.

On the other hand, I sure know lots of *terrible* programmers who can
program equally terribly in many different languages.

I still think the important thing at the start is to learn how to program. It
is worth using a language that is rich enough to introduce all the necessary
abstraction concepts (Borland Object Pascal, Ada 95, C++ meet this criterion,
this is not a complete list of course, but you get the idea). It is a 
mistake to learn C to start with, since it lacks critical abstraction
features and so you will tend to miss the importance of data abstraction
and parametrization at the module level (it is not that this cannot be done
in C, just that you are unlikely to learn it if you start by learning C).

But in any case, the important thing is to concentrate on programming, not
on language collecting, at an early stage. Unfortunately many high school
teachers who teach computing have not progressed much beyond the language
collecting stage themselves, so you often have to rely on books at that
level.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-18  0:00         ` Robert Dewar
       [not found]           ` <01bb7588$236982e0$7b91f780@deangulo>
@ 1996-07-19  0:00           ` Jon Bell
  1996-07-22  0:00             ` Tim Oxler
  1 sibling, 1 reply; 688+ messages in thread
From: Jon Bell @ 1996-07-19  0:00 UTC (permalink / raw)



 Robert Dewar <dewar@cs.nyu.edu> wrote:
>First, the business world uses many languages -- even today far more programs
>are written in COBOL than in C and C++ combined by a very large margin.

Just out of curiosity, how much *new* development takes place in COBOL, 
as opposed to maintainance and extension of existing systems?   This does 
not imply that I'm downgrading maintainance, by the way.  I've done some 
of it (although not in COBOL), and I know that it can be a real challenge.

-- 
Jon Bell <jtbell@presby.edu>                        Presbyterian College
Dept. of Physics and Computer Science        Clinton, South Carolina USA




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-20  0:00             ` steved
@ 1996-07-19  0:00               ` Peter Seebach
  1996-07-20  0:00                 ` Jon Bell
  1996-07-20  0:00                 ` Robert Dewar
  0 siblings, 2 replies; 688+ messages in thread
From: Peter Seebach @ 1996-07-19  0:00 UTC (permalink / raw)



In article <4spj1f$prf@news.pacifier.com>,
Steve Doiel <steved@pacifier.com> wrote:
>In 'C' I find that programmers talk about the types "integer", and "double".
>In Pascal I find that programmers talk about "inches" and "liters".

I object to this generalization.

This is true of inexperienced programmers in many languages.  It may be
true of more C programmers, but then, there are an awful lot of C programmers
from which to select the worst.  :)

However, it's not the C language that predisposes people to think in terms of
implementation rather than interface; rather, such people are likely to drift
towards C, because it allows them to work in this domain, and even makes it
tolerably easy.

I don't think it's fair or appropriate to blame C for the vast population of
idiots who get involved with it.  It's an excellent language in some ways,
which led to its eventual success in some fields, which led to a lot of utter
morons trying to use it, or write books about it.

This doesn't mean it can't be used well; although C doesn't always help you
write good code, it certainly doesn't prevent you from doing so.

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
       [not found]           ` <01bb7588$236982e0$7b91f780@deangulo>
@ 1996-07-19  0:00             ` Robert Dewar
  1996-07-20  0:00             ` steidl
  1 sibling, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-07-19  0:00 UTC (permalink / raw)



Carlos said

"And about your COBOL remark... The only reason COBOL is still alive in the
business world, is because large corporations have this code so spread out
through out their corporations that changing to anything else would cost
millions of dollars. Convincing customers to change to another language
because it is more powerful usually doesn't "cost justify" itself to them.
Fortunatelly we as programmers have the choice of what environment we want
to spend our time in. If you want to spend your time working on a dinosaur
language like COBOL, or if you want to spend your time on C++ which is the
language that most of the software development corporations in the world
are using."

In the information systems world, which clearly from your remarks you
are unfamiliar with, C++ is not particularly attractive, and for example
Smalltalk has generated much more interest. COBOL, which I also would guess
from your remarks is something you are not familiar with, is in fact a 
powerful tool, and is the language of choice, even for a lot of new
development being done in client/server setups using PC's.

One thing you quickly learn in this field is that most people have a rather
narrow view of the world (for example, it is not unusual to find people
who assume that Unix has a large percentable of the operating system
market).





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-19  0:00             ` Jason Alan Turnage
@ 1996-07-19  0:00               ` Robert Dewar
  1996-07-20  0:00                 ` TRAN PHAN ANH
                                   ` (3 more replies)
  1996-07-22  0:00               ` Stephen M O'Shaughnessy
  1 sibling, 4 replies; 688+ messages in thread
From: Robert Dewar @ 1996-07-19  0:00 UTC (permalink / raw)



"
        Never, never, never try to start learning a language before you
        learn how to program.  A good algorithm simply cannot be replaced,
        and learning how to write an alogrithm is in programming, not
        in learning a language.  You can sit down and read a hundred books
        about how to use pointers and linked lists in c++, and you still
        won't know how to use them in a good manner, if at all."


I am very familiar with the highly unusual approach Georgia Tech takes, but
I find the above remark rubbish. You cannot express algorithms unless you
use a language to express them in, and for my taste, a well chosen 
programming language is as good choice as anything.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-18  0:00         ` Jason Alan Turnage
@ 1996-07-19  0:00           ` Vic Metcalfe
  1996-07-19  0:00           ` Robert Dewar
  1 sibling, 0 replies; 688+ messages in thread
From: Vic Metcalfe @ 1996-07-19  0:00 UTC (permalink / raw)



Jason Alan Turnage (turnage@cc.gatech.edu) wrote:
: Patrick Horgan (patrick@broadvision.com) wrote:

: : If C++ is not a choice for you learn C.  Pascal won't get you a job.
:    Unless you want to be a high school computer teacher (snicker, snicker)

My company does development in Delphi, which is a mutant version of Pascal.
We recently tried to hire additional Delphi programmers, but found that
none who were at all qualified could be found.  We would have been happy
to find any programmer that knew Pascal as well as most good programmers 
know C.

I didn't see in your orignal post what platform you are learning on, but
if it is MS-Windows, and you have access to Delphi, there are plenty of
jobs out there.  On the other hand, if you are using unix, and have a
standard or extended pascal compiler, I wouldn't waste my time with it,
but would learn other more common languages.

Good luck,
  Vic.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-18  0:00         ` Robert Dewar
@ 1996-07-19  0:00           ` Billy Chambless
  0 siblings, 0 replies; 688+ messages in thread
From: Billy Chambless @ 1996-07-19  0:00 UTC (permalink / raw)



In article <dewar.837728071@schonberg>, dewar@cs.nyu.edu (Robert Dewar) writes:
 
|> Note the phrase "C++ programmer"
 
|> the second word is by FAR the most important. Concentrate on learning the
|> basic principles of computer science and software engineering. The language
|> you learn at first is not so important, and I would say Pascal is probably

Amen! Any good programmer can learn any new langauge that coems along.
The danger new programmers often fall into is that of learning to be a
"foo programmer", where foo is the language du jour.

The technology changes to fast to be locked into one mode. C++ is
meaga-way-cool today, but remember, C was the end-all a while back, and
COBOL before that.

Don't strive to be a C++ Programmer or a Java Programmer -- become a
Good Programmer.
-- 
* Billy Chambless  | billy@cast.msstate.edu |  voice: 601-688-7608
* Mississippi State University Center for Air-Sea Technology 
* "And I don't like doing silly things (except on purpose)."
*                  --Larry Wall in <1992Jul3.191825.14435@netlabs.com>





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-19  0:00           ` johnf
@ 1996-07-19  0:00             ` Jeremy Nelson
  1996-07-19  0:00             ` Jason Alan Turnage
                               ` (3 subsequent siblings)
  4 siblings, 0 replies; 688+ messages in thread
From: Jeremy Nelson @ 1996-07-19  0:00 UTC (permalink / raw)



johnf <johnf@nando.com> wrote:
>So, I am I only learning C, and not "how to program"? I don't understand
>how the two can be exclusive.

"Programming" is seeing a problem and knowing how to construct a
language-independant algorithm for solving the problem.  Ie, programming
is problem solving, with the added value that you understand how to make
a computer help you do the work.

"Learning C" is understanding the syntax and semantics of a specific
computer programming language without understanding the nuances of how
it can be applied to solve larger problems that may transcend the ability
of that language.


If you know how to program, then you already know how to determine which
language is the right tool for the problem.  Not every problem can be solved
in every language, and knowing which one to use at which time is fundamental.

If you know how to write semantically correct C programs, but you dont have
any idea what to do if you are confronted with a problem that you cant solve
using C, then you really dont have the "programming" side of it.

As warped as it sounds, i knew several languages, but i never really
became a "programmer" until i took a fortran class where the whole idea was
using the computer to do problem solving.

jfn




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
       [not found] <sperlman-0507961717550001@p121.ezo.net>
                   ` (4 preceding siblings ...)
  1996-07-19  0:00 ` Andrew Gierth
@ 1996-07-19  0:00 ` Andrew Gierth
  1996-07-21  0:00 ` Laurent Guerby
                   ` (14 subsequent siblings)
  20 siblings, 0 replies; 688+ messages in thread
From: Andrew Gierth @ 1996-07-19  0:00 UTC (permalink / raw)




[This thread should never have been injected into comp.unix.programmer.
Would respondents *please* take that into account. Followups set.]

brashear@ns1.sw-eng.falls-church.va.us wrote:
>In article <4sjmtk$e95@herald.concentric.net>,
>Mark  McKinney  <mckmark@mail.concentric.net> wrote:
>>
>>This raises a big concern I have always had about how programming is taugh 
>>in general. Problem solving techniques, style, methodologies etc. should 
>>be taught or learned prior to a programming language. The "this is how you 
>>do it and then this is how yu do it well" approach seems highly 
>>ineffective. 
>>-Mark
>>
>>
>
>This reminds me of the high school English teacher who said "Teach them
>grammar in elementary school, and I'll teach them how to write (compose)."
>
>How do you learn grammar without writing (composing)?  How do you learn
>problem solving techniques, style, methodologies, etc. without actually
>solving problems, creating programs according to certain styles, using
>a programming language to apply a methodology?  Might as well try to teach
>a child the mathematical discipline of knot theory before teaching her how
>to tie her shoes!
>
>Phil B




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
       [not found] <sperlman-0507961717550001@p121.ezo.net>
                   ` (3 preceding siblings ...)
  1996-07-19  0:00 ` Andrew Gierth
@ 1996-07-19  0:00 ` Andrew Gierth
  1996-07-19  0:00 ` Andrew Gierth
                   ` (15 subsequent siblings)
  20 siblings, 0 replies; 688+ messages in thread
From: Andrew Gierth @ 1996-07-19  0:00 UTC (permalink / raw)




[This thread should never have been injected into comp.unix.programmer.
Would all respondents *please* take that into account. Followups set.]

jtbell@presby.edu wrote:
> Robert Dewar <dewar@cs.nyu.edu> wrote:
>>First, the business world uses many languages -- even today far more programs
>>are written in COBOL than in C and C++ combined by a very large margin.
>
>Just out of curiosity, how much *new* development takes place in COBOL, 
>as opposed to maintainance and extension of existing systems?   This does 
>not imply that I'm downgrading maintainance, by the way.  I've done some 
>of it (although not in COBOL), and I know that it can be a real challenge.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
       [not found] <sperlman-0507961717550001@p121.ezo.net>
                   ` (2 preceding siblings ...)
  1996-07-17  0:00 ` Aron Felix Gurski
@ 1996-07-19  0:00 ` Andrew Gierth
  1996-07-19  0:00 ` Andrew Gierth
                   ` (16 subsequent siblings)
  20 siblings, 0 replies; 688+ messages in thread
From: Andrew Gierth @ 1996-07-19  0:00 UTC (permalink / raw)



[This thread should never have been injected into comp.unix.programmer.
Would respondents *please* take that into account. Followups set.]

Craig Franck <clfranck@worldnet.att.net> wrote:
>Just an observation about people you argue (excuse me discuss)  
>programing languages. The people who know and use several languages 
>seem much less inclined to declare any one the best, while those who
>know or use only one language seem to think its always the best. 
>My answer to the question would be with another question :
>Why not learn both?

>Since pascal is like so many other languages its good to in case 
>you ever need to learn something like ada or join a team who is 
>going to use Borland's Delphi to develope database applications.

>Also since c is so unlike so many other langauages ie. its terse
>syntax and heavy use of operators as oposed to keywords, that its
>possible after years of heavy use to become "syntacticly crippled".
>I know some excellent, highly intelligent programmers who can not
>look at a COBOL listing with out having to sit down and stare out
>the window just to get reorientated to reality...






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
       [not found]         ` <01bb7591$83087d60$87ee6fce@timpent.airshields.com>
  1996-07-19  0:00           ` johnf
@ 1996-07-19  0:00           ` Craig Franck
  1 sibling, 0 replies; 688+ messages in thread
From: Craig Franck @ 1996-07-19  0:00 UTC (permalink / raw)



Just an observation about people you argue (excuse me discuss)  
programing languages. The people who know and use several languages 
seem much less inclined to declare any one the best, while those who
know or use only one language seem to think its always the best. 
My answer to the question would be with another question :
Why not learn both?

Since pascal is like so many other languages its good to in case 
you ever need to learn something like ada or join a team who is 
going to use Borland's Delphi to develope database applications.

Also since c is so unlike so many other langauages ie. its terse
syntax and heavy use of operators as oposed to keywords, that its
possible after years of heavy use to become "syntacticly crippled".
I know some excellent, highly intelligent programmers who can not
look at a COBOL listing with out having to sit down and stare out
the window just to get reorientated to reality...






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-19  0:00           ` johnf
  1996-07-19  0:00             ` Jeremy Nelson
@ 1996-07-19  0:00             ` Jason Alan Turnage
  1996-07-19  0:00               ` Robert Dewar
  1996-07-22  0:00               ` Stephen M O'Shaughnessy
  1996-07-20  0:00             ` Tim Behrendsen
                               ` (2 subsequent siblings)
  4 siblings, 2 replies; 688+ messages in thread
From: Jason Alan Turnage @ 1996-07-19  0:00 UTC (permalink / raw)



johnf (johnf@nando.com) wrote:

: OK

: I am one of these newbies.
: I haven't programmed anything, ever, with any language.
: I am currently learning C with the help of Dave Mark (Learn C on Mac) as
: my baptism into programming. 
: So, I am I only learning C, and not "how to program"? I don't understand
: how the two can be exclusive.
: How does one learn how to be a "Good Programmer" without picking a
: language to learn first, learning it well, then learning others as they
: interest you? 

	Never, never, never try to start learning a language before you
	learn how to program.  A good algorithm simply cannot be replaced,
	and learning how to write an alogrithm is in programming, not
	in learning a language.  You can sit down and read a hundred books
	about how to use pointers and linked lists in c++, and you still
	won't know how to use them in a good manner, if at all.

	Here at GA Tech, in the first computer course you can take, a
	pseudo language is taught so that the real objective of the class,
	algorithm teaching, won't be disturbed by students trying to jump
	the gun and diving in to any one language.  This pseudo language
	is a mixture of c, pascal, and a few other languages, along with
	some non-language stuff like 'var1 isa number', so the non-programmers
	can understand it.  Probably half or more students going into
	a programming major here have never programmed before, and would be
	totally lost in the second computer programming course (programming
	in pascal) if they hadn't taken this course first.

: I am not trying to be a wise guy, just a guy who can learn to program well
: enough to get out of his crappy job and into this (for me) exciting field
: as a career.
: I don't expect to start as the Sr. Developer on some project, I will
: happily slog it out in the trenches and pay my dues, just explain to me
: how to get there...

	The best programmers can do anything on a computer, absolutely
	anything that the computer will let them do, and sometimes even
	more.  And these are the guys that make the money.  And the only
	way to become one of these guys is to learn how to program.
	Once you learn how to program, a language is no problem, it's the
	easy part.  If you start with learning a language, you'll have one
	hell of a time if you ever have to learn another language, or better
	yet, to convert your current code to another language.

	Good luck.

--
Jason Turnage
Georgia Tech
turnage@cc.gatech.edu
www.prism.gatech.edu/~gt8678a




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-18  0:00       ` Carlos DeAngulo
  1996-07-18  0:00         ` Robert Dewar
       [not found]         ` <01bb7591$83087d60$87ee6fce@timpent.airshields.com>
@ 1996-07-19  0:00         ` Dirk Dickmanns
  2 siblings, 0 replies; 688+ messages in thread
From: Dirk Dickmanns @ 1996-07-19  0:00 UTC (permalink / raw)



"Carlos DeAngulo" <cdvi@msg.itg.ti.com> writes:

>You should definitely learn C/C++. The business world today uses C++ as its
>power language to develop the finest applications. Don't let anyone guide
>you wrong.

Yes, and don't look at anything else, it could happen that you
would learn something about programming!

The world does _NOT_ end at C or C++, even if Carlos says so.

Dirk

--
Dirk Dickmanns -- REALIS -- real-time dynamic computer vision
Sun OS 4.1.3; PC Linux; Ada, OCCAM, C, Eiffel, PROLOG, C++




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-17  0:00         ` Mark McKinney
@ 1996-07-19  0:00           ` Philip Brashear
  1996-07-23  0:00             ` John A Hughes
  0 siblings, 1 reply; 688+ messages in thread
From: Philip Brashear @ 1996-07-19  0:00 UTC (permalink / raw)



In article <4sjmtk$e95@herald.concentric.net>,
Mark  McKinney  <mckmark@mail.concentric.net> wrote:
>
>This raises a big concern I have always had about how programming is taugh 
>in general. Problem solving techniques, style, methodologies etc. should 
>be taught or learned prior to a programming language. The "this is how you 
>do it and then this is how yu do it well" approach seems highly 
>ineffective. 
>-Mark
>
>

This reminds me of the high school English teacher who said "Teach them
grammar in elementary school, and I'll teach them how to write (compose)."

How do you learn grammar without writing (composing)?  How do you learn
problem solving techniques, style, methodologies, etc. without actually
solving problems, creating programs according to certain styles, using
a programming language to apply a methodology?  Might as well try to teach
a child the mathematical discipline of knot theory before teaching her how
to tie her shoes!

Phil B





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-18  0:00       ` Patrick Horgan
                           ` (3 preceding siblings ...)
  1996-07-19  0:00         ` Scott McMahan - Softbase Systems
@ 1996-07-19  0:00         ` Reto Koradi
  1996-07-23  0:00           ` TRAN PHAN ANH
  4 siblings, 1 reply; 688+ messages in thread
From: Reto Koradi @ 1996-07-19  0:00 UTC (permalink / raw)



Patrick Horgan wrote:
> In my company and in many other startups in Silicon Valley doing
> the bleeding edge work in the newest cool stuff, you can't get a
> job without being a C++ programmer, period.

Such statements keep irritating me. Programming languages are nothing
but tools, and if you know the principles of programming and have
learned a few other languages, you start programming in C on the
first day, and learn about the more subtle points with time.
I grew up with the Pascal/Modula-2/Oberon line (what do you expect
when studying at Wirth's university?), and didn't have the
slightest problem programming in C when I started my first job.

Even though C and C++ dominate the workplace, languages like Modula-2
are still much better for learning programming.
-- 
Reto Koradi (kor@mol.biol.ethz.ch, http://www.mol.biol.ethz.ch/~kor)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-18  0:00       ` Patrick Horgan
                           ` (2 preceding siblings ...)
  1996-07-19  0:00         ` Andrew Gierth
@ 1996-07-19  0:00         ` Scott McMahan - Softbase Systems
  1996-07-20  0:00           ` steidl
  1996-07-20  0:00           ` Tim Behrendsen
  1996-07-19  0:00         ` Reto Koradi
  4 siblings, 2 replies; 688+ messages in thread
From: Scott McMahan - Softbase Systems @ 1996-07-19  0:00 UTC (permalink / raw)



Patrick Horgan (patrick@broadvision.com) wrote:

: Of course I think you should learn at least seven or eight high level languages
: just for fun 

Once you know C, Pascal and other structured languages
aren't much of a challenge. Perl is very similar. You can
learn 5-6 languages just from C and a little study. Throw
in C++ and couple of others and you'll be all set.

: and five or six assemblers for the same reason.

I'd rather just learn C and port it. Asm isn't as important
anymore now that there's so many different platforms.

Scott





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
       [not found]         ` <01bb7591$83087d60$87ee6fce@timpent.airshields.com>
@ 1996-07-19  0:00           ` johnf
  1996-07-19  0:00             ` Jeremy Nelson
                               ` (4 more replies)
  1996-07-19  0:00           ` Craig Franck
  1 sibling, 5 replies; 688+ messages in thread
From: johnf @ 1996-07-19  0:00 UTC (permalink / raw)



In article <01bb7591$83087d60$87ee6fce@timpent.airshields.com>, "Tim
Behrendsen" <tim@airshields.com> wrote:

>Carlos DeAngulo <cdvi@msg.itg.ti.com> wrote in article
><01bb74ac$b7aa7860$7b91f780@deangulo>...
>> You should definitely learn C/C++. The business world today uses C++ as
>its
>> power language to develop the finest applications. Don't let anyone guide
>> you wrong.
>
>Not to start a flame war on C++, but all you newbie programmers
>out there, don't believe everything you hear about C++.  Object
>oriented programming has a lot of good concepts, but C++ is a bad
>implementation of them.  Not that you shouldn't learn it, but
>don't think it's the ultimate expression of what OOP is all about.
>
>C++: The PL/I of the 90s.

OK

I am one of these newbies.
I haven't programmed anything, ever, with any language.
I am currently learning C with the help of Dave Mark (Learn C on Mac) as
my baptism into programming. 
So, I am I only learning C, and not "how to program"? I don't understand
how the two can be exclusive.
How does one learn how to be a "Good Programmer" without picking a
language to learn first, learning it well, then learning others as they
interest you? 
I am not trying to be a wise guy, just a guy who can learn to program well
enough to get out of his crappy job and into this (for me) exciting field
as a career.
I don't expect to start as the Sr. Developer on some project, I will
happily slog it out in the trenches and pay my dues, just explain to me
how to get there...

Thank you,

Johnf

-- 
johnf@nando.com

Go Falcons




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-19  0:00           ` Robert Dewar
@ 1996-07-20  0:00             ` steved
  1996-07-19  0:00               ` Peter Seebach
  1996-07-23  0:00             ` Ralph Silverman
  1 sibling, 1 reply; 688+ messages in thread
From: steved @ 1996-07-20  0:00 UTC (permalink / raw)



Robert Dewar writes:
[snip]
>I still think the important thing at the start is to learn how to program. It
>is worth using a language that is rich enough to introduce all the necessary
>abstraction concepts (Borland Object Pascal, Ada 95, C++ meet this criterion,
>this is not a complete list of course, but you get the idea). It is a 
>mistake to learn C to start with, since it lacks critical abstraction
>features and so you will tend to miss the importance of data abstraction
>and parametrization at the module level (it is not that this cannot be done
>in C, just that you are unlikely to learn it if you start by learning C).

Agreed.  I have an academic background in Computer Information Science and
Physics.  Since then I have worked as a "Software Engineer" for the last ten
years.

Physics teaches problem solving.  This involves breaking things down in an
abstract manner to learn things about the physical world.

IMHO Computer Science (really engineering) is best applied by breaking a
a problem down to an abstract description, and then designing a software
solution in terms of that abstraction.

Pascal, Ada, Modula-2 (and more I'm sure) permit you to write software in
what I would call "the problem domain".

In my experience, people with a 'C' only background have great difficulty in
understanding this form of abstraction.

In 'C' I find that programmers talk about the types "integer", and "double".
In Pascal I find that programmers talk about "inches" and "liters".

I also think it's a lot easier to go from Pascal to C than from C to Pascal,
although I did the first so it's hard to say.

Steve Doiel







^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-19  0:00               ` Robert Dewar
  1996-07-20  0:00                 ` TRAN PHAN ANH
@ 1996-07-20  0:00                 ` Jon Bell
  1996-07-20  0:00                   ` Robert Dewar
                                     ` (3 more replies)
  1996-07-20  0:00                 ` Crash
  1996-07-23  0:00                 ` Ralph Silverman
  3 siblings, 4 replies; 688+ messages in thread
From: Jon Bell @ 1996-07-20  0:00 UTC (permalink / raw)



[removed comp.unix programmer (irrelevant) and comp.dos.programmer 
(bogus) 'from Newsgroups:' line; added comp.edu]

 Robert Dewar <dewar@cs.nyu.edu> wrote:
>  You cannot express algorithms unless you
>use a language to express them in, and for my taste, a well chosen 
>programming language is as good choice as anything.

For *my* taste, a real programming language is *better*, because you can 
test the correctness of your solution by executing it.  Try *that* with 
pseudocode or data flow diagrams!  :-)

(Which is not to downgrade their usefulness as design tools.)

I do appreciate the counter-argument that it's asking a lot of students 
to master both program design skills and language syntax at the same 
time.  Therefore I respect the decisions made by schools that use a
"non-marketable" language such as Modula-"n" or Scheme as their first 
programming language.  I think it is a reasonable strategy to start with 
such a language, then switch to a "real" language later, in the context 
of a four-year degee program.  Of course, students may need to be 
persuaded that this is actually worthwhile!

-- 
Jon Bell <jtbell@presby.edu>                        Presbyterian College
Dept. of Physics and Computer Science        Clinton, South Carolina USA
[for beginner's Usenet info, see http://cs1.presby.edu/~jtbell/usenet/]




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-19  0:00               ` Peter Seebach
@ 1996-07-20  0:00                 ` Jon Bell
  1996-07-20  0:00                   ` Andy Askey
  1996-07-20  0:00                 ` Robert Dewar
  1 sibling, 1 reply; 688+ messages in thread
From: Jon Bell @ 1996-07-20  0:00 UTC (permalink / raw)



 Peter Seebach <seebs@solon.com> wrote:
>I don't think it's fair or appropriate to blame C for the vast population of
>idiots who get involved with it.  It's an excellent language in some ways,
>which led to its eventual success in some fields, which led to a lot of utter
>morons trying to use it, or write books about it.

It also led to C being used as a "Swiss-army-knife" language, often in 
areas in which another language would be a better choice.

>This doesn't mean it can't be used well; although C doesn't always help you
>write good code, it certainly doesn't prevent you from doing so.

Agreed.  By the same token, it's perfectly possible to design good
programs and write good code in older versions of Fortran or BASIC.  But
you have to take a lot more care when doing this; you have to invent your
own "building blocks" for good design, and you have to use them
consistently.  Languages with good modularizing features built into them
make this easier. 

When someone is beginning to program, it's hard enough already to get 
them to use these modularizing features, even when they're built in, 
because they require some (gasp!) thought and planning before coding.  
When they have to "simulate" these features using lower-level constructs, 
it just makes more work for them, and they resist even more strongly.

-- 
Jon Bell <jtbell@presby.edu>                        Presbyterian College
Dept. of Physics and Computer Science        Clinton, South Carolina USA
[for beginner's Usenet info, see http://cs1.presby.edu/~jtbell/usenet/]




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-19  0:00               ` Peter Seebach
  1996-07-20  0:00                 ` Jon Bell
@ 1996-07-20  0:00                 ` Robert Dewar
  1996-07-22  0:00                   ` steidl
  1 sibling, 1 reply; 688+ messages in thread
From: Robert Dewar @ 1996-07-20  0:00 UTC (permalink / raw)



Peter says

"This doesn't mean it can't be used well; although C doesn't always help you
write good code, it certainly doesn't prevent you from doing so."

Although I agree that it does not *prevent* you, it certainly makes it
hard, and what we see in practice is that few programmers persevere
through the road blocks.

For example, headers can almost be used like package specs. They very 
seldom are (it is common for example to find uncommented headers).
But if you use headers as package specs, you find you have a nasty
problem if these headers define data, since the notion of macro copying
is not quite right. Yes, you can get around this, but most people don't
bother to do it systematically.

As anoher example, the lack of named types that are well separated means
that you gain much less from

   typedef int Inches;

than from (in Ada)

   type Inches is new Integer;

since in C you can assign Inches to int (or anything else) anyway. The
result, alhough it would still be useful for documentation and specification
purposes to introduce a typedef like the above, most C code does not bother.

In general both data abstraction and separation of interfaces and 
implementation do not come very naturally in a C environment. Now if
you have learned these principles in the context of another language
like Ada 95, then you will find that you can indeed apply them in C,
but people who learn C as their only, or first language, if not taught
*really* well are likely to establish their understanding of what
programming is about with a complete lack of understanding of these
fundamental issues.

Languages are more than just a set of syntax and semantic rules, they also
come with a "typical style" of writing code. This style is partly a matter
of tradition, but is strongly influenced by the design of the syntax and
semantics.

In language comparisons and discussions, people often trot out arguments
that boil down to the following:

   Anyone can write bad code in style x in language y for all x,y
   Anyone can use good technique x in language y for all x,y

These statements are generally true for any language other than a very
trivial one, or for techniques that are very specifically geared to 
language features.

However, they do not address the real issue, which is what is he typical
style of writing code in a given language and why? For the majority of 
programs, these are the real issues, and what we look for are languages
where the typical style of writing in the language embodies the principles
and techniques that we would like to encourage.

One clue of the importance of these typical style issues is the difficulty
of writing effective automatic translation programs from one language to
another. It is not too hard to write a translator that translates one
computer language to another, but to write one that takes a program written
in the typical style of language X and produces an idiomatic program written
in the typical style of langauge Y is extremely difficult. 

In fact I would venture to guess that at this stage translation of natural
languages from one to another is more successful than translation of
computer programming languages. That's because the entire strucure of
a program can be altered as a result of the shift of styles.

Note that this applies to a human too. Someone who knows French and 
English really well can take a novel written in one language, and produce
an acceptable idiomatic translation by translating sentence by sentence.

But even if you know C and Ada well, you will typically find that translating
a C program into Ada will involve major restructuring, and cannot be done
on a statement by statement basis.






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-19  0:00           ` johnf
  1996-07-19  0:00             ` Jeremy Nelson
  1996-07-19  0:00             ` Jason Alan Turnage
@ 1996-07-20  0:00             ` Tim Behrendsen
  1996-07-22  0:00             ` Ralph Silverman
  1996-07-23  0:00             ` John A Hughes
  4 siblings, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-07-20  0:00 UTC (permalink / raw)



johnf <johnf@nando.com> wrote in article
<johnf-1907961506170001@johnf-mac.nando.com>...
> In article <01bb7591$83087d60$87ee6fce@timpent.airshields.com>, "Tim
> Behrendsen" <tim@airshields.com> wrote:
> 
> >Carlos DeAngulo <cdvi@msg.itg.ti.com> wrote in article
> ><01bb74ac$b7aa7860$7b91f780@deangulo>...
> >> You should definitely learn C/C++. The business world today uses C++
as
> >its
> >> power language to develop the finest applications. Don't let anyone
guide
> >> you wrong.
> >
> >Not to start a flame war on C++, but all you newbie programmers
> >out there, don't believe everything you hear about C++.  Object
> >oriented programming has a lot of good concepts, but C++ is a bad
> >implementation of them.  Not that you shouldn't learn it, but
> >don't think it's the ultimate expression of what OOP is all about.
> >
> >C++: The PL/I of the 90s.
> 
> OK
> 
> I am one of these newbies.
> I haven't programmed anything, ever, with any language.
> I am currently learning C with the help of Dave Mark (Learn C on Mac) as
> my baptism into programming. 
> So, I am I only learning C, and not "how to program"? I don't understand
> how the two can be exclusive.
> How does one learn how to be a "Good Programmer" without picking a
> language to learn first, learning it well, then learning others as they
> interest you? 
> I am not trying to be a wise guy, just a guy who can learn to program
well
> enough to get out of his crappy job and into this (for me) exciting field
> as a career.
> I don't expect to start as the Sr. Developer on some project, I will
> happily slog it out in the trenches and pay my dues, just explain to me
> how to get there...
> 
> Thank you,
> Johnf
> 
> -- 
> johnf@nando.com

I will go so far as to say when you are learning to program, it doesn't
matter which language you learn. The important things to learn are
conceptual, not syntactical. In other words, you want to learn how
the computer works, and to understand the "mechanistic nature" of
programs. The actual details of the language are not as important.

If you want to program recreationally, then what you're doing is fine.

If you are really serious about programming as a career, and you want
to be ahead of 99% of all the other programmers, then let me make
a suggestion: Learn assembly language. It will probably be a bit more
confusing at the beginning, but start small and work your way up.
I guarantee you will learn more about programming than *any*
high-level language you could learn. After you gain confidence there,
you will truly understand how high-level languages *work*.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-20  0:00         ` TRAN PHAN ANH
@ 1996-07-20  0:00           ` Mark Eissler
  1996-07-25  0:00             ` Erik Seaberg
  1996-07-26  0:00             ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Tim Behrendsen
  1996-07-20  0:00           ` Should I learn C or Pascal? Robert Dewar
                             ` (2 subsequent siblings)
  3 siblings, 2 replies; 688+ messages in thread
From: Mark Eissler @ 1996-07-20  0:00 UTC (permalink / raw)



In article <1996Jul20.124025.122789@kuhub.cc.ukans.edu>,
anh@kuhub.cc.ukans.edu (TRAN PHAN ANH) wrote:

> Absolutely right.
> 
> But Pascal or C was the original question.  Start with C is what I say.  
> Better yet, why not C++ then move on to JAVA?
> 
> Besides, if you can master C/C++, and JAVA, it will take you 5 min. to learn 
> Pascal.  Actually, if one has a solid foundation in programming techniques and
> a solid understanding of one or two languages, one can aquire a working 
> knowledge of any language in no time.
> 
> From my point of view, right now, C/C++, and JAVA on a resume is hotter 
> than Pascal.
> 

Yes, but since just about everyone else has said something I'd say follow
this path BASIC -> Pascal -> C -> C++ -> JAVA. 

Jumping from nothing to C is likely to end up with nowhere. Pascal is a
simple language that will teach you certain things that will make it
easier to adapt to or "learn" C. While Pascal may not be used all that
much at a commercial level, it is still an excellent teaching language and
therefore a good place to start. 

Adapting to C is a breeze once you know Pascal. Of course, having had some
passing familarity with the concepts of programming (possibly picked up
from BASIC) will help a great deal. 

Becoming a "programmer" is something that happens in steps, not all at
once. And the learning continues as the languages evolve.

--
Mark Eissler                      |  Still in Toronto.
tequila@interlog.com              |  Still Raining.    http://www.interlog.com/~tequila/ |  Still...






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-20  0:00                 ` Crash
@ 1996-07-20  0:00                   ` Robert Dewar
  0 siblings, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-07-20  0:00 UTC (permalink / raw)



"An algorithm is just a set of instructions on how to accomplish a task. An
algorithm does not depend at all on any sort of programming "language". In
fact, I was taught that you should write out your "game plan" and psuedo
code before you even touch any sort of syntax in any language. Yes, I do
tend to write up my psuedo-code in a C-like syntax frequently but you
don't need to.

     Loop until variable is equal to 7
        Add 1 to variable
        Print out the following string to screen: ;-)
        end of loop

There's an algorithm written in plain english. It's a stupid algorithm,"

First of all "pseudo-code" is still written in a language, which has syntax
and semantics. The trouble with an algorithm written in "plain english"
like the above is that it is imprecise. For example does the string that
is written to screen include a leading blank? I have no idea from the above.

If variable is 2**32-1, does the above still work? I have no idea.

Are the strings on separate lines of the screen? I have no idea.

writing

   while variable /= 7 loop
      variable := variable + 1;
      put_line (";-)");
   end loop;

is less writing than you did, just as clear, and importantly, exactly
precise (I had to make some guesses as to what your intention was, but
if my guesses were wrong, the above is easily modified to accomodate
them).

Writing at a high semantic level is desirable, and your example would
have been much more effective if for example it had included universal
quantification over finite sets, but that too would better be written
in a well-defined language.

If you went a step further, and wrote non-effective computation steps
involving for example membership in infinite sets, then of course what
you wrote would not correspond to an executable program, but would still
be better written in a precise language (e.g. one of the many
specification languages like Z)

Writing in "plain english" may appeal to those who are petrified of
formal notation of any kind, but such people are not likely to get
very far in this field anyway!





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-19  0:00         ` Scott McMahan - Softbase Systems
  1996-07-20  0:00           ` steidl
@ 1996-07-20  0:00           ` Tim Behrendsen
  1996-07-21  0:00             ` Rich Maggio
  1996-07-22  0:00             ` Ralph Silverman
  1 sibling, 2 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-07-20  0:00 UTC (permalink / raw)



Scott McMahan - Softbase Systems <softbase@mercury.interpath.com> wrote in
article <4sokr1$4c9@news.interpath.net>...
> Patrick Horgan (patrick@broadvision.com) wrote:
> 
> 
> : and five or six assemblers for the same reason.
> 
> I'd rather just learn C and port it. Asm isn't as important
> anymore now that there's so many different platforms.
> 
> Scott

Big disagreement ... assembler is most critical thing any
programmer can learn.  Not because you're going to use it
every day, but because it will teach you more about what's
*really* going on than 10 high-level language classes.

That's like saying that since most Electronic Engineers use
ICs, it's not necessary to learn the fundamentals of resisters,
capaciters, and transisters.

Programmers who do not assembly language are dangerous,
because they do not fundamentally understand what the
compiler is generating. They believe in "The Myth of the
Optimizing Compiler", that "compilers are so good nowadays
that you don't have to worry about writing efficient code.
If you do have to worry, then get a better compiler."




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-20  0:00           ` Andy Askey
@ 1996-07-20  0:00             ` steidl
  1996-07-21  0:00               ` Andy Askey
  0 siblings, 1 reply; 688+ messages in thread
From: steidl @ 1996-07-20  0:00 UTC (permalink / raw)



In <4srb1i$n04@news-e2c.gnn.com>, ajaskey@gnn.com (Andy Askey) writes:
>anh@kuhub.cc.ukans.edu (TRAN PHAN ANH) wrote:
>
>>Absolutely right.
>
>>But Pascal or C was the original question.  Start with C is what I say.  
>>Better yet, why not C++ then move on to JAVA?
>
>>Besides, if you can master C/C++, and JAVA, it will take you 5 min. to learn 
>>Pascal.  Actually, if one has a solid foundation in programming techniques and
>>a solid understanding of one or two languages, one can aquire a working 
>>knowledge of any language in no time.

It seems naive to think that one could have a "solid foundation in
programming techniques" if one only has a "solid understanding of one
or two languages" such as C/C++ and JAVA.  How can you obtain a solid
foundation of making use of dynamic types if you haven't had a lot of
practice with SmallTalk or some other language that supports it?  How
can you have a solid foundation in lazy evaluation if you don't have
a decent amount of experience with Haskell or some other language that
supports it?  How can you have a solid foundation in non-procedural
programming if you've only used procedural languages?  What about
functional programming, reflective programming, higher-order functions,
continuation semantics, syntactical pattern matching, metaclasses?
Yes, most of these can be emulated in C/C++, but not easily, and you
are not likely to be well versed in their use if C/C++/JAVA are the
only languages you've really used.

The statement "one can aquire a working knowledge of any language in no
time" also seems quite naive.  The languages that do support these
features are not as easy to pick up as you might think - using them to
their fullest requires learning substantially different approaches to
programming problems, just as OOP C++ requires a significantly
different approach than non-OOP C (or non-OOP C++).  And the learning
that comes out of taking that different approach is what will make you
a better programmer.

>>From my point of view, right now, C/C++, and JAVA on a resume is hotter 
>>than Pascal.
>>Anh

That's true, but how long will C++ be king of the hill?  Besides, when
I interview someone for a job, I don't care if their C++ specific
skills are at 100% or not.  I have even let interviewees answer
interview questions using their language of choice because I know that
picking up syntax is simple, but learning concepts is hard (and some
people never learn them).

>Absolutely right, Anh.  I started with Fortran, taught myself C and
>C++ in about a month or so, and then picked up Ada in a couple weeks.
>
>Andy Askey

If you think you learned C++ (and OOP) in a month after only knowning
ForTran, then either you are fooling yourself, or you're a hell of a
lot smarter then me or anyone else I know.  Most of the syntax can be
learned in a week or less.  But OOP (and some of the more obscure
facets of C++) take much longer.

As to the original question, if it is an either-or situation and you
are only considering strict Pascal, then yes C++ is probably the better
choice.  If you allow for Pascal variants (such as Delphi) or more
recent versions of Pascal (i.e. Modula 3), then the choice is not so
clear cut.  If your only goal in life is to get a job [:-(], then C++
probably still edges out the newer Pascals.  But that question is moot
and not even worth arguing since no aspiring programmer should limit
themselves to only one language.


-Jeff

steidl@centuryinter.net - http://www.dont.i.wish.com/
All opinions are my own, and are subject to change without notice.






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-19  0:00               ` Robert Dewar
@ 1996-07-20  0:00                 ` TRAN PHAN ANH
  1996-07-22  0:00                   ` Ralph Silverman
  1996-07-20  0:00                 ` Jon Bell
                                   ` (2 subsequent siblings)
  3 siblings, 1 reply; 688+ messages in thread
From: TRAN PHAN ANH @ 1996-07-20  0:00 UTC (permalink / raw)



In article <dewar.837815529@schonberg>, dewar@cs.nyu.edu (Robert Dewar) writes:
> "
>         Never, never, never try to start learning a language before you
>         learn how to program.  A good algorithm simply cannot be replaced,
>         and learning how to write an alogrithm is in programming, not
>         in learning a language.  You can sit down and read a hundred books
>         about how to use pointers and linked lists in c++, and you still
>         won't know how to use them in a good manner, if at all."
> 
> 
> I am very familiar with the highly unusual approach Georgia Tech takes, but
> I find the above remark rubbish. You cannot express algorithms unless you
> use a language to express them in, and for my taste, a well chosen 
> programming language is as good choice as anything.

Rubbish....absolutely.  I know some friendly dudes, who know all the bloody 
O(n) for every algorithms, but as soon as they sit down to implement 
one...oh opps...core dump.   Oh, to make matter worse, they choose algorithms 
based solely on O(n).

Anh 




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-19  0:00         ` Scott McMahan - Softbase Systems
@ 1996-07-20  0:00           ` steidl
  1996-07-20  0:00           ` Tim Behrendsen
  1 sibling, 0 replies; 688+ messages in thread
From: steidl @ 1996-07-20  0:00 UTC (permalink / raw)



In <4sokr1$4c9@news.interpath.net>, softbase@mercury.interpath.com (Scott McMahan - Softbase Systems) writes:
>Patrick Horgan (patrick@broadvision.com) wrote:
>
>: Of course I think you should learn at least seven or eight high level languages
>: just for fun 
>
>Once you know C, Pascal and other structured languages
>aren't much of a challenge. Perl is very similar. You can
>learn 5-6 languages just from C and a little study. Throw
>in C++ and couple of others and you'll be all set.

It is more challenging (and beneficial) if you pick languages that are
*not* like C/C++ when picking other languages to learn.  Just a few
examples:  SmallTalk, Haskell, Prolog.  Each of these has unique and
fundamental ways of doing things that are not a part of C or C++.

>: and five or six assemblers for the same reason.
>
>I'd rather just learn C and port it. Asm isn't as important
>anymore now that there's so many different platforms.

He didn't say to write your programs in assembler, just learn it.  It
can give you a better understanding of what's going on "under the hood"
and can thus improve your C/C++ programming.  (I personally found that
pointers were much easier to learn in 6809 than in C.)  Someone in the
comp.lang.c++ group recently asked why they were getting a stack
overflow when their procedure said they had sufficient stack space.  If
that person had been initiated into the world of assembly language
programming and interrupt handling, they would have known why their C++
program wasn't working.


-Jeff

steidl@centuryinter.net - http://www.dont.i.wish.com/
All opinions are my own, and are subject to change without notice.






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-20  0:00         ` TRAN PHAN ANH
  1996-07-20  0:00           ` Mark Eissler
@ 1996-07-20  0:00           ` Robert Dewar
  1996-07-22  0:00             ` TRAN PHAN ANH
  1996-07-23  0:00             ` Ken Garlington
  1996-07-20  0:00           ` Andy Askey
  1996-07-22  0:00           ` Stephen M O'Shaughnessy
  3 siblings, 2 replies; 688+ messages in thread
From: Robert Dewar @ 1996-07-20  0:00 UTC (permalink / raw)



Tran said

"Besides, if you can master C/C++, and JAVA, it will take you 5 min. to learn
Pascal.  Actually, if one has a solid foundation in programming techniques and
a solid understanding of one or two languages, one can aquire a working
knowledge of any language in no time."

Evenm allowing for a reasonable amount of rhetorical exaggeration, this is
false. First of all, there is no language C/C++, they are two quite
separate languages, and it definitely is NOT the case that if you have
learned one language that you can learn another in five minutes. Pretty
quickly sure, but not five minutes, for example, the notion of 
non-deterministic semantics for "and" in Pascal will be quite unfamiliar
to a C (or for that matter C++ programmer), and there are always enough
fine points like this to make it more than a trivial matter to become
a knowledgable programmer in a new language.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-19  0:00               ` Robert Dewar
  1996-07-20  0:00                 ` TRAN PHAN ANH
  1996-07-20  0:00                 ` Jon Bell
@ 1996-07-20  0:00                 ` Crash
  1996-07-20  0:00                   ` Robert Dewar
  1996-07-23  0:00                 ` Ralph Silverman
  3 siblings, 1 reply; 688+ messages in thread
From: Crash @ 1996-07-20  0:00 UTC (permalink / raw)



On 19 Jul 1996, Robert Dewar wrote:

><O<)"
><O<)        Never, never, never try to start learning a language before you
><O<)        learn how to program.  A good algorithm simply cannot be replaced,
><O<)        and learning how to write an alogrithm is in programming, not
><O<)        in learning a language.  You can sit down and read a hundred books
><O<)        about how to use pointers and linked lists in c++, and you still
><O<)        won't know how to use them in a good manner, if at all."
><O<)
><O<)
><O<)I am very familiar with the highly unusual approach Georgia Tech takes, but
><O<)I find the above remark rubbish. You cannot express algorithms unless you
><O<)use a language to express them in, and for my taste, a well chosen 
><O<)programming language is as good choice as anything.
><O<)
><O<)

An algorithm is just a set of instructions on how to accomplish a task. An
algorithm does not depend at all on any sort of programming "language". In
fact, I was taught that you should write out your "game plan" and psuedo
code before you even touch any sort of syntax in any language. Yes, I do
tend to write up my psuedo-code in a C-like syntax frequently but you
don't need to.

     Loop until variable is equal to 7
	Add 1 to variable
	Print out the following string to screen: ;-)
	end of loop

There's an algorithm written in plain english. It's a stupid algorithm,
I'll give you that, but one none the less and it's not written in any
programming "lanugage/syntax" I have ever worked with.






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-20  0:00         ` TRAN PHAN ANH
  1996-07-20  0:00           ` Mark Eissler
  1996-07-20  0:00           ` Should I learn C or Pascal? Robert Dewar
@ 1996-07-20  0:00           ` Andy Askey
  1996-07-20  0:00             ` steidl
  1996-07-22  0:00           ` Stephen M O'Shaughnessy
  3 siblings, 1 reply; 688+ messages in thread
From: Andy Askey @ 1996-07-20  0:00 UTC (permalink / raw)



anh@kuhub.cc.ukans.edu (TRAN PHAN ANH) wrote:

>Absolutely right.

>But Pascal or C was the original question.  Start with C is what I say.  
>Better yet, why not C++ then move on to JAVA?

>Besides, if you can master C/C++, and JAVA, it will take you 5 min. to learn 
>Pascal.  Actually, if one has a solid foundation in programming techniques and
>a solid understanding of one or two languages, one can aquire a working 
>knowledge of any language in no time.

>From my point of view, right now, C/C++, and JAVA on a resume is hotter 
>than Pascal.

>Anh


Absolutely right, Anh.  I started with Fortran, taught myself C and
C++ in about a month or so, and then picked up Ada in a couple weeks.

Andy Askey
--
May your karma be excellent for forgiving my spelling mishaps.

Andy Askey
ajaskey@gnn.com





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-20  0:00                 ` Jon Bell
@ 1996-07-20  0:00                   ` Andy Askey
  0 siblings, 0 replies; 688+ messages in thread
From: Andy Askey @ 1996-07-20  0:00 UTC (permalink / raw)



jtbell@presby.edu (Jon Bell) wrote:

> Peter Seebach <seebs@solon.com> wrote:
>>This doesn't mean it can't be used well; although C doesn't always help you
>>write good code, it certainly doesn't prevent you from doing so.

>Agreed.  By the same token, it's perfectly possible to design good
>programs and write good code in older versions of Fortran or BASIC.  But
>you have to take a lot more care when doing this; you have to invent your
>own "building blocks" for good design, and you have to use them
>consistently.  Languages with good modularizing features built into them
>make this easier. 

>When someone is beginning to program, it's hard enough already to get 
>them to use these modularizing features, even when they're built in, 
>because they require some (gasp!) thought and planning before coding.  
>When they have to "simulate" these features using lower-level constructs, 
>it just makes more work for them, and they resist even more strongly.

>-- 
>Jon Bell <jtbell@presby.edu>                        Presbyterian College
>Dept. of Physics and Computer Science        Clinton, South Carolina USA
>[for beginner's Usenet info, see http://cs1.presby.edu/~jtbell/usenet/]


I have found that there are always littlet nuances of algorithm
implementation depending on the language choosen.  While most well
designed algorithms can be implemented in any language, it is not
always a 1 to 1 maping when switching languages.  If you wish to see
the complexity of implementing a C or Fortran algorithm in Ada, take a
look at the book "Numerical Recipes in C" (or Fortran).  Try porting
these routines from one language to another.  You can usually do it,
but you always can find an neater approach in the different language.

So much for the standard languages.  If you begin to look at parallel
processing on multiple CPUs, then the algorithms are in no way
similar.  Computing an FFT on a set of data in Ada or C calls for one
or more loops of simplistic calculations and data moving.  If you
write the same code in parallel C on a massively parallel computer
with 16,000 processors, then no looping is required.

And with PCs moving to multiprocessors as the next great jump of in my
den technology,  algorithms will become more and more dependent on the
language and hardware used.  (Kinda like when computers were first
discovered -- I believe earth must be the cosmic junk yard of old
technology after other beings grow tired with "the next great thing")

Oh, by the way, learn C and not Pascal.  Everyone will hire a C
programmer but I see very few jobs for Pascal gurus.

--
May your karma be excellent for forgiving my spelling mishaps.

Andy Askey
ajaskey@gnn.com





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-17  0:00       ` David Verschoore
  1996-07-17  0:00         ` Anthony Kanner
  1996-07-17  0:00         ` Mark McKinney
@ 1996-07-20  0:00         ` TRAN PHAN ANH
  1996-07-20  0:00           ` Mark Eissler
                             ` (3 more replies)
  2 siblings, 4 replies; 688+ messages in thread
From: TRAN PHAN ANH @ 1996-07-20  0:00 UTC (permalink / raw)



Absolutely right.

But Pascal or C was the original question.  Start with C is what I say.  
Better yet, why not C++ then move on to JAVA?

Besides, if you can master C/C++, and JAVA, it will take you 5 min. to learn 
Pascal.  Actually, if one has a solid foundation in programming techniques and
a solid understanding of one or two languages, one can aquire a working 
knowledge of any language in no time.

From my point of view, right now, C/C++, and JAVA on a resume is hotter 
than Pascal.

Anh


In article <01bb73e3.1c6a0060$6bf467ce@dave.iceslimited.com>, "David Verschoore" <dversch@ibm.net> writes:
> [snip]
>> So please, don't trap yourself in the one-or-the-other mindset.  Learn
> both.  You 
>> will be a better programmer -- and a more valuable employee -- for it.
>> 
>> Lee Crites
>> Computer Mavericks
>> 
> Bravo!
> I would like to point out that a language is a tool.  Any tool used
> improperly will give less than expected results.
> What makes a good programmer is not necessarily the language but the
> technique in which it is used. Lean a 
> language well. but more importantly, learn the technique of good
> programming practices.  Once you develop your 
> personal 'technique', try another language and see how well your
> techniques port to the new language.
> 
> An artist is more likely able to paint a masterpiece than the man selling
> the paints. ;-)
> 
> You may want to check out Steve McConnell's book Code Complete as you
> learn your target language.
> -Dave




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
       [not found]           ` <01bb7588$236982e0$7b91f780@deangulo>
  1996-07-19  0:00             ` Robert Dewar
@ 1996-07-20  0:00             ` steidl
  1 sibling, 0 replies; 688+ messages in thread
From: steidl @ 1996-07-20  0:00 UTC (permalink / raw)



In <01bb7588$236982e0$7b91f780@deangulo>, "Carlos DeAngulo" <cdvi@msg.itg.ti.com> writes:
>Robert Dewar <dewar@cs.nyu.edu> wrote in article
><dewar.837728192@schonberg>...
>> Carlos says
>> 
>> "You should definitely learn C/C++. The business world today uses C++ as
>its
>> power language to develop the finest applications. Don't let anyone guide
>> you wrong."
>> 
>> Well I would say Carlos gives good advice (don't let anyone guide you
>wrong)
>> and you can start by not letting Carlos guide you wrong.
>> 
>> First, the business world uses many languages -- even today far more
>programs
>> are written in COBOL than in C and C++ combined by a very large margin.
>It is
>> true that a segment of the technical engineering and software devlopment
>> market uses C and C++ heavily today, but who knows what they may be using
>> tomorrow. Don't concentrate on learning languages, concentrate on
>learning
>> how to program.
>> 
>When I say business world I mean the working world. The question was
>whether to learn C or Pascal. All you have to do is pick up any newspaper
>anywhere in the country and see how many jobs are offered for Pascal
>programmers, and compare it to how many jobs are offered to C/C++
>programmers. If you spend your time learning Pascal, that's all you will
>end up doing... spending your time. Pick up a real language, don't waste
>your time on other ones.

OK, so your point is that we all need to program in C/C++ - we don't
really have a choice because that's where the jobs are.

>And about your COBOL remark... The only reason COBOL is still alive in the
[Reasons for COBOL's continue existence snipped]
>Fortunatelly we as programmers have the choice of what environment we want
>to spend our time in.

Oh, but Carlos, poor little me is confused.  Now you say we have a
choice?

>If you want to spend your time working on a dinosaur
>language like COBOL, or if you want to spend your time on C++ which is the
>language that most of the software development corporations in the world
>are using. If it isn't obvious how easy the decision is to go with C++ over
>COBOL and Pascal then your bank account will show the error you made!

Robert never said the person should learn COBOL, and he certainly never
said the person should learn COBOL to the exclusion of all else.  He
just said the person should learn "programming", and that C/C++ is
neither the only language used, nor is it the last word on programming.

You seemed to react quite unreasonably and vehemently to the mere
*mention* of COBOL.  To this I just have one thing to say:

   COBOL  COBOL  COBO   COBOL  C      CO
   C      C   L  C   L  C   L  C      CO
   C      C   L  COBOL  C   L  C      CO
   C      C   L  C   L  C   L  C
   COBOL  COBOL  COBO   COBOL  COBOL  CO

:-)

-Jeff

steidl@centuryinter.net - http://www.dont.i.wish.com/
All opinions are my own, and are subject to change without notice.






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-20  0:00                 ` Jon Bell
@ 1996-07-20  0:00                   ` Robert Dewar
  1996-07-21  0:00                     ` Alexander Vrenios
  1996-07-21  0:00                   ` Steve Tate
                                     ` (2 subsequent siblings)
  3 siblings, 1 reply; 688+ messages in thread
From: Robert Dewar @ 1996-07-20  0:00 UTC (permalink / raw)



Jon Bell said

"I do appreciate the counter-argument that it's asking a lot of students
to master both program design skills and language syntax at the same
time.  Therefore I respect the decisions made by schools that use a
"non-marketable" language such as Modula-"n" or Scheme ...

and what makes you think that Modula-"n" does not have a language syntax?
Even Scheme has a syntax, but it is a simple one. Modula-"n" is a fully
developed procedural language with a syntax not particularly more simple
than obviously comparable languages.

The motivation for teaching Modula-"n", or Scheme is not to avoid teaching
syntax (although it is true that Scheme reduces the syntactic burden, but
that is not a primary reason for introducing it as a first language
(actually in practice scheme is almost never introduced to anyone as a 
first language -- at schools where scheme is taught in the equivalent of
CS1, most students know how to program in some other language first).





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
       [not found] <sperlman-0507961717550001@p121.ezo.net>
                   ` (6 preceding siblings ...)
  1996-07-21  0:00 ` Laurent Guerby
@ 1996-07-21  0:00 ` Wayne
  1996-07-22  0:00 ` Darin Johnson
                   ` (12 subsequent siblings)
  20 siblings, 0 replies; 688+ messages in thread
From: Wayne @ 1996-07-21  0:00 UTC (permalink / raw)



Darin Johnson wrote:
> 
> > Also keep in mind that this rather lengthy diatribe was comparing c with the
> > 'standard' pascal, not what it has become.  Today's pascal is as
> > different from what was being discussed as today's c++ is from the old c.
> 
> True, Pascal has been mostly subsumed by Modula II and III.  These are
> nice languages, and you can do real-world and systems programming in
> them.  They're not as popular (you probably have to go commercial to
> get a compiler).
> --
> Darin Johnson
> djohnson@ucsd.edu       O-
>        Support your right to own gnus.

Isn't there an GNU Pascal compiler?  I think it also supports some Borland changes
to the ISO standard as well.


Wayne
-- 
Hit any user to continue.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-20  0:00                 ` Jon Bell
  1996-07-20  0:00                   ` Robert Dewar
@ 1996-07-21  0:00                   ` Steve Tate
  1996-07-21  0:00                     ` Robert Dewar
  1996-07-21  0:00                     ` Phil Howard
  1996-07-22  0:00                   ` Stephen M O'Shaughnessy
  1996-07-25  0:00                   ` ++           robin
  3 siblings, 2 replies; 688+ messages in thread
From: Steve Tate @ 1996-07-21  0:00 UTC (permalink / raw)



Jon Bell (jtbell@presby.edu) wrote:

>  Robert Dewar <dewar@cs.nyu.edu> wrote:
> >  You cannot express algorithms unless you
> >use a language to express them in, and for my taste, a well chosen 
> >programming language is as good choice as anything.

> For *my* taste, a real programming language is *better*, because you can 
> test the correctness of your solution by executing it.  Try *that* with 
> pseudocode or data flow diagrams!  :-)

I believe that pseudo-code is better for getting across general
principles without getting bogged down in implementation details.  As
another reason to promote pseudocode, I suggest in a half-serious way
that it is less likely to have people say things like this last
poster.  To explain that a little better: you can never, EVER "test
the correctness" of an *algorithm* by executing it.  I don't care how
you design your test cases, the ONLY way to show the correctness of an
algorithm is with a formal mathematical proof.  Of course,
implementations are good for demonstrating the INcorrectness of some
algorithms!  :-)

--
Steve Tate  ---  srt@cs.unt.edu | "As often as a study is cultivated by narrow
Dept. of Computer Sciences      |  minds, they will draw from it narrow
University of North Texas       |  conclusions."  -- John Stuart Mill, 1865.
Denton, TX  76201               | 




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-20  0:00                   ` Robert Dewar
@ 1996-07-21  0:00                     ` Alexander Vrenios
  0 siblings, 0 replies; 688+ messages in thread
From: Alexander Vrenios @ 1996-07-21  0:00 UTC (permalink / raw)



   C. Check the job ads in your Sunday paper and count those that are
looking for C programmers and those that are looking for Pascal skills.






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
       [not found] <sperlman-0507961717550001@p121.ezo.net>
                   ` (5 preceding siblings ...)
  1996-07-19  0:00 ` Andrew Gierth
@ 1996-07-21  0:00 ` Laurent Guerby
  1996-07-22  0:00   ` Stephen M O'Shaughnessy
  1996-07-21  0:00 ` Wayne
                   ` (13 subsequent siblings)
  20 siblings, 1 reply; 688+ messages in thread
From: Laurent Guerby @ 1996-07-21  0:00 UTC (permalink / raw)



Robert> Well of course, Andy may be a brilliant exception, but most
Robert> people who learn "Ada in a couple of weeks" on their own with
Robert> a background like his tend to write just terrible Ada, having
Robert> completely misunderstood the critical use of abstraction, the
Robert> right way to use exceptions, the right way to use generics,

   I would say it depends of what what they were looking for in
learning Ada. If one (C/Fortran/Basic) programmer feels that his
language is not enough to build in a satisfactory way good code and is
looking for new SE principle in learning Ada, then there's a chance
that the misunderstanding won't be that sytematic. And remember that the
very good and *free* Ada material is available all over the
net. Example: the Ada Quality & Style is full of very interesting
discussions (pros & cons, more interesting than guidelines most of the
time) on good use of Ada features. (Aren't you a reviewer of this
document? ;-). URL:

   http://sw-eng.falls-church.va.us/AdaIC/standards/Welcome.html
   (or something near this URL)

Robert> and probably I would guess that they only know a small subset
Robert> of the language (typically not including the annexes for
Robert> example).

   The advantage of the annexes in Ada 95 is that you don't have to
learn them if you don't use them, that's not "subsetting". I guess the
right example and common case is learning Ada without tasking.

   (Knowing all Ada 95 annexes means that you know: 
Annex A: all the core standard libraries (IO, Strings, ...)
Annex B: all of C, Fortran, COBOL languages
Annex C: all of your machine  (interruptions, assembly)
Annex D: all of priority scheduling (RMS, Ceiling, ...)
Annex E: all of distributed systems programming
Annex F: all of COBOL pictures strings editing and decimal arithmetic
Annex G: all of numerics (accuracy, complex arithmetic)
Annex H: all of safety critical system concerns (implementation choices)
Annex J: all of Ada 83 obsolescent features
... definitly a large amount of knowledge and experience!)

Robert> Learning a language is more than learning where the semicolons
Robert> go!

   Right (since most languages come with a philosophy of programming,
especially Ada in the realm of SE).

-- 
Laurent Guerby <guerby@gnat.com>, Team Ada.
   "Use the Source, Luke. The Source will be with you, always (GPL)."




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-21  0:00                   ` Steve Tate
  1996-07-21  0:00                     ` Robert Dewar
@ 1996-07-21  0:00                     ` Phil Howard
  1996-07-21  0:00                       ` Robert Dewar
  1 sibling, 1 reply; 688+ messages in thread
From: Phil Howard @ 1996-07-21  0:00 UTC (permalink / raw)



On 21 Jul 1996 19:24:07 GMT Steve Tate (srt@zaphod.csci.unt.edu) wrote:

| I believe that pseudo-code is better for getting across general
| principles without getting bogged down in implementation details.  As
| another reason to promote pseudocode, I suggest in a half-serious way
| that it is less likely to have people say things like this last
| poster.  To explain that a little better: you can never, EVER "test
| the correctness" of an *algorithm* by executing it.  I don't care how
| you design your test cases, the ONLY way to show the correctness of an
| algorithm is with a formal mathematical proof.  Of course,
| implementations are good for demonstrating the INcorrectness of some
| algorithms!  :-)

Even that isn't right.  I can send you bad code and _claim_ it to be an
implementation of _any_ algorithm, thus disproving.  Not!

There are multiple issues in programming that one needs to deal with.
There are abstract concepts (what data structure do I need to use?)
and concrete concepts (how do I format the output to be readable?)
involved.  Programming is essentially the bringing together of these
concepts.  The program is thus the bridge between the abstract and the
concrete.  You need to learn and understand both.

Programming languages are just tools.  The right tools for the job do
work better, but you have to know how to choose the correct tool and
also how to use that correct tool in the correct way.  Often these things
are not yet answered for what you may need to do.  Thus you need to look
at programming also as a problem solving situation and that takes a good
foundational understanding of all aspects.  That includes the abstract,
the concrete, and the tools for making programs that are the bridges.

--
Phil Howard KA9WGN   +-------------------------------------------------------+
Linux Consultant     |  Linux installation, configuration, administration,   |
Milepost Services    |  monitoring, maintenance, and diagnostic services.    |
phil@milepost.com    +-------------------------------------------------------+




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-21  0:00             ` Rich Maggio
@ 1996-07-21  0:00               ` Robert Dewar
  0 siblings, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-07-21  0:00 UTC (permalink / raw)



"> Programmers who do not assembly language are dangerous,
> because they do not fundamentally understand what the
> compiler is generating."

What I tell my students is that learning assembly language (or rather
machine architecture and machine language, which is really the issue) is
like learning how internal combustion engines and other elements of a car
work. Even if you do not plan to become an auto mechanic, you will find
that this knowledge is very useful, both from the point of view of allowing
you to figure out what might be wrong, and in understanding what the
auto-mechanic has to say when something really does go wrong.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-21  0:00                   ` Steve Tate
@ 1996-07-21  0:00                     ` Robert Dewar
  1996-07-21  0:00                     ` Phil Howard
  1 sibling, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-07-21  0:00 UTC (permalink / raw)



Steve Tate said

"I believe that pseudo-code is better for getting across general
principles without getting bogged down in implementation details.  As
another reason to promote pseudocode, I suggest in a half-serious way
that it is less likely to have people say things like this last
poster.  To explain that a little better: you can never, EVER "test
the correctness" of an *algorithm* by executing it.  I don't care how
you design your test cases, the ONLY way to show the correctness of an
algorithm is with a formal mathematical proof.  Of course,
implementations are good for demonstrating the INcorrectness of some
algorithms!  :-)"


But you certainly cannot prove the correctness of an algorithm written
in pseudo-code, at best you could produce an informal proof. If you
really want to write proofs of correctness, then you need to write
not just in a programming language, but in one whose semantics are
rigorously understood and formally defined.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-20  0:00             ` steidl
@ 1996-07-21  0:00               ` Andy Askey
  0 siblings, 0 replies; 688+ messages in thread
From: Andy Askey @ 1996-07-21  0:00 UTC (permalink / raw)



steidl@centuryinter.net wrote:

>If you think you learned C++ (and OOP) in a month after only knowning
>ForTran, then either you are fooling yourself, or you're a hell of a
>lot smarter then me or anyone else I know.  Most of the syntax can be
>learned in a week or less.  But OOP (and some of the more obscure
>facets of C++) take much longer.


>-Jeff

>steidl@centuryinter.net - http://www.dont.i.wish.com/
>All opinions are my own, and are subject to change without notice.


Jeff,
I find OOD/OOP more of a state of mind than anything mystical and not
immediately apparent.  I guess I have also looked at all things
objectively.  System design and coding is an art form and it really
does not take alot of brains.  It is like the 5 year old who can play
concert piano.  The kid aint necessarily smart, he/she can just play
piano.  Don't make OOP out to be rocket science.

The real amazing achievement, as I see it, is the fact that I was able
to write Fortran code for 5 years without going nuts.

--
May your karma be excellent for forgiving my spelling mishaps.

Andy Askey
ajaskey@gnn.com





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-20  0:00           ` Tim Behrendsen
@ 1996-07-21  0:00             ` Rich Maggio
  1996-07-21  0:00               ` Robert Dewar
  1996-07-22  0:00             ` Ralph Silverman
  1 sibling, 1 reply; 688+ messages in thread
From: Rich Maggio @ 1996-07-21  0:00 UTC (permalink / raw)



> That's like saying that since most Electronic Engineers use
> ICs, it's not necessary to learn the fundamentals of resisters,
> capaciters, and transisters.

Amen!

> Programmers who do not assembly language are dangerous,
> because they do not fundamentally understand what the
> compiler is generating. They believe in "The Myth of the
> Optimizing Compiler", that "compilers are so good nowadays
> that you don't have to worry about writing efficient code.
> If you do have to worry, then get a better compiler."

An additional thought along these lines.  A programmer should be VERY familiar with the inner workings of the 
OS that they are working with.  When working in a multitasking environment, it is important for the programmer 
(software engineer) to understand just how the multitasking is achieved in the OS.  With this understanding, 
efficient and bug free code can be written.  Programmers that see multitasking as "some magic thing that just 
happens" are pretty dangerous.  I have seen numerous bugs that can be explained understood very easily with an 
understanding of what the OS is really doing under the hood.

Rich Maggio





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-21  0:00                     ` Phil Howard
@ 1996-07-21  0:00                       ` Robert Dewar
  1996-07-22  0:00                         ` Steve Tate
  0 siblings, 1 reply; 688+ messages in thread
From: Robert Dewar @ 1996-07-21  0:00 UTC (permalink / raw)



Steve said

"| poster.  To explain that a little better: you can never, EVER "test
| the correctness" of an *algorithm* by executing it.  I don't care how
| you design your test cases, the ONLY way to show the correctness of an
| algorithm is with a formal mathematical proof.  Of course,
| implementations are good for demonstrating the INcorrectness of some
| algorithms!  :-)"

Well it is a reasonable guess that Steve is pretty new to the field and
is repeating what he has been taught. But still, this concentration on
correctness seems a mistake to me, and it particularly seems a mistake
to concentrate on correctness to the exclusion of pragmatic understanding
of the real issues.

Steve, you need to understand that correctness is just one aspect of a 
program. Correct programs are not necessarily usable or acceptable, and
incorrect programs are often both usable and acceptable. There are two
issues here.

Correctness only reflects the match between a program and it spec (Note; i
know we had a discussion about the confusion caused by taking a general
term like correctness and using in this restricted sense, but that is clearly
what Steve is talking about when he talks about a mathematic proof, since
clearly you cannot proove a program correct if you use the more general
informal meaning of correct :-)

But:

 a) how do you know the spec is correct (now we are using the informal sense
    of the term -- since sooner or later you have to track back to the real
    world, and at the point you do, the notion of mathematical proof fails
    you.

 b) what if the spec is not formalizable ("make the screen appearence 
    aesthetically pleasing", "generate good error messages" etc.) Such
    specs can *only* be realized via testing, since they are subjective,
    but often subjective requirements like this are as important as
    anything else (is the interface of Windows 95 correct?????)

Proof of correctness is an important tool, but accepting the orthodoxy that
it is the only useful approach and that testing of implementations is
useless reflects a rather large gap between your thinking and the real world!





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-22  0:00             ` Tim Oxler
  1996-07-22  0:00               ` Stig Norland
  1996-07-22  0:00               ` Janus
@ 1996-07-22  0:00               ` Robert Dewar
  1996-07-30  0:00                 ` Tim Behrendsen
  1996-07-31  0:00                 ` Patrick Horgan
  2 siblings, 2 replies; 688+ messages in thread
From: Robert Dewar @ 1996-07-22  0:00 UTC (permalink / raw)



Tim Oxler quoted:

Sentry Market Research surveyed 700 IS mangers what language they used
for client/server application development:

Visual Basic    23%
Cobol           21%
C++             18%
C               15%

and note that client server applications probably have a lower percentage
of COBOL than all applications, because there are still lots of 
traditional batch programs being generated in IS shops in COBOL.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-20  0:00                 ` TRAN PHAN ANH
@ 1996-07-22  0:00                   ` Ralph Silverman
  0 siblings, 0 replies; 688+ messages in thread
From: Ralph Silverman @ 1996-07-22  0:00 UTC (permalink / raw)



TRAN PHAN ANH (anh@kuhub.cc.ukans.edu) wrote:
: In article <dewar.837815529@schonberg>, dewar@cs.nyu.edu (Robert Dewar) writes:
: > "
: >         Never, never, never try to start learning a language before you
: >         learn how to program.  A good algorithm simply cannot be replaced,
: >         and learning how to write an alogrithm is in programming, not
: >         in learning a language.  You can sit down and read a hundred books
: >         about how to use pointers and linked lists in c++, and you still
: >         won't know how to use them in a good manner, if at all."
: > 
: > 
: > I am very familiar with the highly unusual approach Georgia Tech takes, but
: > I find the above remark rubbish. You cannot express algorithms unless you
: > use a language to express them in, and for my taste, a well chosen 
: > programming language is as good choice as anything.

: Rubbish....absolutely.  I know some friendly dudes, who know all the bloody 
: O(n) for every algorithms, but as soon as they sit down to implement 
: one...oh opps...core dump.   Oh, to make matter worse, they choose algorithms 
: based solely on O(n).

: Anh 

--
***********begin r.s. response***************

	(theory vs.  practice)

	obviously...
	these are interrelated and interdependent!

	(since "the beginning of time"...)

	stone throwing like this...
	in either direction...
	is unintelligent!

***********end r.s. response*****************
Ralph Silverman
z007400b@bcfreenet.seflin.lib.fl.us





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-22  0:00             ` Tim Oxler
@ 1996-07-22  0:00               ` Stig Norland
  1996-07-22  0:00               ` Janus
  1996-07-22  0:00               ` Robert Dewar
  2 siblings, 0 replies; 688+ messages in thread
From: Stig Norland @ 1996-07-22  0:00 UTC (permalink / raw)
  To: Tim Oxler


In article <4svvjf$c3i@news1.i1.net>, troxler@i1.net (Tim Oxler) writes:

[snip]

> Sentry Market Research surveyed 700 IS mangers what language they used
> for client/server application development:
> 
> Visual Basic	23%
> Cobol		21%
> C++		18%
> C		15%
> 4GL		15%
> Other		  8%
> 
> Key findings:
> 
> The use of languages has doubled since 1993.
> 
> The fastest growing are Cobol and VB.
> 
> 4GLs, Object Cobol, and C++ are the highest rated.
> 
> The use of model driven development tools is growing.
> 
> VB's strength is in graphical tool development, but placed in the
> middle of the ratings list.
> 
> 
> Tim Oxler

And Delphi wasn't in the survey at al ???





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-20  0:00                 ` Jon Bell
  1996-07-20  0:00                   ` Robert Dewar
  1996-07-21  0:00                   ` Steve Tate
@ 1996-07-22  0:00                   ` Stephen M O'Shaughnessy
  1996-07-25  0:00                   ` ++           robin
  3 siblings, 0 replies; 688+ messages in thread
From: Stephen M O'Shaughnessy @ 1996-07-22  0:00 UTC (permalink / raw)



In article <Duuoxq.IK1@presby.edu>, jtbell@presby.edu says...
>I do appreciate the counter-argument that it's asking a lot of students 
>to master both program design skills and language syntax at the same 
>time.  Therefore I respect the decisions made by schools that use a
>"non-marketable" language such as Modula-"n" or Scheme as their first 
>programming language.  I think it is a reasonable strategy to start with 
>such a language, then switch to a "real" language later, in the context 
>of a four-year degee program.  Of course, students may need to be 
>persuaded that this is actually worthwhile!
>

What separates a 'real' language from a (fake?) language? Would not the students still have to 
learn the syntax of even psudocode?  Is a 'non-marketable' language inherent in all 
programmers?  Because I can program in assembler, fortran, Forth, HP Basic, Borland C and Ada 
but I have never written a single line of Modula-"n" or Scheme.  All the psudocode I have ever 
written was made up of bits and pieces of the afore mentioned languages.  i.e I have never 
used a for loop in normal conversation, written or spoken.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-21  0:00                       ` Robert Dewar
@ 1996-07-22  0:00                         ` Steve Tate
  0 siblings, 0 replies; 688+ messages in thread
From: Steve Tate @ 1996-07-22  0:00 UTC (permalink / raw)



Robert Dewar (dewar@cs.nyu.edu) wrote:
> Steve said

> "| poster.  To explain that a little better: you can never, EVER "test
> | the correctness" of an *algorithm* by executing it.  I don't care how
> | you design your test cases, the ONLY way to show the correctness of an
> | algorithm is with a formal mathematical proof.  Of course,
> | implementations are good for demonstrating the INcorrectness of some
> | algorithms!  :-)"

> Well it is a reasonable guess that Steve is pretty new to the field and
> is repeating what he has been taught. 

Ummm... no.  I've been programming for about 18 years, and have had a
Ph.D. and been teaching for about 5.

> But still, this concentration on
> correctness seems a mistake to me, and it particularly seems a mistake
> to concentrate on correctness to the exclusion of pragmatic understanding
> of the real issues.

> Steve, you need to understand that correctness is just one aspect of a 
> program. Correct programs are not necessarily usable or acceptable, and
> incorrect programs are often both usable and acceptable. There are two
> issues here.

And you seem to have totally missed what I said.  Nowhere (not even
once) did I say anything about proving that a *program* was correct.
I was talking about correctness proofs for *algorithms*.  Take, for
example, Prim's algorithm for finding a minimum spanning tree.  There
are a huge number of different ways you can implement this, using
different data structures or different languages, but the abstract
sequence of steps that makes it Prim's algorithm is very well
defined.  Furthermore, you can formally prove the correctness of the
algorithm which, of course, doesn't depend in the least on
implementation details such as exact data structures of programming
language.  I think using a real programming language to describe
Prim's algorithm would be a mistake because it wouldn't clearly
emphasize that these details are not relevant to what "Prim's
algorithm" really is.

Let me give you another example that better illustrates what I said
about an implementation telling you nothing about the correctness of
an algorithm.  There is a proposed primality testing algorithm (based
on a sequence of numbers called the "Perrin Sequence") which was
recently (June 1996) described in Scientific American.  The algorithm
has been known for about 100 years, but noone has been able to either
prove that it works correctly or give a counter example.  There have
been implementations, and for almost a century all empirical tests
have shown that the algorithm works.  Until a couple of weeks ago.
That's when a counter example was found.  So I repeat what I said
before: you can tell absolutely nothing about the correctness of an
algorithm by testing --- the only thing that reasonably shows the
correctness of an algorithm is a good mathematical proof.

> Correctness only reflects the match between a program and it spec (Note; i
> know we had a discussion about the confusion caused by taking a general
> term like correctness and using in this restricted sense, but that is clearly
> what Steve is talking about when he talks about a mathematic proof, since
> clearly you cannot proove a program correct if you use the more general
> informal meaning of correct :-)

Again, I was talking about algorithms, not writing a program "to
spec".  When you talk about algorithm, you are normally given a clear,
unambiguous, mathematical description of what needs to be computed.
Proving correctness is clearly desirable in such a situation.

--
Steve Tate  ---  srt@cs.unt.edu | "As often as a study is cultivated by narrow
Dept. of Computer Sciences      |  minds, they will draw from it narrow
University of North Texas       |  conclusions."  -- John Stuart Mill, 1865.
Denton, TX  76201               | 




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-21  0:00 ` Laurent Guerby
@ 1996-07-22  0:00   ` Stephen M O'Shaughnessy
  0 siblings, 0 replies; 688+ messages in thread
From: Stephen M O'Shaughnessy @ 1996-07-22  0:00 UTC (permalink / raw)



All this rhetoric is fine as far as it goes, BUT, for me it is not what you can write, but what you 
can read.  It does not matter how many languages you know.  It does not really matter what style you 
use.  In most cases it does not matter which language you apply to which problem.  Problem spaces 
change, evolve.  Sooner or later you or someone else will look at your code and want to change it.  
In most cases this occures before the product is even delivered.  Most institutions have procedures 
to review code.  From turning in assignments at school to walk-throughs and peer reviews in 
industry, someone is looking at your code.  This is the motivation behind Ada, the human aspect.  
And Ada wins, hands down.  If you want/need to learn a language, Ada has it all.  Object 
Orientation, all the constructs, availability, understandability, and most important for new 
programmers -- readability.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-22  0:00                   ` steidl
@ 1996-07-22  0:00                     ` Stephen M O'Shaughnessy
  1996-07-23  0:00                       ` Richard A. O'Keefe
  0 siblings, 1 reply; 688+ messages in thread
From: Stephen M O'Shaughnessy @ 1996-07-22  0:00 UTC (permalink / raw)



In article <4suk39$9h2@news.ld.centuryinter.net>, steidl@centuryinter.net says...
 However, I think that human novels are
>generally harder to translate than computer programs because a
>computer program is self-contained (if you include the language and
>libraries), whereas a novel draws on an entire culture.  Translating
>the novel effectively often means having intimate knowledge of two
>cultures and being able to make mappings (which are sometimes very
>subtle) from one to the other.  Some really difficult problems
>(difficult even for very intelligent humans) can arise when trying to
>perform this task.  To see examples of such, you should read "Godel,
>Escher, Bach:  The Eternal Golden Braid" by Douglas Hofstadter.
>
>BTW, everyone who likes to program should read this book, IMHO.
>
Most people believe the Bible to be the in-errant word of God.  For a real eye opener read two 
versions side by side, say the King James and the Living Bible.

Sorry for straying off the subject.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-20  0:00         ` TRAN PHAN ANH
                             ` (2 preceding siblings ...)
  1996-07-20  0:00           ` Andy Askey
@ 1996-07-22  0:00           ` Stephen M O'Shaughnessy
  1996-07-23  0:00             ` TRAN PHAN ANH
  3 siblings, 1 reply; 688+ messages in thread
From: Stephen M O'Shaughnessy @ 1996-07-22  0:00 UTC (permalink / raw)



In article <1996Jul20.124025.122789@kuhub.cc.ukans.edu>, anh@kuhub.cc.ukans.edu 
says...
>
>Absolutely right.
>
>But Pascal or C was the original question.  Start with C is what I say.  
>Better yet, why not C++ then move on to JAVA?
>
>Besides, if you can master C/C++, and JAVA, it will take you 5 min. to learn 
>Pascal.  Actually, if one has a solid foundation in programming techniques and
>a solid understanding of one or two languages, one can aquire a working 
>knowledge of any language in no time.
>
>Anh
>

Yes, but "Hello World" is not a 'real' program. 8}






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-22  0:00                 ` Jeremy Nelson
@ 1996-07-22  0:00                   ` Stephen M O'Shaughnessy
  0 siblings, 0 replies; 688+ messages in thread
From: Stephen M O'Shaughnessy @ 1996-07-22  0:00 UTC (permalink / raw)



In article <4t01v4$8a6@news.inc.net>, nelson@cs.uwp.edu says...

>I think you missed your attributions here -- The only post i made in this
>thread was on an entirely different branch (the posting i made has not been 
>followed up to yet, even), so just a friendly reminder to go back and make
>sure that you dont get me into a flame war that i never really intended to
>get into. ;-)
>
You are correct, I missed quoted you.  Sorry.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-20  0:00           ` Tim Behrendsen
  1996-07-21  0:00             ` Rich Maggio
@ 1996-07-22  0:00             ` Ralph Silverman
  1996-07-23  0:00               ` Tim Behrendsen
  1 sibling, 1 reply; 688+ messages in thread
From: Ralph Silverman @ 1996-07-22  0:00 UTC (permalink / raw)



Tim Behrendsen (tim@airshields.com) wrote:
: Scott McMahan - Softbase Systems <softbase@mercury.interpath.com> wrote in
: article <4sokr1$4c9@news.interpath.net>...
: > Patrick Horgan (patrick@broadvision.com) wrote:
: > 
: > 
: > : and five or six assemblers for the same reason.
: > 
: > I'd rather just learn C and port it. Asm isn't as important
: > anymore now that there's so many different platforms.
: > 
: > Scott

: Big disagreement ... assembler is most critical thing any
: programmer can learn.  Not because you're going to use it
: every day, but because it will teach you more about what's
: *really* going on than 10 high-level language classes.

: That's like saying that since most Electronic Engineers use
: ICs, it's not necessary to learn the fundamentals of resisters,
: capaciters, and transisters.

: Programmers who do not assembly language are dangerous,
: because they do not fundamentally understand what the
: compiler is generating. They believe in "The Myth of the
: Optimizing Compiler", that "compilers are so good nowadays
: that you don't have to worry about writing efficient code.
: If you do have to worry, then get a better compiler."

--
***********begin r.s. response*************

	a) start low and go high...
	b) start high and go low...

	common sense tells us either might work...
	experience tells us each has...

***********end r.s. response***************
Ralph Silverman
z007400b@bcfreenet.seflin.lib.fl.us





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-22  0:00             ` Tim Oxler
  1996-07-22  0:00               ` Stig Norland
@ 1996-07-22  0:00               ` Janus
  1996-07-22  0:00               ` Robert Dewar
  2 siblings, 0 replies; 688+ messages in thread
From: Janus @ 1996-07-22  0:00 UTC (permalink / raw)



With nothing better to do on Mon, 22 Jul 1996 13:21:32 GMT,
troxler@i1.net (Tim Oxler) dropped 2 cents in the slot and wrote:

<snip>
>Sentry Market Research surveyed 700 IS mangers what language they used
>for client/server application development:
>
<snip>
>
>The use of languages has doubled since 1993.
>
Maybe I'm particularly dense tonight - but what does that sentence
mean ? 


Bye from
Janus at Kerry, Ireland
email : jab@iol.ie
WWW : http://www.iol.ie/~jab





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-19  0:00             ` Jason Alan Turnage
  1996-07-19  0:00               ` Robert Dewar
@ 1996-07-22  0:00               ` Stephen M O'Shaughnessy
  1996-07-22  0:00                 ` Jeremy Nelson
  1 sibling, 1 reply; 688+ messages in thread
From: Stephen M O'Shaughnessy @ 1996-07-22  0:00 UTC (permalink / raw)



In article <4sord0$l0k@solaria.cc.gatech.edu>, turnage@cc.gatech.edu says...

>        Never, never, never try to start learning a language before you
>        learn how to program.  A good algorithm simply cannot be replaced,
>        and learning how to write an alogrithm is in programming, not
>        in learning a language.  You can sit down and read a hundred books
>        about how to use pointers and linked lists in c++, and you still
>        won't know how to use them in a good manner, if at all.
>
I am with Mr. Dewar on this one.  What Jeremy is saying is like saying learn theology or 
philosophy before you learn to read or write or even speak!  You can't understand the 
concepts of any discipline until you learn the language that describes that discipline.

There is some validity in Mr. Nelson's  statements.  Don't get caught up in the language 
war when trying to understand the concepts of programming.  It is  like arguing merits of 
Greek or Hebrew in understanding Christianity.  It misses the whole point.  But you must 
have at least one language to even begin.

So for the original poster, does Mr. Nelson have a reference for the none language 
specific programming?





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-20  0:00           ` Should I learn C or Pascal? Robert Dewar
@ 1996-07-22  0:00             ` TRAN PHAN ANH
  1996-07-23  0:00             ` Ken Garlington
  1 sibling, 0 replies; 688+ messages in thread
From: TRAN PHAN ANH @ 1996-07-22  0:00 UTC (permalink / raw)



In article <dewar.837900271@schonberg>, dewar@cs.nyu.edu (Robert Dewar) writes:
> Tran said
> 
> "Besides, if you can master C/C++, and JAVA, it will take you 5 min. to learn
> Pascal.  Actually, if one has a solid foundation in programming techniques and
> a solid understanding of one or two languages, one can aquire a working
> knowledge of any language in no time."
> 
> Evenm allowing for a reasonable amount of rhetorical exaggeration, this is
> false. First of all, there is no language C/C++, they are two quite

This is nitpicking buddy.  Let's replace / with and, and say C and C++.

> separate languages, and it definitely is NOT the case that if you have
> learned one language that you can learn another in five minutes. Pretty

6, make that 6 minutes...

> quickly sure, but not five minutes, for example, the notion of 
> non-deterministic semantics for "and" in Pascal will be quite unfamiliar
> to a C (or for that matter C++ programmer), and there are always enough
> fine points like this to make it more than a trivial matter to become
> a knowledgable programmer in a new language.

Add an extra minute or 2 for unexpected problems. :-) Besides, I never 
claimed that you will become an expert in the language.  I said "a 
working knowledge".

Loosen up dude...:-)

Anh




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-22  0:00               ` Stephen M O'Shaughnessy
@ 1996-07-22  0:00                 ` Jeremy Nelson
  1996-07-22  0:00                   ` Stephen M O'Shaughnessy
  0 siblings, 1 reply; 688+ messages in thread
From: Jeremy Nelson @ 1996-07-22  0:00 UTC (permalink / raw)



In article <Duy1nn.7t5@most.fw.hac.com>,
>>
>I am with Mr. Dewar on this one.  What Jeremy is saying is like saying learn theology or 
>philosophy before you learn to read or write or even speak!  You can't understand the 
>concepts of any discipline until you learn the language that describes that discipline.

I think you missed your attributions here -- The only post i made in this
thread was on an entirely different branch (the posting i made has not been 
followed up to yet, even), so just a friendly reminder to go back and make
sure that you dont get me into a flame war that i never really intended to
get into. ;-)

>So for the original poster, does Mr. Nelson have a reference for the none language 
>specific programming?


Please read my post again.  What i said was "You arent really a programmer
until you learn to solve problems."  That does not preclude the ability to 
learn a computer language, but there is much more to programming then just
producing semantically correct programs.  You have to understand how to apply
the nuances of the _appropriate_ language to get the _appropriate_ answer.

FORTRAN is easy to learn, and works well for math computation problems.
C is more difficult to learn, but can be used for most pragmatic problems.
C++ is even more difficult to learn, but can be used for more abstract
   problems.
lisp is easy to learn, and can be used for a very useful set of specific
   problems requiring a functional programming approach.
Pascal is trivially easy to learn, and that was the point.  Pure pascal
   is limited in its usefulness, but several enhanced dialects of pascal
   can be quite useful for what may be refered to as "user-land" problems.
   I dont believe pascal is an appropriate tool for system programming.

I wouldnt write space shuttle software in lisp, but FORTRAN has done very 
well in this respect, for many years.

As i said in my (one and only) post, 

"Not every language solves every problem.  Programming is knowing which 
language to use at what time." And i might add also "And you also have
to know how to use that language to get the computer to do the most
work for you in the least amount of human-time."

Programming is not about languages.  Its about problem-solving using computers.

Jeremy




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-19  0:00           ` johnf
                               ` (2 preceding siblings ...)
  1996-07-20  0:00             ` Tim Behrendsen
@ 1996-07-22  0:00             ` Ralph Silverman
  1996-07-23  0:00               ` Joe Gwinn
  1996-07-23  0:00             ` John A Hughes
  4 siblings, 1 reply; 688+ messages in thread
From: Ralph Silverman @ 1996-07-22  0:00 UTC (permalink / raw)



johnf (johnf@nando.com) wrote:
: In article <01bb7591$83087d60$87ee6fce@timpent.airshields.com>, "Tim
: Behrendsen" <tim@airshields.com> wrote:

: >Carlos DeAngulo <cdvi@msg.itg.ti.com> wrote in article
: ><01bb74ac$b7aa7860$7b91f780@deangulo>...
: >> You should definitely learn C/C++. The business world today uses C++ as
: >its
: >> power language to develop the finest applications. Don't let anyone guide
: >> you wrong.
: >
: >Not to start a flame war on C++, but all you newbie programmers
: >out there, don't believe everything you hear about C++.  Object
: >oriented programming has a lot of good concepts, but C++ is a bad
: >implementation of them.  Not that you shouldn't learn it, but
: >don't think it's the ultimate expression of what OOP is all about.
: >
: >C++: The PL/I of the 90s.

: OK

: I am one of these newbies.
: I haven't programmed anything, ever, with any language.
: I am currently learning C with the help of Dave Mark (Learn C on Mac) as
: my baptism into programming. 
: So, I am I only learning C, and not "how to program"? I don't understand
: how the two can be exclusive.
: How does one learn how to be a "Good Programmer" without picking a
: language to learn first, learning it well, then learning others as they
: interest you? 
: I am not trying to be a wise guy, just a guy who can learn to program well
: enough to get out of his crappy job and into this (for me) exciting field
: as a career.
: I don't expect to start as the Sr. Developer on some project, I will
: happily slog it out in the trenches and pay my dues, just explain to me
: how to get there...

: Thank you,

: Johnf

: -- 
: johnf@nando.com

: Go Falcons

--
**************begin r.s. response****************

	poster admits to being inexperienced
	but manifests great common sense and
	good intelligence...

	the question is...
	is this adequate anymore...
	?

	the sense of frustration manifested
	is on the mark...
	what unseen and unadmitted factors
	are in operation here???

**************end r.s. response******************
Ralph Silverman
z007400b@bcfreenet.seflin.lib.fl.us





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
       [not found] <sperlman-0507961717550001@p121.ezo.net>
                   ` (7 preceding siblings ...)
  1996-07-21  0:00 ` Wayne
@ 1996-07-22  0:00 ` Darin Johnson
  1996-07-22  0:00 ` Darin Johnson
                   ` (11 subsequent siblings)
  20 siblings, 0 replies; 688+ messages in thread
From: Darin Johnson @ 1996-07-22  0:00 UTC (permalink / raw)



> Robert> Well of course, Andy may be a brilliant exception, but most
> Robert> people who learn "Ada in a couple of weeks" on their own with
> Robert> a background like his tend to write just terrible Ada, having
> Robert> completely misunderstood the critical use of abstraction, the
> Robert> right way to use exceptions, the right way to use generics,

This makes the assumption that the learner has never seen abstraction,
exceptions or generics before coming to Ada.  If the learner has seen
these before then Ada is just a matter of learning the syntax,
semantics and standard packages.  And even if not all of these have
been seen, having a decent brain gets over those hurdles (ie, a brain
that says "I don't understand generics fully, so I won't use them just
yet").
-- 
Darin Johnson
djohnson@ucsd.edu	O-
    Where am I?  In the village...  What do you want?  Information...




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-19  0:00           ` Jon Bell
@ 1996-07-22  0:00             ` Tim Oxler
  1996-07-22  0:00               ` Stig Norland
                                 ` (2 more replies)
  0 siblings, 3 replies; 688+ messages in thread
From: Tim Oxler @ 1996-07-22  0:00 UTC (permalink / raw)



jtbell@presby.edu (Jon Bell) wrote:

> Robert Dewar <dewar@cs.nyu.edu> wrote:
>>First, the business world uses many languages -- even today far more programs
>>are written in COBOL than in C and C++ combined by a very large margin.

>Just out of curiosity, how much *new* development takes place in COBOL, 
>as opposed to maintainance and extension of existing systems?   This does 
>not imply that I'm downgrading maintainance, by the way.  I've done some 
>of it (although not in COBOL), and I know that it can be a real challenge.

>-- 
>Jon Bell <jtbell@presby.edu>                        Presbyterian College
>Dept. of Physics and Computer Science        Clinton, South Carolina USA


Quite a bit.  ALL of my work is new development (12 new CICS pgms in
the past 2 months).  All of it, for my current client, has been on
mainframe.  Just because it's "legacy" doesn't mean that it's static.
Corporations are very dynamic.  Some systems move to client/server
(with the mainframe connected), and many are written in Cobol.

Here's the source Infoworld, April 8, 1996, page 56.

Sentry Market Research surveyed 700 IS mangers what language they used
for client/server application development:

Visual Basic	23%
Cobol		21%
C++		18%
C		15%
4GL		15%
Other		  8%

Key findings:

The use of languages has doubled since 1993.

The fastest growing are Cobol and VB.

4GLs, Object Cobol, and C++ are the highest rated.

The use of model driven development tools is growing.

VB's strength is in graphical tool development, but placed in the
middle of the ratings list.


Tim Oxler


TEO Computer Technologies Inc.
troxler@i1.net
http://www.i1.net/~troxler
http://users.aol.com/TEOcorp





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
       [not found] <sperlman-0507961717550001@p121.ezo.net>
                   ` (8 preceding siblings ...)
  1996-07-22  0:00 ` Darin Johnson
@ 1996-07-22  0:00 ` Darin Johnson
  1996-07-23  0:00 ` Darin Johnson
                   ` (10 subsequent siblings)
  20 siblings, 0 replies; 688+ messages in thread
From: Darin Johnson @ 1996-07-22  0:00 UTC (permalink / raw)



> Add an extra minute or 2 for unexpected problems. :-) Besides, I never 
> claimed that you will become an expert in the language.  I said "a 
> working knowledge".

I learned C during a lecture.  The prof let us do programs in either
Pascal or C, so during the lecture I would occasionally ask my friend
for the equivalents of certain constructs.  Then after class I sat
down and wrote my assignment in C, then alternated each assignment
between C and Pascal after that.  The knowledge I was lacking was in
the details, stuff like printf specifications.  Certainly no abstract
programming details were missing (like what's a pointer, and how to do
recursion, etc, although it was odd losing out on the modularization I
was used to).

Ok, it wasn't 6 minutes, but I didn't have a book or summary sheet to
go on, only whispers.  When I actually got around to reading K&R, much
of it was old hat.
-- 
Darin Johnson
djohnson@ucsd.edu	O-
    "Particle Man, Particle Man, doing the things a particle can"




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-20  0:00                 ` Robert Dewar
@ 1996-07-22  0:00                   ` steidl
  1996-07-22  0:00                     ` Stephen M O'Shaughnessy
  0 siblings, 1 reply; 688+ messages in thread
From: steidl @ 1996-07-22  0:00 UTC (permalink / raw)



In <dewar.837864318@schonberg>, dewar@cs.nyu.edu (Robert Dewar) writes:
>Peter says
>
[Lots of stuff talking about the limitations of C header files snipped.]
>
>Languages are more than just a set of syntax and semantic rules, they also
>come with a "typical style" of writing code. This style is partly a matter
>of tradition, but is strongly influenced by the design of the syntax and
>semantics.
>
[Some style stuff snipped.]
>
>One clue of the importance of these typical style issues is the difficulty
>of writing effective automatic translation programs from one language to
>another. It is not too hard to write a translator that translates one
>computer language to another, but to write one that takes a program written
>in the typical style of language X and produces an idiomatic program written
>in the typical style of langauge Y is extremely difficult. 
>
>In fact I would venture to guess that at this stage translation of natural
>languages from one to another is more successful than translation of
>computer programming languages. That's because the entire strucure of
>a program can be altered as a result of the shift of styles.
>
>Note that this applies to a human too. Someone who knows French and 
>English really well can take a novel written in one language, and produce
>an acceptable idiomatic translation by translating sentence by sentence.

It sounds like you have some experience with translating computer
prorams, but no real experience translating novels.  Human language
translation presents many of the same problems as computer program
translation does - i.e. different sentences may talk about the same
thing or be in the same style, or even talk about each other, and
translating them from one language to another without consistency
will result in a lackluster outcome.  Both programs and novels involve
a certain style and content.  However, I think that human novels are
generally harder to translate than computer programs because a
computer program is self-contained (if you include the language and
libraries), whereas a novel draws on an entire culture.  Translating
the novel effectively often means having intimate knowledge of two
cultures and being able to make mappings (which are sometimes very
subtle) from one to the other.  Some really difficult problems
(difficult even for very intelligent humans) can arise when trying to
perform this task.  To see examples of such, you should read "Godel,
Escher, Bach:  The Eternal Golden Braid" by Douglas Hofstadter.

BTW, everyone who likes to program should read this book, IMHO.

Of course, if you want to make computer program translation as hard as
translating novels, require the translater to translate the comments
from the context of the first languages' community to that of the
second would be a start.


-Jeff

steidl@centuryinter.net - http://www.dont.i.wish.com/
All opinions are my own, and are subject to change without notice.






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-22  0:00           ` Stephen M O'Shaughnessy
@ 1996-07-23  0:00             ` TRAN PHAN ANH
  0 siblings, 0 replies; 688+ messages in thread
From: TRAN PHAN ANH @ 1996-07-23  0:00 UTC (permalink / raw)



In article <Duy3Iy.86A@most.fw.hac.com>, smosha@most.fw.hac.com (Stephen M O'Shaughnessy) writes:
> In article <1996Jul20.124025.122789@kuhub.cc.ukans.edu>, anh@kuhub.cc.ukans.edu 
> says...
>>
>>Absolutely right.
>>
>>But Pascal or C was the original question.  Start with C is what I say.  
>>Better yet, why not C++ then move on to JAVA?
>>
>>Besides, if you can master C/C++, and JAVA, it will take you 5 min. to learn 
>>Pascal.  Actually, if one has a solid foundation in programming techniques and
>>a solid understanding of one or two languages, one can aquire a working 
>>knowledge of any language in no time.
>>
>>Anh
>>
> 
> Yes, but "Hello World" is not a 'real' program. 8}

We learned Smalltalk, Prolog, ML, and a dataflow language (can't recall 
the name right now) in one semester (adv. prg. lang. topics course) , and we 
did not say hello to the world one single time. :-)

About the projects, for sure, noone lost sleep over the syntactic aspects 
and the different programming paradigms. :-)

The course, of course, did not make us experts on all these languages, 
but, which course make students expert in its area?

Anh




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-22  0:00                     ` Stephen M O'Shaughnessy
@ 1996-07-23  0:00                       ` Richard A. O'Keefe
  1996-07-23  0:00                         ` Michael Ickes
  1996-07-24  0:00                         ` system
  0 siblings, 2 replies; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-07-23  0:00 UTC (permalink / raw)



smosha@most.fw.hac.com (Stephen M O'Shaughnessy) writes:

>In article <4suk39$9h2@news.ld.centuryinter.net>, steidl@centuryinter.net says...
>Most people believe the Bible to be the in-errant word of God.

Quantifying over the whole world, this statement is false.

For those people who *do* believe it, the agreed definition of
inerrancy applies *solely* to the original text in the original languages;
every copy, every translation, and every interpretation is corrigible.

>For a real eye opener read two 
>versions side by side, say the King James and the Living Bible.

(a) The Authorised Version came out in 1611.  That's a long time ago,
    and English has changed quite a lot.
(b) The Living Bible IS NOT A TRANSLATION!  It is openly and unashamedly
    a *paraphrase*.

For a fairer comparison, consider the current Jewish Publication Society
translation of the Tanach, and a really professional Christian translation
such as The Revised English Bible or the International Version.  Despite
being produced by disjoint committees with radically different
theological biases; some of the sentences are word for word identical.

I think that what this shows is that it *is* possible to do a very good
job of translating between languages in two unrelated families 2500+
years apart in dramatically different cultures *if* you take hundreds of
scholars, hundreds of years, and build up a "translation technology",
and libraries full of information about the cultural background.

-- 
Fifty years of programming language research, and we end up with C++ ???
Richard A. O'Keefe; http://www.cs.rmit.edu.au/~ok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-23  0:00                       ` Richard A. O'Keefe
@ 1996-07-23  0:00                         ` Michael Ickes
  1996-07-25  0:00                           ` Andy Askey
  1996-07-24  0:00                         ` system
  1 sibling, 1 reply; 688+ messages in thread
From: Michael Ickes @ 1996-07-23  0:00 UTC (permalink / raw)



Richard A. O'Keefe wrote:
> 
> smosha@most.fw.hac.com (Stephen M O'Shaughnessy) writes:
> 
> >In article <4suk39$9h2@news.ld.centuryinter.net>, steidl@centuryinter.net says...
> >Most people believe the Bible to be the in-errant word of God.
> 
> Quantifying over the whole world, this statement is false.
> 
> For those people who *do* believe it, the agreed definition of
> inerrancy applies *solely* to the original text in the original languages;
> every copy, every translation, and every interpretation is corrigible.
> 
> >For a real eye opener read two
> >versions side by side, say the King James and the Living Bible.
> 
> (a) The Authorised Version came out in 1611.  That's a long time ago,
>     and English has changed quite a lot.
> (b) The Living Bible IS NOT A TRANSLATION!  It is openly and unashamedly
>     a *paraphrase*.
> 
> For a fairer comparison, consider the current Jewish Publication Society
> translation of the Tanach, and a really professional Christian translation
> such as The Revised English Bible or the International Version.  Despite
> being produced by disjoint committees with radically different
> theological biases; some of the sentences are word for word identical.
> 
> I think that what this shows is that it *is* possible to do a very good
> job of translating between languages in two unrelated families 2500+
> years apart in dramatically different cultures *if* you take hundreds of
> scholars, hundreds of years, and build up a "translation technology",
> and libraries full of information about the cultural background.
> 
> --
> Fifty years of programming language research, and we end up with C++ ???
> Richard A. O'Keefe; http://www.cs.rmit.edu.au/~ok; RMIT Comp.Sci.

 Should'nt this post go in alt.religious.stuff..................?




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-15  0:00   ` Should I learn C or Pascal? Ralph Silverman
  1996-07-15  0:00     ` Steve Sobol
  1996-07-16  0:00     ` Lee Crites
@ 1996-07-23  0:00     ` Richard A. O'Keefe
  2 siblings, 0 replies; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-07-23  0:00 UTC (permalink / raw)



>Gabor Egressy (gegressy@uoguelph.ca) wrote:
>: Here is a quote from Brian W. Kernighan of "The C Programming Language" fame.
>: He is also an internationally respected lecturer who has been there since 
>: the inception of UNIX and C.
>: "I feel that it is a mistake to use Pascal for anything much beyond its 
>: original target. In its pure form, Pascal is a toy language, suitable for 
>: teaching but not for real programming."
>: Draw your own conclusions.


I have read that paper, and have often recommended it to people.
But by now it is an OLD paper, so it is important to understand that
Kernighan is basically talking about ISO 7185 Level 0 Pascal.
 - Turbo Pascal is a completely different language
 - Delphi is a completely different language
 - ISO Pascal Extended (with or without the Object-Oriented Extensions
   to Pascal) is a completely different language

It would be very easy for someone familiar with ISO Pascal Extended to
write a paper "Why C is not my favourite programming language" pointing
out that unlike (the current international standard entitled to the name)
Pascal, C
 - does not have modules
 - does not have type-safe separate compilation
 - does not have Ada-style "type schemas"
 - does not have syntax for string concatenation, comparison, substring
 - does not have support for arrays whose size is not known until run time
 - does not have nested procedures
 - does not have complex arithmetic


The only thing is, after studying ISO 10206 I can see little reason for
using Pascal instead of Ada; Ada has everything that Pascal has and then
some.  As for the 1993 Object Oriented Extensions to Pascal, I greatly
prefer the object model of Ada 95 (having to *manually* destroy every
object because objects are really pointers does *not* appeal to me).

My answer to the question that is the Subject of this thread is
    - don't learn Pascal first
    - don't learn C first either
    - do learn Scheme first (try "The Little Schemer", then "Simply Scheme")
    - or learn Ada first.

-- 
Fifty years of programming language research, and we end up with C++ ???
Richard A. O'Keefe; http://www.cs.rmit.edu.au/~ok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-22  0:00             ` Ralph Silverman
@ 1996-07-23  0:00               ` Tim Behrendsen
  0 siblings, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-07-23  0:00 UTC (permalink / raw)



Ralph Silverman <z007400b@bcfreenet.seflin.lib.fl.us> wrote in article
<4t0pcb$poq@nntp.seflin.lib.fl.us>...
> Tim Behrendsen (tim@airshields.com) wrote:
> : Scott McMahan - Softbase Systems <softbase@mercury.interpath.com> wrote
in
> : article <4sokr1$4c9@news.interpath.net>...
> : > Patrick Horgan (patrick@broadvision.com) wrote:
> : > 
> : > 
> : > : and five or six assemblers for the same reason.
> : > 
> : > I'd rather just learn C and port it. Asm isn't as important
> : > anymore now that there's so many different platforms.
> : > 
> : > Scott
> 
> : Big disagreement ... assembler is most critical thing any
> : programmer can learn.  Not because you're going to use it
> : every day, but because it will teach you more about what's
> : *really* going on than 10 high-level language classes.
> 
> : That's like saying that since most Electronic Engineers use
> : ICs, it's not necessary to learn the fundamentals of resisters,
> : capaciters, and transisters.
> 
> : Programmers who do not assembly language are dangerous,
> : because they do not fundamentally understand what the
> : compiler is generating. They believe in "The Myth of the
> : Optimizing Compiler", that "compilers are so good nowadays
> : that you don't have to worry about writing efficient code.
> : If you do have to worry, then get a better compiler."
> 
> 	a) start low and go high...
> 	b) start high and go low...
> 
> 	common sense tells us either might work...
> 	experience tells us each has...

Uh, so what?  The fact that people have learned both ways
is completely irrelevent.

The important question is what is the best way for the most
people, and experience has shown me that it's way better to
ground people in foundations of how computers really work. That
way they get a sense of the procedural nature of computers, and
they are not bogged down with 10 tons of abstract crap before
they are prepared to know what it really means.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
       [not found] <sperlman-0507961717550001@p121.ezo.net>
                   ` (9 preceding siblings ...)
  1996-07-22  0:00 ` Darin Johnson
@ 1996-07-23  0:00 ` Darin Johnson
  1996-07-24  0:00   ` Michael Feldman
                     ` (2 more replies)
  1996-07-24  0:00 ` Darin Johnson
                   ` (9 subsequent siblings)
  20 siblings, 3 replies; 688+ messages in thread
From: Darin Johnson @ 1996-07-23  0:00 UTC (permalink / raw)



> A sizeable portion of the class,
> however, whined because they talked to "big, smart people in the real
> world" who told them they were wasting their time, no one uses Scheme,
> it's a stupid language, you can't do XYZ like you can in C or
> whatever, man, universities are stupid, this country is fucked up
> blablabla.

Unfortunately, this group exists regardless of what you teach or how.
I've TA'd and proctored a variety of classes (programming and
architecture), and you don't get away from them.  And it's not just
that they want the university to be a trade school either, I had
people in '81 (back when you had to know a variety of things and be
able to adapt in order to program) and they were complaining about why
they should learn Pascal since they already knew BASIC and didn't need
this intro to programming class.  They didn't do that well in the
class however.  Had a student complain about why he should learn how
compiler work since we already have a good C compiler.  And of course,
plenty were upset that they had to learn about the eniac or
computability or PDA's or what-not.

Of course, IMHO, I think the solution would be to take the military
route.  Don't reason with the students, make them do laps or pushups
instead.  Call them names and insult their mothers when they claim to
be smarter than you.  Get rid of all their preconceived notions in
boot camp so they can actually learn something later on.  (what, do
students actually say "only wussies use Pascal in the real world" at
West Point?).

The old saying goes, "he can write Fortran in any language".  Which is
just a way of saying that you can present a programmer with any
language you want, but if they can't program well you won't get a good
program at the end.
-- 
Darin Johnson
djohnson@ucsd.edu	O-
    Gravity is a harsh mistress - The Tick




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-19  0:00           ` Robert Dewar
  1996-07-20  0:00             ` steved
@ 1996-07-23  0:00             ` Ralph Silverman
  1 sibling, 0 replies; 688+ messages in thread
From: Ralph Silverman @ 1996-07-23  0:00 UTC (permalink / raw)



Robert Dewar (dewar@cs.nyu.edu) wrote:
: Jason said

: ": Of course I think you should learn at least seven or eight high level
: : languages just for fun, and five or six assemblers for the same reason.

:    No doubt.  No good programmer only knows one language.  And no
:    really good programmer doesn't know assembly."

: I worry at this recommendation. It encourages what I often see at the
: beginning level of the "language collecting" phenomenon. People think
: that learning about programming is learning the syntax of lots of 
: different languages, while not really knowing how to program in any of
: them.

: Yes it is true that really good programmers tend to know several languages
: and to know assembler, but this is not always true, I know some quite
: brilliant COBOL programmers around who don't know other languages, and
: these days you quite often find very good programmers who only know C.

: On the other hand, I sure know lots of *terrible* programmers who can
: program equally terribly in many different languages.

: I still think the important thing at the start is to learn how to program. It
: is worth using a language that is rich enough to introduce all the necessary
: abstraction concepts (Borland Object Pascal, Ada 95, C++ meet this criterion,
: this is not a complete list of course, but you get the idea). It is a 
: mistake to learn C to start with, since it lacks critical abstraction
: features and so you will tend to miss the importance of data abstraction
: and parametrization at the module level (it is not that this cannot be done
: in C, just that you are unlikely to learn it if you start by learning C).

: But in any case, the important thing is to concentrate on programming, not
: on language collecting, at an early stage. Unfortunately many high school
: teachers who teach computing have not progressed much beyond the language
: collecting stage themselves, so you often have to rely on books at that
: level.


--
************begin r.s. response*****************

	1) learn programming
		without practising programming
	??????????
	actually,  some very advanced minds of the
	past did the like of this and produced the 
	foundations of computer programming of today...

	arguably,  gottlib frege,  a german mathematics
	specialist of the 19th century researched
		formal languages
	which are much like programming languages...

	in ancient greece,  a mathematics specialist
	known as
		eretosthenes
	produced the algorithm for a
		prime number finder
	known as the
		sieve of eretosthenes
	this has been used as the basis for
	widely utilized benchmarking programs
	(in the decade 1980-1989) and now.  
	
	(a team i worked with in 1983 coded this in
	the 'c' programming language
	and ran this on a
		western electric 3b20s!)

	however...
	reasonably,  the great mathematics minds
	of the past generally would have preferred to use
	computers such as we have,  if such then were
	available...

	2) collecting languages?????????
	the computer science curriculum of the
		acm
	(in past times)
	included such.
	personally,  i learned this way...
	this was very difficult...
	and certainly beneficial!

************end r.s. response*******************
Ralph Silverman
z007400b@bcfreenet.seflin.lib.fl.us





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-19  0:00               ` Robert Dewar
                                   ` (2 preceding siblings ...)
  1996-07-20  0:00                 ` Crash
@ 1996-07-23  0:00                 ` Ralph Silverman
  3 siblings, 0 replies; 688+ messages in thread
From: Ralph Silverman @ 1996-07-23  0:00 UTC (permalink / raw)



Robert Dewar (dewar@cs.nyu.edu) wrote:
: "
:         Never, never, never try to start learning a language before you
:         learn how to program.  A good algorithm simply cannot be replaced,
:         and learning how to write an alogrithm is in programming, not
:         in learning a language.  You can sit down and read a hundred books
:         about how to use pointers and linked lists in c++, and you still
:         won't know how to use them in a good manner, if at all."


: I am very familiar with the highly unusual approach Georgia Tech takes, but
: I find the above remark rubbish. You cannot express algorithms unless you
: use a language to express them in, and for my taste, a well chosen 
: programming language is as good choice as anything.


--
***********begin r.s. response*************

	(computer programmers as
	misfits,  nerds,  outcasts,
	outlaws,  renegades etc.)

	traditionally,
	the art of computer programming
	fostered,  and required,
	individual,  independent
	thinking.

	this trend was fostered by the
	availability,  and use,  of objective
	arbitration of competing interpretations
	and ideas by computer systems...

	largely,  specifically,
	software development systems,
	such as compilers...

	because of this,  programmers tended
	to develop empirical,  aauthoritarian
	views on verification and applicability...

	in view of this,
	practically speaking,
	in the traditional system,
	clever students might readily challenge
	teachers.

	efforts to overturn this traditional
	system in the education of programmers
	may be rooted in the perceived social
	disruption caused by such,  more than by
	any putative effort to improve the education
	of these.

***********end r.s. response***************
Ralph Silverman
z007400b@bcfreenet.seflin.lib.fl.us





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-19  0:00         ` Reto Koradi
@ 1996-07-23  0:00           ` TRAN PHAN ANH
  0 siblings, 0 replies; 688+ messages in thread
From: TRAN PHAN ANH @ 1996-07-23  0:00 UTC (permalink / raw)





Irritating, but that's the reality.  There are subtle points and pitfalls 
with all languages.  How many times have you wondered why something that 
should work, but does not?  Then you find that the reason it does not work 
is due to some "idosynchracies" or "features" of the language? :-)  
Therefore, companies ask for experience.  And C/C++ is a skill that is 
asked about quite often.

Of course, a good problem solver makes a better employee than a person, who 
"just" knows a dozen of languages.  But a good problem solver with expertise 
in a dozen of languages is a dream for any company.

Anh

In article <31EF54CA.15FB@spectrospin.ch>, Reto Koradi <kor@spectrospin.ch> writes:
> Patrick Horgan wrote:
>> In my company and in many other startups in Silicon Valley doing
>> the bleeding edge work in the newest cool stuff, you can't get a
>> job without being a C++ programmer, period.
> 
> Such statements keep irritating me. Programming languages are nothing
> but tools, and if you know the principles of programming and have
> learned a few other languages, you start programming in C on the
> first day, and learn about the more subtle points with time.
> I grew up with the Pascal/Modula-2/Oberon line (what do you expect
> when studying at Wirth's university?), and didn't have the
> slightest problem programming in C when I started my first job.
> 
> Even though C and C++ dominate the workplace, languages like Modula-2
> are still much better for learning programming.
> -- 
> Reto Koradi (kor@mol.biol.ethz.ch, http://www.mol.biol.ethz.ch/~kor)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-19  0:00           ` johnf
                               ` (3 preceding siblings ...)
  1996-07-22  0:00             ` Ralph Silverman
@ 1996-07-23  0:00             ` John A Hughes
  4 siblings, 0 replies; 688+ messages in thread
From: John A Hughes @ 1996-07-23  0:00 UTC (permalink / raw)



In article <johnf-1907961506170001@johnf-mac.nando.com>,
johnf <johnf@nando.com> wrote:
>In article <01bb7591$83087d60$87ee6fce@timpent.airshields.com>, "Tim
>Behrendsen" <tim@airshields.com> wrote:
>
>>Carlos DeAngulo <cdvi@msg.itg.ti.com> wrote in article
>><01bb74ac$b7aa7860$7b91f780@deangulo>...
>>> You should definitely learn C/C++. The business world today uses C++ as
>>its
>>> power language to develop the finest applications. Don't let anyone guide
>>> you wrong.
>>
>>Not to start a flame war on C++, but all you newbie programmers
>>out there, don't believe everything you hear about C++...
>OK
>
>I am one of these newbies.
>I haven't programmed anything, ever, with any language.
>I am currently learning C with the help of Dave Mark (Learn C on Mac) as
>my baptism into programming. 
>So, I am I only learning C, and not "how to program"? I don't understand
>how the two can be exclusive...
>[deletia]
>I don't expect to start as the Sr. Developer on some project, I will
>happily slog it out in the trenches and pay my dues, just explain to me
>how to get there...

I was a TA for introductory computer science classes one year in grad
school.  The experience was pretty illuminating and I always remember
it when I read these silly arguments.

The first semester, they taught the poor kids Scheme, a dialect of
LISP.  These are kids who've never seen a program before in their
lives. However, one thing about the language is that its syntax and
semantics are crystal clear. When you program in Scheme, you're really
doing pure problem solving. The brightest kids in the class understood
this after a while and went to work. A sizeable portion of the class,
however, whined because they talked to "big, smart people in the real
world" who told them they were wasting their time, no one uses Scheme,
it's a stupid language, you can't do XYZ like you can in C or
whatever, man, universities are stupid, this country is fucked up
blablabla. I was a very friendly TA and did what I could to try to
persuade these students that there is more to learning a skill than
aping what big smart people in the real world do, and point out that
they were learning basics that they would use over and over in
whatever language they eventually used-- they were just starting out
with the basics and not spending time on syntactic crypticness and
memory management exotica. Some understood. Some never shut up with
their whining. And they were lousy students, too, and had bizarre,
undeserved arrogances because they were hobbyists who could write
programs to do this or that in whatever language and didn't need to
pay any attention to what we were trying to teach them.

Of course this would not be so interesting if I did not TA most of the
same students the very next semester in part II of intro comp sci,
which was taught in C++. Now, I personally don't believe C++ is a
language for beginners at all, though I don't sharre the opinion that
the language "sucks" or is a "bad implementation of OOL". However, the
good students, who had paid attention in the first semester, got used
to thinking about how to solve problems and express them in a
formalism, and didn't whine constantly about all those stupid
parentheses, managed to pick up the basics of C++ just as easily and
go on to solve the significantly harder problems they were given in
that semester. A significant portion of the whiners all practically
flunked; it was very sad. They hadn't learned a damn thing the first
semester except how stupid computer scientists are, and then with
their first taste of the "real world" they were hopeless.

I am not saying that this is anything other than anecdotal, but I
think it illustrates very nicely how someone who is really interested
in what programming is differs from someone who isn't, and exercises
linguistic bigorty in the name of what goes on in the real world. What
goes on in the real world, in fact, is that a bunch of morons who
really don't know how to program but have "used" hot new languages get
lots of jobs writing lousy code that other people who like solving
problems and expressing them in a formalism (any one will do, really)
have to maintain, debug, and rewrite totally when the smallest thing
changes because they have no foresight.

THAT is the real world.

Pick a language you feel comfortable with and that looks kind of easy
to start with. Read what EVERYone has to say about programming and
think about those things while you learn. Stuff will become clear to
you. Then you can move on to other languages to see how different
formalisms can help you express solutions differently. You will have
wasted no time learning any of the languages, and you'll be
appreciated more for being intelligent and having broad experience
than your coworkers who learned C++ out of the womb but has no idea
what it's for. IMHO, C and Pascal are not different enough to warrant
this silly argument; Pascal's a bit easier and probably better for
newbies. Learning C from there will be a total snap. And if you really
know what you're doing, you'll impress people in interviews and you
WILL get a job.


jah




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-19  0:00           ` Philip Brashear
@ 1996-07-23  0:00             ` John A Hughes
  1996-07-26  0:00               ` Randy Kaelber
  0 siblings, 1 reply; 688+ messages in thread
From: John A Hughes @ 1996-07-23  0:00 UTC (permalink / raw)



In article <4sntoi$i71@ns1.sw-eng.falls-church.va.us>,
Philip Brashear <brashear@ns1.sw-eng.falls-church.va.us> wrote:
>In article <4sjmtk$e95@herald.concentric.net>,
>Mark  McKinney  <mckmark@mail.concentric.net> wrote:
>>
>>This raises a big concern I have always had about how programming is taugh 
>>in general. Problem solving techniques, style, methodologies etc. should 
>>be taught or learned prior to a programming language. The "this is how you 
>>do it and then this is how yu do it well" approach seems highly 
>>ineffective. 
>>-Mark
>
>This reminds me of the high school English teacher who said "Teach them
>grammar in elementary school, and I'll teach them how to write (compose)."
>
>How do you learn grammar without writing (composing)?  How do you learn
>problem solving techniques, style, methodologies, etc. without actually
>solving problems, creating programs according to certain styles, using
>a programming language to apply a methodology?  Might as well try to teach
>a child the mathematical discipline of knot theory before teaching her how
>to tie her shoes!

I personally think the one doesn't preclude the other. Programming
*is* a little more than just problem solving. A lot of times people
around here have bizarre bugs they can't find that end up depending on
the language-- some flaky syntax fact or a well-known obscure
implementation detail (and yes, I did intend to write that--
computation is rife with well-known obscurities :). All that needs to
be learned all at once. What doesn't need to be learned is linguistic
fossilization. The particular language doesn't matter much except as a
source of trivia; however, the PROCESS of USING a precise language
PROPERLY is very important.

I think classroom-style instruction for computer science is bogus. The
typical ratio of lecture to lab should be inverted in my opinion-- 3
labs to each lecture. And lectures should introduce the labs and then
discuss problems in the labs, and the broader significance of problems
people had or differences in various solutions, NOT the other way
around, where labs are some kind of vesigial "illustration" of lecture
concepts.

I also think at more advanced levels the labs should really teach
collaborative programming, where everyone must work on a large
project, everyone must discuss how the project should be broken up,
and maybe teams should have to trade software components during
implementation to teach them how to write maintainable code.

This kind of class would be very hard to design and teach, and would
be a total blast.


jah




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-22  0:00             ` Ralph Silverman
@ 1996-07-23  0:00               ` Joe Gwinn
  1996-07-24  0:00                 ` John A Hughes
  1996-07-24  0:00                 ` Theodore E. Dennison
  0 siblings, 2 replies; 688+ messages in thread
From: Joe Gwinn @ 1996-07-23  0:00 UTC (permalink / raw)



Shouldn't we answer the man's question, without drifting into theological
discussions about the relative merits of various languages?  He wants to
find a better job, not find religion, or become a better person.  So,
where are the jobs?  

Joe Gwinn


> : I am one of these newbies.
> : I haven't programmed anything, ever, with any language.
> : I am currently learning C with the help of Dave Mark (Learn C on Mac) as
> : my baptism into programming. 
> : So, I am I only learning C, and not "how to program"? I don't understand
> : how the two can be exclusive.
> : How does one learn how to be a "Good Programmer" without picking a
> : language to learn first, learning it well, then learning others as they
> : interest you? 
> : I am not trying to be a wise guy, just a guy who can learn to program well
> : enough to get out of his crappy job and into this (for me) exciting field
> : as a career.
> : I don't expect to start as the Sr. Developer on some project, I will
> : happily slog it out in the trenches and pay my dues, just explain to me
> : how to get there...




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-20  0:00           ` Should I learn C or Pascal? Robert Dewar
  1996-07-22  0:00             ` TRAN PHAN ANH
@ 1996-07-23  0:00             ` Ken Garlington
  1 sibling, 0 replies; 688+ messages in thread
From: Ken Garlington @ 1996-07-23  0:00 UTC (permalink / raw)



Robert Dewar wrote:
> 
> Tran said
> 
> "Besides, if you can master C/C++, and JAVA, it will take you 5 min. to learn
> Pascal.  Actually, if one has a solid foundation in programming techniques and
> a solid understanding of one or two languages, one can aquire a working
> knowledge of any language in no time."

Except RPG or APL, of course. No one has ever learned those two :)

-- 
LMTAS - "Our Brand Means Quality"




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
       [not found] <sperlman-0507961717550001@p121.ezo.net>
                   ` (11 preceding siblings ...)
  1996-07-24  0:00 ` Darin Johnson
@ 1996-07-24  0:00 ` Jon S Anthony
  1996-07-25  0:00 ` ++           robin
                   ` (7 subsequent siblings)
  20 siblings, 0 replies; 688+ messages in thread
From: Jon S Anthony @ 1996-07-24  0:00 UTC (permalink / raw)



In article <gwinn-2307961502440001@smc19.ed.ray.com> gwinn@res.ray.com (Joe Gwinn) writes:

> He wants to find a better job, not find religion, or become a better
                     ^^^^^^
> person.  So, where are the jobs?
                         ^^^
I don't think there's much correlation between these, but if the latter
is what's desired, then clearly the answer is:


Visual Basic...

/Jon
-- 
Jon Anthony
Organon Motives, Inc.
1 Williston Road, Suite 4
Belmont, MA 02178

617.484.3383
jsa@organon.com





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-24  0:00   ` Andrew J Steinbach
  1996-07-24  0:00     ` John A Hughes
  1996-07-24  0:00     ` Jon Bell
@ 1996-07-24  0:00     ` system
  2 siblings, 0 replies; 688+ messages in thread
From: system @ 1996-07-24  0:00 UTC (permalink / raw)



 stei0113@maroon.tc.umn.edu (Andrew J Steinbach) writes:
>Darin Johnson (djohnson@tartarus.ucsd.edu) wrote:
>: > A sizeable portion of the class,
>: > however, whined because they talked to "big, smart people in the real
>: > world" who told them they were wasting their time, no one uses Scheme,
>: > it's a stupid language, you can't do XYZ like you can in C or
>: > whatever, man, universities are stupid, this country is fucked up
>: > blablabla.
>
>: Unfortunately, this group exists regardless of what you teach or how.
>: I've TA'd and proctored a variety of classes (programming and
>: architecture), and you don't get away from them.  And it's not just

[clip]

>: Of course, IMHO, I think the solution would be to take the military
>: route.  Don't reason with the students, make them do laps or pushups
>: instead.  Call them names and insult their mothers when they claim to
>: be smarter than you.  Get rid of all their preconceived notions in
>: boot camp so they can actually learn something later on.  (what, do
>: students actually say "only wussies use Pascal in the real world" at
>: West Point?).

I personally assumed that this was at least partly tongue-in-cheek.

I have been a T.A. as well (for physics) and fully sympathize with
the above posters' feelings about the whiners.  Most who have valid
complaints (e.g. "The only reason I am being forced to take physics is to
weed out students applying for my program") Don't whine about it,
they state it and go to work.

>	Absolutely.  "We know what's good for you.  Your ideas are 
>invalid and stupid.  Now swallow everything we feed you, because it is 
>the WORD."  Free thought, opinions, what a concept.  

Have you considered that maybe, just maybe, the teachers know something 
the students don't?  That the students should work on learning what is
being taught?  The students expressed their feelings and the Profs and 
TAs argued (ARGUED not DECLARED) that the coursework was valid and usefull.

>Ugh.  Have you 
>considered that these people may actually have *valid* complaints with 
>Scheme?

After reading the original post? no.

>: The old saying goes, "he can write Fortran in any language".  Which is
>: just a way of saying that you can present a programmer with any
>: language you want, but if they can't program well you won't get a good
>: program at the end.
>
>	This is true, but the choice of language does make a difference in
>the final result, too.  I do have a problem with using Scheme as a
>learning tool in intro courses.  Why is it wrong to teach students basic
>algorithms and data structures using a language which they probably 
>already know?  

like Basic?

More to the point, it is wrong because with a new language it is easier
to break their bad habits.  Once they learn the new habits taught by
the class they can (in the long run) compare and decide which is better.

>Most intro CSci students aren't programming virgins (well, 
>at least the ones I know).

I took assembler here at NIU, many of the students in the class had had
one prior programming course, COBOL.  Blech, the prof and T.A.s had to
teach them programming as well as assembler.

Robert Morphis

>Andy Steinbach
>stei0113@maroon.tc.umn.edu




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-23  0:00 ` Darin Johnson
  1996-07-24  0:00   ` Michael Feldman
@ 1996-07-24  0:00   ` Ralph Silverman
  1996-07-24  0:00     ` TRAN PHAN ANH
  1996-07-24  0:00   ` Andrew J Steinbach
  2 siblings, 1 reply; 688+ messages in thread
From: Ralph Silverman @ 1996-07-24  0:00 UTC (permalink / raw)



Darin Johnson (djohnson@tartarus.ucsd.edu) wrote:
: > A sizeable portion of the class,
: > however, whined because they talked to "big, smart people in the real
: > world" who told them they were wasting their time, no one uses Scheme,
: > it's a stupid language, you can't do XYZ like you can in C or
: > whatever, man, universities are stupid, this country is fucked up
: > blablabla.

: Unfortunately, this group exists regardless of what you teach or how.
: I've TA'd and proctored a variety of classes (programming and
: architecture), and you don't get away from them.  And it's not just
: that they want the university to be a trade school either, I had
: people in '81 (back when you had to know a variety of things and be
: able to adapt in order to program) and they were complaining about why
: they should learn Pascal since they already knew BASIC and didn't need
: this intro to programming class.  They didn't do that well in the
: class however.  Had a student complain about why he should learn how
: compiler work since we already have a good C compiler.  And of course,
: plenty were upset that they had to learn about the eniac or
: computability or PDA's or what-not.

: Of course, IMHO, I think the solution would be to take the military
: route.  Don't reason with the students, make them do laps or pushups
: instead.  Call them names and insult their mothers when they claim to
: be smarter than you.  Get rid of all their preconceived notions in
: boot camp so they can actually learn something later on.  (what, do
: students actually say "only wussies use Pascal in the real world" at
: West Point?).

: The old saying goes, "he can write Fortran in any language".  Which is
: just a way of saying that you can present a programmer with any
: language you want, but if they can't program well you won't get a good
: program at the end.
: -- 
: Darin Johnson
: djohnson@ucsd.edu	O-
:     Gravity is a harsh mistress - The Tick

--
*************begin r.s. response*****************

	(re.  '...military style...'
		in posting cited above)

	who will watch the watchers?

	...
	a teacher who can not hold the interest
	of the students in this area without
	resort to draconian measures may be
		the problem!!!


	(re.  '...Don't reason with the students...'
		in the posting cited above)

	this is a pretty pass!!!
	if we can not reason about
		computer programming!!!

*************end r.s. response*******************
Ralph Silverman
z007400b@bcfreenet.seflin.lib.fl.us





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-24  0:00   ` Andrew J Steinbach
@ 1996-07-24  0:00     ` John A Hughes
  1996-07-24  0:00     ` Jon Bell
  1996-07-24  0:00     ` system
  2 siblings, 0 replies; 688+ messages in thread
From: John A Hughes @ 1996-07-24  0:00 UTC (permalink / raw)



In article <4t49om$ebi@epx.cis.umn.edu>,
Andrew J Steinbach <stei0113@maroon.tc.umn.edu> wrote:
>Darin Johnson (djohnson@tartarus.ucsd.edu) wrote:
>: Of course, IMHO, I think the solution would be to take the military
>: route.  Don't reason with the students, make them do laps or pushups
>: instead.  Call them names and insult their mothers when they claim to
>: be smarter than you.  Get rid of all their preconceived notions in
>: boot camp so they can actually learn something later on.  (what, do
>: students actually say "only wussies use Pascal in the real world" at
>: West Point?).
>
>	Absolutely.  "We know what's good for you.  Your ideas are 
>invalid and stupid.  Now swallow everything we feed you, because it is 
>the WORD."  Free thought, opinions, what a concept.  Ugh.  Have you 
>considered that these people may actually have *valid* complaints with 
>Scheme?

I have yet to encounter anything in my life that someone could not have valid
complaints about. One can argue which style of teaching and which tools are
the best for that style, and I would find that a pretty interesting argument,
but I do believe the classes I was TAing were tailored towards a certain
set of goals and that the choice of Scheme was not unreasonable for those
goals. The most troubling thing is that people feel there are expendable
concepts that you can ignore if you use the right language, that all this
discussion about correct programming, giving an eye to maintainability and
extensibility, and using the features of various languages properly is
all airy-fairy and gets in the way of doing "real work". There are brilliant
people who can do anything with any tool and any methodology, but normal
people need to be taught specific methodologies with specific tools. And
as far as I can tell there was nothing we taught in our class that was
not worth learning, and the people who complained about the language are
morons who I hope no one ever hires, especially not in any company where
I might have to deal with them and their code.

>	This is true, but the choice of language does make a difference in
>the final result, too.  I do have a problem with using Scheme as a
>learning tool in intro courses.  Why is it wrong to teach students basic
>algorithms and data structures using a language which they probably 
>already know?  Most intro CSci students aren't programming virgins (well, 
>at least the ones I know).

I think you have to pick a language for intro classes. And I don't think
picking one they're already familiar with, and presumably already have
bad habits in, is a priori any better than choosing one they've never 
seen before intending them to actually pay attention to someone else's
experience in that language and in programming in general.

I'll say it again; if you think a particular language only teaches you things 
not worth learning, you have a pretty impoverished idea of what useful 
knowledge is.


jah






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-24  0:00   ` Andrew J Steinbach
  1996-07-24  0:00     ` John A Hughes
@ 1996-07-24  0:00     ` Jon Bell
  1996-07-24  0:00     ` system
  2 siblings, 0 replies; 688+ messages in thread
From: Jon Bell @ 1996-07-24  0:00 UTC (permalink / raw)



[followups set to comp.edu, which is where this discussion really belongs]

 Andrew J Steinbach <stei0113@maroon.tc.umn.edu> wrote:
> Why is it wrong to teach students basic
>algorithms and data structures using a language which they probably 
>already know?  Most intro CSci students aren't programming virgins (well, 
>at least the ones I know).

Believe it or not, many prospective CS majors do *not* have significant 
programming experience when they arrive at college/university.  This 
varies from school to school, of course.  Here, it is unusual for a 
student in my CS1/CS2 courses to have *any* real programming experience.
Our situation isn't really relevant because we don't offer a CS major 
(just a minor); nevertheless, I have seen comments from people at larger 
schools with CS degree programs that a significant number of their 
prospective majors are programming "virgins" or almost so.

Also, the ones that *do* have programming experience haven't all been using 
the same language.  One reason for using a "different" programming 
language like Scheme is that it tends to "level the playing field" for 
students in the course.

Many factors go into the choice of an introductory programming language.  
Different schools weigh those factors differently.  In our case, since we 
offer only a minor, we don't have *time* to start with one language, then 
switch to another; so we use C++ for CS1/CS2.  Actually, we also have a 
"CS0" course which is mostly descriptive, but has a bit of programming in 
the HyperTalk scripting language, and practically all our CS minors take 
that first.

-- 
Jon Bell <jtbell@presby.edu>                        Presbyterian College
Dept. of Physics and Computer Science        Clinton, South Carolina USA




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-23  0:00               ` Joe Gwinn
  1996-07-24  0:00                 ` John A Hughes
@ 1996-07-24  0:00                 ` Theodore E. Dennison
  1 sibling, 0 replies; 688+ messages in thread
From: Theodore E. Dennison @ 1996-07-24  0:00 UTC (permalink / raw)



Joe Gwinn wrote:
> 
> Shouldn't we answer the man's question, without drifting into theological
> discussions about the relative merits of various languages?  He wants to
> find a better job, not find religion, or become a better person.  So,
> where are the jobs?

Everywhere, and in every language.

Not a particularly helpful anwer I'm afraid...

-- 
T.E.D.          
                |  Work - mailto:dennison@escmail.orl.mmc.com  |
                |  Home - mailto:dennison@iag.net              |
                |  URL  - http://www.iag.net/~dennison         |




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
       [not found] <sperlman-0507961717550001@p121.ezo.net>
                   ` (10 preceding siblings ...)
  1996-07-23  0:00 ` Darin Johnson
@ 1996-07-24  0:00 ` Darin Johnson
  1996-07-25  0:00   ` Andy Askey
                     ` (2 more replies)
  1996-07-24  0:00 ` Jon S Anthony
                   ` (8 subsequent siblings)
  20 siblings, 3 replies; 688+ messages in thread
From: Darin Johnson @ 1996-07-24  0:00 UTC (permalink / raw)



gwinn@res.ray.com (Joe Gwinn) writes:
> Shouldn't we answer the man's question, without drifting into theological
> discussions about the relative merits of various languages?  He wants to
> find a better job, not find religion, or become a better person.  So,
> where are the jobs?  

And speaking of theology - give a man a fish and you feed him for a
day, teach a man to fish and you feed him for life.

That's why the thread has drifted.  The original poster wanted to know
where to get a fish.  If he learns the language that gets him a job
now, what happens next year?  The language will change - and more
often the way the language is used will change.  When you know how to
program, the choice of language is just a matter of syntax and
idiosyncracies.  If all you know is one language/methodology, then
everything else is viewed as "a silly way of doing things".  If you
learn abstraction, you can use it in any language.  If you learn C and
have never seen abstraction, you're not going to use abstraction until
years of experience cause you to use it.  These aren't things you
learn on the job unless you work with people that also know these
things and they make an effort to teach you (and this is becoming less
and less likely).

I have to maintain code written by someone who just learned C after
decades of assembler.  He thinks it's the greatest language in the
world, and is always wondering why I am spending the time improving
code (because it saves me time in the long run).  There's absolutely
no sense of abstraction in the code, no comments (except to tell the
name of the file), little portability (he thinks separate source trees
are fine), and whatever gets the job done is what gets done.  That's
what happens when you just learn the language but not the programming.

Same thing everywhere.  They teach English to English speakers in
school.  They teach music theory to people who can already play.
Are computers supposed to be the exception?
-- 
Darin Johnson
djohnson@ucsd.edu	O-
       The opinions expressed are not necessarily those of the
       Frobozz Magic Hacking Company, or any other Frobozz affiliates.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-23  0:00                       ` Richard A. O'Keefe
  1996-07-23  0:00                         ` Michael Ickes
@ 1996-07-24  0:00                         ` system
  1 sibling, 0 replies; 688+ messages in thread
From: system @ 1996-07-24  0:00 UTC (permalink / raw)



 ok@goanna.cs.rmit.edu.au (Richard A. O'Keefe) writes:
>smosha@most.fw.hac.com (Stephen M O'Shaughnessy) writes:
>
>(b) The Living Bible IS NOT A TRANSLATION!  It is openly and unashamedly
>    a *paraphrase*.

amen (and a lousy one IMHO)

>consider the current Jewish Publication Society
>translation of the Tanach, and a really professional Christian translation
>such as The Revised English Bible or the International Version.  Despite
>being produced by disjoint committees with radically different
>theological biases; some of the sentences are word for word identical.

>I think that what this shows is that it *is* possible to do a very good
>job of translating between languages in two unrelated families 2500+
>years apart in dramatically different cultures *if* you take hundreds of
>scholars, hundreds of years, and build up a "translation technology",
>and libraries full of information about the cultural background.

And if you include enough footnotes to double the amount of text.

[although this started out as a religious discussion and so discussing
the Bible seems on topic I will try to make it even more on-topic]

It also shows (by analogy) that it is possible to do good programming
in any language as long as you work at it.

>Fifty years of programming language research, and we end up with C++ ???
>Richard A. O'Keefe; http://www.cs.rmit.edu.au/~ok; RMIT Comp.Sci.

Robert




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-16  0:00 ` Darin Johnson
@ 1996-07-24  0:00   ` Ralph Silverman
  0 siblings, 0 replies; 688+ messages in thread
From: Ralph Silverman @ 1996-07-24  0:00 UTC (permalink / raw)



Darin Johnson (djohnson@tartarus.ucsd.edu) wrote:
: > Also keep in mind that this rather lengthy diatribe was comparing c with the 
: > 'standard' pascal, not what it has become.  Today's pascal is as
: > different from what was being discussed as today's c++ is from the old c.

: True, Pascal has been mostly subsumed by Modula II and III.  These are
: nice languages, and you can do real-world and systems programming in
: them.  They're not as popular (you probably have to go commercial to
: get a compiler).
: -- 
: Darin Johnson
: djohnson@ucsd.edu	O-
:        Support your right to own gnus.

--
************begin r.s. response********************

	(re.  modula2
		in posting cited above)

	an amazing
		shareware
		^^^^^^^^^
		modula2 compiler
	for
		(ms pc dr)dos
	is widely available,  named

		fmodula2
		^^^^^^^^
	.

************end r.s. response**********************
Ralph Silverman
z007400b@bcfreenet.seflin.lib.fl.us





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-24  0:00   ` Ralph Silverman
@ 1996-07-24  0:00     ` TRAN PHAN ANH
  0 siblings, 0 replies; 688+ messages in thread
From: TRAN PHAN ANH @ 1996-07-24  0:00 UTC (permalink / raw)



Allow me to round off this thread.

You don't need to start with lots of parentheses, or Pascal to learn 
problem solving.  And you don't screw up your learning ability if you 
start off with C or machine code.  Heck, I wonder if turning switches 
made Knuth a worse Computer Scientist than what he is?  :-))

Anyway, reality check, 2 students looking for an internship or a 
part-time job, which requires programming.  All things being equal, one 
with a working knowledge of C/C++, and one with a working knowledge of 
parantheses, who gets the job (most of the times)?  By the time they
graduate, both will have covered the same material.  But one will have 
EXPERIENCE to put on the resume.

If one has the desire, and an open mind...anything can be learned.  On 
the other hand, if one thinks that what one knows is enough, be it lisp, 
basic, scheme, C, or anything, one will never learn new things.  That's 
given by definition.

Anh


In article <4t5ebb$hp3@nntp.seflin.lib.fl.us>, z007400b@bcfreenet.seflin.lib.fl.us (Ralph Silverman) writes:
> Darin Johnson (djohnson@tartarus.ucsd.edu) wrote:
> : > A sizeable portion of the class,
> : > however, whined because they talked to "big, smart people in the real
> : > world" who told them they were wasting their time, no one uses Scheme,
> : > it's a stupid language, you can't do XYZ like you can in C or
> : > whatever, man, universities are stupid, this country is fucked up
> : > blablabla.
> 
> : Unfortunately, this group exists regardless of what you teach or how.
> : I've TA'd and proctored a variety of classes (programming and
> : architecture), and you don't get away from them.  And it's not just
> : that they want the university to be a trade school either, I had
> : people in '81 (back when you had to know a variety of things and be
> : able to adapt in order to program) and they were complaining about why
> : they should learn Pascal since they already knew BASIC and didn't need
> : this intro to programming class.  They didn't do that well in the
> : class however.  Had a student complain about why he should learn how
> : compiler work since we already have a good C compiler.  And of course,
> : plenty were upset that they had to learn about the eniac or
> : computability or PDA's or what-not.
> 
> : Of course, IMHO, I think the solution would be to take the military
> : route.  Don't reason with the students, make them do laps or pushups
> : instead.  Call them names and insult their mothers when they claim to
> : be smarter than you.  Get rid of all their preconceived notions in
> : boot camp so they can actually learn something later on.  (what, do
> : students actually say "only wussies use Pascal in the real world" at
> : West Point?).
> 
> : The old saying goes, "he can write Fortran in any language".  Which is
> : just a way of saying that you can present a programmer with any
> : language you want, but if they can't program well you won't get a good
> : program at the end.
> : -- 
> : Darin Johnson
> : djohnson@ucsd.edu	O-
> :     Gravity is a harsh mistress - The Tick
> 
> --
> *************begin r.s. response*****************
> 
> 	(re.  '...military style...'
> 		in posting cited above)
> 
> 	who will watch the watchers?
> 
> 	...
> 	a teacher who can not hold the interest
> 	of the students in this area without
> 	resort to draconian measures may be
> 		the problem!!!
> 
> 
> 	(re.  '...Don't reason with the students...'
> 		in the posting cited above)
> 
> 	this is a pretty pass!!!
> 	if we can not reason about
> 		computer programming!!!
> 
> *************end r.s. response*******************
> Ralph Silverman
> z007400b@bcfreenet.seflin.lib.fl.us
> 




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-23  0:00               ` Joe Gwinn
@ 1996-07-24  0:00                 ` John A Hughes
  1996-07-24  0:00                 ` Theodore E. Dennison
  1 sibling, 0 replies; 688+ messages in thread
From: John A Hughes @ 1996-07-24  0:00 UTC (permalink / raw)



In article <gwinn-2307961502440001@smc19.ed.ray.com>,
Joe Gwinn <gwinn@res.ray.com> wrote:
>Shouldn't we answer the man's question, without drifting into theological
>discussions about the relative merits of various languages?  He wants to
>find a better job, not find religion, or become a better person.  So,
>where are the jobs?  

I hope the jobs are where the people who know what they're doing are.
I thought I *did* answer the guy's question-- pick the one that looks
easiest to you or for which you have the most resources, pay attention
to what people say about programming in the language, and when you're
comfortable with the ideas behind programming, learn whatever you
want.

The surest sign that a programmer is in the wrong field is his
inability to move from one language to another, or even, I would
argue, one paradigm to another. (There are tons of such people in this
industry. I'm no genius, but I think what they end up doing is pretty
sad.)  Time you spend learning any language will not be wasted. You do
not suddenly become magic and omniscient because you learn C or
Pascal.  And you do not suddenly become marketable because you taught
yourself C instead of Pascal. You're up against a wall of "experience
necessary" no matter what you do when you're learning. By all means,
learn C, because there really are tons of jobs for that, but "learning
C" is not enough.

To the original poster: Please, ignore this thread, and read the other
ones, and try to understand the advice you find there. You'll get much
farther paying close attention to that sort of thing and only
practical attention to the particular language you practice in. My
personal belief is that Pascal is better for starting out, as it's a
little easier for total neophytes. Learning C from there, once you
have the right habits, is a snap. If you find moving from Pascal to C
a big step, you should probably look for a different career.


jah




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-23  0:00 ` Darin Johnson
@ 1996-07-24  0:00   ` Michael Feldman
  1996-07-24  0:00   ` Ralph Silverman
  1996-07-24  0:00   ` Andrew J Steinbach
  2 siblings, 0 replies; 688+ messages in thread
From: Michael Feldman @ 1996-07-24  0:00 UTC (permalink / raw)



In article <qq3f2i61zo.fsf@tartarus.ucsd.edu>,

>(what, do
>students actually say "only wussies use Pascal in the real world" at
>West Point?).

Actually, starting this year, they'll probably say "only wussies 
use Ada....":-)

West Point is switching its intro courses to Ada in Sept.

Mike Feldman
------------------------------------------------------------------------
Michael B. Feldman -  chair, SIGAda Education Working Group
Professor, Dept. of Electrical Engineering and Computer Science
The George Washington University -  Washington, DC 20052 USA
202-994-5919 (voice) - 202-994-0227 (fax) 
http://www.seas.gwu.edu/faculty/mfeldman
------------------------------------------------------------------------
       Pork is all that money the government gives the other guys.
------------------------------------------------------------------------
WWW: http://lglwww.epfl.ch/Ada/ or http://info.acm.org/sigada/education
------------------------------------------------------------------------




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-23  0:00 ` Darin Johnson
  1996-07-24  0:00   ` Michael Feldman
  1996-07-24  0:00   ` Ralph Silverman
@ 1996-07-24  0:00   ` Andrew J Steinbach
  1996-07-24  0:00     ` John A Hughes
                       ` (2 more replies)
  2 siblings, 3 replies; 688+ messages in thread
From: Andrew J Steinbach @ 1996-07-24  0:00 UTC (permalink / raw)



Darin Johnson (djohnson@tartarus.ucsd.edu) wrote:
: > A sizeable portion of the class,
: > however, whined because they talked to "big, smart people in the real
: > world" who told them they were wasting their time, no one uses Scheme,
: > it's a stupid language, you can't do XYZ like you can in C or
: > whatever, man, universities are stupid, this country is fucked up
: > blablabla.

: Unfortunately, this group exists regardless of what you teach or how.
: I've TA'd and proctored a variety of classes (programming and
: architecture), and you don't get away from them.  And it's not just
: that they want the university to be a trade school either, I had
: people in '81 (back when you had to know a variety of things and be
: able to adapt in order to program) and they were complaining about why
: they should learn Pascal since they already knew BASIC and didn't need
: this intro to programming class.  They didn't do that well in the
: class however.  Had a student complain about why he should learn how
: compiler work since we already have a good C compiler.  And of course,
: plenty were upset that they had to learn about the eniac or
: computability or PDA's or what-not.

: Of course, IMHO, I think the solution would be to take the military
: route.  Don't reason with the students, make them do laps or pushups
: instead.  Call them names and insult their mothers when they claim to
: be smarter than you.  Get rid of all their preconceived notions in
: boot camp so they can actually learn something later on.  (what, do
: students actually say "only wussies use Pascal in the real world" at
: West Point?).

	Absolutely.  "We know what's good for you.  Your ideas are 
invalid and stupid.  Now swallow everything we feed you, because it is 
the WORD."  Free thought, opinions, what a concept.  Ugh.  Have you 
considered that these people may actually have *valid* complaints with 
Scheme?

: The old saying goes, "he can write Fortran in any language".  Which is
: just a way of saying that you can present a programmer with any
: language you want, but if they can't program well you won't get a good
: program at the end.

	This is true, but the choice of language does make a difference in
the final result, too.  I do have a problem with using Scheme as a
learning tool in intro courses.  Why is it wrong to teach students basic
algorithms and data structures using a language which they probably 
already know?  Most intro CSci students aren't programming virgins (well, 
at least the ones I know).

--
Andy Steinbach
stei0113@maroon.tc.umn.edu




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
       [not found] <sperlman-0507961717550001@p121.ezo.net>
                   ` (13 preceding siblings ...)
  1996-07-25  0:00 ` ++           robin
@ 1996-07-25  0:00 ` ++           robin
  1996-07-25  0:00 ` ++           robin
                   ` (5 subsequent siblings)
  20 siblings, 0 replies; 688+ messages in thread
From: ++           robin @ 1996-07-25  0:00 UTC (permalink / raw)



	Carlos DeAngulo <cdvi@msg.itg.ti.com> wrote in article
	<01bb74ac$b7aa7860$7b91f780@deangulo>...
	> You should definitely learn C/C++. The business world today uses C++ as
	>its
	> power language to develop the finest applications. Don't let anyone guide
	> you wrong.
	
	>Not to start a flame war on C++, but all you newbie programmers
	>out there, don't believe everything you hear about C++.  Object
	>oriented programming has a lot of good concepts, but C++ is a bad
	>implementation of them.  Not that you shouldn't learn it, but
	>don't think it's the ultimate expression of what OOP is all about.
	
	>C++: The PL/I of the 90s.

---The PL/I of the 90s is considerably enhanced,
with strong typing for list processing,
ordinals, an extended macroprocessor, Year 2000
date processing, and much more.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
       [not found] <sperlman-0507961717550001@p121.ezo.net>
                   ` (12 preceding siblings ...)
  1996-07-24  0:00 ` Jon S Anthony
@ 1996-07-25  0:00 ` ++           robin
  1996-07-25  0:00 ` ++           robin
                   ` (6 subsequent siblings)
  20 siblings, 0 replies; 688+ messages in thread
From: ++           robin @ 1996-07-25  0:00 UTC (permalink / raw)



	Crash <wakko@starbase1.htls.lib.il.us> writes:

	>An algorithm is just a set of instructions on how to accomplish a task. An
	>algorithm does not depend at all on any sort of programming "language". In
	>fact, I was taught that you should write out your "game plan" and psuedo
	>code before you even touch any sort of syntax in any language. Yes, I do
	>tend to write up my psuedo-code in a C-like syntax frequently but you
	>don't need to.

	>     Loop until variable is equal to 7
	>	Add 1 to variable
	>	Print out the following string to screen: ;-)
	>	end of loop

	>There's an algorithm written in plain english.

---To be an algorithm, it must be definite.  This one isn't.
The variable "variable" is never initialized.

	 It's a stupid algorithm,
	>I'll give you that, but one none the less and it's not written in any
	>programming "lanugage/syntax" I have ever worked with.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
       [not found] <sperlman-0507961717550001@p121.ezo.net>
                   ` (14 preceding siblings ...)
  1996-07-25  0:00 ` ++           robin
@ 1996-07-25  0:00 ` ++           robin
  1996-07-30  0:00   ` Robert Barnes
  1996-07-31  0:00 ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Darin Johnson
                   ` (4 subsequent siblings)
  20 siblings, 1 reply; 688+ messages in thread
From: ++           robin @ 1996-07-25  0:00 UTC (permalink / raw)



	dewar@cs.nyu.edu (Robert Dewar) writes:

	>"An algorithm is just a set of instructions on how to accomplish a task. An
	>algorithm does not depend at all on any sort of programming "language". In
	>fact, I was taught that you should write out your "game plan" and psuedo
	>code before you even touch any sort of syntax in any language. Yes, I do
	>tend to write up my psuedo-code in a C-like syntax frequently but you
	>don't need to.

	>     Loop until variable is equal to 7
	>        Add 1 to variable
	>        Print out the following string to screen: ;-)
	>        end of loop

	>There's an algorithm written in plain english."

---To be an algorithm, it must be definite.  This one isn't.
The variable "variable" must be intialized.

	>First of all "pseudo-code" is still written in a language, which has syntax
	>and semantics. The trouble with an algorithm written in "plain english"
	>like the above is that it is imprecise. For example does the string that
	>is written to screen include a leading blank? I have no idea from the above.

	>If variable is 2**32-1, does the above still work? I have no idea.

	>Are the strings on separate lines of the screen? I have no idea.

	>writing

	>   while variable /= 7 loop
	>      variable := variable + 1;
	>      put_line (";-)");
	>   end loop;

	>is less writing than you did, just as clear, and importantly, exactly
	>precise

---This isn't any more of an algorithm than that of the
previous writer.  The variable "variable" still isn't
initialized.

Now, how about:

	DO I = 1 TO 7 BY 1;
		PUT SKIP LIST ( ';-)' );
	END;

is abundantly clear that the loop is being executed with
clear start and finish values of the control variable.
Furthermore, the loop will be executed exactly 7 times.

Might be better to consider learning PL/I . . .




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-20  0:00                 ` Jon Bell
                                     ` (2 preceding siblings ...)
  1996-07-22  0:00                   ` Stephen M O'Shaughnessy
@ 1996-07-25  0:00                   ` ++           robin
  3 siblings, 0 replies; 688+ messages in thread
From: ++           robin @ 1996-07-25  0:00 UTC (permalink / raw)



	jtbell@presby.edu (Jon Bell) writes:

	> Robert Dewar <dewar@cs.nyu.edu> wrote:
	>>  You cannot express algorithms unless you
	>>use a language to express them in, and for my taste, a well chosen 
	>>programming language is as good choice as anything.

	>For *my* taste, a real programming language is *better*, because you can 
	>test the correctness of your solution by executing it.  Try *that* with 
	>pseudocode or data flow diagrams!  :-)

You can test the correctness of a solution with
the *algorithm*, whether it's expressed in flowchart,
pseuducode, etc form.

It's considered better to get the design correct *before*
committing the design into detailed code.  Re-designing
the detailed code tends to be more expensive than
re-designing the algorithm.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-20  0:00           ` Mark Eissler
@ 1996-07-25  0:00             ` Erik Seaberg
  1996-07-26  0:00             ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Tim Behrendsen
  1 sibling, 0 replies; 688+ messages in thread
From: Erik Seaberg @ 1996-07-25  0:00 UTC (permalink / raw)



In article <tequila-2007961836190001@tequila.interlog.com>,
	Mark Eissler <tequila@interlog.com> writes:
> Yes, but since just about everyone else has said something I'd say follow
> this path BASIC -> Pascal -> C -> C++ -> JAVA. 

I find this odd.  I'd suggest learning Java (it has garbage collection
and simplified semantics, and it's almost impossible to hurt yourself)
as a prelude to modern C++ (new-style casts, RTTI, STL), perhaps
followed by C if you expect to run into a platform lacking a C++
compiler (or really want to know about the few dark corners that
aren't subsets of C++).  It seems to me that Pascal offers very little
over Java even for teaching - Java is lexically weird because C is,
but Pascal is syntactically weird for cheaper parsers, a less sensible
tradeoff these days.

I suppose my wish list order (with what one at least ought to learn
from them):

* Scheme (functional algorithms, recursion, closures and continuations
  as "all you need")
* Java (imperative algorithms, classes, polymorphism, containers)
* C++ (runtime costs, memory management, real-world utility)
* Eiffel (programming by contract, module isolation and cohesion)
  [perhaps Sather or Ada would work here, I don't know]

but then I expect each language I learn to demand I wrap my brain
around it for a while, or I won't bother studying it unless using some
implementation happens to be expedient.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-23  0:00                         ` Michael Ickes
@ 1996-07-25  0:00                           ` Andy Askey
  0 siblings, 0 replies; 688+ messages in thread
From: Andy Askey @ 1996-07-25  0:00 UTC (permalink / raw)



Michael Ickes <mickes@gnn.com> wrote:

> Should'nt this post go in alt.religious.stuff..................?

I thought Ada was a religion.  A form of Budhism, I believe.
--
May your karma be excellent for forgiving my spelling mishaps.

Andy Askey
ajaskey@gnn.com





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-24  0:00 ` Darin Johnson
@ 1996-07-25  0:00   ` Andy Askey
  1996-07-26  0:00     ` Mark Eissler
  1996-08-02  0:00   ` Patrick Horgan
  1996-08-05  0:00   ` Should I learn C or Pascal? Sherwin Anthony Sequeira
  2 siblings, 1 reply; 688+ messages in thread
From: Andy Askey @ 1996-07-25  0:00 UTC (permalink / raw)



djohnson@tartarus.ucsd.edu (Darin Johnson) wrote:

>gwinn@res.ray.com (Joe Gwinn) writes:
>> Shouldn't we answer the man's question, without drifting into theological
>> discussions about the relative merits of various languages?  He wants to
>> find a better job, not find religion, or become a better person.  So,
>> where are the jobs?  

>And speaking of theology - give a man a fish and you feed him for a
>day, teach a man to fish and you feed him for life.
snip
>Darin Johnson
>djohnson@ucsd.edu	O-
>       The opinions expressed are not necessarily those of the
>       Frobozz Magic Hacking Company, or any other Frobozz affiliates.



See, I knew programming was like Buddism.

--
May your karma be excellent for forgiving my spelling mishaps.

Andy Askey
ajaskey@gnn.com





^ permalink raw reply	[flat|nested] 688+ messages in thread

* What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-20  0:00           ` Mark Eissler
  1996-07-25  0:00             ` Erik Seaberg
@ 1996-07-26  0:00             ` Tim Behrendsen
  1996-07-27  0:00               ` Rick Elbers
                                 ` (4 more replies)
  1 sibling, 5 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-07-26  0:00 UTC (permalink / raw)



Mark Eissler <tequila@interlog.com> wrote in article
<tequila-2007961836190001@tequila.interlog.com>...
> Yes, but since just about everyone else has said something I'd say follow
> this path BASIC -> Pascal -> C -> C++ -> JAVA. 

I think that is fine for the casual programmer.  If you want to be
a professional programmer, I think this is the *best* course, but
not the easiest ...

Assembly -> C [non-GUI] -> C-GUI -> C++

All the rest of the languages are variations on the same theme.
Here's my rationale ...

Assembly:   Learn what's *really* going on.  The most important.

C:          Learn structured programming.  C is close enough
            to assembly that a student can really *see* how the
            compiler translates the code to assembly, and really
            understand what languages are all about.

C-GUI:      Learn the concept of event-driven programming (which
            is all GUI is, stripped of the extraneous stuff).

C++:        Even though I think C++ is brain damaged, it is
            close enough to C that the student can see what OOP
            really is in the context of, again, assembly
            language. Assembly is without the "abstraction
            bias" that other languages have. IMO, the only
            thing OOP brings to the table is the concept of
            assigning methods to areas of memory, AKA objects.
            Non-OOP: Apply methods to memory.  OOP: Execute
            method abstractly bound to memory.  All the rest
            of the concepts of OOP derive from that. Obviously,
            I'm speaking from a "reality" point of view, not from
            the OO abstraction.

My point is there is nothing that is all that complicated
in computer science, if it can be expressed in the fundamental
components of programming, which are move, arithmetic, logicals,
test, and branch (might be something else I'm leaving out). If
a student is ground in these fundamentals, there is nothing else
they can't learn.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-25  0:00   ` Andy Askey
@ 1996-07-26  0:00     ` Mark Eissler
  0 siblings, 0 replies; 688+ messages in thread
From: Mark Eissler @ 1996-07-26  0:00 UTC (permalink / raw)



gwinn@res.ray.com (Joe Gwinn) writes:

>> Shouldn't we answer the man's question, without drifting into theological
>> discussions about the relative merits of various languages?  He wants to
>> find a better job, not find religion, or become a better person.  So,
>> where are the jobs?  
 
The jobs? What jobs? I don't about your town, but around here most places
require a degree these days. That being the case, a teaching institution
and its curriculum will figure this out. Going it alone would (I suppose)
require some type of portfolio (someone correct me on this?) that could be
shown to prospective employers as proof you have a clue.

That said, C and C++ programmers are probably more in demand than anything
else, although there's also JAVA, COBOL, and Ada... See, there's still
demand for programmers that can maintain older systems (institutions don't
tend to sack their mainframes and legacy apps every couple of years).

All in all, I think a lot of people will agree that it is better to evolve
as a programmer rather than just learn ONE language and say "Here I am,
give me $80k/yr plus moving expenses!"

Or something like that.

For instance: get a book on Pascal, get a compiler, breeze through the
book, write a few utilities (apps, whatever) and then move on to the next
language.

As you learn a new language, with its own convoluted concepts, it will be
somewhat comforting to have another language to relate to.

--
Mark Eissler                      |  If something doesn't start
tequila@interlog.com              |  happening soon, I'm gonna    http://www.interlog.com/~tequila/ |  take up Yak farming!






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-23  0:00             ` John A Hughes
@ 1996-07-26  0:00               ` Randy Kaelber
  1996-07-29  0:00                 ` Ralph Silverman
  1996-08-06  0:00                 ` StHeller
  0 siblings, 2 replies; 688+ messages in thread
From: Randy Kaelber @ 1996-07-26  0:00 UTC (permalink / raw)



John A Hughes (jah@cais.cais.com) wrote:
> I think classroom-style instruction for computer science is bogus. The
> typical ratio of lecture to lab should be inverted in my opinion-- 3
> labs to each lecture. And lectures should introduce the labs and then
> discuss problems in the labs, and the broader significance of problems
> people had or differences in various solutions, NOT the other way
> around, where labs are some kind of vesigial "illustration" of lecture
> concepts.

I think I agree with the concepts you're suggesting, but I still feel 
there is need for lecture... maybe 2 lec/3 lab, especially in more 
concentrated classes (database, Operating systems, automata, compilers, 
balh blah blah). But absolutely I tired of having 8 or 9 dinky little 
assignments through a semester. Give me one or two BIG projects and make 
it half my grade.

> I also think at more advanced levels the labs should really teach
> collaborative programming, where everyone must work on a large
> project, everyone must discuss how the project should be broken up,
> and maybe teams should have to trade software components during
> implementation to teach them how to write maintainable code.

YES! YES! YES! YES!!! Absolutely! Very little software development 
happens in a vacuum. The team concept should be highly stressed. Getting 
a certificate in computer programming from Sally Struthers does not a 
computing professional make. One needs to be able to maintain code and be 
able to write maintainable code. I think that's why new programmers get 
maintenance jobs... so they can see what maintainable and unmaintainable 
code looks like. Plus, all the experienced people want to develop new 
systems, not maintain old code. :)

> This kind of class would be very hard to design and teach, and would
> be a total blast.

God knows I'd love to try it! Teach it, take it, I don't care.

--
Randy Kaelber:  kaelbers@muohio.edu
DARS Programmer/Analyst, Miami University, Oxford, OH 45056 USA
http://avian.dars.muohio.edu/~kaelbers/

Unsolicited commercial E-mail will be spell checked for a fee of $50.
Sending such mail constitutes acceptance of these terms. 





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-26  0:00             ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Tim Behrendsen
@ 1996-07-27  0:00               ` Rick Elbers
  1996-07-28  0:00                 ` Mark Eissler
  1996-07-28  0:00                 ` J. Christian Blanchette
  1996-07-28  0:00               ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Robert Dewar
                                 ` (3 subsequent siblings)
  4 siblings, 2 replies; 688+ messages in thread
From: Rick Elbers @ 1996-07-27  0:00 UTC (permalink / raw)



"Tim Behrendsen" <tim@airshields.com> wrote:

>Mark Eissler <tequila@interlog.com> wrote in article
><tequila-2007961836190001@tequila.interlog.com>...
>> Yes, but since just about everyone else has said something I'd say follow
>> this path BASIC -> Pascal -> C -> C++ -> JAVA. 

>I think that is fine for the casual programmer.  If you want to be
>a professional programmer, I think this is the *best* course, but
>not the easiest ...

>Assembly -> C [non-GUI] -> C-GUI -> C++

>All the rest of the languages are variations on the same theme.
>Here's my rationale ...

>Assembly:   Learn what's *really* going on.  The most important.
I fully agree, i did not get the  *concept*( i do not mean just the
use) of pointers and reference down without an understanding of
Assembly. I wonder how other C++ ers do this ....


>C:          Learn structured programming.  C is close enough
>            to assembly that a student can really *see* how the
>            compiler translates the code to assembly, and really
>            understand what languages are all about.

>C-GUI:      Learn the concept of event-driven programming (which
>            is all GUI is, stripped of the extraneous stuff).

>C++:        Even though I think C++ is brain damaged, it is
>            close enough to C that the student can see what OOP
>            really is in the context of, again, assembly
>            language. Assembly is without the "abstraction
>            bias" that other languages have. IMO, the only
>            thing OOP brings to the table is the concept of
>            assigning methods to areas of memory, AKA objects.
>            Non-OOP: Apply methods to memory.  OOP: Execute
>            method abstractly bound to memory.  All the rest
>            of the concepts of OOP derive from that. Obviously,
>            I'm speaking from a "reality" point of view, not from
>            the OO abstraction.

>My point is there is nothing that is all that complicated
>in computer science, if it can be expressed in the fundamental
>components of programming, which are move, arithmetic, logicals,
>test, and branch (might be something else I'm leaving out). If
>a student is ground in these fundamentals, there is nothing else
>they can't learn.
Yep this approach is IMHO the very best approach to understand almost
every *over-mystified* topic (though later on you might just switch
the other way around ..and considers "nothing compared......"). 
But when radically following this lead i think you can as well start
with BASIC to get the hunch ( while all the concepts of
branch,test,arithmic, move and logic are there also ...)

rick
>-- Tim Behrendsen (tim@airshields.com)

Rick Elbers
e-mail: rick.elbers@tip.nl






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-27  0:00               ` Rick Elbers
  1996-07-28  0:00                 ` Mark Eissler
@ 1996-07-28  0:00                 ` J. Christian Blanchette
  1996-07-28  0:00                   ` Robert Dewar
                                     ` (9 more replies)
  1 sibling, 10 replies; 688+ messages in thread
From: J. Christian Blanchette @ 1996-07-28  0:00 UTC (permalink / raw)



> >Assembly -> C [non-GUI] -> C-GUI -> C++

This is really crazy!

It's maybe interesting to understand how arguments are passed on the 
stack, but i strongly believe that simple high-level languages must be 
learned first, and more complex one after.  Since there is a performance 
vs. simplicity tradeoff all along, the performance desire (usually for 
graphical applications) makes people learn lower level languages.  I know 
many people, including myself, who made the Basic/C step for 
performances.

C must me learned before C++, that's a point since C++ is a really more 
complex superset.

I see no reason why learning GUI programming before OOP: in my sense 
they're not related at all.  I've never did any GUI app, but the concept 
of multiple entry points is easy to understand, as well as that of 
object-orientedness (which can be found even in C programs, although 
C++/Java are more adequate).

Understanding the machine architecture is one thing, using assembly 
languages is another.  There's no real interest in knowing all the 
mnemonics of a peculiar assembly language for a C coder: knowing how 
stacks work or how system calls are performed is enough to make efficient 
C programs.

Jas.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-27  0:00               ` Rick Elbers
@ 1996-07-28  0:00                 ` Mark Eissler
  1996-07-28  0:00                 ` J. Christian Blanchette
  1 sibling, 0 replies; 688+ messages in thread
From: Mark Eissler @ 1996-07-28  0:00 UTC (permalink / raw)




> "Tim Behrendsen" <tim@airshields.com> wrote:
> 
> >Mark Eissler <tequila@interlog.com> wrote in article
> ><tequila-2007961836190001@tequila.interlog.com>...
> >> Yes, but since just about everyone else has said something I'd say follow
> >> this path BASIC -> Pascal -> C -> C++ -> JAVA. 
> 
> >I think that is fine for the casual programmer.  If you want to be
> >a professional programmer, I think this is the *best* course, but
> >not the easiest ...
> 
> >Assembly -> C [non-GUI] -> C-GUI -> C++
> 
> >All the rest of the languages are variations on the same theme.

I would still argue that some form of BASIC should be taken as a
preliminary step to any programming. Going from nothing into Assembly is
suicidal. Something as simple as a loop tends to appear 8 billion times
more complicated than the same thing in BASIC. Not only that, but you
can't argue that ASM isn't more cryptic than any high level language. 

Although I've only dabbled in Assembly (I do intend to crack this beast!),
I don't think it's a great starting point. Possibly a next step. Structure
will be more clear learned from BASIC (writing sequentially executed
programs). ASM code looks like it jumps around an awful lot (because it
does).

I think it can be agreed, though, that learning to program is a
progression. You can't just pick up one book one day and expect to write
the next killer app the day after.

My first programming experiences began with a ZX-81 about 15 or 16 years
ago. I must have learned at least 4 or 5 different interpretations of
BASIC after that before moving on to dBASE, C (got confused), Pascal
(needed it to understand the developer docs. provided by Apple for the
Mac), C again (Pascal knowledge helped me real big with this), C++. I'm
still dealing with C++ right now. Plan to learn JAVA next (don't feel too
pressured though) and want to get back into learning ASM. I think I might
add Ada to my list of things to do. 

I guess there are arguments that you don't need to learn a billion
languages in order to learn how to program. Having spent a great deal of
time learning this  (without too much formal instruction -- 'cause I
really suck at the math courses they always make mandatory at
Universities) I think the progression from one language to the next has
only helped me.

--
Mark Eissler                      |  If something doesn't start
tequila@interlog.com              |  happening soon, I'm gonna    http://www.interlog.com/~tequila/ |  take up Yak farming!






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-26  0:00             ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Tim Behrendsen
  1996-07-27  0:00               ` Rick Elbers
@ 1996-07-28  0:00               ` Robert Dewar
  1996-07-29  0:00                 ` Tim Behrendsen
                                   ` (3 more replies)
  1996-07-29  0:00               ` Byron B. Kauffman
                                 ` (2 subsequent siblings)
  4 siblings, 4 replies; 688+ messages in thread
From: Robert Dewar @ 1996-07-28  0:00 UTC (permalink / raw)



Tim recommends

"Assembly -> C [non-GUI] -> C-GUI -> C++

All the rest of the languages are variations on the same theme.
Here's my rationale ...

Assembly:   Learn what's *really* going on.  The most important."


I strongly disagree. A student treading this path will have a hard time
learning what abstraction is all about. Teaching abstraction is the
biggest challenge in teaching at an elementary level. It is possible
to recover from a (mis)education that follows the path above, but
very difficult for most people.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-28  0:00                 ` J. Christian Blanchette
@ 1996-07-28  0:00                   ` Robert Dewar
  1996-07-29  0:00                   ` Tim Behrendsen
                                     ` (8 subsequent siblings)
  9 siblings, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-07-28  0:00 UTC (permalink / raw)



iJas says

"C must me learned before C++, that's a point since C++ is a really more
complex superset."

This is a mistake. Learning C before C++ makes it hard to understand
what C++ is all about. I still see huge quantities of C++ code clealry
written by C programmers, and it is basically C++ written in C style.

It is much more effective to teach C++ first, at an appropriate level
of abstraction, and then later on you can introduce the low level
features of C++ that are inherited from C, and are occasionally
appropriate for low level code. 





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-26  0:00               ` Randy Kaelber
@ 1996-07-29  0:00                 ` Ralph Silverman
  1996-08-06  0:00                 ` StHeller
  1 sibling, 0 replies; 688+ messages in thread
From: Ralph Silverman @ 1996-07-29  0:00 UTC (permalink / raw)



Randy Kaelber (kaelbers@avian.dars.muohio.edu) wrote:
: John A Hughes (jah@cais.cais.com) wrote:
: > I think classroom-style instruction for computer science is bogus. The
: > typical ratio of lecture to lab should be inverted in my opinion-- 3
: > labs to each lecture. And lectures should introduce the labs and then
: > discuss problems in the labs, and the broader significance of problems
: > people had or differences in various solutions, NOT the other way
: > around, where labs are some kind of vesigial "illustration" of lecture
: > concepts.

: I think I agree with the concepts you're suggesting, but I still feel 
: there is need for lecture... maybe 2 lec/3 lab, especially in more 
: concentrated classes (database, Operating systems, automata, compilers, 
: balh blah blah). But absolutely I tired of having 8 or 9 dinky little 
: assignments through a semester. Give me one or two BIG projects and make 
: it half my grade.

: > I also think at more advanced levels the labs should really teach
: > collaborative programming, where everyone must work on a large
: > project, everyone must discuss how the project should be broken up,
: > and maybe teams should have to trade software components during
: > implementation to teach them how to write maintainable code.

: YES! YES! YES! YES!!! Absolutely! Very little software development 
: happens in a vacuum. The team concept should be highly stressed. Getting 
: a certificate in computer programming from Sally Struthers does not a 
: computing professional make. One needs to be able to maintain code and be 
: able to write maintainable code. I think that's why new programmers get 
: maintenance jobs... so they can see what maintainable and unmaintainable 
: code looks like. Plus, all the experienced people want to develop new 
: systems, not maintain old code. :)

: > This kind of class would be very hard to design and teach, and would
: > be a total blast.

: God knows I'd love to try it! Teach it, take it, I don't care.

: --
: Randy Kaelber:  kaelbers@muohio.edu
: DARS Programmer/Analyst, Miami University, Oxford, OH 45056 USA
: http://avian.dars.muohio.edu/~kaelbers/

: Unsolicited commercial E-mail will be spell checked for a fee of $50.
: Sending such mail constitutes acceptance of these terms. 


--
*************begin r.s. response***************

	done both...
	if the career you are looking at
	involves working in groups...
	then there is plenty of time
	for that later...
	would suggest...
	enjoy programming by,
	and for,
	yourself...
	while you can...

*************end r.s. response*****************
Ralph Silverman
z007400b@bcfreenet.seflin.lib.fl.us





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-26  0:00             ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Tim Behrendsen
  1996-07-27  0:00               ` Rick Elbers
  1996-07-28  0:00               ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Robert Dewar
@ 1996-07-29  0:00               ` Byron B. Kauffman
  1996-07-30  0:00               ` Alan Peake
       [not found]               ` <dewar. <peake.206.002D549F@dstos3.dsto.gov.au>
  4 siblings, 0 replies; 688+ messages in thread
From: Byron B. Kauffman @ 1996-07-29  0:00 UTC (permalink / raw)



Tim Behrendsen wrote:

snip...
> 
> C:          Learn structured programming.  C is close enough
>             to assembly that a student can really *see* how the
>             compiler translates the code to assembly, and really
>             understand what languages are all about.
> 

Excuse me?  IMHO, 'C' and 'structured programming' in the same sentence
constitutes an oxymoron. I would think that in a learning environment,
you would want to teach the student good habits, not how to hack. Once
you've taught them something about structured programming (forget BASIC
and Pascal - use Ada - at least you can get a job using it), you can
THEN give them a look at the 'down and dirty' side of the business.

-- 
Byron Kauffman




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-28  0:00                 ` J. Christian Blanchette
  1996-07-28  0:00                   ` Robert Dewar
@ 1996-07-29  0:00                   ` Tim Behrendsen
  1996-07-30  0:00                     ` Arra Avakian
  1996-07-31  0:00                   ` Patrick Horgan
                                     ` (7 subsequent siblings)
  9 siblings, 1 reply; 688+ messages in thread
From: Tim Behrendsen @ 1996-07-29  0:00 UTC (permalink / raw)



J. Christian Blanchette <jblanc@ivic.qc.ca> wrote in article
<31FBC584.4188@ivic.qc.ca>...
> > >Assembly -> C [non-GUI] -> C-GUI -> C++
> 
> This is really crazy!
> 
> It's maybe interesting to understand how arguments are passed on the 
> stack, but i strongly believe that simple high-level languages must be 
> learned first, and more complex one after.  Since there is a performance 
> vs. simplicity tradeoff all along, the performance desire (usually for 
> graphical applications) makes people learn lower level languages.  I know

> many people, including myself, who made the Basic/C step for 
> performances.

The point isn't that your actually going to use it day-to-day, the point is
really understand what's going on in terms of the fundamentals.  People
who program in C who do not understand assembly are dangerous
because they don't truly understand what the compiler is producing.

> I see no reason why learning GUI programming before OOP: in my sense 
> they're not related at all.  I've never did any GUI app, but the concept 
> of multiple entry points is easy to understand, as well as that of 
> object-orientedness (which can be found even in C programs, although 
> C++/Java are more adequate).

I could go either way on this one; I chose GUI first because the student
can
understand event-driven programming without the added baggage of OOP
abstractions.

> Understanding the machine architecture is one thing, using assembly 
> languages is another.  There's no real interest in knowing all the 
> mnemonics of a peculiar assembly language for a C coder: knowing how 
> stacks work or how system calls are performed is enough to make efficient

> C programs.

The point isn't to learn a specific one; I don't that really matters.  The
student needs *something* to play on.  The point is to see what's really
going on, to see the bytes flow in and out of memory locations, to really
experience that it's only a big table of bytes that you're moving around.

The most important thing any student can learn is the stripping away
of the shroud of abstractions, and seeing the simplicity of what's
really underneath.  Once they get that, all the rest of it comes naturally.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-28  0:00               ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Robert Dewar
@ 1996-07-29  0:00                 ` Tim Behrendsen
  1996-07-30  0:00                   ` Paul Campbell
                                     ` (3 more replies)
  1996-08-06  0:00                 ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Alf P. Steinbach
                                   ` (2 subsequent siblings)
  3 siblings, 4 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-07-29  0:00 UTC (permalink / raw)



Robert Dewar <dewar@cs.nyu.edu> wrote in article
<dewar.838609515@schonberg>...
> Tim recommends
> 
> "Assembly -> C [non-GUI] -> C-GUI -> C++
> 
> All the rest of the languages are variations on the same theme.
> Here's my rationale ...
> 
> Assembly:   Learn what's *really* going on.  The most important."
> 
> I strongly disagree. A student treading this path will have a hard time
> learning what abstraction is all about. Teaching abstraction is the
> biggest challenge in teaching at an elementary level. It is possible
> to recover from a (mis)education that follows the path above, but
> very difficult for most people.

I arrived at this conclusion based on the students that were coming
to me to be hired.  I give a standard test to all my applicants that
tests two primary attributes,

1) How well they understand what's *really* going on; the best
programmers have a solid foundation.
2) How well the can take a problem they've (probably) never seen
before and generate a solution.

I was shocked at the results.  I had people with Masters and
Doctorates who were completely incapable of creating new solutions
that they had never seen before.  I had people, with *degrees* now,
tell me "convert argument to binary" as one of the steps on a
logical operation problem!  The latter are people who are ground
in the "abstraction" of an integer, but are completely clueless
that the computer works in binary. How can a student get a
full-blown degree, and not understand a computer works in binary?
It's like graduating someone with a writing degree who is
illiterate.

This is what convinced me that our CS education is being done
completely wrong.  I've said this before [excuse my repetition];
it would be as if EE students were taught IC design in the first
course, and were only given resisters, capacitors, ohm's law,
etc. in their senior year, almost as an afterthought!

Bottom line, a student cannot fundamentally understand an
abstraction until they understand the abstraction
in terms of the fundamental components of programming; again:
Move, Arithmetic, Test, Branch, and Logicals.

CS is currently taught the way you describe.  Why do you think
it's so hard to teach abstractions at the elementary level?
It's because the students get so wrapped up in the fancy
terminology, they don't see that its really all smoke and
mirrors, and the concepts are very, very basic.  If it's
expressed in fundamental ways, it's very easy to see what
abstractions are really saying in *real* terms.

I have seen it!  Teaching CS the way you describe it is
producing brain damage of the highest degree.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn
  1996-07-29  0:00                 ` Tim Behrendsen
  1996-07-30  0:00                   ` Paul Campbell
@ 1996-07-30  0:00                   ` TRAN PHAN ANH
  1996-07-31  0:00                   ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Arne W. Flones
  1996-08-02  0:00                   ` David Wheeler
  3 siblings, 0 replies; 688+ messages in thread
From: TRAN PHAN ANH @ 1996-07-30  0:00 UTC (permalink / raw)



In article <01bb7da1$323102a0$96ee6fcf@timhome2>, "Tim Behrendsen" <tim@airshields.com> writes:
> 
> 1) How well they understand what's *really* going on; the best
> programmers have a solid foundation.
> 2) How well the can take a problem they've (probably) never seen
> before and generate a solution.

[snip]
 
> Bottom line, a student cannot fundamentally understand an
> abstraction until they understand the abstraction
> in terms of the fundamental components of programming; again:
> Move, Arithmetic, Test, Branch, and Logicals.
> 
> CS is currently taught the way you describe.  Why do you think
> it's so hard to teach abstractions at the elementary level?
> It's because the students get so wrapped up in the fancy
> terminology, they don't see that its really all smoke and
> mirrors, and the concepts are very, very basic.  If it's
> expressed in fundamental ways, it's very easy to see what
> abstractions are really saying in *real* terms.

Hoola...clap clap clap...:-)  What's the reason for abstraction?  We 
abstract in order to hide details.  But if we do not know what the 
details are, how in heaven can we abstract them 'away' ? :-)

Anh

 > -- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-26  0:00             ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Tim Behrendsen
                                 ` (2 preceding siblings ...)
  1996-07-29  0:00               ` Byron B. Kauffman
@ 1996-07-30  0:00               ` Alan Peake
       [not found]               ` <dewar. <peake.206.002D549F@dstos3.dsto.gov.au>
  4 siblings, 0 replies; 688+ messages in thread
From: Alan Peake @ 1996-07-30  0:00 UTC (permalink / raw)





>it would be as if EE students were taught IC design in the first
>course, and were only given resisters, capacitors, ohm's law,
>etc. in their senior year, almost as an afterthought!

Well, it may be going that way. Most of the logic design in my department is 
now done in a Hardware Description Language;  no-one needs to know about 
gates and counters anymore. All they need to know about capacitors is what 
types to use for bypassing the gate array chips. This level of hardware 
abstraction is a bit like OO in programming. The class libraries will contain 
most functions that the programmer is likely to need in much the same way as 
the HDL elements are the building blocks for the array chip.

Sure, there is a need for a few people to understand the nuts and bolts, but 
these few will be writing the libraries and designing the silicon i.e., making 
the tools. As long as the rest of us can use the tools, what does it matter 
how they work?

Alan







^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-30  0:00                   ` Paul Campbell
@ 1996-07-30  0:00                     ` Robert Dewar
  1996-08-02  0:00                       ` Tim Behrendsen
  1996-08-03  0:00                     ` Patrick Horgan
  1 sibling, 1 reply; 688+ messages in thread
From: Robert Dewar @ 1996-07-30  0:00 UTC (permalink / raw)



Paul said

"We have had CS students on placement here after 2 years of study who
didn't even understand hexedecimal !."

Perhaps they might have understood hexadecimal
                                      ^

yes, yes, I know, I make more spelling errors than most on newsgroup
posts because I type in furiously and do not bother to correct, but
this one was hard to resist :-)

I actually don't find it so terrible for CS students with two years
of study not to know hexadecimal notation. I would be more disturbed
if they did not know and understand what an abstract data type was,
or what call by reference means.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-29  0:00                   ` Tim Behrendsen
@ 1996-07-30  0:00                     ` Arra Avakian
  1996-07-31  0:00                       ` James Youngman
                                         ` (3 more replies)
  0 siblings, 4 replies; 688+ messages in thread
From: Arra Avakian @ 1996-07-30  0:00 UTC (permalink / raw)



In article <01bb7da2$6c505ac0$96ee6fcf@timhome2>, "Tim Behrendsen" <tim@airshields.com> wrote:
..
>The most important thing any student can learn is the stripping away
>of the shroud of abstractions, and seeing the simplicity of what's
>really underneath.  Once they get that, all the rest of it comes naturally.
>

I can see both sides of this issue: the importance of understanding 
abstractions, and the importance of understanding what is underneath.
I think that a programmer needs to have a model of how a computation occurs in 
order to understand issues such as time and space efficiency of the 
computation. On the other hand, understanding the abstractions presented by 
both language constructs and by APIs is absolutely critical to being able to 
deal with complex systems. A curriculum that does not accomplish the learning 
of computing abstractions fails. Its hard for me to judge the "best" approach 
for someone learning now, when I have had the experience of learning gradually 
over a lifetime.

I learned programming in the summer between my freshman and sophomore years in 
high school (1960!!). I was taught two "languages" that summer: IBM 1620 
assembly language and FORTRAN II. I do recall being mystified and puzzled by 
the concept of how FORTRAN source code could "execute", but could readily 
"grok" how a dumb machine could blindly execute machine code. It wasn't until 
I understood the concepts behind a compiler that the mystery faded and I could 
accept the abstraction of FORTRAN. What is interesting in hindsight was that 
the concept of an assembler did not cause any mystification - its role as a 
translator was "obvious", but the role of a compiler was definitely not 
obvious. The foundation of understanding the concepts of a stored program 
computer was essential for me to understand anything else.

I still to this day need to understand the execution model of an abstraction 
in order to "really" understand it. I guess my character is to be suspicious 
of the mystery, and not be able to take it on "faith".

Arra Avakian
Intermetrics, Inc.
733 Concord Avenue
Cambridge, Massachusetts 02138
USA
(617) 661-1840
arra@inmet.com




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-22  0:00               ` Robert Dewar
@ 1996-07-30  0:00                 ` Tim Behrendsen
  1996-07-31  0:00                 ` Patrick Horgan
  1 sibling, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-07-30  0:00 UTC (permalink / raw)



Robert Dewar <dewar@cs.nyu.edu> wrote in article
<dewar.838067532@schonberg>...
> Tim Oxler quoted:
> 
> Sentry Market Research surveyed 700 IS mangers what language they used
> for client/server application development:
> 
> Visual Basic    23%
> Cobol           21%
> C++             18%
> C               15%
> 
> and note that client server applications probably have a lower percentage
> of COBOL than all applications, because there are still lots of 
> traditional batch programs being generated in IS shops in COBOL.

I wonder how many of those C++ people actually use the language
features beyond // comments?  The reason I ask this is I see so
many resumes with "C/C++" listed on it (which is a dead giveaway,
since they are very different languages) but when you ask them about
it, they admit "well, I read a book about it.  My previous positions
didn't actually use it".  That makes me think that they people who
are answering this survey checked the C++ box to be "hip".  "Well,
I'm *really* going to crack that C++ book next week and convert
my 100,000 lines over, so I might as well check the box."

I sort of doubt that C++ is being used more than straight C in
anything more than a trivial way.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-29  0:00                 ` Tim Behrendsen
@ 1996-07-30  0:00                   ` Paul Campbell
  1996-07-30  0:00                     ` Robert Dewar
  1996-08-03  0:00                     ` Patrick Horgan
  1996-07-30  0:00                   ` What's the best language to start with? [was: Re: Should I learn TRAN PHAN ANH
                                     ` (2 subsequent siblings)
  3 siblings, 2 replies; 688+ messages in thread
From: Paul Campbell @ 1996-07-30  0:00 UTC (permalink / raw)



>>>>>>
I was shocked at the results.  I had people with Masters and
Doctorates who were completely incapable of creating new solutions
that they had never seen before.  I had people, with *degrees* now,
tell me "convert argument to binary" as one of the steps on a
logical operation problem!  The latter are people who are ground
in the "abstraction" of an integer, but are completely clueless
that the computer works in binary. How can a student get a
full-blown degree, and not understand a computer works in binary?
It's like graduating someone with a writing degree who is
illiterate.
<<<<<<<

We have had CS students on placement here after 2 years of study who
didn't even understand hexedecimal !.

Paul C.
UK.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-25  0:00 ` ++           robin
@ 1996-07-30  0:00   ` Robert Barnes
  1996-07-30  0:00     ` Rob(t.) Brannan
  0 siblings, 1 reply; 688+ messages in thread
From: Robert Barnes @ 1996-07-30  0:00 UTC (permalink / raw)



++ robin wrote:
> 
>         dewar@cs.nyu.edu (Robert Dewar) writes:
> 
>         >"An algorithm is just a set of instructions on how to accomplish a task. An
>         >algorithm does not depend at all on any sort of programming "language".True, but the way that you express an algorithm is at least partly conditioned by your proposed implementation. 
 Not that there's anything wrong with this IMHO. I used to use a mixture of PL/I and English - where PL/I was 
at least as concise, then I saw no point in writing "English" just to be "Not a programming language".  
Similarly, if you're going to be expressing in C, or Pascal, or whatever.  

The rule for pseudocode should be "use the clearest communication to your audience".  Who is your audience?  
Yourself, and your collegues.  There needs to be some degree of consensus with your collegues (= documentation 
standards), but these needn't be rigid.  If your task was doing something with valid account data (let's 
assume that this is properly understood in this context), then in my programming group I'd have allowed forms 
such as:-
	DO for every valid account
	   xxxxx
	ENDDO
and
	FOR EVERY valid account DO
	   xxxxx
	END
and variations on these - LOOP, FOR ALL, WITH EACH ...   whatever.  As long as it's clear and unequivocal.

Pseudocode should summarize.  Thus in the example given:-
> 
>         >     Loop until variable is equal to 7
>         >        Add 1 to variable
>         >        Print out the following string to screen: ;-)
>         >        end of loop
> the introduction of "variable", is unnecessary, and has only added to the confusion.  Hence the subsequent 
discussion which has generated more heat than light. In fact, like Robin, at this level I'd have written the 
PL/I directly, but if I did feel the need for pseudocode (because there was some real complexity to the logic 
within the loop), I'd probably have written something like:-
	DO 7 times
	  xxxx
The whole point about successive refinement is to avoid getting bogged down in detail that is irrelevant to the 
level that you're designing.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-30  0:00   ` Robert Barnes
@ 1996-07-30  0:00     ` Rob(t.) Brannan
  1996-08-01  0:00       ` Tony Konashenok
  1996-08-01  0:00       ` ++           robin
  0 siblings, 2 replies; 688+ messages in thread
From: Rob(t.) Brannan @ 1996-07-30  0:00 UTC (permalink / raw)



As a first language Pascal is the way to go for many reasons.

C is very complicated as a first language (libraries,pointers,numerous
low level routines), not to mention some of the errors that you will
encounter are not exactly well covered in any book.

As a newbie , remember trying to figure out whats wrong when an
included file or library was just left out!




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-30  0:00                     ` Arra Avakian
@ 1996-07-31  0:00                       ` James Youngman
  1996-07-31  0:00                       ` Stephen M O'Shaughnessy
                                         ` (2 subsequent siblings)
  3 siblings, 0 replies; 688+ messages in thread
From: James Youngman @ 1996-07-31  0:00 UTC (permalink / raw)



In article <DvD01C.4v0.0.-s@inmet.camb.inmet.com>, arra@inmet.com says...
>
>In article <01bb7da2$6c505ac0$96ee6fcf@timhome2>, "Tim Behrendsen" 
<tim@airshields.com> wrote:
>.
>>The most important thing any student can learn is the stripping away
>>of the shroud of abstractions, and seeing the simplicity of what's
>>really underneath.  Once they get that, all the rest of it comes naturally.
>>
>
>I can see both sides of this issue: the importance of understanding 
>abstractions, and the importance of understanding what is underneath.
>I think that a programmer needs to have a model of how a computation occurs in 
>order to understand issues such as time and space efficiency of the 
>computation. 

This IMHO makes Knuth's "The Art Of Computer Programming" indispansable 
even now.  He teaches many algorighms but talks __all the way through__
about the machanics of the machine.  He shows the student the wood and the 
trees simultaneously.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-29  0:00                 ` Tim Behrendsen
  1996-07-30  0:00                   ` Paul Campbell
  1996-07-30  0:00                   ` What's the best language to start with? [was: Re: Should I learn TRAN PHAN ANH
@ 1996-07-31  0:00                   ` Arne W. Flones
  1996-08-02  0:00                   ` David Wheeler
  3 siblings, 0 replies; 688+ messages in thread
From: Arne W. Flones @ 1996-07-31  0:00 UTC (permalink / raw)



"Tim Behrendsen" <tim@airshields.com> wrote:
[snip]
>Why do you think
>it's so hard to teach abstractions at the elementary level?
>It's because the students get so wrapped up in the fancy
>terminology, they don't see that its really all smoke and
>mirrors, and the concepts are very, very basic.  If it's
>expressed in fundamental ways, it's very easy to see what
>abstractions are really saying in *real* terms.

I agree.  My first computer was one of those single board computers
that you programmed through a hexadecimal keypad.  The display was a 6
digit hexadecimal LED display.  It had all of 1024 bytes of memory and
a 2K monitor ROM.  I cut my teeth programming it in hexadecimal
machine language. (Who needs an assembler? :-)  That knowledge has
served me well for the two decades since.

Beneath the complexities of of a modern programming language is a very
simple thing--a CPU that doesn't do much, but does it very fast.  The
modern processor is more complex than those early ones, but not much.
It is extremely helpful to understand what the processor is going to
do when you, for instance, add two numbers together.

Computers are not magic.  They are very simple things that can be
easily understood.  When one realizes that all that happens inside a
computer is simple addition, subtraction, bit-shifts, bit-testing and
jumping, all suddenly becomes clear.

Regards,
Arne
flonesaw@netonecom.net





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
       [not found]               ` <dewar. <peake.206.002D549F@dstos3.dsto.gov.au>
  1996-07-31  0:00                 ` P. Cnudde VH14 (8218)
@ 1996-07-31  0:00                 ` Tim Behrendsen
  1996-07-31  0:00                 ` Stephen M O'Shaughnessy
  2 siblings, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-07-31  0:00 UTC (permalink / raw)



Alan Peake <peake@dstos3.dsto.gov.au> wrote in article
<peake.206.002D549F@dstos3.dsto.gov.au>...
> >it would be as if EE students were taught IC design in the first
> >course, and were only given resisters, capacitors, ohm's law,
> >etc. in their senior year, almost as an afterthought!
> 
> Well, it may be going that way. Most of the logic design in my department
is 
> now done in a Hardware Description Language;  no-one needs to know about 
> gates and counters anymore. All they need to know about capacitors is
what 
> types to use for bypassing the gate array chips. This level of hardware 
> abstraction is a bit like OO in programming. The class libraries will
contain 
> most functions that the programmer is likely to need in much the same way
as 
> the HDL elements are the building blocks for the array chip.
> 
> Sure, there is a need for a few people to understand the nuts and bolts,
but 
> these few will be writing the libraries and designing the silicon i.e.,
making 
> the tools. As long as the rest of us can use the tools, what does it
matter 
> how they work?

Hmmm... the analogy I would use is the people who string together prefab
components are the programming equivalent of VB programmers, who string
together prefab dialogs and glue them together with some BASIC code.

The VB programmers are not going to be the compiler designers or O/S
builders (at that stage in their career, anyway) and the HDL people
are not going to be the Microprocessor designers. Certainly there is room
in the world for high level and low level people.

Also, software design is usually a much more difficult science, since the
"flow of execution" is much slower than the flow of electrons, so
inefficiency at the circuit level is not as critical.  If circuit
propagation times are important, a company will still need to call in the
nuts and bolts people to take care of business.

The latter BTW (off the subject) is my theory of why we have so many
standardized components in EE, but standardization has generally failed
for CS.  EEs have so much signal propagation bandwidth to play with,
they can afford to have very general components and not worry about
the loss of efficiency.  In CS, computer cycles are always important, and
thus its very difficult to make libraries that are efficient in a wide
variety of circumstances (generalization always costs in efficiency).

-- Tim Behrendsen (tim@airshields.com) 






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
       [not found] <sperlman-0507961717550001@p121.ezo.net>
                   ` (15 preceding siblings ...)
  1996-07-25  0:00 ` ++           robin
@ 1996-07-31  0:00 ` Darin Johnson
  1996-08-01  0:00   ` Tim Behrendsen
  1996-07-31  0:00 ` Darin Johnson
                   ` (3 subsequent siblings)
  20 siblings, 1 reply; 688+ messages in thread
From: Darin Johnson @ 1996-07-31  0:00 UTC (permalink / raw)



> To my oppionion the same holds for software design. Assembly
> language is no longer really needed. If you have the knowledge it can be usefull,
> but is it really worth the effort ?

The effort is already minimal.  One lousy class.  And assembler is
constantly showing up in the workplace still, sorry.  Sure, maybe your
employees have no need of assembler, but other employees elsewhere may
run across it.  I see too often employers trying to convince schools
that every student should be customized to their particular needs, and
it ends up with conflicting requirements by the prospective employers.
Almost everything in CS is considered useless by at least one person
on the outside - architecture, assembler, theory, algorithms, ai,
mathematics, software engineering, operating systems, and every single
language, including C and C++.

The other common complaint I see all the time is the "I personally
don't use XXX, therefore no one needs to use XXX".  This common
attitude needs no further arguments to strike it down...

The most common systems out there are Windows 3.1 and MVS.  MVS needs
assembler, and Windows 3.1 programming is difficult to understand
without understanding architectural details.  Find your best
programmers and chances are they are somewhat familiar with some form
of assembler.

Sheesh, one class in 5 years covering *basics*, and you think it's not
worth the effort...  I suppose you thought calculus and history were
useful but not worth the effort too.
-- 
Darin Johnson
djohnson@ucsd.edu	O-
	My shoes are too tight, and I have forgotten how to dance - Babylon 5




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-28  0:00                 ` J. Christian Blanchette
                                     ` (2 preceding siblings ...)
  1996-07-31  0:00                   ` Patrick Horgan
@ 1996-07-31  0:00                   ` AJ Musgrove
  1996-08-01  0:00                     ` Sam Harris
                                       ` (3 more replies)
  1996-08-08  0:00                   ` William Clodius
                                     ` (5 subsequent siblings)
  9 siblings, 4 replies; 688+ messages in thread
From: AJ Musgrove @ 1996-07-31  0:00 UTC (permalink / raw)



J. Christian Blanchette (jblanc@ivic.qc.ca) wrote:
: > >Assembly -> C [non-GUI] -> C-GUI -> C++

: This is really crazy!

: It's maybe interesting to understand how arguments are passed on the 
: stack, but i strongly believe that simple high-level languages must be 
: learned first, and more complex one after.  Since there is a performance 

In it's purest form, assembly is much simpler than C. Each instruction does
1 and only 1 thing (in general). If someone will spend the "blood, sweat,
and tears" to learn the basics, the rest will come easily. The first
alnguage I was TRULY profienct in was x86 Assembly. Then I went to Pascal,
then C, and I've learned some more since then (but still prefer C for it's
simple elegance).

I also recomend compiled with the "generate assembly" option when a
programmer first starts learning C, so really see what that codes becomes in
assembly. (option is -s on most unix systems).

: vs. simplicity tradeoff all along, the performance desire (usually for 
: graphical applications) makes people learn lower level languages.  I know 
: many people, including myself, who made the Basic/C step for 
: performances.
Uhh.. BASIC, I'm sorry.

: C must me learned before C++, that's a point since C++ is a really more 
: complex superset.
The first C++ "compilers" really only generated C code, and compiled it.
Most C compilers still really only generate assembly, then assemble it. They
don't do strait to binary/machine format.

: I see no reason why learning GUI programming before OOP: in my sense 
: they're not related at all.  I've never did any GUI app, but the concept 
: of multiple entry points is easy to understand, as well as that of 
: object-orientedness (which can be found even in C programs, although 
: C++/Java are more adequate).
Objects are easier to udnerstand from a graphical viewpoint. A Button is an
object, and Window is an object. After one fully understands that, it is
easy to see how a Databae Connection can be an object.

: Understanding the machine architecture is one thing, using assembly 
: languages is another.  There's no real interest in knowing all the 
: mnemonics of a peculiar assembly language for a C coder: knowing how 
: stacks work or how system calls are performed is enough to make efficient 
: C programs.
In the military, soldiers are taught how to disassemble guns.
Why? There are people who could do that for them. If one FULLY understands the
tools they are using, one can use them better. Period.

: Jas.


--
AJ Musgrove

----------------------------------------------------------------
My opinions do not necessarily reflect those of MFS, or anyone
else for that matter. O-
----------------------------------------------------------------





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
       [not found] <sperlman-0507961717550001@p121.ezo.net>
                   ` (16 preceding siblings ...)
  1996-07-31  0:00 ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Darin Johnson
@ 1996-07-31  0:00 ` Darin Johnson
  1996-08-02  0:00   ` Alan Peake
  1996-08-01  0:00 ` Stefan 'Stetson' Skoglund
                   ` (2 subsequent siblings)
  20 siblings, 1 reply; 688+ messages in thread
From: Darin Johnson @ 1996-07-31  0:00 UTC (permalink / raw)



> >it would be as if EE students were taught IC design in the first
> >course, and were only given resisters, capacitors, ohm's law,
> >etc. in their senior year, almost as an afterthought!
> 
> Well, it may be going that way. Most of the logic design in my department is 
> now done in a Hardware Description Language;

Is that an EE class though?  I doubt an *EE* class would get to the
point where resisters/capacitors/etc are only taught as an
afterthought (unless of course, the students are expected to already
know this from physics classes).  Most EE students don't even get to
do IC design, or even get jobs doing IC design.

> Sure, there is a need for a few people to understand the nuts and
> bolts, but these few will be writing the libraries and designing the
> silicon i.e., making the tools. As long as the rest of us can use
> the tools, what does it matter how they work? 

Then the "rest of us" don't need to go to universities.  We're not
talking about how to use tools, we're talking about learning at the
university level.  Why bother even learning programming if "the rest
of us" are only going to use the end-products?  Do you advocate that
arithmetic need not be taught, because "the rest of us" can just buy
calculators?  If that's all you want from programming, then don't go
to a university, there are plenty of 2-bit tradeschools, 4-bit
tradeschools, and even some excellent tradeschools.  (however, most EE
oriented tradeschools will teach capacitors, resistors, and Ohm's law,
even if they never get around to Fourier or Laplace transforms)  If
you don't like CS, then don't get a degree in CS, it's that simple.
If you're in a CS program, the university can only assume you want to
learn CS; even if they have a programming focus available, it surely
won't be a "learn the minimum only" sort of degree.  If you don't want
to learn, you don't have to, just don't spread your anti-learning
philosophy to others.

Yes, today at this moment, if you only know C++ you can get a job.
But in the past, and hopefully in the future again, you're not going
to get and keep a job if you aren't adaptable and flexible and able to
pick up a new OS or language or programming technique or whatever.
This field is not static.  Few fields are static.  You always need to
learn new things and do things differently and use new tools, even if
you're not the one designing the new things.  What good is your CNE
going to do you when no one uses Novell anymore?  What about when
TCP/IP never gets used anymore?  Or C, C++, or Ada?  Do you think the
standard state of affairs should be to require massive retraining
whenever the industry changes?
-- 
Darin Johnson
djohnson@ucsd.edu	O-
	The trouble with conspiracy theories are that they assume
	the government is organized.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-31  0:00                 ` P. Cnudde VH14 (8218)
@ 1996-07-31  0:00                   ` Nicolas Devillard
  1996-08-02  0:00                   ` Matt Austern
  1 sibling, 0 replies; 688+ messages in thread
From: Nicolas Devillard @ 1996-07-31  0:00 UTC (permalink / raw)



P. Cnudde VH14 (8218) wrote:
> As a project leader in IC design at Alcatel, I agree fully with Alan.
> 
> Software knowledge is being considered more valuable than knowing how a
> transistor works. This will even become more and more the case in the
> future. And I see no problem in giving first a course in IC design
> and only later going to transistors. Abstraction is the point which
> is difficult to learn. When we design a new asic we almost never think
> about resistors etc, understanding the behaviour of the chip is where the
> real problem is. To my oppionion the same holds for software design. Assembly
> language is no longer really needed. If you have the knowledge it can be usefull,
> but is it really worth the effort ?

Software design, whatever it is, should at one moment
be aware of hardware constraints. These constraints
can only be understood by knowing what computer guts look
like.
You can stay at a very high-level of abstraction,
accumulating underlying layers of software, and
eventually you will end up with monsters such as MS
Word, a gigantic, slow, buggy code needing a powerful
workstation just for word processing... Or even worse:
you will only think in terms of abstract functionalities,
reuse an old piece of software library, and when you
realize the hardware constraints are not the same,
Ariane 5 already exploded.

Designing software without knowledge of computer
electronics at all is like architecture without
knowledge of structure mechanics. You can dream
a lot when designing, but many constraints bring
you back to reality, and among them you should be
able to understand hardware ones to take them into
account.
Furthermore, learning how ASIC's and microprocessors
work is quite fast, easy, and does not require much
knowledge. It is in majority a question of conventions.

My 5 cents opinion, though, certainly not a definitive
view. ;)

--Nicolas




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-30  0:00                     ` Arra Avakian
  1996-07-31  0:00                       ` James Youngman
@ 1996-07-31  0:00                       ` Stephen M O'Shaughnessy
  1996-08-02  0:00                       ` Tim Behrendsen
  1996-08-13  0:00                       ` Chris Sonnack
  3 siblings, 0 replies; 688+ messages in thread
From: Stephen M O'Shaughnessy @ 1996-07-31  0:00 UTC (permalink / raw)



In article <DvD01C.4v0.0.-s@inmet.camb.inmet.com>, arra@inmet.com says...
>
>In article <01bb7da2$6c505ac0$96ee6fcf@timhome2>, "Tim Behrendsen" <tim@airshields.com> 
wrote:
>..
>>The most important thing any student can learn is the stripping away
>>of the shroud of abstractions, and seeing the simplicity of what's
>>really underneath.  Once they get that, all the rest of it comes naturally.
>>
>
>I can see both sides of this issue: the importance of understanding 
>abstractions, and the importance of understanding what is underneath.
>I think that a programmer needs to have a model of how a computation occurs in 
>order to understand issues such as time and space efficiency of the 
>computation. On the other hand, understanding the abstractions presented by 
>both language constructs and by APIs is absolutely critical to being able to 
>deal with complex systems. A curriculum that does not accomplish the learning 
>of computing abstractions fails. Its hard for me to judge the "best" approach 
>for someone learning now, when I have had the experience of learning gradually 
>over a lifetime.
>

As with most hot debates, both sides have equally valid arguments. 
In this I tend toward the  abstract side of the fence.  We learn to 
tie our shoes without knowledge of knot theory.  We learn to read
without a knowledge of grammer. We learn addition, subtraction, multi.
and division without any formal mathmatical proofs.  In fact learning basic 
arithmetic is an abstraction as most school children learn by counting
pennies and dimes.  This prepares us for 90% or more of the problems we
encounter in life.  Likewise with programming.  I believe, once you learn
a language, it will be quite sufficient for 90% of the programming problems
you encounter. (I do not consider assembly as a language. Despite it's title
of assembly *language* it is a code not far enough removed (i.e. abstracted)
from the underlying hardware).

I am not saying that the basic principles are not important.  If one is going
to make a career of programming these principles are crucial.  But I don't
believe one can recognize the underlying principles without the *shroud* of
abstractions to frame them.  

problem: Add two numbers
Q: How?
A: Put them in registers A and R5 then do ADD A,R5
Q: What is a register? 
A: A place to hold data. 
Q: What is data?
A: A collection of 8 bit bytes.
Q: What is a bit/Byte?
A: ...

OR

problem: Add two numbers
Q: How?
A: C := A + B

The original question was where do you start.  I have not learned Pascal so
I can't comment about it's quality as a beginning language.  For me the best
way to start is with Ada.  Programming is solving real world problems with a computer.
That is an abstraction.  The beginning student already understands real world problems.
So first teach them to express these problems with a computer language.  Ada,
with it's strong typing, maps very well to real world objects.  Strong typing
is about abstraction and that enables the hiding of irrelevant details.
This is so important when we are trying to learn something.  (That something
we are trying to learn is how to solve real world problems with a computer, 
not how a computer works).

Ada syntax was designed to be read by a human being.  I have yet to read C
code, my own included, that was quickly and easily decipherable a year
after it was written.  Ada systems catch more errors earlier than other 
language systems.  I know all you old pros who studied under Babbage never make
mistakes.  But for a beginner, it is a nice feature not to have to go through
the whole compile, link, execute process before finding bugs.

Should I learn C or Pascal? 

No. Learn Ada





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
       [not found]               ` <dewar. <peake.206.002D549F@dstos3.dsto.gov.au>
  1996-07-31  0:00                 ` P. Cnudde VH14 (8218)
  1996-07-31  0:00                 ` Tim Behrendsen
@ 1996-07-31  0:00                 ` Stephen M O'Shaughnessy
  1996-08-02  0:00                   ` Tim Behrendsen
  2 siblings, 1 reply; 688+ messages in thread
From: Stephen M O'Shaughnessy @ 1996-07-31  0:00 UTC (permalink / raw)




>
>
>>it would be as if EE students were taught IC design in the first
>>course, and were only given resisters, capacitors, ohm's law,
>>etc. in their senior year, almost as an afterthought!
>

You are misleading here.  Your analogy assumes EE don't need the basics, which
everyone knows to be absurd.  And you use that absurdity to prove IC design
should not be taught first.  But there is no link between one and the other.

By IC design do you mean the design of the ICs themselves or designing circuits
with ICs?

ICs are resistors, capacitors and semiconductors.  So designing ICs without
a knowledge of these basic elements would be impossible.  In this case you are
correct but have no point.

But if you are talking about designing with, say, logic ICs, I would argue that
it can easily be done without a knowlege of resistors, capacitors, transistors 
or even ohms law.  However, you would not be a student learning EE.  The 
point I am trying to make is that we must be careful about what basics are
necessary to learning a new skill.

I don't know much about Lady Lovelace (Namesake of the Ada programming
language).  I wonder how much of what we today call the basics did she
know and understand?  I don't think she understood bytes and bits.  I
am sure, with her math background, she knew about number bases but I
don't believe she had a working knowledge of hexidecimal numbers.  Yet
she was able to conceive a *computer* language of sorts and use it
to solve real world problems on a machine that did not then, nor did
it ever exist.  She is purported to be the first programmer. Start with the 
abstract.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-22  0:00               ` Robert Dewar
  1996-07-30  0:00                 ` Tim Behrendsen
@ 1996-07-31  0:00                 ` Patrick Horgan
  1 sibling, 0 replies; 688+ messages in thread
From: Patrick Horgan @ 1996-07-31  0:00 UTC (permalink / raw)



In article <dewar.838067532@schonberg>, dewar@cs.nyu.edu (Robert Dewar) writes:
> Tim Oxler quoted:
> 
> Sentry Market Research surveyed 700 IS mangers what language they used
> for client/server application development:
> 
> Visual Basic    23%
> Cobol           21%
> C++             18%
> C               15%
> 
> and note that client server applications probably have a lower percentage
> of COBOL than all applications, because there are still lots of 
> traditional batch programs being generated in IS shops in COBOL.
> 

Most of us don't work in IS shops though...it's a very different market than
the kind of development environment you find in a software shop.  Those numbers
are completely inapplicable to the workplaces most of us live in.

Out here in Silicon Valley it's almost impossible to get a job without C++.
I've been involved in several companies with the interviewing/hiring 
process, (cause I do a good job of finding out how much air is in a
resume;) and you don't even get in the door without C++ on your resume.  You
don't get called back for a second interview unless you can demonstrate
C++ ability.

Note that this is not a comment on C++ as a good/poor language for learning/
object-oriented-programming/life.  Just a note on industry practice.

Some of the defense contractors out here are using Ada, and others still use
C, and a few use Cobol, but you really limit your market with those.  Defense
contractors don't pay nearly as well as the rest of the market either. (So
where ARE all those tax dollars going?)


-- 

   Patrick J. Horgan    patrick@broadvision.com   Have horse will ride.
   Opinions mine, not my employer's except by most bizarre coincidence.






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-28  0:00                 ` J. Christian Blanchette
  1996-07-28  0:00                   ` Robert Dewar
  1996-07-29  0:00                   ` Tim Behrendsen
@ 1996-07-31  0:00                   ` Patrick Horgan
  1996-07-31  0:00                   ` AJ Musgrove
                                     ` (6 subsequent siblings)
  9 siblings, 0 replies; 688+ messages in thread
From: Patrick Horgan @ 1996-07-31  0:00 UTC (permalink / raw)



In article <31FBC584.4188@ivic.qc.ca>, "J. Christian Blanchette" <jblanc@ivic.qc.ca> writes:
> 
> C must me learned before C++, that's a point since C++ is a really more 
> complex superset.

Last year I taught a C++ for non-programmers class using Pohl's "Object Oriented
Programming Using C++".  The class went fine.


-- 

   Patrick J. Horgan    patrick@broadvision.com   Have horse will ride.
   Opinions mine, not my employer's except by most bizarre coincidence.






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
       [not found]               ` <dewar. <peake.206.002D549F@dstos3.dsto.gov.au>
@ 1996-07-31  0:00                 ` P. Cnudde VH14 (8218)
  1996-07-31  0:00                   ` Nicolas Devillard
  1996-08-02  0:00                   ` Matt Austern
  1996-07-31  0:00                 ` Tim Behrendsen
  1996-07-31  0:00                 ` Stephen M O'Shaughnessy
  2 siblings, 2 replies; 688+ messages in thread
From: P. Cnudde VH14 (8218) @ 1996-07-31  0:00 UTC (permalink / raw)



Alan Peake wrote:
> 
> >it would be as if EE students were taught IC design in the first
> >course, and were only given resisters, capacitors, ohm's law,
> >etc. in their senior year, almost as an afterthought!
> 
> Well, it may be going that way. Most of the logic design in my department is
> now done in a Hardware Description Language;  no-one needs to know about
> gates and counters anymore. All they need to know about capacitors is what
> types to use for bypassing the gate array chips. This level of hardware
> abstraction is a bit like OO in programming. The class libraries will contain
> most functions that the programmer is likely to need in much the same way as
> the HDL elements are the building blocks for the array chip.
> 
> Sure, there is a need for a few people to understand the nuts and bolts, but
> these few will be writing the libraries and designing the silicon i.e., making
> the tools. As long as the rest of us can use the tools, what does it matter
> how they work?
> 
> Alan

As a project leader in IC design at Alcatel, I agree fully with Alan.

Software knowledge is being considered more valuable than knowing how a 
transistor works. This will even become more and more the case in the 
future. And I see no problem in giving first a course in IC design
and only later going to transistors. Abstraction is the point which
is difficult to learn. When we design a new asic we almost never think
about resistors etc, understanding the behaviour of the chip is where the
real problem is. To my oppionion the same holds for software design. Assembly
language is no longer really needed. If you have the knowledge it can be usefull,
but is it really worth the effort ?
 


-- 


   ____________          Peter Cnudde
   \          /          Alcatel Telecom
    \ ALCATEL/           Switching Systems Division 
     \ BELL /            Microelectronics Design Center
      \    /             
       \  /              F. Wellesplein 1, B-2018 Antwerp
        \/                                        BELGIUM
                         e-mail  : cnuddep@sh.bel.alcatel.be
                         Phone   : +32 3 240 82 18
                         Fax     : +32 3 240 99 47




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
       [not found] <sperlman-0507961717550001@p121.ezo.net>
                   ` (17 preceding siblings ...)
  1996-07-31  0:00 ` Darin Johnson
@ 1996-08-01  0:00 ` Stefan 'Stetson' Skoglund
  1996-08-05  0:00   ` Stephen M O'Shaughnessy
  1996-08-06  0:00   ` Patrick Horgan
  1996-08-01  0:00 ` Andy Hardy
  1996-08-07  0:00 ` Fergus Henderson
  20 siblings, 2 replies; 688+ messages in thread
From: Stefan 'Stetson' Skoglund @ 1996-08-01  0:00 UTC (permalink / raw)



How do we write complex software such as air-traffic control
systems, engine control systems, big-time banking systems
and so on without abstracting away from the computer hardware ?

How do we design such beasts ??
-- 
---------------------------------------------------------------------
Stefan 'Stetson' Skoglund          I               |
sp2stes1@ida.his.se                I               |
<http://www.his.se/ida/~sp2stes1/> I         _____/0\_____
                                   I ____________O(.)O___________
H\"ogskolan i Sk\"ovde, Sverige    I      I-+-I    O    I-+-I
                                   I
                                   I      Viggen with two Rb04
---------------------------------------------------------------------




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
       [not found] <sperlman-0507961717550001@p121.ezo.net>
                   ` (18 preceding siblings ...)
  1996-08-01  0:00 ` Stefan 'Stetson' Skoglund
@ 1996-08-01  0:00 ` Andy Hardy
  1996-08-07  0:00 ` Fergus Henderson
  20 siblings, 0 replies; 688+ messages in thread
From: Andy Hardy @ 1996-08-01  0:00 UTC (permalink / raw)



In article <qqg268at31.fsf@tartarus.ucsd.edu>, Darin Johnson
<djohnson@tartarus.ucsd.edu> writes
[snip]
>  MVS needs assembler

Pardon?

Andy

Andy Hardy (Internet: aph@ahardy.demon.co.uk, CIS: 100015,2603)
PGP key available on request
===============================================================
Acid absorbs 47 times it's weight in excess Reality.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-01  0:00       ` ++           robin
@ 1996-08-01  0:00         ` Ralph Silverman
  1996-08-06  0:00           ` ++           robin
  0 siblings, 1 reply; 688+ messages in thread
From: Ralph Silverman @ 1996-08-01  0:00 UTC (permalink / raw)



++           robin (rav@goanna.cs.rmit.edu.au) wrote:
: 	robt2@ix.netcom.com(Rob(t.) Brannan) writes:

: 	>As a first language Pascal is the way to go for many reasons.

: 	>C is very complicated as a first language (libraries,pointers,numerous
: 	>low level routines), not to mention some of the errors that you will
: 	>encounter are not exactly well covered in any book.

: 	>As a newbie , remember trying to figure out whats wrong when an
: 	>included file or library was just left out!

: ---Then use PL/I.  That just can't happen!

: PL/I is an excellent first language.  And the
: output is even *easier* to do than Pascal or C.

: BTW, one of the advantages of PL/I for a beginner
: is the excellent diagnstic messages not only at run time,
: but also at compile time.

--
*************begin r.s. response***************

	pl/1  certainly would be good...
	but is this available for the pc?

	once found a kind of interpreter/simulator...
	but this was quite limited in language features!

*************end r.s. response*****************
Ralph Silverman
z007400b@bcfreenet.seflin.lib.fl.us





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-30  0:00     ` Rob(t.) Brannan
  1996-08-01  0:00       ` Tony Konashenok
@ 1996-08-01  0:00       ` ++           robin
  1996-08-01  0:00         ` Ralph Silverman
  1 sibling, 1 reply; 688+ messages in thread
From: ++           robin @ 1996-08-01  0:00 UTC (permalink / raw)



	robt2@ix.netcom.com(Rob(t.) Brannan) writes:

	>As a first language Pascal is the way to go for many reasons.

	>C is very complicated as a first language (libraries,pointers,numerous
	>low level routines), not to mention some of the errors that you will
	>encounter are not exactly well covered in any book.

	>As a newbie , remember trying to figure out whats wrong when an
	>included file or library was just left out!

---Then use PL/I.  That just can't happen!

PL/I is an excellent first language.  And the
output is even *easier* to do than Pascal or C.

BTW, one of the advantages of PL/I for a beginner
is the excellent diagnstic messages not only at run time,
but also at compile time.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-31  0:00                   ` AJ Musgrove
  1996-08-01  0:00                     ` Sam Harris
  1996-08-01  0:00                     ` Tim Hollebeek
@ 1996-08-01  0:00                     ` Ken Pizzini
  1996-08-03  0:00                     ` Raffael Cavallaro
  3 siblings, 0 replies; 688+ messages in thread
From: Ken Pizzini @ 1996-08-01  0:00 UTC (permalink / raw)



In article <4toc18$4j0@ns3.iamerica.net>,
AJ Musgrove <amusgrov@varmm.com> wrote:
>In it's purest form, assembly is much simpler than C. Each instruction does
>1 and only 1 thing (in general). If someone will spend the "blood, sweat,
>and tears" to learn the basics, the rest will come easily. The first
>alnguage I was TRULY profienct in was x86 Assembly. Then I went to Pascal,
>then C, and I've learned some more since then (but still prefer C for it's
>simple elegance).

I really didn't grok pointers until I learned assembly.

>I also recomend compiled with the "generate assembly" option when a
>programmer first starts learning C, so really see what that codes becomes in
>assembly. (option is -s on most unix systems).

Er, make that "-S"; "-s" strips symbols from the output file.


>: Understanding the machine architecture is one thing, using assembly 
>: languages is another.  There's no real interest in knowing all the 
>: mnemonics of a peculiar assembly language for a C coder: knowing how 
>: stacks work or how system calls are performed is enough to make efficient 
>: C programs.
>In the military, soldiers are taught how to disassemble guns.
>Why? There are people who could do that for them. If one FULLY understands the
>tools they are using, one can use them better. Period.

I stopped learning the idiosyncracies of various assembly
languages after my third, but I still find knowing the general
feel of assembly languages in general is useful.  Although I no
longer work with processors which I can write assembly for, I
sometimes find that I need to go to the assembly level during
debugging, either because I need to track down a bug in a
program whose source I don't have access to, or because I feel
that the source-code debugger is obscuring the details of a
critical instruction sequence.

That said, I would recommend that assembly be taught as
language number two.  I think that it is useful to learn the
general concepts of variables, flow control, and problem
decomposition with a language that is at a higher level of
abstraction.  If it were only more widespread I would suggest
the Turing programming language (not to be confused with Turing
machines) as a first language: it was designed to minimize the
incidence of peripheral errors that beginners would tend to
make (such as misplaced semicolon errors in Pascal, runtime
detection of the use of uninitialzed variables, etc.).  Of the
"popular" languages, I think they all have serious flaws when
considered as a first programming language.

Once the basic concepts are learned, then the choice of a third
language depends on external issues, mainly:  what kinds of
programs will you be writing, and who are you writing them
for?  While any Turing-complete language can compute anything
that is computable, given enough memory and time, different
languages really do have different strengths.  In the days
before Perl I had written some mildly complex sed programs, and
maintained even more complex sed programs, but I really would
rather not program a compiler in sed, thank you very much ;-).


		--Ken Pizzini




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-31  0:00 ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Darin Johnson
@ 1996-08-01  0:00   ` Tim Behrendsen
  1996-08-01  0:00     ` Stephen M O'Shaughnessy
  1996-08-05  0:00     ` Patrick Horgan
  0 siblings, 2 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-01  0:00 UTC (permalink / raw)



Darin Johnson <djohnson@tartarus.ucsd.edu> wrote in article
<qqg268at31.fsf@tartarus.ucsd.edu>...
> > To my oppionion the same holds for software design. Assembly
> > language is no longer really needed. If you have the knowledge it can
be usefull,
> > but is it really worth the effort ?
> 
> The effort is already minimal.  One lousy class.  And assembler is

Except, IMO assembly should be used *exclusively* for the first two
years of a CS degree.  The first two years is usually all algorithmic
analysis, anyway.  There's nothing you can't learn about algorithms
that you can't learn and learn it better doing it in assembly.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-30  0:00     ` Rob(t.) Brannan
@ 1996-08-01  0:00       ` Tony Konashenok
  1996-08-04  0:00         ` Lawrence Kirby
  1996-08-09  0:00         ` Verne Arase
  1996-08-01  0:00       ` ++           robin
  1 sibling, 2 replies; 688+ messages in thread
From: Tony Konashenok @ 1996-08-01  0:00 UTC (permalink / raw)



One of my pet peeves about C is REALLY poor diagnostics from the vast
majority of compilers. Certain easily-correctable syntax errors cause
a fatal error when they could easily be fixed by compiler and only induce
a warning; many error messages point very far from the place where the
error actually happened (well, it's mostly the language syntax that caused
it, but compilers could also be somewhat smarter); hardly any compiler
produces a cross-reference dictionary.
Another serious limitation is that fixed-point numbers can only be integer.
I can't count how many times I had to do dirty tricks to circumvent it.

Pascal is better... but I am not into B&D. I vote for PL/I!

Tony Konashenok
Team PL/I




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-31  0:00                   ` AJ Musgrove
@ 1996-08-01  0:00                     ` Sam Harris
  1996-08-02  0:00                       ` Eric W. Nikitin
  1996-08-01  0:00                     ` Tim Hollebeek
                                       ` (2 subsequent siblings)
  3 siblings, 1 reply; 688+ messages in thread
From: Sam Harris @ 1996-08-01  0:00 UTC (permalink / raw)



AJ Musgrove wrote:

> In the military, soldiers are taught how to disassemble guns.
> Why? There are people who could do that for them. If one FULLY understands the
> tools they are using, one can use them better. Period.

In the field, there is no one else to strip and clean your weapon.
In fact, while in garrison, a soldier is held ultimately responsible
for his weapon from time of issuance until it is returned to the
armoury and is expected to do all the cleaning and stripping
personally. No one else does it because the soldier must know
how to quickly disassemble and clear or even repair the weapon
in a dark, wet hole in no man's land.

In addition to those hard realities of military life, knowing those
intimate details of the workings of the weapon does indeed enable
the soldier to use it to better effect. Military specs on performance
envelops are always conservative. A knowledgeble soldier with the
weapon in the hand knows better how far the performance can be pushed.

As to language order, I learned BASIC and Z80 assembler at about
the same time (ages ago) and matured my techiques in both in parallel.
If someone cannot deal with the level of detail and the span of
attention required to effectively program assembly, then that
individual may be in the wrong business. Best to find out early
so the student can make an informed choice about their potential
success in the field.

When I was an Air ROTC cadet, I wanted to be a pilot. In that program,
you got 7 hours of flight instruction, 1 hour for solo, 3 hours of
additional instruction including more solos. The 12th hour was a
flight test. The candidate was expected to perform at the level of
40 hours of flight instruction. The reasoning being, if the candidate
did not have the talent to "pick it up" that quickly, then performance
would never be acceptable at the Air Force level. As a personal note,
after 7 hours, I was a competant pilot as long as everything went
a-okay, but I was no where near ready for the flight test. Like
Dirty Harry said, "A man's got to know his limitations."

I believe teaching a higher-level language with constant feedback
with the produced assembler output is a good combination worth
serious consideration and trial. In my college days tutoring other
comp-sci students, I noticed a disheartening trend of students
competent in reasoning with their familiar language becoming
totally lost in the minutia of assembly. They just count map
their higher-level language constructs into the dozens of lines
of equivalent assmbly code. Once I show them how those contructs
worked in assembly, they now had another "brand name" collection
to go with their tool set and of they went.

-- 
Samuel T. Harris, Senior Engineer
Hughes Training, Inc. - Houston Operations
2224 Bay Area Blvd. Houston, TX 77058-2099
"If you can make it, We can fake it!"




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-01  0:00   ` Tim Behrendsen
@ 1996-08-01  0:00     ` Stephen M O'Shaughnessy
  1996-08-03  0:00       ` Tim Behrendsen
  1996-08-05  0:00     ` Patrick Horgan
  1 sibling, 1 reply; 688+ messages in thread
From: Stephen M O'Shaughnessy @ 1996-08-01  0:00 UTC (permalink / raw)



In article <01bb7fcc$c5a98de0$87ee6fce@timpent.airshields.com>, 
tim@airshields.com says...
>
>Darin Johnson <djohnson@tartarus.ucsd.edu> wrote in article
><qqg268at31.fsf@tartarus.ucsd.edu>...
>> > To my oppionion the same holds for software design. Assembly
>> > language is no longer really needed. If you have the 
knowledge it can
>be usefull,
>> > but is it really worth the effort ?
>> 
>> The effort is already minimal.  One lousy class.  And 
assembler is
>
>Except, IMO assembly should be used *exclusively* for the first 
two
>years of a CS degree.  The first two years is usually all 
algorithmic
>analysis, anyway.  There's nothing you can't learn about 
algorithms
>that you can't learn and learn it better doing it in assembly.
>
>-- Tim Behrendsen (tim@airshields.com)

Learn sorting algorithms in assembly?  Are you serious!?





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-31  0:00                   ` AJ Musgrove
  1996-08-01  0:00                     ` Sam Harris
@ 1996-08-01  0:00                     ` Tim Hollebeek
  1996-08-01  0:00                     ` Ken Pizzini
  1996-08-03  0:00                     ` Raffael Cavallaro
  3 siblings, 0 replies; 688+ messages in thread
From: Tim Hollebeek @ 1996-08-01  0:00 UTC (permalink / raw)



AJ Musgrove (amusgrov@varmm.com) wrote:

: I also recomend compiled with the "generate assembly" option when a
: programmer first starts learning C, so really see what that codes becomes in
: assembly. (option is -s on most unix systems).

Hmm, isn't it usually -S ?  Also, I wouldn't suggest showing this to
newbie programmers unless you force them to look at the output from at
least two different compilers on two different platforms.  Some of the
_worst_ C I've seen comes from attempts to write, for example, x86 ML
in C.

: : C must me learned before C++, that's a point since C++ is a really more 
: : complex superset.
: The first C++ "compilers" really only generated C code, and compiled it.
: Most C compilers still really only generate assembly, then assemble it. They
: don't do strait to binary/machine format.

Actually, if you're going to learn C++, I'd suggest avoiding C, and
going back to it later.  Many things that are necessary to write C
programs, while available in C++, can be completely avoided.  String
and output handling, for example, differs widely.

: : Understanding the machine architecture is one thing, using assembly 
: : languages is another.  There's no real interest in knowing all the 
: : mnemonics of a peculiar assembly language for a C coder: knowing how 
: : stacks work or how system calls are performed is enough to make efficient 
: : C programs.

I agree with this sentiment.

: In the military, soldiers are taught how to disassemble guns.
: Why? There are people who could do that for them. If one FULLY understands the
: tools they are using, one can use them better. Period.

That particular tool only, though.  Too much code is written with a
particular compiler in mind, instead of writing clear code that can
compiled well for _any_ platform.

For example:

x *= 16; /* will be compiled well by any decent compiler, probably
            generating a bitshift operation if available */

x <<= 4; /* Harder to read, and will generate _slower_ code on many
	    architectures which lack a multibit shift operation */

---------------------------------------------------------------------------
Tim Hollebeek         | Disclaimer :=> Everything above is a true statement,
Electron Psychologist |                for sufficiently false values of true.
Princeton University  | email: tim@wfn-shop.princeton.edu
----------------------| http://wfn-shop.princeton.edu/~tim (NEW! IMPROVED!)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-02  0:00                   ` David Wheeler
@ 1996-08-02  0:00                     ` Peter Seebach
  1996-08-02  0:00                       ` Gary M. Greenberg
                                         ` (2 more replies)
  1996-08-06  0:00                     ` What's the best language to start with? [was: Re: Should I learn C or Pasca StHeller
  1 sibling, 3 replies; 688+ messages in thread
From: Peter Seebach @ 1996-08-02  0:00 UTC (permalink / raw)



In article <4ttdlg$2pr@news.ida.org>,
David Wheeler <wheeler@aphrodite.csed.ida.org> wrote:
>That's pretty clueless.  However, depending on how well they
>understood their programming language, they might still produce
>a reasonable result.  Why?  Because, although they may not realize
>that some of these "conversion" operations have no run time impact,
>they may know how to combine the features of their language to
>produce solutions.

And, if their language doesn't specify a binary representation, they may get
an algorithm right on a weird machine that still uses BCD or decimal
internally.

>Any professional developer must understand several assembly languages,
>and how they work in general.

I think this is no longer true.  Understanding an assembler buys you nothing;
all it tells you is that at least one machine had a certain kind of semantics
inside it.  You *don't care*.  If you're writing a *solution* to a *problem*,
those are the only things you need to be working with.  If you try to make
sure your solution fits your imaginings of the underlying machine, you
constrain it unnaturally.  Leave the unnatural constraints for the rare
occasions when your code actually has a detectable performance problem that
can't be quickly traced to a broken algorithm.

>The language you use also determines the amount of detail
>you need to know about the underlying machine abstraction.

Quite true.

>When you're using C or C++, you really need to know how the
>underlying assembly language works.

Quite untrue.

>Why? Because pointer arithmetic
>makes little sense if you don't, and tracking down rogue pointers
>overwriting your local stack values and other nasties is impossible if
>you don't understand how underlying machines work. 

This is simply nonsense.  Professionals and competent engineers do not need to
be told how it is that rogue pointers work; they need to be told what kinds of
operations produce unpredictable results.

I have never even tried to understand an assembly language.  Yet, I am
mysteriously able to use pointer arithmetic correctly, and I have tracked down
any number of bugs involving rogue pointers.

Pointer arithmetic is defined in terms of objects in C.  You don't need to
know what the underlying processor is doing, because you never see it.  You
work in terms of sizeof() and you know that sizeof(char) is 1.  You know
that pointers move and increment by the size of the thing pointed to.

None of this depends on assembly. (conceptually.  It is often implemented in
machine code, but you don't need to know that to make it work, or to use it
correctly.)

>Most of the
>basic C/C++ operations are defined as "whatever your local machine does",
>so it's really part of the C/C++ definition.

Will you folks *please* stop discussing "C/C++"?  I'm discussing C here,
because C++ is a vastly different language, and I don't know it.

You have attained only the second level of enlightenment.

The newbie uses a machine, and to the newbie, that is all machines; the
newbie's code runs only on that machine.

The experienced programmer uses several machines, and has learned how each
behavies.  The experienced programmer's code runs on several machines.

The true programmer uses no machines, and uses the specifications granted by
the language.  The true programmer's code runs on any machine.

The C definition explicitly *DOES NOT* include the semantics of the local
machine; rather, it says that each implementation must document these things,
but a portable C program *can not use any of them*.

Further, it defines quite a few semantics explicitly.  For instance, unsigned
integers are handled modulo N, for some suitable N.  Always.  Everywhere.  No
C compiler may signal overflow on unsigned arithmetic.

When a program depends on the way the local machine handles a basic operation,
that C program is either OS or library code, or broken.

The semantics guaranteed by the language are quite strong enough for 99% of
most real projects.  The other 1% is why people ask what your experience is
specifically when hiring a programmer.

Many programmers seem to be able to push that number down, by finding more and
more ways to depend on things they don't need.  This is stupid.

>A professional developer using other languages should still know this
>level of detail, but it's frankly less important in Java, Ada 95, Eiffel,
>and Smalltalk (among others).  Who cares that "if" becomes a branch
>instruction? Unless you call outside the language, you won't even get
>to see how some capabilities are implemented in some languages. The
>result: the detail you need to know to solve a problem shrinks.  Thus,
>you can concentrate on learning how to abstract - the real problem.

A C programmer who cares whether an "if" becomes a branch or not is wasting
his own, and everyone else's, time.  That's not part of the language, nor
should it be.

>It's not clear it's REALLY taught that way.  Lots of schools just give
>a quick intro to a programming languages or two, and then show other
>people's abstractions (instead of teaching how to abstract).  Besides,
>many students just "get by" instead of learning, and many people
>find abstraction really hard to grasp.

Especially because unenlightened ones keep telling them that they should care
how or why their code works, rather than telling them to study the
specifications.  If I had a dollar for every student who's been burned, and
badly, by the infamous advice "try it on your compiler and see what happens",
I wouldn't need to work for a living.

>The answer: because you don't REALLY need to know that level of detail;
>an abstraction of a computer is good enough.

Empirically, the abstraction of "A C implementation" is good enough for most
code.

>What I do need to know is how to abstract a problem
>into parts so I can solve it.  I agree with Mr. Dewar - abstraction
>is most important, worry about that first.

This is quite true.

>Thus: If solving problems is the key skill, start with a high-level
>language that lets you concentrate on learning how to solve problems.

This is good advice.  I'm inclined to think that, properly taught, C can be
such a language.  However, I'd rather start people in Icon or perl, myself.

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-03  0:00                       ` Alf P. Steinbach
@ 1996-08-02  0:00                         ` Peter Seebach
  0 siblings, 0 replies; 688+ messages in thread
From: Peter Seebach @ 1996-08-02  0:00 UTC (permalink / raw)



In article <3202876B.BC7@online.no>, Alf P. Steinbach <alfps@online.no> wrote:
>Peter Seebach wrote:
>> And, if their language doesn't specify a binary representation, they may get
>> an algorithm right on a weird machine that still uses BCD or decimal
>> internally.

>Been to a museum lately?

Well, I work for a large corporation... :)

Who says binary is here to stay?  C will always act like it's on a binary
machine, unless they change that aspect of the standard, but assuming that all
languages will act like that is silly.  Among other things, who does bignums
in binary?

>(A)
>> >Any professional developer must understand several assembly languages,
>> >and how they work in general.

>(B)
>> I think this is no longer true.  Understanding an assembler buys you nothing;
>> all it tells you is that at least one machine had a certain kind of semantics
>> inside it.  You *don't care*.  If you're writing a *solution* to a *problem*,
>> those are the only things you need to be working with.

>Pardon me, but this is utter bullshit, both (A) and the response (B).  That
>(A) is untrue is not necessary to discuss.  (B) is more subtle, but builds on
>several assumptions, which can be summed up as a "mathematicians" view of
>programming:  only abstract semantics matter.  At least when discussing C,
>the most popular high level assembler in existence, that argument is
>clearly not valid.  Could be valid in other contexts, though.

I would say it certainly *is* valid.  I have a tiny little 75k project I've
written over the last couple of weeks, and it doesn't depend in any way on
anything but the abstract semantics of the language.  It's not like you *need*
anything else for the majority of code.

I'm a reasonably active real-world programmer, and I concern myself
exclusively with the abstract semantics.  I believe this is called an
existance proof.

>The advice as misguiding advice
>says something quite different:  disregard the machine completely.  That's
>stupid.  When you build a house, you need both a good understanding of the
>plans (abstractions) and the physical materials and other resources. 
>Disregarding or playing down the importance of one or the other leads to
>failure.

It is a common mistake to assume that the machine is the building block of
code.  This is true only in device drivers.  In the majority of code, your
building materials can be, and *must* be, the abstract semantics of the
language.  To write good code, you generally have to ignore the machine
entirely, and leave the machine details to the compiler.

Anything else leads to code which is too tightly coupled to your understanding
of a given machine.

>A very nice goal (getting paid for programming without using computers would
>indeed be ideal).  Aber doch, porting isn't quite that easy... :-)

*I* always find porting easy.  Generally, 50% of my porting time is waiting
for the compile.  The other 50% is writing a makefile for a new platform.
Hmm.  As soon as I eliminate my current bug, I'll try to compile this program
under Borland C, MPW on the Mac, and SAS/C on the Amiga.  I betcha it all
compiles and runs without trouble.  This is about 75k of code, written over
about 3 weeks *exclusively* on BSD-derived Unix using gcc to compile.

Any bets?  :)

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-02  0:00                     ` Peter Seebach
@ 1996-08-02  0:00                       ` Gary M. Greenberg
  1996-08-03  0:00                       ` Alf P. Steinbach
  1996-08-05  0:00                       ` Chris Sonnack
  2 siblings, 0 replies; 688+ messages in thread
From: Gary M. Greenberg @ 1996-08-02  0:00 UTC (permalink / raw)



In article <4ttksk$9lt@solutions.solon.com>, seebs@solon.com wrote:

["Far snip;" ... is that the right `model' for that cut ;p]
 
>  The newbie uses a machine, and to the newbie, that is all machines; the
>  newbie's code runs only on that machine.

Whoa, vast generalization. Slow down. I've been at C* a bit over a
year; I'm still quite the neophyte. I _know_** what is ANSI-C and what
isn't. I know that what runs on my machine using extensions isn't what
runs everywhere.
    * My first computer language.
   ** there are many things about ANSI-C I still don't know, but what
     I know, I can distinguish between Standard and non-standard.

From day one, the code I wrote was keyed in one of three machines,
then copied to each of the other two to learn if and what made a
difference. The three OS'es are quite different:
(1) Macintosh; (2) BSD on an 386, since upgraded to an OS/2 box; and
(3) a Sun Sparc 20 running Solaris 2.4, recently upgraded to an
    Ultra running Solaris 2.5

Guess what. Today, there is a unix compressed tar archive of source
code for a GIS application I wrote sitting at:
    ftp://users.southeast.net/private/garyg

It's called mapbook. The code was originally written on the Mac,
tested on the BSD box, downloaded to the Sun, now Ultra and recompiled
without changing one line of code.
The program writes a program.
I venture to say that it will download and compile on _any_
platform (excepting that it is _no respecter_ of the 8.3
filename constraint).

This is just to say that there are some of us who can learn properly.
There are some of us who don't care about the machine, per se.
I had a task I wanted to automate; I figured out the steps to
automate it. I wrote the code within the contraints provided by
the language that I was learning to use. Over time, I added more and
more error checking, to protect the user from their own mistakes,
but I did it all by staying strictly conforming.

[snip]
>  The experienced programmer's code runs on several machines.
>  The true programmer uses no machines, and uses the specifications
>  granted by the language.  The true programmer's code runs on any
>  machine.

And, sometimes the apprentice's code does too. We ain't all ijits.

>  [snip] a portable C program *can not use any of them*.

   That's a key to reuse too.

[snip]
>  Many programmers seem to be able to push that number down,
>  by finding more and more ways to depend on things they don't need.
>  This is stupid.

    Bingo! Every problem that needs to be handled by the application
    should be evalutated with regard to the Standard library, and the
    functions provided within. If it's in the library, use it rather
    than an extension provided with the tools dejour.
    And, new programmers should check each function they use, to be
    sure they are using one that is, in fact, in the standard as
    opposed to an extension. Given a choice, use the Standard library
    functions.

[much to agree with clipped]
>  -- 
>  Peter Seebach - seebs@solon.com

FWIW: There is an awful lot of chaff to sift thru in c.l.c.
Some of us, while new to the language, have seen which shoulders
to stand on, yours included. Glad there are a few beacons out there.

Cheers,

gary    /* the Sorcerer's Apprentice */
       Contribute to the Randal Schwartz Legal Defense Fund.
           This URL hosts the C Programmers' Reference:
         http://users.southeast.net/~garyg/main_page.html




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-24  0:00 ` Darin Johnson
  1996-07-25  0:00   ` Andy Askey
@ 1996-08-02  0:00   ` Patrick Horgan
  1996-08-04  0:00     ` Gary M. Greenberg
       [not found]     ` <4u76ej$7s9@newsbf02.news.aol.com>
  1996-08-05  0:00   ` Should I learn C or Pascal? Sherwin Anthony Sequeira
  2 siblings, 2 replies; 688+ messages in thread
From: Patrick Horgan @ 1996-08-02  0:00 UTC (permalink / raw)



In article <qqspaho8h3.fsf@tartarus.ucsd.edu>, djohnson@tartarus.ucsd.edu (Darin Johnson) writes:

> That's why the thread has drifted.  The original poster wanted to know
> where to get a fish.  If he learns the language that gets him a job
> now, what happens next year?  The language will change - and more
> often the way the language is used will change.  When you know how to
> program, the choice of language is just a matter of syntax and
> idiosyncracies.  If all you know is one language/methodology, then
> everything else is viewed as "a silly way of doing things".  If you
> learn abstraction, you can use it in any language.  If you learn C and
> have never seen abstraction, you're not going to use abstraction until
> years of experience cause you to use it.  These aren't things you
> learn on the job unless you work with people that also know these
> things and they make an effort to teach you (and this is becoming less
> and less likely).

You're making a few unspoken assumptions here.  First that the
previous thread had anything to do with learning abstractions, (and I'll
expand that to mean "all the good things that make the difference 
between someone that I would respect as an elegant developer"), second 
that everyone can and will learn these things if exposed to them, and most
chilling, that no one will make an effort to continue their education on
their own unless someone else takes the bull by the horns and leads them.
Nevertheless what you've said is valid and good and I agree with it.  To
show that I do, let me say what you've said only in my usual blathering 
way with large amounts of voluminous text;)

Most of the previous discussion has been along the lines of, "well, learn it
in this order and somehow that will teach you the higher level abstractions
you need to know.  I think that's silly.  Programming languages might lead
you along the path a bit, (yes even assembler even though you ~seem~ to
feel that all assembler programmers are archaic spaghetti code hacks;),
but courses in data structures, algorithms, object oriented programming,
and other abstractions do a better job.  The better intro programming
courses cover a bit of all of this along with some language, and the
people might come out of the course assuming that learning the language
taught them these things, but it's not true.  

I can write spaghetti, structured, procedural, object-oriented, or anything
you desire in ada, scheme, assembler, C++, pascal, boern shell, or logo.
Languages don't teach you programming, they're just the medium in which
you do it.  I can learn them in any order I want, and as long as all I'm
learning is the language constructs I'll still suck;)

Nevertheless it's obvious that these things can be learned (by some;).  Often
times the genesis is a set of rules to follow.  I remember when structured 
programming was all the rage.  Its intent was to do away with spaghetti
code in assembler and fortran and basic, and lead programmers to a better,
in that it was more understandable and more maintainable, way of programming.
Most people's exposure to this was as a set of rules such as:

o modularise
o a module should have no more than N lines/should fit on M screens
o a module should have only one entrance and exit (well, in a non-procedural
  language you had to tell people these things!)
o frequently reused code segments should be abstracted into their own module

etc...some people blindly adhered to these rules in a completely anal way that
demonstrated that they didn't get it...they thought that following the rules
was the thing, but the original intent was to show better tools to get things
done.  Like in all other things, the better people got the idea, the abstraction
behind the rules, and applied it without having to think about the rules, or
worry about if it was ever appropriate to do something outside the rules.

Later more and more design methodologies came along.  Some people blindly adhered
to them, some people took the good ideas behind them that they where trying to
express and added them to their repertoire.  (Many projects that insisted on
blindly following a particular design methodology failed, see the literature
for many many details;)

It became clearer and clearer that people fell into a few categories, there were
those that just liked it, and learned more and more on their own all the time,
and considered it a moving, changing, growing body of knowledge.  There were
those that wanted to know "what worked", studied it, decided they knew and
wrote up whatever particular abstractions and methodologies they'd come up
with as if it was on tablets from heaven (although I know with the best
intentions).  There were those who just believed what they were told and
blindly followed it.

There were more of the latter category.

I believe the first guys that want to learn and grow and add more and more tools
and concepts and paradigms to their toolkits all the time are the best.  They
are flexible.  They use what works.  They understand why it works.  They know
where the ideas for structured programming came from, they know where the ideas
for object oriented programming came from.  They've internalised these things
to the point that they are common sense, and they're still spending more time
learning about their field (and usually lots of other things in life) than
most people spent in college.  They like to think.  They like elegant abstractions.

We should be teaching free thinking and flexibility in college.  Instead, since
MANY college professors and their TAs fall into the second category (excluding
well known examples like Pohl and Knuth and others that are the good guys,)
we get rules to follow.  We get punishment for free thinking and flexibility.

It's sad, but not actually that bad, because only a small percentage of the
folks in college have the ability to be in the first group.  I don't know
what it is that separates the top guys from the rest...it's not intelligence.
I've known a lot of really bright but rigid folks in the field.

The folks that belong in the top group will most likely get there in spite of
any obstacles placed in front of them by the educational system.

So, if the question is what language should you learn to get a job, I'll still
say C++.  Most work these days in Unix or Windows is done in C++, and that's
most of the work.  Is it enought to just learn a language?  No.  Will learning
several languages in some particular order teach you what you need?  No.  Learn
everything, learn constantly, seek to always grow.  Learn what you need for
school, learn what you need for work, but don't stop there.  Learn the basic
underpinnings of algorithms and data structures...if it's hard for you read
it again.  If it's still hard get a different book and read it.  Learn until
it's common sense to you and then go on to something else.  Learn always.

If you show up on a job interview I should be able to say, "write me a quick
binary sort algorithm" and you have no problem doing it because you know how
it works...you don't have to memorise it, you create if from well understood
basic principles as needed.

Then you'll end up being the developer that's sought out.  You'll make more
money!  And if you're lucky, you'll end up in a startup that hired a lot of
people like you and your peers will push you to new heights, and appreciate
what makes you different from the vast bulk of your peers:)  (Can you tell
I'm happy where I work?:)  If you find yourself in a place that's stifling,
a place where you're a big fish in a pool of minnows, move on.

And always, always, always keep learning, be one of the people that learns 
new stuff first.  Be a leader.  But always learn in depth.  It doesn't really
get added to your repertoire until it becomes common sense.

-- 

   Patrick J. Horgan    patrick@broadvision.com   Have horse will ride.
   Opinions mine, not my employer's except by most bizarre coincidence.






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-30  0:00                     ` Robert Dewar
@ 1996-08-02  0:00                       ` Tim Behrendsen
  1996-08-03  0:00                         ` Peter Seebach
  0 siblings, 1 reply; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-02  0:00 UTC (permalink / raw)



Robert Dewar <dewar@cs.nyu.edu> wrote in article
<dewar.838737842@schonberg>...
> Paul said
> 
> "We have had CS students on placement here after 2 years of study who
> didn't even understand hexedecimal !."
> 
> Perhaps they might have understood hexadecimal
>                                       ^
> 
> yes, yes, I know, I make more spelling errors than most on newsgroup
> posts because I type in furiously and do not bother to correct, but
> this one was hard to resist :-)
> 
> I actually don't find it so terrible for CS students with two years
> of study not to know hexadecimal notation. I would be more disturbed
> if they did not know and understand what an abstract data type was,
> or what call by reference means.

Yes, but if they don't even know what hexadecimal is (or by extension
the fundamentals), do they *really* understand what an abstract data
type is, or what call by reference means?  I'm sure they could quote
the book definition, but do they really *understand* it?

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-30  0:00                     ` Arra Avakian
  1996-07-31  0:00                       ` James Youngman
  1996-07-31  0:00                       ` Stephen M O'Shaughnessy
@ 1996-08-02  0:00                       ` Tim Behrendsen
  1996-08-05  0:00                         ` Henrik Wetterstrom
                                           ` (2 more replies)
  1996-08-13  0:00                       ` Chris Sonnack
  3 siblings, 3 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-02  0:00 UTC (permalink / raw)



Arra Avakian <arra@inmet.com> wrote in article
<DvD01C.4v0.0.-s@inmet.camb.inmet.com>...
> In article <01bb7da2$6c505ac0$96ee6fcf@timhome2>, "Tim Behrendsen"
<tim@airshields.com> wrote:
> ..
> >The most important thing any student can learn is the stripping away
> >of the shroud of abstractions, and seeing the simplicity of what's
> >really underneath.  Once they get that, all the rest of it comes
naturally.
> >
> I can see both sides of this issue: the importance of understanding 
> abstractions, and the importance of understanding what is underneath.
> I think that a programmer needs to have a model of how a computation
occurs in 
> order to understand issues such as time and space efficiency of the 
> computation. On the other hand, understanding the abstractions presented
by 
> both language constructs and by APIs is absolutely critical to being able
to 
> deal with complex systems. A curriculum that does not accomplish the
learning 
> of computing abstractions fails. Its hard for me to judge the "best"
approach 
> for someone learning now, when I have had the experience of learning
gradually 
> over a lifetime.

Let's try a thought experiment.  We take two students; Jane is taught
assembly from day 1 for two years.  John is taught C for two years.
Both are exposed to identical curriculums of algorithmic analysis,
data structures, etc.

Two years later, they switch roles.  Who will learn the other's skills
the easiest?  I say C will be the most obvious thing in the world to
Jane, and within a few weeks she will out-code John by a longshot.
She will instantly see what subroutines, automatic variables,
static variables, pointers, argument passing, call-by-value v.s.
call-by-reference, arrays, you name it, are all about.  She will
already know the concepts; it's just a matter of learning the syntax.
And if she's ever confused, she can always look at the assembly
output.

John will be hopelessly confused at first, because he will have
absolutely no grounding in what the C syntax really means.  He will
have to start completely from scratch.  His C syntactical experience
will be almost useless in learning assembly, because the mechanisms
behind the "magic" have been hidden to him.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-31  0:00                 ` Stephen M O'Shaughnessy
@ 1996-08-02  0:00                   ` Tim Behrendsen
  1996-08-05  0:00                     ` Mark McKinney
                                       ` (2 more replies)
  0 siblings, 3 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-02  0:00 UTC (permalink / raw)



On Wed, 31 Jul 1996, Stephen M O'Shaughnessy wrote:
> 
> But if you are talking about designing with, say, logic ICs, I would argue that
> it can easily be done without a knowlege of resistors, capacitors, transistors 
> or even ohms law.  However, you would not be a student learning EE.  The 
> point I am trying to make is that we must be careful about what basics are
> necessary to learning a new skill.

This is what I meant, and I agree with you.  Just as a CS student is not
learning programming without assembly skills.

> I don't know much about Lady Lovelace (Namesake of the Ada programming
> language).  I wonder how much of what we today call the basics did she
> know and understand?  I don't think she understood bytes and bits.  I
> am sure, with her math background, she knew about number bases but I
> don't believe she had a working knowledge of hexidecimal numbers.  Yet
> she was able to conceive a *computer* language of sorts and use it
> to solve real world problems on a machine that did not then, nor did
> it ever exist.  She is purported to be the first programmer. Start with the 
> abstract.

Wait ... this would be like saying that since Newton invented calculus,
(Let's not start the age-old Leibnitz argument, BTW! :) ) we shouldn't
teach calculus because Newton didn't need it!  And if that was good enough
for ol' Isaac ...

Come to think of it, that's an interesing analogy. Maybe we should start
math students off with the "abstraction" of calculus, and fill in the
algebra/arithmetic details later!

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-01  0:00                     ` Sam Harris
@ 1996-08-02  0:00                       ` Eric W. Nikitin
  0 siblings, 0 replies; 688+ messages in thread
From: Eric W. Nikitin @ 1996-08-02  0:00 UTC (permalink / raw)



Sam Harris (s_harris@hso.link.com) wrote:
: In fact, while in garrison, a soldier is held ultimately responsible
: for his weapon from time of issuance until it is returned to the
: armoury and is expected to do all the cleaning and stripping
: personally. No one else does it because the soldier must know
: how to quickly disassemble and clear or even repair the weapon
: in a dark, wet hole in no man's land.

: In addition to those hard realities of military life, knowing those
: intimate details of the workings of the weapon does indeed enable
: the soldier to use it to better effect. Military specs on performance
: envelops are always conservative. A knowledgeble soldier with the
: weapon in the hand knows better how far the performance can be pushed.

: When I was an Air ROTC cadet, I wanted to be a pilot. In that program,
: you got 7 hours of flight instruction, 1 hour for solo, 3 hours of
: additional instruction including more solos. The 12th hour was a
: flight test. The candidate was expected to perform at the level of
: 40 hours of flight instruction. The reasoning being, if the candidate
: did not have the talent to "pick it up" that quickly, then performance
: would never be acceptable at the Air Force level. As a personal note,
: after 7 hours, I was a competant pilot as long as everything went
: a-okay, but I was no where near ready for the flight test. Like
: Dirty Harry said, "A man's got to know his limitations."

How many hours did you spend disassembling/reassembling/repairing the plane?





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-31  0:00 ` Darin Johnson
@ 1996-08-02  0:00   ` Alan Peake
  0 siblings, 0 replies; 688+ messages in thread
From: Alan Peake @ 1996-08-02  0:00 UTC (permalink / raw)




>> silicon i.e., making the tools. As long as the rest of us can use
>> the tools, what does it matter how they work? 

>Then the "rest of us" don't need to go to universities.  We're not
>talking about how to use tools, we're talking about learning at the
>university level.  Why bother even learning programming if "the rest
>of us" are only going to use the end-products? 

You still need to learn programming but use the tools that were made by an 
earlier generation to create something more advanced than the earlier 
generation were able to create with the tools that they inherited. There is 
only so much you can stuff into the head of a student in 4 or 5 years at Uni. 
Technology is expanding at such a rate that while it may be nice to 
understand programming from assembler up, you have to put your effort into 
where it will be most useful so if that entails starting at a higher level of 
abstraction, then so be it. There is a very interesting article by Andrew 
Koenig in the June '96 issue of the Journal of Object-Oriented 
Programming, advocating C++ as a first language - worth a read.


>This field is not static.  Few fields are static.  You always need to
>learn new things and do things differently and use new tools, even if
>you're not the one designing the new things.

Quite so.

> Do you think the
>standard state of affairs should be to require massive retraining
>whenever the industry changes?

Well, if you look at universities as training grounds for industry then that 
is what happens; not massively but progressively at any rate. 

Alan





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-31  0:00                 ` P. Cnudde VH14 (8218)
  1996-07-31  0:00                   ` Nicolas Devillard
@ 1996-08-02  0:00                   ` Matt Austern
  1996-08-15  0:00                     ` Lawrence Kirby
  1 sibling, 1 reply; 688+ messages in thread
From: Matt Austern @ 1996-08-02  0:00 UTC (permalink / raw)



smosha@most.fw.hac.com (Stephen M O'Shaughnessy) writes:

> >Except, IMO assembly should be used *exclusively* for the first 
> two
> >years of a CS degree.  The first two years is usually all 
> algorithmic
> >analysis, anyway.  There's nothing you can't learn about 
> algorithms
> >that you can't learn and learn it better doing it in assembly.
> 
> Learn sorting algorithms in assembly?  Are you serious!?

Why not?  Volume 3 of Knuth is all about sorting algorithms, and
every program in it is written in MIX assembly language.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-29  0:00                 ` Tim Behrendsen
                                     ` (2 preceding siblings ...)
  1996-07-31  0:00                   ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Arne W. Flones
@ 1996-08-02  0:00                   ` David Wheeler
  1996-08-02  0:00                     ` Peter Seebach
  1996-08-06  0:00                     ` What's the best language to start with? [was: Re: Should I learn C or Pasca StHeller
  3 siblings, 2 replies; 688+ messages in thread
From: David Wheeler @ 1996-08-02  0:00 UTC (permalink / raw)



Tim Behrendsen (tim@airshields.com) wrote:
: Robert Dewar <dewar@cs.nyu.edu> wrote in article
: <dewar.838609515@schonberg>...
: > Tim recommends
: > 
: > "Assembly -> C [non-GUI] -> C-GUI -> C++
: > 
: > Assembly:   Learn what's *really* going on.  The most important."
: > 
: > I strongly disagree. A student treading this path will have a hard time
: > learning what abstraction is all about.


: I arrived at this conclusion based on the students that were coming
: to me to be hired.  I give a standard test to all my applicants that
: tests two primary attributes,

: 1) How well they understand what's *really* going on; the best
: programmers have a solid foundation.
: 2) How well the can take a problem they've (probably) never seen
: before and generate a solution.

I strongly agree with points 1 and 2. The best programmers understand
how things really work, and are able to generate a solution from a
problem.  Makes sense to me.  We just disagree on how to get there.

: I was shocked at the results.  I had people with Masters and
: Doctorates who were completely incapable of creating new solutions
: that they had never seen before.

I also agree that many people are coming out of schools and are
incapable of creating new solutions.  I've met them myself.

The biggest problem I encounter is not that they don't understand the
first principles.  The problem is that they don't understand how to
break a problem into parts, and then solve the problem.  It's a problem
in _abstraction_, not in understanding how the pieces work.

: It's like graduating someone with a writing degree who is
: illiterate.

Continuing your analogy, what they really need is experience in writing
seriously-size software -- and a CRITIQUE of their work by good writers.
I haven't seen much in the way of critique in universities, and students
tend to work with tiny problems (because nobody wants to evaluate big
ones).  This is an area where things really break down; it's possible
to get a Master's or PhD and not write anything longer than a thousand
lines, and with no feedback from experienced people.

: I had people, with *degrees* now,
: tell me "convert argument to binary" as one of the steps on a
: logical operation problem!  The latter are people who are ground
: in the "abstraction" of an integer, but are completely clueless
: that the computer works in binary. How can a student get a
: full-blown degree, and not understand a computer works in binary?

That's pretty clueless.  However, depending on how well they
understood their programming language, they might still produce
a reasonable result.  Why?  Because, although they may not realize
that some of these "conversion" operations have no run time impact,
they may know how to combine the features of their language to
produce solutions.

And it's the solutions, not the knowledge, that you're paying for.

I agree with you, though, you don't want such clueless people working
for you. The clueless ones won't know how to write high-performance
software where it's needed.  You want people who understand the
underlying technology _AND_ how to apply it.  But don't assume that the
former implies the latter.  In fact, given someone who knows how to
apply the technology, I can quickly get them up to speed on how it works.


: Bottom line, a student cannot fundamentally understand an
: abstraction until they understand the abstraction
: in terms of the fundamental components of programming; again:
: Move, Arithmetic, Test, Branch, and Logicals.

Any professional developer must understand several assembly languages,
and how they work in general.  I think that goes without saying.  That
doesn't mean it should be their first language.  There are various
pedagogical approaches, and I haven't seen any strong evidence for this
approach.  I think assembly language is excessively difficult as a
first step, and it obfuscates other necessary skills (like learning how
to abstract things).  A short course in how they work might be okay,
but serious work starting with assembly would IMHO be a disservice.

The language you use also determines the amount of detail
you need to know about the underlying machine abstraction.
When you're using C or C++, you really need to know how the
underlying assembly language works. Why? Because pointer arithmetic
makes little sense if you don't, and tracking down rogue pointers
overwriting your local stack values and other nasties is impossible if
you don't understand how underlying machines work.  Most of the
basic C/C++ operations are defined as "whatever your local machine does",
so it's really part of the C/C++ definition.

A professional developer using other languages should still know this
level of detail, but it's frankly less important in Java, Ada 95, Eiffel,
and Smalltalk (among others).  Who cares that "if" becomes a branch
instruction? Unless you call outside the language, you won't even get
to see how some capabilities are implemented in some languages. The
result: the detail you need to know to solve a problem shrinks.  Thus,
you can concentrate on learning how to abstract - the real problem.

In the immortal words of managers everywhere: Who cares how the
underlying system works? When will the code be ready?


: CS is currently taught the way you describe.  Why do you think
: it's so hard to teach abstractions at the elementary level?

It's not clear it's REALLY taught that way.  Lots of schools just give
a quick intro to a programming languages or two, and then show other
people's abstractions (instead of teaching how to abstract).  Besides,
many students just "get by" instead of learning, and many people
find abstraction really hard to grasp.


: This is what convinced me that our CS education is being done
: completely wrong.  I've said this before [excuse my repetition];
: it would be as if EE students were taught IC design in the first
: course, and were only given resisters, capacitors, ohm's law,
: etc. in their senior year, almost as an afterthought!

But even you are suggesting that there be an abstraction for
software developers.  Computers use resistors and
capacitors, too. Why not require a full EE degree?
The answer: because you don't REALLY need to know that level of detail;
an abstraction of a computer is good enough.

I've got a EE and CS degree; given sand and enough time I could build
a computer, and then build an operating system and compiler on top of it.
But in most of my work I don't need that level of low-level
understanding.  What I do need to know is how to abstract a problem
into parts so I can solve it.  I agree with Mr. Dewar - abstraction
is most important, worry about that first.

Thus: If solving problems is the key skill, start with a high-level
language that lets you concentrate on learning how to solve problems.


--- David A. Wheeler
Net address: wheeler@ida.org





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-02  0:00                     ` Peter Seebach
  1996-08-02  0:00                       ` Gary M. Greenberg
@ 1996-08-03  0:00                       ` Alf P. Steinbach
  1996-08-02  0:00                         ` Peter Seebach
  1996-08-05  0:00                       ` Chris Sonnack
  2 siblings, 1 reply; 688+ messages in thread
From: Alf P. Steinbach @ 1996-08-03  0:00 UTC (permalink / raw)



Peter Seebach wrote:
> 
> In article <4ttdlg$2pr@news.ida.org>,
> David Wheeler <wheeler@aphrodite.csed.ida.org> wrote:
> >That's pretty clueless.  However, depending on how well they
> >understood their programming language, they might still produce
> >a reasonable result.  Why?  Because, although they may not realize
> >that some of these "conversion" operations have no run time impact,
> >they may know how to combine the features of their language to
> >produce solutions.
> 
> And, if their language doesn't specify a binary representation, they may get
> an algorithm right on a weird machine that still uses BCD or decimal
> internally.

Been to a museum lately?



(A)
> >Any professional developer must understand several assembly languages,
> >and how they work in general.
> 

(B)
> I think this is no longer true.  Understanding an assembler buys you nothing;
> all it tells you is that at least one machine had a certain kind of semantics
> inside it.  You *don't care*.  If you're writing a *solution* to a *problem*,
> those are the only things you need to be working with.

Pardon me, but this is utter bullshit, both (A) and the response (B).  That
(A) is untrue is not necessary to discuss.  (B) is more subtle, but builds on
several assumptions, which can be summed up as a "mathematicians" view of
programming:  only abstract semantics matter.  At least when discussing C,
the most popular high level assembler in existence, that argument is
clearly not valid.  Could be valid in other contexts, though.



> If you try to make sure your solution fits your imaginings of the underlying
> machine, you constrain it unnaturally.  Leave the unnatural constraints for
> the rare occasions when your code actually has a detectable performance
> problem that can't be quickly traced to a broken algorithm.

On the surface, this is good advice.  Like Knuth said, "Premature optimization
is the root of all evil" (or something on these lines).  But that's the surface.
The advice as sound advice assumes that the design is *already* guided by a
good understanding of the underlying machine.  The advice as misguiding advice
says something quite different:  disregard the machine completely.  That's
stupid.  When you build a house, you need both a good understanding of the plans 
(abstractions) and the physical materials and other resources.  Disregarding or
playing down the importance of one or the other leads to failure.


> You have attained only the second level of enlightenment.
> 
> The newbie uses a machine, and to the newbie, that is all machines; the
> newbie's code runs only on that machine.
> 
> The experienced programmer uses several machines, and has learned how each
> behavies.  The experienced programmer's code runs on several machines.
> 
> The true programmer uses no machines, and uses the specifications granted by
> the language.  The true programmer's code runs on any machine.

A very nice goal (getting paid for programming without using computers would
indeed be ideal).  Aber doch, porting isn't quite that easy... :-)



- Alf   (quite aware of the flame-inviting nature of this thread... :-) )




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-31  0:00                   ` AJ Musgrove
                                       ` (2 preceding siblings ...)
  1996-08-01  0:00                     ` Ken Pizzini
@ 1996-08-03  0:00                     ` Raffael Cavallaro
  1996-08-05  0:00                       ` Chris Sonnack
  3 siblings, 1 reply; 688+ messages in thread
From: Raffael Cavallaro @ 1996-08-03  0:00 UTC (permalink / raw)


[-- Warning: decoded text below may be mangled, UTF-8 assumed --]
[-- Attachment #1: Type: text/plain, Size: 1814 bytes --]


In article <4toc18$4j0@ns3.iamerica.net>, amusgrov@varmm.com (AJ Musgrove)
wrote:


> In the military, soldiers are taught how to disassemble guns.
> Why? There are people who could do that for them. If one FULLY understands the
> tools they are using, one can use them better. Period.

Soldiers don't take their weapons apart and reassemble them so they can
"fully" understand them. Most infantrymen are not too bright, and they
sure as hell don't "fully understand" how the blowback mechanism of an
automatic rifle functions. Soldiers are taught to disassemble guns for a
simple reason--in order to clean the parts and keep their weapons in
operating condition. In the field of battle, there aren't gun repair
shops, so you have to do it yourself.

 Unless you're planning to use your computer away from civilization (and
some people do--I used to when I did field archaeology) you don't need to
know how to take it apart and put it back together -- there are
professional technicians for that.

Of course, if you're writing software, it is advantageous to know how your
microprocessor works, what its instructions are, busses, registers etc. so
you can code for speed/low memory if you need to. Of course, if you're not
writing such software, you can remain blissfully ignorant of such things.
To this day I have no idea what the processor instructions or machine
architecture were on the system I first learned to program on, but it
didn't stop me from writing fully functional software. Of course, my
programs might have been a bit faster if I had known, but a decent
compiler often obviates the one time necessity to "get down to the metal."
This is only becoming *increasingly* true. As compiler technology
advances, assembly language programming is becoming less and less
necessary.

just my 2�

Raf




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-01  0:00     ` Stephen M O'Shaughnessy
@ 1996-08-03  0:00       ` Tim Behrendsen
  1996-08-06  0:00         ` Stephen M O'Shaughnessy
  0 siblings, 1 reply; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-03  0:00 UTC (permalink / raw)



Stephen M O'Shaughnessy <smosha@most.fw.hac.com> wrote in article
<DvH2MI.n40@most.fw.hac.com>...
> In article <01bb7fcc$c5a98de0$87ee6fce@timpent.airshields.com>, 
> tim@airshields.com says...
> Learn sorting algorithms in assembly?  Are you serious!?

Quite serious, but why would think sorting algorithms are
especially hard?  That class of algorithm would have identical
difficult in C or Asm.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-30  0:00                   ` Paul Campbell
  1996-07-30  0:00                     ` Robert Dewar
@ 1996-08-03  0:00                     ` Patrick Horgan
  1996-08-04  0:00                       ` Kurt E. Huhner
  1 sibling, 1 reply; 688+ messages in thread
From: Patrick Horgan @ 1996-08-03  0:00 UTC (permalink / raw)



I interviewed someone with a masters in CS and years of experience recently
that couldn't correctly code a strcmp() function.  He's been working with C++,
yet though a constructor returned something.  Things like this are not the
exception but the rule in the interviews I do...it's sad.  How do people stay
in the industry when they don't even know the basic use of their tools?



In article <31FDFB6C.2781E494@att.com>, Paul Campbell <Paul_Campbell__Mr@att.com> writes:
> >>>>>>
> I was shocked at the results.  I had people with Masters and
> Doctorates who were completely incapable of creating new solutions
> that they had never seen before.  I had people, with *degrees* now,
> tell me "convert argument to binary" as one of the steps on a
> logical operation problem!  The latter are people who are ground
> in the "abstraction" of an integer, but are completely clueless
> that the computer works in binary. How can a student get a
> full-blown degree, and not understand a computer works in binary?
> It's like graduating someone with a writing degree who is
> illiterate.
> <<<<<<<
> 
> We have had CS students on placement here after 2 years of study who
> didn't even understand hexedecimal !.
> 
> Paul C.
> UK.



-- 

   Patrick J. Horgan    patrick@broadvision.com   Have horse will ride.
   Opinions mine, not my employer's except by most bizarre coincidence.






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-02  0:00                       ` Tim Behrendsen
@ 1996-08-03  0:00                         ` Peter Seebach
  1996-08-04  0:00                           ` Alf P. Steinbach
  0 siblings, 1 reply; 688+ messages in thread
From: Peter Seebach @ 1996-08-03  0:00 UTC (permalink / raw)



In article <01bb8023$4bfe7a80$96ee6fcf@timhome2>,
Tim Behrendsen <tim@airshields.com> wrote:
>Yes, but if they don't even know what hexadecimal is (or by extension
>the fundamentals), do they *really* understand what an abstract data
>type is, or what call by reference means?  I'm sure they could quote
>the book definition, but do they really *understand* it?

Certainly.  Knowing hex won't buy you anything WRT abstract data types.
Knowing the bottom half of the machine might help you understand the
*mechanics* of call by reference.  It would not help you understand the
*implications*.  You can explain the implications easily using metaphors,
and most students will quickly grasp why these operate differently.

-s


-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-03  0:00                         ` Peter Seebach
@ 1996-08-04  0:00                           ` Alf P. Steinbach
  1996-08-04  0:00                             ` Peter Seebach
  0 siblings, 1 reply; 688+ messages in thread
From: Alf P. Steinbach @ 1996-08-04  0:00 UTC (permalink / raw)



Peter Seebach wrote:
> 
> In article <01bb8023$4bfe7a80$96ee6fcf@timhome2>,
> Tim Behrendsen <tim@airshields.com> wrote:
> >Yes, but if they don't even know what hexadecimal is (or by extension
> >the fundamentals), do they *really* understand what an abstract data
> >type is, or what call by reference means?  I'm sure they could quote
> >the book definition, but do they really *understand* it?
> 
> Certainly.  Knowing hex won't buy you anything WRT abstract data types.

The question had nothing whatsoever with the benefits of knowing hex, but
with the absence of knowledge of such a fundamental matter.  Knowing that
a car needs gasoline won't buy you anything with respect to driving a
car (when this matter does not make itself important, as the matter of
hex numbers could make itself important, e.g. while designing iostreams).
However, I wouldn't entrust a car to a person not knowing about gasoline.


> Knowing the bottom half of the machine might help you understand the
> *mechanics* of call by reference.  It would not help you understand the
> *implications*.  You can explain the implications easily using metaphors,
> and most students will quickly grasp why these operate differently.


I have a really hard time understanding why students should *not* learn
about the physical machine.  If they're not able to grasp a simple thing
like hex and binary, then they most certainly will not be able to grasp
abstract datatypes (design, implementation and usage of).  Not for lack
of the fundamentals  --  as you've pointed out, it is possible to reason
at any given level of abstraction  --  but for lack of brains.  Any
reasonably intelligent student will understand hex & bin after just a
few minutes of explanation, and will understand conversion methods
after just an hour so of carrying out exercises.  If not, then the
person has nothing whatsoever to do studying C.Sc. or programming.

So, the cost of learning the fundamentals is very low, while the payoff
is great.  It's a simple & fundamental principle of teaching and learning:
build up from the known and concrete, instead of down from airy
abstractions.  And I expect you do the same (I do!):  when confronted with
some problem which seems too abstract and complicated, I always find it
beneficial to consider a simplified and more concrete version first.

Leaving out the fundamentals simply means the student will fill the
vacuum at the bottom with something out of fantasy and intuition.  I
can't see any good reason  --  or even any reason at all  --  for
actively suppressing / leaving out this information.


- Alf




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-04  0:00                           ` Alf P. Steinbach
@ 1996-08-04  0:00                             ` Peter Seebach
  1996-08-04  0:00                               ` Jerry van Dijk
  1996-08-05  0:00                               ` Tim Behrendsen
  0 siblings, 2 replies; 688+ messages in thread
From: Peter Seebach @ 1996-08-04  0:00 UTC (permalink / raw)



In article <320433CE.6D5A@online.no>, Alf P. Steinbach <alfps@online.no> wrote:
>The question had nothing whatsoever with the benefits of knowing hex, but
>with the absence of knowledge of such a fundamental matter.  Knowing that
>a car needs gasoline won't buy you anything with respect to driving a
>car (when this matter does not make itself important, as the matter of
>hex numbers could make itself important, e.g. while designing iostreams).
>However, I wouldn't entrust a car to a person not knowing about gasoline.

Hex isn't a part of the operation of a computer; it's a possible interface
we can use, but we don't have to.  I would trust a car to someone who
hadn't seen automatic locks, because they don't generally need to know.

>I have a really hard time understanding why students should *not* learn
>about the physical machine.

Because it will incline them to assume that the physical machine is
a constant.  Beginners are constantly writing programs which assume
that you can treat a pointer to int as a pointer to char, and read the
bottom 8 bits of the integer out of it.

>If they're not able to grasp a simple thing
>like hex and binary, then they most certainly will not be able to grasp
>abstract datatypes (design, implementation and usage of).

Probably.  But similarly, no student who can't learn Chinese, Spanish, or
German is likely to be able to comprehend abstract data types.  This is *not*
sufficient reason to require students to learn these languages in the CS
major.

>Not for lack
>of the fundamentals  --  as you've pointed out, it is possible to reason
>at any given level of abstraction  --  but for lack of brains.  Any
>reasonably intelligent student will understand hex & bin after just a
>few minutes of explanation, and will understand conversion methods
>after just an hour so of carrying out exercises.  If not, then the
>person has nothing whatsoever to do studying C.Sc. or programming.

True.  I would actually be inclined to think that people should be exposed
to hex and binary, because these are, themselves, useful *abstractions*.
Especially hex.  Even binary is a useful abstraction.  Thinking about bits
is not necessarily bad, and may be helpful.  Believing you know where they
are is downright dangerous.

>So, the cost of learning the fundamentals is very low, while the payoff
>is great.  It's a simple & fundamental principle of teaching and learning:
>build up from the known and concrete, instead of down from airy
>abstractions.  And I expect you do the same (I do!):  when confronted with
>some problem which seems too abstract and complicated, I always find it
>beneficial to consider a simplified and more concrete version first.

Uhm.  I prefer to think of simplified and more abstract versions.  But then, I
was raised by mathemeticians; I consider the physical world to be a special
case.  :)

>Leaving out the fundamentals simply means the student will fill the
>vacuum at the bottom with something out of fantasy and intuition.  I
>can't see any good reason  --  or even any reason at all  --  for
>actively suppressing / leaving out this information.

I guess, I distinguish between the fundementals and *any* form of assembly.
The fundementals that are worth teaching are the fundementals of *all*
machines.  I think a study of a Turing machine would be a great thing for
students.  You could make a turing sim for them.  (Obviously, it would
only simulate finite machines...)

But studying a *specific* architecture, or even two, is very dangerous to
young programmers.  Young minds are impressionable, and people find it much
harder to unlearn than to learn.  It took me *years* to get out of some
of my early bad habits.

A lot of programmers used to habitually use a null pointer as a shortcut
for "" for string functions; you'd see
	if (strcmp(s, 0))
used like
	if (strcmp(s, ""))
because it worked on some machines.

For a concrete example, about a year ago someone was writing in about the
famed "null pointer assignment" message Borland C would emit after his
programs ran.  He didn't want to know what caused it; he *knew* what caused
it.

He wanted to turn it off.

See, he'd discovered that
	if (strcmp(gets(NULL), "hello"))
let him save a variable.

Students should be kept *away* from the kind of mindset that makes this seem
reasonable.  That feeling that, now that I know what's happening, I can take
advantage of it - heady, but ultimately dangerous.

Many books on C encourage beginners to experiment with expressions like
	a[i] = ++i;
and "see what happens".  The exercise is generally like "think about this.
What do you think will happen?  Write a test program to try it.  See what
really happens.  What does this tell you about ..."

Ugh!

(For those who don't know:  The expression is completely devoid of meaning
in standard C.  There are two obvious interpretations, but in fact, it's
not merely a bit odd, but *completely undefined*.  A compiler may end up
corrupting memory if you pull tricks like this.)

(And no, no combination of ()'s, or moving the ++ around, will make that
expression valid or meaningful C.)

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-01  0:00       ` Tony Konashenok
@ 1996-08-04  0:00         ` Lawrence Kirby
  1996-08-09  0:00         ` Verne Arase
  1 sibling, 0 replies; 688+ messages in thread
From: Lawrence Kirby @ 1996-08-04  0:00 UTC (permalink / raw)



In article <4tqoru$kdn@overload.lbl.gov>
           tonyk@sseos.lbl.gov "Tony Konashenok" writes:

>One of my pet peeves about C is REALLY poor diagnostics from the vast
>majority of compilers. Certain easily-correctable syntax errors cause
>a fatal error when they could easily be fixed by compiler and only induce
>a warning;

You're saying that the compiler complaining in the strongest terms about
invalid syntax is poor diagnostics? I call it essential diagnostic.
I do not want the compiler to 2nd guess my intent, it can all too easily
get it wrong (even another programmer can easily misjudge where that
extra brace is supposed to go). A syntax error means the source code
needs fixing and a good compiler will highlight that and not try to
work around it.

>many error messages point very far from the place where the
>error actually happened (well, it's mostly the language syntax that caused
>it, but compilers could also be somewhat smarter);

Some compilers are better than others, I don't find this to be much of
a problem.

-- 
-----------------------------------------
Lawrence Kirby | fred@genesis.demon.co.uk
Wilts, England | 70734.126@compuserve.com
-----------------------------------------




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-03  0:00                     ` Patrick Horgan
@ 1996-08-04  0:00                       ` Kurt E. Huhner
  0 siblings, 0 replies; 688+ messages in thread
From: Kurt E. Huhner @ 1996-08-04  0:00 UTC (permalink / raw)
  To: patrick; +Cc: Timothy E. Raborn


Patrick Horgan wrote:
> 
> I interviewed someone with a masters in CS and years of experience recently
> that couldn't correctly code a strcmp() function.  He's been working with C++,
> yet though a constructor returned something.  Things like this are not the
> exception but the rule in the interviews I do...it's sad.  How do people stay
> in the industry when they don't even know the basic use of their tools?

<DREAM>

I know!!! I have a Masters degree and roughly ten years of experiece.  I
am 
a GUI software engineer and extensive experience in my field, including
C.  It kills me (actually makes me laugh) when a customer comes to me a
asks for solid code and then moans over the rate @ $80/hr.

They need to go and find one of these guys...

> that couldn't correctly code a strcmp() function.

</DREAM>


-- 


~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
      Check out "How'd You Do That? X/Motif Tips & Tricks," 
         a monthly column written for The X Advisor at
         http://www.unx.com/DD/advisor/huhner/toc.shtml.
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

============================================================================
Kurt E. Huhner, Software Consultant     MailTo:khuhner@ncs-ssc.com
Nation Computer Services, Inc.          http://www.ncs-ssc.com
Bldg. 9110, MSAAP                       Voice: (601)689-8100
Stennis Space Center, MS 39529          Fax:   (601)689-8130
============================================================================




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-02  0:00   ` Patrick Horgan
@ 1996-08-04  0:00     ` Gary M. Greenberg
       [not found]     ` <4u76ej$7s9@newsbf02.news.aol.com>
  1 sibling, 0 replies; 688+ messages in thread
From: Gary M. Greenberg @ 1996-08-04  0:00 UTC (permalink / raw)



Eloquently, patrick@broadvision.com wrote:

>  you have no problem doing it because you know how it works...
>  you don't have to memorise it, you create if from well understood
>  basic principles as needed.
> [ ... ] 
>  If you find yourself in a place that's stifling,
>  a place where you're a big fish in a pool of minnows, move on.
>  
>  And always, always, always keep learning, be one of the people
>  that learns new stuff first.  Be a leader.  But always learn in
>  depth.  It doesn't really get added to your repertoire until it
>  becomes common sense.
>  

    The quintessential statement of success! Bravo!

>     Patrick J. Horgan    patrick@broadvision.com   Have horse will ride.

gary    /* the Sorcerer's Apprentice */




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-05  0:00                               ` Tim Behrendsen
@ 1996-08-04  0:00                                 ` Peter Seebach
  1996-08-05  0:00                                   ` Chris Sonnack
  1996-08-06  0:00                                   ` Tim Behrendsen
  0 siblings, 2 replies; 688+ messages in thread
From: Peter Seebach @ 1996-08-04  0:00 UTC (permalink / raw)



In article <01bb826a$b222f400$9bee6fce@timhome2>,
Tim Behrendsen <tim@airshields.com> wrote:
>Peter Seebach <seebs@solutions.solon.com> wrote in article
><4u1g29$hc0@solutions.solon.com>...
>> A lot of programmers used to habitually use a null pointer as a shortcut
>> for "" for string functions; you'd see
>> 	if (strcmp(s, 0))
>> used like
>> 	if (strcmp(s, ""))
>> because it worked on some machines.

>This completely makes the point.  Someone who knows assembly language
>would *never* make that mistake, because it would be so obviously
>wrong.

An interesting theory.  Unfortunately, I long since lost track of these
articles, but in the old "use the OS vs. bang the metal" flame wars in
comp.sys.amiga.programmer, the assembly people would always talk about
how the first instruction in their programs was always
	movel #0,0L
(Or however it is that you spell "write 4 bytes of 0 to address 0")...

A high-level programmer would never do that, because someone who has
learned only the abstract semantics would not think of a null pointer
as the address of any real memory.  It isn't; it's *guaranteed* that
no object has a null pointer as its address.

>You cannot program in assembly and not understand a null
>pointer v.s. a pointer to a zeroed memory location.

And you cannot program in abstract C and think of a null pointer as a pointer
to *any* memory location.  On many machines that I use, you can't read or
write to it; it certainly doesn't *act* like it's memory.

The people who did things like that were always assembler programmers relying
on their knowledge of what was "really" happening inside the machine.  The C
abstract semantics clearly forbid it; however, on the majority of common
architectures, the hardware doesn't forbid it, and the assumption holds.
(By chance, mostly.)

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-04  0:00                             ` Peter Seebach
@ 1996-08-04  0:00                               ` Jerry van Dijk
  1996-08-05  0:00                               ` Tim Behrendsen
  1 sibling, 0 replies; 688+ messages in thread
From: Jerry van Dijk @ 1996-08-04  0:00 UTC (permalink / raw)



Peter Seebach (seebs@solutions.solon.com) wrote:

: True.  I would actually be inclined to think that people should be exposed
: to hex and binary, because these are, themselves, useful *abstractions*.

I know, I shouldn't allow myself to be dragged into this, but...

I keep wondering if age/background might have something to do with the
positions taken. Me, I first learned about programming using a 4004
front panel system. And I tend to agree with the 'abstractions first'
school of thought.

Anyway, future student generations will be thankfull that we are programmers
not educators :-))  (Well, most of us :-)





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-03  0:00                     ` Raffael Cavallaro
@ 1996-08-05  0:00                       ` Chris Sonnack
  0 siblings, 0 replies; 688+ messages in thread
From: Chris Sonnack @ 1996-08-05  0:00 UTC (permalink / raw)



Raffael Cavallaro (raffael@tiac.net) wrote:

>> In the military, soldiers are taught how to disassemble guns.
>> Why? There are people who could do that for them. If one FULLY understands
>> the tools they are using, one can use them better. Period.
>
> Soldiers don't take their weapons apart and reassemble them so they can
> "fully" understand them. Most infantrymen are not too bright, and they
> sure as hell don't "fully understand" how the blowback mechanism of an
> automatic rifle functions. Soldiers are taught to disassemble guns for a
> simple reason--in order to clean the parts and keep their weapons in
> operating condition. In the field of battle, there aren't gun repair
> shops, so you have to do it yourself.

Well, in this case, you're both right.

It's the old, "You don't have to know how a car works to use one" argument.

And that's basically true...as far as it goes.

BUT! You'll, I think, be a //better// driver if you DO know; and you'll be
in a FAR better position to help yourself when you break down and there's
no other help in sight.

--
Chris Sonnack  <cjsonnack@mmm.com>                  http://eishcq.mmm.com
Engineering Information Services/Information Technology/3M, St.Paul, Minn
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
UNIX is a very user-friendly operating system ...
			... it's just picky about who it's friends with

Opinions expressed herein are my own and may not represent those of my employer.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-05  0:00                                   ` Chris Sonnack
@ 1996-08-05  0:00                                     ` Peter Seebach
  1996-08-07  0:00                                       ` Tom Watson
  1996-08-05  0:00                                     ` Tim Hollebeek
  1 sibling, 1 reply; 688+ messages in thread
From: Peter Seebach @ 1996-08-05  0:00 UTC (permalink / raw)



In article <4u5682$qgs@dawn.mmm.com>, Chris Sonnack <cjsonnack@mmm.com> wrote:
>Peter Seebach (seebs@solutions.solon.com) wrote:
>>> This completely makes the point.  Someone who knows assembly language
>>> would *never* make that mistake, because it would be so obviously
>>> wrong.

>> An interesting theory.

>Not a theory. I've seen it personally.

A theory, and a false one, because I've seen an assembly programmer, not only
make that mistake, but attempt to justify it based on his "knowledge" of the
machine.

>Because I know what "" really is under the hood, I know it has nothing in
>common with a NULL pointer. And I've seen a co-worker, a fellow professional
>who's been doing this for a number of years, make the above mistake. Not
>just once, but several times.

With no concept at all of how "" really works, it's obvious that it has
nothing in common with a null pointer.  It's an object, or the address of an
object in some contexts, and is very unlikely to have any connection to the
null pointer.

>Yep. True. (Although, honestly, 90+% of all implementations define NULL
>along the lines of: (void*)0) 

Or just plain 0.  They are required to.  However, *THIS DOES NOT MEAN IT IS
THE ADDRESS 0*.

There is no reason to expect that
	(void *)0
and
	(void *) 0xFFFFFFFF
will not compare equal.

0 must convert to a null pointer.  This does *not* mean that a null pointer is
a 0.

>This is also true. The point is "" is a real object in memory, and any
>assembly programmer would know that. What the NULL pointer is doesn't
>matter here; the point is that "" ain't the NULL address.

But there's nothing wrong with the null address working as a "", and the
assumption visibly holds on many machines.

A *competent* assembly programmer would never make that mistake.  Neither woud
a *competent* C programmer.  (IMHO.)  I don't believe that either the assembly
or the non-assembly background grants any protection.  People who are inclined
to try things and assume that they'll continue to work are dangerous in any
language.

I would suspect that people who want to learn assembly are slightly more prone
to this weakness, but it's hard to be sure.

>How's my programming?  Call 1-800-DEV-NULL

Cute.

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-05  0:00                       ` Chris Sonnack
@ 1996-08-05  0:00                         ` Peter Seebach
  0 siblings, 0 replies; 688+ messages in thread
From: Peter Seebach @ 1996-08-05  0:00 UTC (permalink / raw)



In article <4u55hu$o73@dawn.mmm.com>, Chris Sonnack <cjsonnack@mmm.com> wrote:
>Peter Seebach (seebs@solutions.solon.com) wrote:
>> Understanding an assembler buys you nothing;
>                                      ^^^^^^^
>That's taking it too far the other way.

Oh, my, I think my fanatacism is showing again.

>Understanding assembler //helps// you learn C faster. I've seen it with
>my own eyes and experienced it personally (see previous post). I wouldn't
>care to code assembly now; I don't think in terms of assembly when I write
>C; and I sure as hell don't use the asm() feature of C. But knowing what
>it is that I'm really turning out has been helpful.

C has no asm() feature.  There is no provision in the C language for inline
assembly.

A little knowledge is dangerous.  In the hands of an inexperienced programmer
learning C, assembly is like a Herb Schildt book - it gets you into a language
much faster.  Unfortunately, the language you learn isn't really C.  Assembly
tends to teach you a language in which the effects of overflow on signed
integers and floating point numbers are well defined, where address literals
are meaningful, and where you know all sorts of useful things about the layout
of memory.

An interesting language, but not C.

Some people manage to avoid this trap.  To them, assembly is powerful and
helpful, and exposes them to a good understanding of the underlying system.
But this is a good thing only when you can distinguish between a particular
underlying system, and the semantics of the C language.

>And I've been able to troubleshoot some really bizarre problems FASTER by
>either looking as the ASM output (via the -S switch most compilers support)
>or by tracing code in a debugger at the ASM level. Not that I couldn't have
>fixed the problem at a high level; I just did it //faster// at a low one.

I have a compiler where the ASM output is *NOT* used in normal compilation; it
may differ in internals or semantics from the normal compilation, which
compiles to an intermediate format, and then direct to object code.

>Agree on both points. But, again, you're a more effective tool if you
>DO know what's going on "under the hood".

If it's really going on, yes.  But it's *very* dangerous when you get confused
about what's really going on vs. what may have gone on once.

>Absolutely. Programming is about solving problems. A true programmer knows
>how to analyze a problem and synthesize a solution. The language doesn't
>matter; not really. The language is just the local way of expressing the
>solution.

I've written a couple of languages.  In my experience, you can tell whether a
language is complete or not by whether or not you can program in it.  If you
can find tasks you can't do, the language isn't finished yet.

>HOWEVER: it seems obvious to me that the more you know about your craft,
>the better you will be at deriving the most effective solution. And the
>better you'll be at dealing with problems along the path to that solution.

Probably.  I may eventually learn an assembly or two, out of curiousity.  But
I'm more likely to learn Icon, which I love but do not comprehend, and scheme,
first.  They'll teach me a lot more.  Assembly is just unportable, limited C,
without structures.  Why bother?  I know that model already.

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-01  0:00 ` Stefan 'Stetson' Skoglund
@ 1996-08-05  0:00   ` Stephen M O'Shaughnessy
  1996-08-06  0:00     ` Bob Gilbert
  1996-08-06  0:00   ` Patrick Horgan
  1 sibling, 1 reply; 688+ messages in thread
From: Stephen M O'Shaughnessy @ 1996-08-05  0:00 UTC (permalink / raw)



My response to an E-mail from Patrick Horgan

Pat, 
 I tried to e-mail this back to you and it bounced, so I am posting it here on CLA

At 04:41 PM 8/4/96 -0700, you wrote:

>
>I think you and most of the people in this discussion err in thinking that
>there's this artificial dichotomy between specific and abstract knowledge,
>and that there's some reason one is better to learn first.  In the real
>world both are learned together, watch any kids.  We're always learning
>specifics and abstracting generalities.

I disagree.  I think we learn generalities and as we continue learning we 
pick up the specifics.  Continued learning is the learning of the specifics.

>
>> tie our shoes without knowledge of knot theory.  We learn to read
>> without a knowledge of grammer. We learn addition, subtraction, multi.
>> and division without any formal mathmatical proofs.  In fact learning basic 
>> arithmetic is an abstraction as most school children learn by counting
>> pennies and dimes.  This prepares us for 90% or more of the problems we
>
>Apparently you haven't heard of the research work of Dr. Maria Dominguez of
>Venezuela.  She felt that mathematics was taught all wrong starting with
>concrete seeming pieces and eventually learning enough abstractions at the
>college level (if you stuck it out that far) to understand the whys and hows
>of math.  She taught what was essentially an Abstract Algebra course to
>preschool children and then built up through a ten year study.  She ended
>up with kids at the eighth grade level that were handling doctoral level
>mathematics...not just a few of the bright kids, but almost all of the kids.
>It seemed that understanding a conceptual framework to fit math in instead
>of memorizing rules made it easier for them...go figure:)
>

Though I am not familiar with this research, your brief description sounds like
support for my assertions.  We do not teach children number theory, we teach
them to count.  And not count numbers, they count apples and oranges and fish.
"One fish, Two fish; Red fish, Blue fish".  When we teach math we step back
from the details (concrete) to a higher level, a set of apples in front of the 
student.  Memorizing rules is not an abstraction, that is the concrete.  Applying
those rules to every day situations, without direct reference to those rules,
is the abstraction.


>> you encounter. (I do not consider assembly as a language. Despite it's title
>> of assembly *language* it is a code not far enough removed (i.e. abstracted)
>> from the underlying hardware).

>
>That's a bit arbitrary;)  Most machine code is abstracted from the hardware
>at least a level or two through microcoding.  

You support my point.  We don't agree on how much abstraction is desirable, but
machine code is abstracted from hardware through microcode; to which I add 
assembly code is abstracted from  machine code and higher level languages are
abstracted from assembly code.  It is at this higher level of abstraction that
I believe is the best place to start with learning programming.


>> I am not saying that the basic principles are not important.  If one is going
>> to make a career of programming these principles are crucial.  But I don't
>> believe one can recognize the underlying principles without the *shroud* of
>> abstractions to frame them.
>
>Eh?  I thought the basic principles were the abstractions and their expression
>in a language was the concrete?  I'm completely unclear what you mean here.

No, basic principles of hardware, microcode, machine code (assembly) are the
concrete that I was refering to.  In the real world it is electron moving 
through circuits that is taking place at the most basic level.  It is from here
that we must abstract to higher levels to solve real world problems.

In looking at the problem space, yes the language that we express the problem
with is less abstract, more concrete.  I was responding to other posters who 
claimed that a fundamental understanding of the hardware, microcode, assembly
language was necessary for a *first* understanding of the problem.  I was coming
from the other direction: abstracting up from the hardware to the problem space,
not, as I think you are doing, moving from an abstract problem to a more
concrete solution.  I feel we are both correct, just arguing different points.

>> 
>> problem: Add two numbers
>> Q: How?
>> A: Put them in registers A and R5 then do ADD A,R5
>> Q: What is a register? 
>> A: A place to hold data. 
>> Q: What is data?
>> A: A collection of 8 bit bytes.
>> Q: What is a bit/Byte?
>> A: ...
>> 
>> OR
>> 
>> problem: Add two numbers
>> Q: How?
>> A: C := A + B
>
>You forgot to ask all the questions for this one.  If you did it would go much
>farther and be a lot more complicated.  General numeric expressions are much
>more complicated than hardware level expressions of them.
>Q: What's a number
>Q: What's add
>Q: What's =

These are valid questions.  I left them out because they apply to both situations.
My point was that if I need to add the sales tax on a purchase total, I really
don't care what register the result is stored in or how many bits it takes.  I
want to abstract away from those details.  A some point in my career I will need
to know more detail about data types, but for the beginning student that level
of detail is unnecessary 

From the book "Programming in Ada" by J. G. P. Barnes, pg 3:
"The first advance occured in the early 1950s with high level languages such
as FORTRAN and Autocode which introduced 'expression abstraction'.  It 
thus became possible to write statements such as

    X=A + B(I)

so that the use of machine registers to evaluate the expression was completely
hidden from the programmer."

>
>> So first teach them to express these problems with a computer language.  Ada,
>> with it's strong typing, maps very well to real world objects.  Strong typing
>> is about abstraction and that enables the hiding of irrelevant details.
>
>This just isn't true, strong typing has nothing to do with abstraction.

We are obvioulsy not in agreement about the definition of abstraction, or
perhaps we are applying it differently.

"The brightest highlights of Ada95 are its inherent reliability and its
ability to provide ABSTRACTION through the package and PRIVATE TYPE"
Ada95 Rational, Introduction, pg II-1 first paragraph, first sentance.
(Emphasis are mine)

"The emphasis on high performance in Ada applications, and the requirement to
support interfacing to special hardware devices, mean that Ada programmers
must be able to engineer the low-level mapping of algorithms and data 
structures onto physical hardware.  On the other hand, to build large systems,
programmers must operate at a high level of ABSTRACTION and compose systems
from understandable building blocks.  The Ada TYPE system and facilities for
separate compilation are ideally suited to reconciling these seemingly conflicting
requirements."
Ada95 Rational, Overview of the Ada Language para. III.1.5

"The Ada language goes further, providing for the definition of TYPEs
that can only be manipulated according to the type's abstract operations.
Thus the Ada language provides direct support for the idea of an abstract
data type."
Norman H. Cohen, Ada as a Second Language, pg 64.

Patrick, typing has everything to do with abstraction.

>> This is so important when we are trying to learn something.  (That something
>> we are trying to learn is how to solve real world problems with a computer, 
>> not how a computer works).
>
>No, learn both, learn everything, don't be prejudiced against some types of
>learning.  You don't know what you're missing.  Understanding things from 
>both the top down and the bottom up adds a richness to your ability to 
>deal with abstract concepts in your head, and figure out some concrete (i.e.
>programatic) expression of an abstract solution.
>
I agree with you.
I said we must learn both.  My point is that we can't learn both at the same
time.  I believe it is best to start with the higher level to get a good feel
for what the problems are and how to solve then, then work our way down to more
basic levels of the *machine*  to see if there might be better ways to solve
specific problems on specific machines.

With all do respect to Dr. Dominguez, I don't believe you can teach phd level
mathmatics without first teaching them to count.  From my point of view, counting,
while a basic math skill, is an abstraction away from the details that phd
level students study.  Perhaps counting, because it is so basic, is a poor
example.  But I hope you see my point.  In the realm of mathematics, counting abstracts 
away number bases, number theory, addition, etc.  When the child has
a firm grasp of this, then we can show her more of the details such as addition 
is counting by one, that zero is just a place holder, that multiplication is 
addition with fixed steps (i.e. 2*2 is 2+2, 2*3 is 2+2+2, 2*4 is 2+2+2+2 ...)

We are not in agreement as to where the abstraction is taking place and 
where assumtions about a solution are being made.
 
>> No. Learn Ada
>
>No, it just doesn't really matter compared to the real issue...just keep learning:)
>
>Most people aren't going to learn a number of languages though, and the original
>question was specifically about what would make him marketable.  Ada won't cut
>it since it's not used outside of the defense industry and the pay scale in
>defense is low and the work environment is fraught with risk.
>
You are correct in recommending C for the most job choices.  You are wrong is
assuming Ada is niche language with no future.  More than half the project
being written in Ada today are outside the defense industry.  As a result
of many of it's side benefits, such as readablity, maintainablity, as well
as things like it is the only recognized, standardized object oriented language,
Ada is rapidly becomming a language of choice for large projects.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-01  0:00   ` Tim Behrendsen
  1996-08-01  0:00     ` Stephen M O'Shaughnessy
@ 1996-08-05  0:00     ` Patrick Horgan
  1996-08-06  0:00       ` Szu-Wen Huang
  1996-08-06  0:00       ` Dan Pop
  1 sibling, 2 replies; 688+ messages in thread
From: Patrick Horgan @ 1996-08-05  0:00 UTC (permalink / raw)



In article <01bb7fcc$c5a98de0$87ee6fce@timpent.airshields.com>, "Tim Behrendsen" <tim@airshields.com> writes:
> Except, IMO assembly should be used *exclusively* for the first two
> years of a CS degree.  The first two years is usually all algorithmic
> analysis, anyway.  There's nothing you can't learn about algorithms
> that you can't learn and learn it better doing it in assembly.

Good point Tim!  It's sure a lot easier counting cycles in assembler.
Unfortunately, a lot of schools aren't teaching algorithmic analysis
anymore.

-- 

   Patrick J. Horgan    patrick@broadvision.com   Have horse will ride.
   Opinions mine, not my employer's except by most bizarre coincidence.






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-05  0:00                                   ` Chris Sonnack
  1996-08-05  0:00                                     ` Peter Seebach
@ 1996-08-05  0:00                                     ` Tim Hollebeek
  1996-08-10  0:00                                       ` Mike Rubenstein
  1 sibling, 1 reply; 688+ messages in thread
From: Tim Hollebeek @ 1996-08-05  0:00 UTC (permalink / raw)



Chris Sonnack (cjsonnack@mmm.com) wrote:
: Peter Seebach (seebs@solutions.solon.com) wrote:

: >>> A lot of programmers used to habitually use a null pointer as a shortcut
: >>> for "" for string functions; you'd see
: >>> 	if (strcmp(s, 0))
: >>> used like
: >>> 	if (strcmp(s, ""))
: >>> because it worked on some machines.
: >>
: >> This completely makes the point.  Someone who knows assembly language
: >> would *never* make that mistake, because it would be so obviously
: >> wrong.
: >
: > An interesting theory.

: Not a theory. I've seen it personally.

: Because I know what "" really is under the hood, I know it has nothing in
: common with a NULL pointer. And I've seen a co-worker, a fellow professional
: who's been doing this for a number of years, make the above mistake. Not
: just once, but several times.

However I'd submit that knowing the difference between "" and NULL is
essential to good _C_ programming, and has no relation to whether they
know assembly or not.  It's a question of the distinction between zero
and pointer-to-zero, which one only needs the 'pointer' concept to
understand; one does not need to know the underlying machine language
constructs.  IMO if someone confuses X and pointer-to-X on a regular
basis, they don't know C, so you should teach them C instead of trying
to get them to misunderstand a new language.

: > comp.sys.amiga.programmer, the assembly people would always talk about
: > how the first instruction in their programs was always
: > 	movel #0,0L
: >
: > A high-level programmer would never do that, because someone who has
: > learned only the abstract semantics would not think of a null pointer
: > as the address of any real memory.  It isn't; it's *guaranteed* that
: > no object has a null pointer as its address.

: Yep. True. (Although, honestly, 90+% of all implementations define NULL
: along the lines of: (void*)0) 

: But that doesn't really change the point that's being made:

: >> You cannot program in assembly and not understand a null
: >> pointer v.s. a pointer to a zeroed memory location.

: This is also true. The point is "" is a real object in memory, and any
: assembly programmer would know that. What the NULL pointer is doesn't
: matter here; the point is that "" ain't the NULL address.

Any C programmer should know a string literal is a *POINTER TO* a real
object in memory.  I've seen tons of bad C code based on not
understanding this.  The famous 'if (x == "foo")' is just one example.

Now for my biggest objection to this bogus assembly language argument:
a C pointer need not be an address at all.  Yet you constantly refer
to pointers as 'addresses'.  Someone has missed a useful abstraction,
methinks.

---------------------------------------------------------------------------
Tim Hollebeek         | Disclaimer :=> Everything above is a true statement,
Electron Psychologist |                for sufficiently false values of true.
Princeton University  | email: tim@wfn-shop.princeton.edu
----------------------| http://wfn-shop.princeton.edu/~tim (NEW! IMPROVED!)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-02  0:00                       ` Tim Behrendsen
@ 1996-08-05  0:00                         ` Henrik Wetterstrom
  1996-08-05  0:00                         ` Fergus Henderson
  1996-08-05  0:00                         ` Chris Sonnack
  2 siblings, 0 replies; 688+ messages in thread
From: Henrik Wetterstrom @ 1996-08-05  0:00 UTC (permalink / raw)



In article <01bb8027$de0e9c80$96ee6fcf@timhome2>, "Tim Behrendsen" <tim@airshields.com> writes:
>
>Let's try a thought experiment.  We take two students; Jane is taught
>assembly from day 1 for two years.  John is taught C for two years.
>Both are exposed to identical curriculums of algorithmic analysis,
>data structures, etc.
>
>Two years later, they switch roles.  Who will learn the other's skills
>the easiest?  I say C will be the most obvious thing in the world to
>Jane, and within a few weeks she will out-code John by a longshot.
>She will instantly see what subroutines, automatic variables,
>static variables, pointers, argument passing, call-by-value v.s.
>call-by-reference, arrays, you name it, are all about.  She will
>already know the concepts; it's just a matter of learning the syntax.
>And if she's ever confused, she can always look at the assembly
>output.
>
>John will be hopelessly confused at first, because he will have
>absolutely no grounding in what the C syntax really means.  He will
>have to start completely from scratch.  His C syntactical experience
>will be almost useless in learning assembly, because the mechanisms
>behind the "magic" have been hidden to him.

I consider myself as an "instance" of Jane and agree completely.
I learned assembler years before any other highlevel language.
Ok, when programming in assembler is it hard to ignore C, since
many system resources are documented as C structures, so you
usually get small parts of C for free when learning assembler.

However, the problem comes when you go further. Assume both
John and Jane learned both assembler and C. It is time to
learn C++ (or any other object oriented langauge).
The higher level of abstraction in OO programming is a bit harder
for a lowlevel programmer to get used to. Virtual functions,
inheritance, garbage collection etc. will take time to learn, but
I believe John would learn it faster than Jane. At least that's
my own experience.

I guess it could be better to learn C++ before C. C programmers
who go C++ tend to still program C, but with a few C++ features.
Once you are unsecure about the C++, then you go C and eventually
end up with one class and 30 functions.

Still a good programmer should learn assembler, but when to
do it? I don't have a good answer...

There's no perfect solution to teaching computer programming.

/Henrik






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-24  0:00 ` Darin Johnson
  1996-07-25  0:00   ` Andy Askey
  1996-08-02  0:00   ` Patrick Horgan
@ 1996-08-05  0:00   ` Sherwin Anthony Sequeira
  2 siblings, 0 replies; 688+ messages in thread
From: Sherwin Anthony Sequeira @ 1996-08-05  0:00 UTC (permalink / raw)



In article <4ttmpi$l3k@ns.broadvision.com>,
	patrick@broadvision.com (Patrick Horgan) writes:
> In article <qqspaho8h3.fsf@tartarus.ucsd.edu>, djohnson@tartarus.ucsd.edu (Darin Johnson) writes:
> 
[lots snipped]
> And always, always, always keep learning, be one of the people that learns 
> new stuff first.  Be a leader.  But always learn in depth.  It doesn't really
> get added to your repertoire until it becomes common sense.

	Bingo! 
> -- 
> 
>    Patrick J. Horgan    patrick@broadvision.com   Have horse will ride.
>    Opinions mine, not my employer's except by most bizarre coincidence.
> 

-- 
Tony Sequeira




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-04  0:00                             ` Peter Seebach
  1996-08-04  0:00                               ` Jerry van Dijk
@ 1996-08-05  0:00                               ` Tim Behrendsen
  1996-08-04  0:00                                 ` Peter Seebach
  1 sibling, 1 reply; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-05  0:00 UTC (permalink / raw)



Peter Seebach <seebs@solutions.solon.com> wrote in article
<4u1g29$hc0@solutions.solon.com>...
> In article <320433CE.6D5A@online.no>, Alf P. Steinbach <alfps@online.no>
wrote:
> 
> A lot of programmers used to habitually use a null pointer as a shortcut
> for "" for string functions; you'd see
> 	if (strcmp(s, 0))
> used like
> 	if (strcmp(s, ""))
> because it worked on some machines.

This completely makes the point.  Someone who knows assembly language
would *never* make that mistake, because it would be so obviously
wrong.  You cannot program in assembly and not understand a null
pointer v.s. a pointer to a zeroed memory location.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-02  0:00                   ` Tim Behrendsen
  1996-08-05  0:00                     ` Mark McKinney
@ 1996-08-05  0:00                     ` Mark McKinney
  1996-08-05  0:00                     ` Mark McKinney
  2 siblings, 0 replies; 688+ messages in thread
From: Mark McKinney @ 1996-08-05  0:00 UTC (permalink / raw)



Tim Behrendsen wrote
>
>Come to think of it, that's an interesing analogy. Maybe we should start
>math students off with the "abstraction" of calculus, and fill in the
>algebra/arithmetic details later!
>

The idea 1 is an abstact Idea. 1+1=2 is even more abstract. Yet we 
believe and yes claim to know these to be true. Numbers and all math for 
that matter is based on an abstract concept. We apply these concepts to 
the real world and find that they work. Calculus is itself an abstaction 
as with algebra/arithetic etc. The only way to learn these discipline is 
through abstraction because they do not exist in the physical world. What 
does 1 sound like, taste like, feel like, look like .. (not the symbol or 
name the actual number) No it is not a bit pattern store in a register. 
That is a physical implementation of the abstraction the number.

  I'll bet you think far more abstractly than you're willing to admit.

  Some Food for though.                   

                                MArk McKinney



 








^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-02  0:00                   ` Tim Behrendsen
@ 1996-08-05  0:00                     ` Mark McKinney
  1996-08-05  0:00                     ` Mark McKinney
  1996-08-05  0:00                     ` Mark McKinney
  2 siblings, 0 replies; 688+ messages in thread
From: Mark McKinney @ 1996-08-05  0:00 UTC (permalink / raw)



Tim Behrendsen wrote
>
>Come to think of it, that's an interesing analogy. Maybe we should start
>math students off with the "abstraction" of calculus, and fill in the
>algebra/arithmetic details later!
>

The idea 1 is an abstact Idea. 1+1=2 is even more abstract. Yet we 
believe and yes claim to know these to be true. Numbers and all math for 
that matter is based on an abstract concept. We apply these concepts to 
the real world and find that they work. Calculus is itself an abstaction 
as with algebra/arithetic etc. The only way to learn these discipline is 
through abstraction because they do not exist in the physical world. What 
does 1 sound like, taste like, feel like, look like .. (not the symbol or 
name the actual number) No it is not a bit pattern store in a register. 
That is a physical implementation of the abstraction the number.

  I'll bet you think far more abstractly than you're willing to admit.

  Some Food for though.                   

                                MArk McKinney



 








^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-02  0:00                   ` Tim Behrendsen
  1996-08-05  0:00                     ` Mark McKinney
  1996-08-05  0:00                     ` Mark McKinney
@ 1996-08-05  0:00                     ` Mark McKinney
  2 siblings, 0 replies; 688+ messages in thread
From: Mark McKinney @ 1996-08-05  0:00 UTC (permalink / raw)



Tim Behrendsen wrote
>
>Come to think of it, that's an interesing analogy. Maybe we should start
>math students off with the "abstraction" of calculus, and fill in the
>algebra/arithmetic details later!
>

The idea 1 is an abstact Idea. 1+1=2 is even more abstract. Yet we 
believe and yes claim to know these to be true. Numbers and all math for 
that matter is based on an abstract concept. We apply these concepts to 
the real world and find that they work. Calculus is itself an abstaction 
as with algebra/arithetic etc. The only way to learn these discipline is 
through abstraction because they do not exist in the physical world. What 
does 1 sound like, taste like, feel like, look like .. (not the symbol or 
name the actual number) No it is not a bit pattern store in a register. 
That is a physical implementation of the abstraction the number.

  I'll bet you think far more abstractly than you're willing to admit.

  Some Food for though.                   

                                MArk McKinney



 








^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-04  0:00                                 ` Peter Seebach
@ 1996-08-05  0:00                                   ` Chris Sonnack
  1996-08-05  0:00                                     ` Peter Seebach
  1996-08-05  0:00                                     ` Tim Hollebeek
  1996-08-06  0:00                                   ` Tim Behrendsen
  1 sibling, 2 replies; 688+ messages in thread
From: Chris Sonnack @ 1996-08-05  0:00 UTC (permalink / raw)



Peter Seebach (seebs@solutions.solon.com) wrote:

>>> A lot of programmers used to habitually use a null pointer as a shortcut
>>> for "" for string functions; you'd see
>>> 	if (strcmp(s, 0))
>>> used like
>>> 	if (strcmp(s, ""))
>>> because it worked on some machines.
>>
>> This completely makes the point.  Someone who knows assembly language
>> would *never* make that mistake, because it would be so obviously
>> wrong.
>
> An interesting theory.

Not a theory. I've seen it personally.

Because I know what "" really is under the hood, I know it has nothing in
common with a NULL pointer. And I've seen a co-worker, a fellow professional
who's been doing this for a number of years, make the above mistake. Not
just once, but several times.

> comp.sys.amiga.programmer, the assembly people would always talk about
> how the first instruction in their programs was always
> 	movel #0,0L
>
> A high-level programmer would never do that, because someone who has
> learned only the abstract semantics would not think of a null pointer
> as the address of any real memory.  It isn't; it's *guaranteed* that
> no object has a null pointer as its address.

Yep. True. (Although, honestly, 90+% of all implementations define NULL
along the lines of: (void*)0) 

But that doesn't really change the point that's being made:

>> You cannot program in assembly and not understand a null
>> pointer v.s. a pointer to a zeroed memory location.

This is also true. The point is "" is a real object in memory, and any
assembly programmer would know that. What the NULL pointer is doesn't
matter here; the point is that "" ain't the NULL address.

--
Chris Sonnack  <cjsonnack@mmm.com>                  http://eishcq.mmm.com
Engineering Information Services/Information Technology/3M, St.Paul, Minn
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
How's my programming?  Call 1-800-DEV-NULL

Opinions expressed herein are my own and may not represent those of my employer.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-02  0:00                       ` Tim Behrendsen
  1996-08-05  0:00                         ` Henrik Wetterstrom
  1996-08-05  0:00                         ` Fergus Henderson
@ 1996-08-05  0:00                         ` Chris Sonnack
  1996-08-06  0:00                           ` Stephen M O'Shaughnessy
  2 siblings, 1 reply; 688+ messages in thread
From: Chris Sonnack @ 1996-08-05  0:00 UTC (permalink / raw)



Tim Behrendsen (tim@airshields.com) wrote:

> Let's try a thought experiment.  We take two students; Jane is taught
> assembly from day 1 for two years.  John is taught C for two years.
> Both are exposed to identical curriculums of algorithmic analysis,
> data structures, etc.
>
> Two years later, they switch roles.  Who will learn the other's skills
> the easiest?

Like many I started with BASIC. But then, starting with Knuth's MIX, I
spent years in various assemblies (Z80, 8086, etc). When I got to C, I
had no problems at all with pointers. In fact, one of my favorite things
about C is the pointers. And pointers to pointers. And pointers to ....
well, you get the idea.

Now I sometimes teach C, and I've observed time and time again that a
student with assembly background picks up the language //much// faster
than someone with, say, a BASIC (or even Pascal) background. A common
question the latter students ask is, "Why do I want pointers? What good
are they??" Those with assembler background already know the value and
utility of references.

Of course, it can be explained. Of course, in time, they'll get it.

But if you've "lived in the metal", I've found you'll understand the higher
level stuff much faster and surer than those who see it as a "black box".

--
Chris Sonnack  <cjsonnack@mmm.com>                  http://eishcq.mmm.com
Engineering Information Services/Information Technology/3M, St.Paul, Minn
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
TODAY'S RULE: No Smoffing or Fnargling!

Opinions expressed herein are my own and may not represent those of my employer.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-02  0:00                     ` Peter Seebach
  1996-08-02  0:00                       ` Gary M. Greenberg
  1996-08-03  0:00                       ` Alf P. Steinbach
@ 1996-08-05  0:00                       ` Chris Sonnack
  1996-08-05  0:00                         ` Peter Seebach
  2 siblings, 1 reply; 688+ messages in thread
From: Chris Sonnack @ 1996-08-05  0:00 UTC (permalink / raw)



Peter Seebach (seebs@solutions.solon.com) wrote:

>> Any professional developer must understand several assembly languages,
>> and how they work in general.

(That's, I think, an overstatement unless you're creating device drivers
or O/S for a living. But....)

> Understanding an assembler buys you nothing;
                                      ^^^^^^^
That's taking it too far the other way.

Understanding assembler //helps// you learn C faster. I've seen it with
my own eyes and experienced it personally (see previous post). I wouldn't
care to code assembly now; I don't think in terms of assembly when I write
C; and I sure as hell don't use the asm() feature of C. But knowing what
it is that I'm really turning out has been helpful.

And I've been able to troubleshoot some really bizarre problems FASTER by
either looking as the ASM output (via the -S switch most compilers support)
or by tracing code in a debugger at the ASM level. Not that I couldn't have
fixed the problem at a high level; I just did it //faster// at a low one.

>> The language you use also determines the amount of detail
>> you need to know about the underlying machine abstraction.
>
> Quite true.
>
>> When you're using C or C++, you really need to know how the
>> underlying assembly language works.
>
> Quite untrue.

Agree on both points. But, again, you're a more effective tool if you
DO know what's going on "under the hood".

> The newbie uses a machine, and to the newbie, that is all machines; the
> newbie's code runs only on that machine.
>
> The experienced programmer uses several machines, and has learned how each
> behavies.  The experienced programmer's code runs on several machines.
>
> The true programmer uses no machines, and uses the specifications granted
> by the language.  The true programmer's code runs on any machine.

Absolutely. Programming is about solving problems. A true programmer knows
how to analyze a problem and synthesize a solution. The language doesn't
matter; not really. The language is just the local way of expressing the
solution.

HOWEVER: it seems obvious to me that the more you know about your craft,
the better you will be at deriving the most effective solution. And the
better you'll be at dealing with problems along the path to that solution.

--
Chris Sonnack  <cjsonnack@mmm.com>                  http://eishcq.mmm.com
Engineering Information Services/Information Technology/3M, St.Paul, Minn
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Chocolate will never replace sex....unless it's very good chocolate.

Opinions expressed herein are my own and may not represent those of my employer.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-02  0:00                       ` Tim Behrendsen
  1996-08-05  0:00                         ` Henrik Wetterstrom
@ 1996-08-05  0:00                         ` Fergus Henderson
  1996-08-06  0:00                           ` Tim Behrendsen
  1996-08-05  0:00                         ` Chris Sonnack
  2 siblings, 1 reply; 688+ messages in thread
From: Fergus Henderson @ 1996-08-05  0:00 UTC (permalink / raw)



"Tim Behrendsen" <tim@airshields.com> writes:

>Let's try a thought experiment.  We take two students; Jane is taught
>assembly from day 1 for two years.  John is taught C for two years.
>Both are exposed to identical curriculums of algorithmic analysis,
>data structures, etc.

That's not a realistic thought experiment.  Teaching abstraction,
data structures, etc. is easier in a higher level language,
and so it is unlikely that Jane will have managed to complete
the same curriculum that John can, given the handicap of using
a low-level language.

I've tutored classes using C and Miranda, amoung other languages.
Much of the time tutoring C was spent explaining how to avoid common
traps and pitfalls in C that simply don't occur in Miranda.
In Miranda, the code for a simple quicksort algorithm fits in
about two or three lines.  This makes it much easier for students
to grasp the essential ideas of such algorithms.
I haven't tutored any classes using assembler, but I imagine that
students would get even more distracted by low-level issues (which
are not relevant to learning the algorithms, abstraction techniques,
etc. that the tutorial is trying to teach) than they do in C.

>Jane [...] will
>already know the concepts; it's just a matter of learning the syntax.

It's generally much easier to learn a concept in the first place
if you have some syntax to hang it on.  Assembler doesn't help much
in that respect.

--
Fergus Henderson <fjh@cs.mu.oz.au>   |  "I have always known that the pursuit
WWW: <http://www.cs.mu.oz.au/~fjh>   |  of excellence is a lethal habit"
PGP: finger fjh@128.250.37.3         |     -- the last words of T. S. Garp.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
       [not found]     ` <4u76ej$7s9@newsbf02.news.aol.com>
@ 1996-08-06  0:00       ` Ralph Silverman
  1996-08-12  0:00         ` Patrick Horgan
  0 siblings, 1 reply; 688+ messages in thread
From: Ralph Silverman @ 1996-08-06  0:00 UTC (permalink / raw)



StHeller (stheller@aol.com) wrote:
: In article <4ttmpi$l3k@ns.broadvision.com>, patrick@broadvision.com
: (Patrick Horgan) writes:

: >If you show up on a job interview I should be able to say, "write me a
: quick
: >binary sort algorithm" and you have no problem doing it because you know
: how
: >it works...you don't have to memorise it, you create if from well
: understood
: >basic principles as needed.
:   I assume you mean a "binary search". If you want a fast sort that's easy
: to program, then distribution counting fills the bill quite nicely; I've
: used it as
: a programming test with great success.

: Steve Heller
: http://ourworld.compuserve.com/homepages/steve_heller/

--
***********begin r.s. response***************

	re.
	"...binary sort algorithm..."

	i remember such from the
		military programming of the 1980's
	...

	has this reached the textbooks now?

	also,
	seems pretty hard for a
		technical interview question
	!!!

***********end r.s. response*****************
Ralph Silverman
z007400b@bcfreenet.seflin.lib.fl.us





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-01  0:00 ` Stefan 'Stetson' Skoglund
  1996-08-05  0:00   ` Stephen M O'Shaughnessy
@ 1996-08-06  0:00   ` Patrick Horgan
  1 sibling, 0 replies; 688+ messages in thread
From: Patrick Horgan @ 1996-08-06  0:00 UTC (permalink / raw)



In article <y9yd91bw9bx.fsf@august.ida.his.se>, Stefan 'Stetson' Skoglund <sp2stes1@ida.his.se> writes:
> How do we write complex software such as air-traffic control
> systems, engine control systems, big-time banking systems
> and so on without abstracting away from the computer hardware ?
> 
> How do we design such beasts ??

Why?  Is someone asking you to do this?  I think the questions were in the wrong
order anyway...I'd hope that you at least had some idea of the high level systems
before you started writing software.

In this whole discussion I haven't seen anyone state nor imply that learning
high-level abstractions wasn't required.

-- 

   Patrick J. Horgan    patrick@broadvision.com   Have horse will ride.
   Opinions mine, not my employer's except by most bizarre coincidence.






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-06  0:00                           ` Tim Behrendsen
  1996-08-06  0:00                             ` Fergus Henderson
@ 1996-08-06  0:00                             ` Szu-Wen Huang
  1996-08-06  0:00                               ` Tim Behrendsen
  1996-08-06  0:00                             ` Dan Pop
  1996-08-12  0:00                             ` Robert I. Eachus
  3 siblings, 1 reply; 688+ messages in thread
From: Szu-Wen Huang @ 1996-08-06  0:00 UTC (permalink / raw)



Tim Behrendsen (tim@airshields.com) wrote:
: Fergus Henderson <fjh@mundook.cs.mu.OZ.AU> wrote in article
: <4u5a11$siv@mulga.cs.mu.OZ.AU>...

: > That's not a realistic thought experiment.  Teaching abstraction,
: > data structures, etc. is easier in a higher level language,
: > and so it is unlikely that Jane will have managed to complete
: > the same curriculum that John can, given the handicap of using
: > a low-level language.

: This seems to be a common theme; that programming things in
: assembly in necessarily harder than programming in a HLL.

It is, in general.

: Maybe I'm weird, but I just don't see assembly as being harder
: than a HLL, and in fact, it seems to me that it's much easier.
: The number of fundamental things to learn is *very* small, and
: I would think that being able to show a problem in terms of the
: "array of memory" being manipulated would just make it infinitely
: easier than having to wrestle with all the abstract nonsense.

I know what you're trying to say, but you neglect what the subject
is trying to teach.  I don't need my students to learn how to print
out a string calling interrupt this function that, or that the
instruction *after* a branch is always executed (in some pipelined
RISCs), or you cannot divide by the ZZX register.  These will all
be useless in a few years, perhaps even a few months.  I need my
students to learn when and why quicksort is more efficient than
bubblesort, and telling them to use assembly sidetracks that effort.

: Now, you wouldn't want to *maintain* large systems of assembly,
: which is why HLLs have taken over the world, but it seems to
: me that assembly per se is just not that hard to use.

It's not hard to use.  It just hampers the *goal* by dumping the
students with more things to learn needlessly.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-03  0:00       ` Tim Behrendsen
@ 1996-08-06  0:00         ` Stephen M O'Shaughnessy
  0 siblings, 0 replies; 688+ messages in thread
From: Stephen M O'Shaughnessy @ 1996-08-06  0:00 UTC (permalink / raw)



In article <01bb80f6$e02d5cc0$9cee6fce@timhome2>, tim@airshields.com says...
>
>Stephen M O'Shaughnessy <smosha@most.fw.hac.com> wrote in article
><DvH2MI.n40@most.fw.hac.com>...
>> In article <01bb7fcc$c5a98de0$87ee6fce@timpent.airshields.com>, 
>> tim@airshields.com says...
>> Learn sorting algorithms in assembly?  Are you serious!?
>
>Quite serious, but why would think sorting algorithms are
>especially hard?  That class of algorithm would have identical
>difficult in C or Asm.
>
>-- Tim Behrendsen (tim@airshields.com)

I don't think sorting is hard.  When I learned sorting, I sorted a deck of
cards.  I would not want to create an abstract data type in assembler.  I
have not done it, but I can't imagine it would be fun. 

The basic alog. might be the same in any language, the theory, but I doubt
that the actual implementation of a real world sorting problem would be 
the same.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-05  0:00     ` Patrick Horgan
@ 1996-08-06  0:00       ` Szu-Wen Huang
  1996-08-06  0:00       ` Dan Pop
  1 sibling, 0 replies; 688+ messages in thread
From: Szu-Wen Huang @ 1996-08-06  0:00 UTC (permalink / raw)



[i feel the need, but i don't know which groups to trim.]

Patrick Horgan (patrick@broadvision.com) wrote:
: In article <01bb7fcc$c5a98de0$87ee6fce@timpent.airshields.com>, "Tim Behrendsen" <tim@airshields.com> writes:
: > Except, IMO assembly should be used *exclusively* for the first two
: > years of a CS degree.  The first two years is usually all algorithmic
: > analysis, anyway.  There's nothing you can't learn about algorithms
: > that you can't learn and learn it better doing it in assembly.

: Good point Tim!  It's sure a lot easier counting cycles in assembler.
: Unfortunately, a lot of schools aren't teaching algorithmic analysis
: anymore.

Counting cycles is not analysis of algorithms.  Knowing that a specific
implementation runs in 145 cycles on a 68HC11 buys you nothing if you
happen to develop for an 80486.  We study (generally) the time and
space complexities of algorithms using platform-independent notations
such as O() in their best, average, and worst cases (and whatever case
turns out interesting).

As for assembly, it has about the merit of BASIC when teaching
programming.  It's usually impossible to abstract the OS, and
unnecessarily forces novice programmers to learn quite a bit about
a specific platform before being able to do anything useful.  Assembly
enforces no structure, so just like BASIC it is easy to form students
that write spaghetti code.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-05  0:00   ` Stephen M O'Shaughnessy
@ 1996-08-06  0:00     ` Bob Gilbert
  1996-08-07  0:00       ` Stephen M O'Shaughnessy
  0 siblings, 1 reply; 688+ messages in thread
From: Bob Gilbert @ 1996-08-06  0:00 UTC (permalink / raw)



In article <Dvo8Cz.G39@most.fw.hac.com>, smosha@most.fw.hac.com (Stephen M O'Shaughnessy) writes:
> >
> >No, learn both, learn everything, don't be prejudiced against some types of
> >learning.  You don't know what you're missing.  Understanding things from 
> >both the top down and the bottom up adds a richness to your ability to 
> >deal with abstract concepts in your head, and figure out some concrete (i.e.
> >programatic) expression of an abstract solution.
> >
> I agree with you.
> I said we must learn both.  My point is that we can't learn both at the same
> time.  I believe it is best to start with the higher level to get a good feel
> for what the problems are and how to solve then, then work our way down to more
> basic levels of the *machine*  to see if there might be better ways to solve
> specific problems on specific machines.

Why can't we learn both at the same time?  

When it came to learning computer science I think I tended to learn
both at the same time.  I took basic EE courses and learned about
operating transistors in saturation, how to build flip-flop circuits,
and how to implement logic using these circuits, and finally how to
design a computer architecture using these circuits (including micro-
code design).  At the same time I was learning PL/I programming, how
to write bubble sorts, learning about the merits of structured 
programming, top-down design methods, various data structures, data 
base design, discrete mathmatics, ect.  This was overlapped and 
followed with learning assembly, state machine theory, Turing machines,
general compiler (language) theory, and so forth.  Somewhere in my 
second senior year it all started to come together and make sense, 
not necessarily with a sudden turning on of the light of understanding,
but a gradual turning up of the dimmer switch.

Everybody is different, some learn better one way than the other, but
I am a definite believer in the "burn the candle at both ends" method.
At least that way you cover both bases.

-Bob






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-06  0:00                           ` Tim Behrendsen
  1996-08-06  0:00                             ` Fergus Henderson
  1996-08-06  0:00                             ` Szu-Wen Huang
@ 1996-08-06  0:00                             ` Dan Pop
  1996-08-06  0:00                               ` Tim Behrendsen
       [not found]                               ` <01bb83cc$fb <tequila-0708960947140001@tequila.interlog.com>
  1996-08-12  0:00                             ` Robert I. Eachus
  3 siblings, 2 replies; 688+ messages in thread
From: Dan Pop @ 1996-08-06  0:00 UTC (permalink / raw)



In <01bb8342$88cc6f40$32ee6fcf@timhome2> "Tim Behrendsen" <tim@airshields.com> writes:

>Maybe I'm weird, but I just don't see assembly as being harder
>than a HLL, and in fact, it seems to me that it's much easier.
>The number of fundamental things to learn is *very* small, and
>I would think that being able to show a problem in terms of the
>"array of memory" being manipulated would just make it infinitely
>easier than having to wrestle with all the abstract nonsense.
>
>Now, you wouldn't want to *maintain* large systems of assembly,
>which is why HLLs have taken over the world, but it seems to
>me that assembly per se is just not that hard to use.

OK, try to implement the "hello world" program in assembly, so that it
works on x86, 68k, MIPS, SPARC, Alpha, PPC, PA-RISC machines and, why
not, a 3090, a Cray and a CM-5.  When you're done, you might get a clue 
about why assembly per se is a pain to use.  

Unix is what it is today because Thompson and Ritchie realized the
importance of having a kernel implemented almost exclusively in a HLL
and created a suitable HLL for that purpose.  Other people discovered
that that language was very well suited for other purposes, as well,
and the rest is history...

Dan
--
Dan Pop
CERN, CN Division
Email: Dan.Pop@cern.ch 
Mail:  CERN - PPE, Bat. 31 R-004, CH-1211 Geneve 23, Switzerland




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pasca
  1996-08-02  0:00                   ` David Wheeler
  1996-08-02  0:00                     ` Peter Seebach
@ 1996-08-06  0:00                     ` StHeller
  1996-08-06  0:00                       ` Robert Dewar
  1 sibling, 1 reply; 688+ messages in thread
From: StHeller @ 1996-08-06  0:00 UTC (permalink / raw)



In article <4ttdlg$2pr@news.ida.org>, wheeler@aphrodite.csed.ida.org
(David Wheeler) writes:

>When you're using C or C++, you really need to know how the
>underlying assembly language works. Why? Because pointer arithmetic
>makes little sense if you don't, and tracking down rogue pointers
>overwriting your local stack values and other nasties is impossible if
>you don't understand how underlying machines work. 
  Absolutely correct. This is essential to get across to the student at
the
beginning of their study of C++.

Steve Heller
http://ourworld.compuserve.com/homepages/steve_heller/




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-28  0:00               ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Robert Dewar
                                   ` (2 preceding siblings ...)
  1996-08-06  0:00                 ` Robert I. Eachus
@ 1996-08-06  0:00                 ` Conrad Herrmann
  3 siblings, 0 replies; 688+ messages in thread
From: Conrad Herrmann @ 1996-08-06  0:00 UTC (permalink / raw)



Alf P. Steinbach wrote:
> ....
> The argument against disregarding or hiding the concrete (levels n-1,
> n-2 etc. when you are learning about level n) is simply that the
> particular instance of a level n you're learning about was abstracted
> from the concrete levels, and your understanding of level n will be
> lacking if you don't know anything about the concrete, the *why*s of
> the solutions and abstractions at level n.  It's like saying "Do things
> this way, follow the cookbook recipes, but for God's sake don't try to
> understand it:  that will lead you into temptation and severe trouble".
> Likewise, disregarding higher level abstractions is just as silly.
> In many cases, solutions at a level n are guided by higher level
> abstractions  --  the "ideal"  --  and fit as best possible to n-1.
> 
> - Alf

To continue your analogy: Cookbook recipies are often a good way to start 
learning how to cook, since they describe a process that can actually be 
executed by someone who doesn't understand fully what they're doing.  
It would be terribly frustrating to have to start with sowing seeds, 
reaping the harvest, milling the wheat, etc. before you could graduate to 
the breadmaking class.

One of the things I enjoyed about the Joy Of Cooking is that it had a 
healthy dose of beginner's cooking theory in separate sections, but they 
didn't keep me from finishing the chocolate cake (yum).

Still, I'd say that one cannot learn programming without being able to 
break out of the "recipe" mentality and understand what's going on.  I 
started by copying Adventure listings out of Byte magazine (in Basic), 
but didn't stop there.

-- Conrad Herrmann
(Borland C++)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-05  0:00     ` Patrick Horgan
  1996-08-06  0:00       ` Szu-Wen Huang
@ 1996-08-06  0:00       ` Dan Pop
  1996-08-08  0:00         ` steidl
  1 sibling, 1 reply; 688+ messages in thread
From: Dan Pop @ 1996-08-06  0:00 UTC (permalink / raw)



In <4u5rqe$9gv@ns.broadvision.com> patrick@broadvision.com (Patrick Horgan) writes:

>In article <01bb7fcc$c5a98de0$87ee6fce@timpent.airshields.com>, "Tim Behrendsen" <tim@airshields.com> writes:
>> Except, IMO assembly should be used *exclusively* for the first two
>> years of a CS degree.  The first two years is usually all algorithmic
>> analysis, anyway.  There's nothing you can't learn about algorithms
>> that you can't learn and learn it better doing it in assembly.
>
>Good point Tim!  It's sure a lot easier counting cycles in assembler.

Yeah, predicting cache misses and pipe stalls is a piece of cake.
The superpipelined processors make the problem even easier.

>Unfortunately, a lot of schools aren't teaching algorithmic analysis
>anymore.

If there is any connection between algorithm analysis and cycle counting,
I definitely missed it.

Dan
--
Dan Pop
CERN, CN Division
Email: Dan.Pop@cern.ch 
Mail:  CERN - PPE, Bat. 31 R-004, CH-1211 Geneve 23, Switzerland




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-05  0:00                         ` Chris Sonnack
@ 1996-08-06  0:00                           ` Stephen M O'Shaughnessy
  0 siblings, 0 replies; 688+ messages in thread
From: Stephen M O'Shaughnessy @ 1996-08-06  0:00 UTC (permalink / raw)



In article <4u54so$m6i@dawn.mmm.com>, cjsonnack@mmm.com says...
>
>Tim Behrendsen (tim@airshields.com) wrote:
>
>> Let's try a thought experiment.  We take two students; Jane is taught
>> assembly from day 1 for two years.  John is taught C for two years.
>> Both are exposed to identical curriculums of algorithmic analysis,
>> data structures, etc.
>>
>> Two years later, they switch roles.  Who will learn the other's skills
>> the easiest?
>
>Like many I started with BASIC. But then, starting with Knuth's MIX, I
>spent years in various assemblies (Z80, 8086, etc). When I got to C, I
>had no problems at all with pointers. In fact, one of my favorite things
>about C is the pointers. And pointers to pointers. And pointers to ....
>well, you get the idea.
>
>Now I sometimes teach C, and I've observed time and time again that a
>student with assembly background picks up the language //much// faster
>than someone with, say, a BASIC (or even Pascal) background. A common
>question the latter students ask is, "Why do I want pointers? What good
>are they??" Those with assembler background already know the value and
>utility of references.
>
>Of course, it can be explained. Of course, in time, they'll get it.
>
>But if you've "lived in the metal", I've found you'll understand the higher
>level stuff much faster and surer than those who see it as a "black box".
>

This sounds like the experiment that was conducted many years ago when Apple
first introduced the a windows operating system.  The debates raged on which
was better, the mindless point and click of windows or the more cerebral
command line structure of DOS.  Researchers found that high school students 
that used DOS wrote better term papers than those that used Apple/Macs.  The 
results, on further study, revealed that it was NOT the operating system but
the quality of the PERSON.  In general it takes a sharper person to want to
use a command line style.  Today few high school students use command line
operating systems.  And, I venture to say, some are  still sharper than others.

In terms of learning programming, is assembly language/machine level issues a
necessary prerequisite or are people who know these things just more *involved*
in their craft?





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-06  0:00                             ` Szu-Wen Huang
@ 1996-08-06  0:00                               ` Tim Behrendsen
  1996-08-06  0:00                                 ` Peter Seebach
                                                   ` (2 more replies)
  0 siblings, 3 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-06  0:00 UTC (permalink / raw)



Szu-Wen Huang <huang@mnsinc.com> wrote in article
<4u7grn$eb0@news1.mnsinc.com>...
> Tim Behrendsen (tim@airshields.com) wrote:
> : Maybe I'm weird, but I just don't see assembly as being harder
> : than a HLL, and in fact, it seems to me that it's much easier.
> : The number of fundamental things to learn is *very* small, and
> : I would think that being able to show a problem in terms of the
> : "array of memory" being manipulated would just make it infinitely
> : easier than having to wrestle with all the abstract nonsense.
> 
> I know what you're trying to say, but you neglect what the subject
> is trying to teach.  I don't need my students to learn how to print
> out a string calling interrupt this function that, or that the
> instruction *after* a branch is always executed (in some pipelined
> RISCs), or you cannot divide by the ZZX register.  These will all
> be useless in a few years, perhaps even a few months.  I need my
> students to learn when and why quicksort is more efficient than
> bubblesort, and telling them to use assembly sidetracks that effort.

Let me bring it back full-circle where we started.  The reason
I mention assembly in the first place was the number of graduates
coming to me for a job that were failing the test I give
*abysmally*, particularly in the areas of creating an algorithm
for a problem they've never seen before, and doing logical
operations.

I chalked this up to the lack of the fundamentals being taught,
and the students having their brains filled up so much with
abstractions that they don't understand how to solve problems
anymore.

This is why I think assembly is the better way to teach
algorithms; it's just you and the algorithm.  It forces them
to really think about the problem, because they don't have any
"training wheels" to protect them from the problem.

Whatever were doing now is *not working*, let me tell you.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pasca
  1996-08-06  0:00                     ` What's the best language to start with? [was: Re: Should I learn C or Pasca StHeller
@ 1996-08-06  0:00                       ` Robert Dewar
  0 siblings, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-06  0:00 UTC (permalink / raw)



Steve said

">When you're using C or C++, you really need to know how the
>underlying assembly language works. Why? Because pointer arithmetic
>makes little sense if you don't, and tracking down rogue pointers
>overwriting your local stack values and other nasties is impossible if
>you don't understand how underlying machines work.
  Absolutely correct. This is essential to get across to the student at
the
beginning of their study of C++.

Steve Heller"



That would *really* be an indictment of C++ as a teaching language if it
were true, but it is not true in fact. Many people teach C++ using various
approaches without finding the need to talk about low level models. One
approach is to use a set of reasonabley high level classes, e.g. for
checked arrays, and completely avoid the pointer issue.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-06  0:00                               ` Tim Behrendsen
@ 1996-08-06  0:00                                 ` Peter Seebach
  1996-08-07  0:00                                   ` Tim Behrendsen
  1996-08-07  0:00                                 ` What's the best language to start with Ian Ward
  1996-08-11  0:00                                 ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Jerone A. Bowers
  2 siblings, 1 reply; 688+ messages in thread
From: Peter Seebach @ 1996-08-06  0:00 UTC (permalink / raw)



In article <01bb83ad$29c3cfa0$87ee6fce@timpent.airshields.com>,
Tim Behrendsen <tim@airshields.com> wrote:
>Let me bring it back full-circle where we started.  The reason
>I mention assembly in the first place was the number of graduates
>coming to me for a job that were failing the test I give
>*abysmally*, particularly in the areas of creating an algorithm
>for a problem they've never seen before, and doing logical
>operations.

These are fundementally abstract operations.  Generalization is the core of
algorithm design; you must abstract what the problem is, what a solution looks
like, and what their relationship is.  Logic is entirely an abstract beast.

>I chalked this up to the lack of the fundamentals being taught,
>and the students having their brains filled up so much with
>abstractions that they don't understand how to solve problems
>anymore.

The entire concept of a solution to a problem, rather than the answer to a
question, is a matter of abstraction.

>This is why I think assembly is the better way to teach
>algorithms; it's just you and the algorithm.  It forces them
>to really think about the problem, because they don't have any
>"training wheels" to protect them from the problem.

But the training wheels protect them, not from the problem, but from the
implementation.

Consider string searching.

An *algorithm* for searching strings has nothing to do with whether you can
use a postdecrement or postincrement addressing mode.

An assembly program for doing so may well *have to* consider these issues.

>Whatever were doing now is *not working*, let me tell you.

What we're doing now is exactly what we do in every other field; we give them
lists of desired answers.  Most math programs would be just as bad if math
were as high paying and easy to get a job in as CS.

I've seen *very* few good teachers in any field, especially ones which require
abstraction.

The problem I've seen is that students think too much about how to implement,
and not enough about the abstract design.  I have this problem, even.  I spend
way too much time, during designs, thinking about what the data type will need
to look like, rather than what it will need to be able to do.  This frequently
turns into extra time about halfway through the implementation, because I
didn't think enough about the *abstract* part of the problem.

I haven't even thought about memory layouts, and I'm *still* too specific and
not abstract enough.

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-06  0:00                             ` Dan Pop
@ 1996-08-06  0:00                               ` Tim Behrendsen
  1996-08-06  0:00                                 ` Peter Seebach
  1996-08-07  0:00                                 ` Mark Eissler
       [not found]                               ` <01bb83cc$fb <tequila-0708960947140001@tequila.interlog.com>
  1 sibling, 2 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-06  0:00 UTC (permalink / raw)



Dan Pop <Dan.Pop@cern.ch> wrote in article
<danpop.839349197@news.cern.ch>...
> In <01bb8342$88cc6f40$32ee6fcf@timhome2> "Tim Behrendsen"
<tim@airshields.com> writes:
> 
> >Maybe I'm weird, but I just don't see assembly as being harder
> >than a HLL, and in fact, it seems to me that it's much easier.
> >The number of fundamental things to learn is *very* small, and
> >I would think that being able to show a problem in terms of the
> >"array of memory" being manipulated would just make it infinitely
> >easier than having to wrestle with all the abstract nonsense.
> >
> >Now, you wouldn't want to *maintain* large systems of assembly,
> >which is why HLLs have taken over the world, but it seems to
> >me that assembly per se is just not that hard to use.
> 
> OK, try to implement the "hello world" program in assembly, so that it
> works on x86, 68k, MIPS, SPARC, Alpha, PPC, PA-RISC machines and, why
> not, a 3090, a Cray and a CM-5.  When you're done, you might get a clue 
> about why assembly per se is a pain to use.  
> 
> Unix is what it is today because Thompson and Ritchie realized the
> importance of having a kernel implemented almost exclusively in a HLL
> and created a suitable HLL for that purpose.  Other people discovered
> that that language was very well suited for other purposes, as well,
> and the rest is history...

I think we've strayed away from the central point.  No one is
arguing a return to the days of assembly language.

The question is (or has since become), is it better to start a
student learning the fundamentals, i.e. assembly and the internals
of computers, and then move on to abstracts; or is it better to
start with abstractions such as C or C++ and perhaps never give
the fundamentals, since "compilers are so good nowadays that
it's useless to know assembly, and in fact, can be dangerous."

I say that based on my experiences testing people straight out of
college (BS, MS, or PhD, makes no difference), we are packing
their heads so full of abstractions that they are unable to think
anymore.  I think it's much better for students to learn pure
algorithmic analysis without all the abstraction distractions that
can be better learned later on, and learned easier.

Perhaps a better question is, which is more important: Learning
abstractions or algorithmic analysis?  I say that algorithmic
analysis is 10 to 1 more important than abstractions.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-06  0:00                               ` Tim Behrendsen
@ 1996-08-06  0:00                                 ` Peter Seebach
  1996-08-07  0:00                                   ` Tim Behrendsen
  1996-08-07  0:00                                 ` Mark Eissler
  1 sibling, 1 reply; 688+ messages in thread
From: Peter Seebach @ 1996-08-06  0:00 UTC (permalink / raw)



In article <01bb83cc$fb35e180$87ee6fce@timpent.airshields.com>,
Tim Behrendsen <tim@airshields.com> wrote:
>The question is (or has since become), is it better to start a
>student learning the fundamentals, i.e. assembly and the internals
>of computers, and then move on to abstracts; or is it better to
>start with abstractions such as C or C++ and perhaps never give
>the fundamentals, since "compilers are so good nowadays that
>it's useless to know assembly, and in fact, can be dangerous."

What makes you think assembly is a fundemental?  Abstraction is *the*
fundemental tool we have.  Assembly is not fundemental by any stretch
of the imagination.

Learning the principles of computing machines would be helpful.  Seeing them
demonstrated on a specific machine might be helpful.  But assembly is not the
only way, or the best, to teach these fundementals.

>I say that based on my experiences testing people straight out of
>college (BS, MS, or PhD, makes no difference), we are packing
>their heads so full of abstractions that they are unable to think
>anymore.  I think it's much better for students to learn pure
>algorithmic analysis without all the abstraction distractions that
>can be better learned later on, and learned easier.

What are you *talking* about?  Algorithmic analysis is fundementally an
abstraction.  Rather than looking at the *specific* costs of the algorithm, we
look at the *kinds* of costs.  N^2 vs. log(N) complexity is entirely an
abstraction.

>Perhaps a better question is, which is more important: Learning
>abstractions or algorithmic analysis?  I say that algorithmic
>analysis is 10 to 1 more important than abstractions.

"Learning English is 10 to 1 more important than learning any language."

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-06  0:00                           ` Tim Behrendsen
@ 1996-08-06  0:00                             ` Fergus Henderson
  1996-08-07  0:00                               ` Tim Behrendsen
                                                 ` (3 more replies)
  1996-08-06  0:00                             ` Szu-Wen Huang
                                               ` (2 subsequent siblings)
  3 siblings, 4 replies; 688+ messages in thread
From: Fergus Henderson @ 1996-08-06  0:00 UTC (permalink / raw)



"Tim Behrendsen" <tim@airshields.com> writes:

>This seems to be a common theme; that programming things in
>assembly in necessarily harder than programming in a HLL.
>
>Maybe I'm weird, but I just don't see assembly as being harder
>than a HLL, and in fact, it seems to me that it's much easier.
>The number of fundamental things to learn is *very* small, and
>I would think that being able to show a problem in terms of the
>"array of memory" being manipulated would just make it infinitely
>easier than having to wrestle with all the abstract nonsense.
>
>Now, you wouldn't want to *maintain* large systems of assembly,
>which is why HLLs have taken over the world, but it seems to
>me that assembly per se is just not that hard to use.

N lines of assembler is not much more difficult to understand
or to code than N lines of C.  But if you want the students to
understand say quicksort, it's a lot easier showing them 20 lines
of C than 100 lines of assembler.

Also, we want to get our students into the habit of writing
robust and reusable code, and this is very difficult in assembler.
At least C has malloc()/free(); with assembler, you need to write
the memory management from scratch.

--
Fergus Henderson <fjh@cs.mu.oz.au>   |  "I have always known that the pursuit
WWW: <http://www.cs.mu.oz.au/~fjh>   |  of excellence is a lethal habit"
PGP: finger fjh@128.250.37.3         |     -- the last words of T. S. Garp.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-28  0:00               ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Robert Dewar
  1996-07-29  0:00                 ` Tim Behrendsen
  1996-08-06  0:00                 ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Alf P. Steinbach
@ 1996-08-06  0:00                 ` Robert I. Eachus
  1996-08-06  0:00                 ` Conrad Herrmann
  3 siblings, 0 replies; 688+ messages in thread
From: Robert I. Eachus @ 1996-08-06  0:00 UTC (permalink / raw)



In article <3202876B.BC7@online.no> "Alf P. Steinbach" <alfps@online.no> writes:

 > ...several assumptions, which can be summed up as a "mathematicians" view of
 > programming:  only abstract semantics matter.  At least when discussing C,
 > the most popular high level assembler in existence, that argument is
 > clearly not valid.  Could be valid in other contexts, though.

   There are still CPU chips being manufactured which implement the
semantics of something close to assembler in a direct manner.  But I
doubt any are used in new designs...  In most modern CPU chips the
relation between the abstract machine API and the execution state of
the chip is tenuous at best.  So the distinction between C and
assembler is a distinction without a difference.  They both provide
incompletely specified virtual machines which programs can be written
against.  (And no, the hardware is NOT a specification for the
assembler.  Things left unspecified by the API are different in
different SPARC chipsets for example.)

--

					Robert I. Eachus

with Standard_Disclaimer;
use  Standard_Disclaimer;
function Message (Text: in Clever_Ideas) return Better_Ideas is...




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-04  0:00                                 ` Peter Seebach
  1996-08-05  0:00                                   ` Chris Sonnack
@ 1996-08-06  0:00                                   ` Tim Behrendsen
  1 sibling, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-06  0:00 UTC (permalink / raw)



Peter Seebach <seebs@solutions.solon.com> wrote in article
<4u3tt1$qub@solutions.solon.com>...
> In article <01bb826a$b222f400$9bee6fce@timhome2>,
> Tim Behrendsen <tim@airshields.com> wrote:
> >Peter Seebach <seebs@solutions.solon.com> wrote in article
> >[snip example of invalid usage of NULL pointer v.s. Null string]
> >This completely makes the point.  Someone who knows assembly language
> >would *never* make that mistake, because it would be so obviously
> >wrong.
> 
> An interesting theory.  Unfortunately, I long since lost track of these
> articles, but in the old "use the OS vs. bang the metal" flame wars in
> comp.sys.amiga.programmer, the assembly people would always talk about
> how the first instruction in their programs was always
> 	movel #0,0L
> (Or however it is that you spell "write 4 bytes of 0 to address 0")...
> 
> A high-level programmer would never do that, because someone who has
> learned only the abstract semantics would not think of a null pointer
> as the address of any real memory.  It isn't; it's *guaranteed* that
> no object has a null pointer as its address.
> 
> >You cannot program in assembly and not understand a null
> >pointer v.s. a pointer to a zeroed memory location.
> 
> And you cannot program in abstract C and think of a null pointer as a
pointer
> to *any* memory location.  On many machines that I use, you can't read or
> write to it; it certainly doesn't *act* like it's memory.
> 
> The people who did things like that were always assembler programmers
relying
> on their knowledge of what was "really" happening inside the machine. 
The C
> abstract semantics clearly forbid it; however, on the majority of common
> architectures, the hardware doesn't forbid it, and the assumption holds.
> (By chance, mostly.)

Both your points come down to the same thing; that if a programmer
makes use of non-portable hardware-specific operations, then the
code is not well-written C.  I completely agree with this.

Where I disagree with you is that you seem to be claiming that
it would be bad to teach people the low level mechanisms, because
then they would be tempted to use them.  This seems very short
sighted to me; certainly with knowledge comes responsibility, and
along with teach the fundamentals there needs to be taught the
portability skills.  The problem is that the schools are so
top heavy with the latter that the fundamentals are being
neglected, and we end with the code-bloated world that we're
living in now.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-28  0:00               ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Robert Dewar
  1996-07-29  0:00                 ` Tim Behrendsen
@ 1996-08-06  0:00                 ` Alf P. Steinbach
  1996-08-06  0:00                 ` Robert I. Eachus
  1996-08-06  0:00                 ` Conrad Herrmann
  3 siblings, 0 replies; 688+ messages in thread
From: Alf P. Steinbach @ 1996-08-06  0:00 UTC (permalink / raw)



Robert I. Eachus wrote:
>    There are still CPU chips being manufactured which implement the
> semantics of something close to assembler in a direct manner.  But I
> doubt any are used in new designs...  In most modern CPU chips the
> relation between the abstract machine API and the execution state of
> the chip is tenuous at best.  So the distinction between C and
> assembler is a distinction without a difference.  They both provide
> incompletely specified virtual machines which programs can be written
> against.  (And no, the hardware is NOT a specification for the
> assembler.  Things left unspecified by the API are different in
> different SPARC chipsets for example.)

Either you're saying something I don't get, or you're not saying
anything at all.  There are at least three issues on the table:
(1) the header of the thread, now long forgotten;  (2) whether
students / beginning programmers are best off learning from the
bottom up, starting with the v.N. architecture, or from the top
down, starting with some suitably high-level language, or using
a gradient blending of the two (my view);  (3) a clarification of
what's meant by "abstract" and "concrete" in the context of
programming  --  which seems to be the issue of your posting.

Re issue (3), there are levels upon levels upon levels:  my point
here is that when you look from level n, "concrete" refers to
levels n-1, n-2 and so on (toward the physical world), whereas 
"abstract" refers to level n, n+1 and so on, of which there are
an infinity you can make up just as you want, and an infinity of
possible levels n+1 coexisting.  Of course these terms are relative.
Everything is relative to something.  For me, that's a truism.

The argument against disregarding or hiding the concrete (levels n-1,
n-2 etc. when you are learning about level n) is simply that the 
particular instance of a level n you're learning about was abstracted 
from the concrete levels, and your understanding of level n will be 
lacking if you don't know anything about the concrete, the *why*s of
the solutions and abstractions at level n.  It's like saying "Do things 
this way, follow the cookbook recipes, but for God's sake don't try to 
understand it:  that will lead you into temptation and severe trouble".
Likewise, disregarding higher level abstractions is just as silly.
In many cases, solutions at a level n are guided by higher level
abstractions  --  the "ideal"  --  and fit as best possible to n-1.

- Alf




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-05  0:00                         ` Fergus Henderson
@ 1996-08-06  0:00                           ` Tim Behrendsen
  1996-08-06  0:00                             ` Fergus Henderson
                                               ` (3 more replies)
  0 siblings, 4 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-06  0:00 UTC (permalink / raw)



Fergus Henderson <fjh@mundook.cs.mu.OZ.AU> wrote in article
<4u5a11$siv@mulga.cs.mu.OZ.AU>...
> "Tim Behrendsen" <tim@airshields.com> writes:
> 
> >Let's try a thought experiment.  We take two students; Jane is taught
> >assembly from day 1 for two years.  John is taught C for two years.
> >Both are exposed to identical curriculums of algorithmic analysis,
> >data structures, etc.
> 
> That's not a realistic thought experiment.  Teaching abstraction,
> data structures, etc. is easier in a higher level language,
> and so it is unlikely that Jane will have managed to complete
> the same curriculum that John can, given the handicap of using
> a low-level language.

This seems to be a common theme; that programming things in
assembly in necessarily harder than programming in a HLL.

Maybe I'm weird, but I just don't see assembly as being harder
than a HLL, and in fact, it seems to me that it's much easier.
The number of fundamental things to learn is *very* small, and
I would think that being able to show a problem in terms of the
"array of memory" being manipulated would just make it infinitely
easier than having to wrestle with all the abstract nonsense.

Now, you wouldn't want to *maintain* large systems of assembly,
which is why HLLs have taken over the world, but it seems to
me that assembly per se is just not that hard to use.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-07-26  0:00               ` Randy Kaelber
  1996-07-29  0:00                 ` Ralph Silverman
@ 1996-08-06  0:00                 ` StHeller
  1 sibling, 0 replies; 688+ messages in thread
From: StHeller @ 1996-08-06  0:00 UTC (permalink / raw)



In article <4taik0$1o2q@rose.muohio.edu>, kaelbers@avian.dars.muohio.edu
(Randy Kaelber) writes:

>God knows I'd love to try it! Teach it, take it, I don't care.
  Yes, it has been a lot of fun for both the teacher (me) and my students.
They
also have learned a lot about programming in the process.

Steve Heller
http://ourworld.compuserve.com/homepages/steve_heller/




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-01  0:00         ` Ralph Silverman
@ 1996-08-06  0:00           ` ++           robin
  0 siblings, 0 replies; 688+ messages in thread
From: ++           robin @ 1996-08-06  0:00 UTC (permalink / raw)



	z007400b@bcfreenet.seflin.lib.fl.us (Ralph Silverman) writes:

	>++           robin (rav@goanna.cs.rmit.edu.au) wrote:
	>: 	robt2@ix.netcom.com(Rob(t.) Brannan) writes:

	>: 	>As a first language Pascal is the way to go for many reasons.

	>: 	>C is very complicated as a first language (libraries,pointers,numerous
	>: 	>low level routines), not to mention some of the errors that you will
	>: 	>encounter are not exactly well covered in any book.

	>: 	>As a newbie , remember trying to figure out whats wrong when an
	>: 	>included file or library was just left out!

	>: ---Then use PL/I.  That just can't happen!

	>: PL/I is an excellent first language.  And the
	>: output is even *easier* to do than Pascal or C.

	>: BTW, one of the advantages of PL/I for a beginner
	>: is the excellent diagnstic messages not only at run time,
	>: but also at compile time.

	>*************begin r.s. response***************

	>	pl/1  certainly would be good...
	>	but is this available for the pc?

---PL/I is available on the PC in various versions:

1.  from Liant Software (PL/I for Windows)
2.  from IBM (PL/I for Windows 95/NT; 
3.  from IBM (PL/I for OS/2).

	>	once found a kind of interpreter/simulator...
	>	but this was quite limited in language features!

	>*************end r.s. response*****************
	>Ralph Silverman
	>z007400b@bcfreenet.seflin.lib.fl.us




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-06  0:00                               ` Tim Behrendsen
  1996-08-06  0:00                                 ` Peter Seebach
@ 1996-08-07  0:00                                 ` Mark Eissler
  1 sibling, 0 replies; 688+ messages in thread
From: Mark Eissler @ 1996-08-07  0:00 UTC (permalink / raw)



In article <01bb83cc$fb35e180$87ee6fce@timpent.airshields.com>, "Tim
Behrendsen" <tim@airshields.com> wrote:

> 
> Perhaps a better question is, which is more important: Learning
> abstractions or algorithmic analysis?  I say that algorithmic
> analysis is 10 to 1 more important than abstractions.
> 

As someone that studied something other than CS in University...

I'd say that the ability to define an efficient algorithm is probably THE
most important thing in design. That's step one. The algorithm can then be
optimized according to CPU specs if one understands ASM. 

So figure it out. Code it. Tweak it. 

Obviously, someone that has no clue regarding things ASM won't be able to
tell you if a certain "for" loop may be better implemented as a "while." 

Or, you could do it the MicroSoft way: just write it and if it's slow
we'll make everyone upgrade their systems.

Although I've never written anything in Assembler, I do have a working
knowledge (waiting for the upgrade ;-} when prices come down) that allows
me to understand what's happening when I see it.

--
Mark Eissler                      | Now that my DNS is working...
tequila@interlog.com              | What will I do next weekend??
http://www.interlog.com/~tequila/ | --Configure SendMail!






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-06  0:00                                 ` Peter Seebach
@ 1996-08-07  0:00                                   ` Tim Behrendsen
  1996-08-07  0:00                                     ` Peter Seebach
                                                       ` (3 more replies)
  0 siblings, 4 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-07  0:00 UTC (permalink / raw)



Peter Seebach <seebs@solutions.solon.com> wrote in article
<4u8lff$3bs@solutions.solon.com>...
> In article <01bb83cc$fb35e180$87ee6fce@timpent.airshields.com>,
> Tim Behrendsen <tim@airshields.com> wrote:
> >The question is (or has since become), is it better to start a
> >student learning the fundamentals, i.e. assembly and the internals
> >of computers, and then move on to abstracts; or is it better to
> >start with abstractions such as C or C++ and perhaps never give
> >the fundamentals, since "compilers are so good nowadays that
> >it's useless to know assembly, and in fact, can be dangerous."
> 
> What makes you think assembly is a fundemental?  Abstraction is *the*
> fundemental tool we have.  Assembly is not fundemental by any stretch
> of the imagination.
> 
> Learning the principles of computing machines would be helpful.  Seeing
them
> demonstrated on a specific machine might be helpful.  But assembly is not
the
> only way, or the best, to teach these fundementals.

Not to get into a debate on the meaning of abstraction, but the
point is that there is very little hidden from the student when
they are learning assembly.  This allows them to better concentrate
on the basics of algorithms, because they are not distracted by syntax.

> >I say that based on my experiences testing people straight out of
> >college (BS, MS, or PhD, makes no difference), we are packing
> >their heads so full of abstractions that they are unable to think
> >anymore.  I think it's much better for students to learn pure
> >algorithmic analysis without all the abstraction distractions that
> >can be better learned later on, and learned easier.
> 
> What are you *talking* about?  Algorithmic analysis is fundementally an
> abstraction.  Rather than looking at the *specific* costs of the
algorithm, we
> look at the *kinds* of costs.  N^2 vs. log(N) complexity is entirely an
> abstraction.

Of course, but I'm talking about abstractions of assembly, i.e.,
HLLs.  Remember, C (or any HLL) does not really exist as far as
the computer knows.  Assembly is the direct raw instruction set of
the physical machine.  If the student is learning algorithms in
assembly, they are unquestionably learning the algorithm, and not
just some vague concept wrapped in 10 layers of wool.

The *reality* is, students graduating today are just not getting
it.

> >Perhaps a better question is, which is more important: Learning
> >abstractions or algorithmic analysis?  I say that algorithmic
> >analysis is 10 to 1 more important than abstractions.
> 
> "Learning English is 10 to 1 more important than learning any language."

True, but don't get me started on the English skills of some of
my applicants!

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-06  0:00                             ` Fergus Henderson
@ 1996-08-07  0:00                               ` Tim Behrendsen
  1996-08-08  0:00                                 ` Thomas Hood
  1996-08-17  0:00                                 ` Lawrence Kirby
  1996-08-08  0:00                               ` Stephen M O'Shaughnessy
                                                 ` (2 subsequent siblings)
  3 siblings, 2 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-07  0:00 UTC (permalink / raw)



Fergus Henderson <fjh@mundook.cs.mu.OZ.AU> wrote in article
<4u86lc$2gu@mulga.cs.mu.OZ.AU>...
> "Tim Behrendsen" <tim@airshields.com> writes:
> 
> >This seems to be a common theme; that programming things in
> >assembly in necessarily harder than programming in a HLL.
> >
> >Maybe I'm weird, but I just don't see assembly as being harder
> >than a HLL, and in fact, it seems to me that it's much easier.
> >The number of fundamental things to learn is *very* small, and
> >I would think that being able to show a problem in terms of the
> >"array of memory" being manipulated would just make it infinitely
> >easier than having to wrestle with all the abstract nonsense.
> >
> >Now, you wouldn't want to *maintain* large systems of assembly,
> >which is why HLLs have taken over the world, but it seems to
> >me that assembly per se is just not that hard to use.
> 
> N lines of assembler is not much more difficult to understand
> or to code than N lines of C.  But if you want the students to
> understand say quicksort, it's a lot easier showing them 20 lines
> of C than 100 lines of assembler.

Who's talking about showing them?  I would suggest that if
they wrote a quicksort in assembler, they will have a much
better "feel" for the algorithm, than if they wrote it in C.

> Also, we want to get our students into the habit of writing
> robust and reusable code, and this is very difficult in assembler.
> At least C has malloc()/free(); with assembler, you need to write
> the memory management from scratch.

Well, I don't the student has to rewrite all the library
routines!  Nothing precludes you from using them from assembler.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-06  0:00     ` Bob Gilbert
@ 1996-08-07  0:00       ` Stephen M O'Shaughnessy
  1996-08-09  0:00         ` Bob Gilbert
  0 siblings, 1 reply; 688+ messages in thread
From: Stephen M O'Shaughnessy @ 1996-08-07  0:00 UTC (permalink / raw)



In article <4u7fol$26s@zeus.orl.mmc.com>, rgilbert@unconfigured.xvnews.domain says...
>
>Why can't we learn both at the same time?  
>
>When it came to learning computer science I think I tended to learn
>both at the same time.  I took basic EE courses and learned about
>operating transistors in saturation, how to build flip-flop circuits,
>and how to implement logic using these circuits, and finally how to
>design a computer architecture using these circuits (including micro-
>code design).  At the same time I was learning PL/I programming, how
>to write bubble sorts, learning about the merits of structured 
>programming, top-down design methods, various data structures, data 
>base design, discrete mathmatics, ect.  This was overlapped and 
>followed with learning assembly, state machine theory, Turing machines,
>general compiler (language) theory, and so forth.  Somewhere in my 
>second senior year it all started to come together and make sense, 
>not necessarily with a sudden turning on of the light of understanding,
>but a gradual turning up of the dimmer switch.
>
That must have been one hell of a lecture.  By *same time* I meant SAME TIME.  Some
of your above mentioned topics must have been learned, and mastered, before the
others.  Care to comment on what you mean by second senior year?





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
       [not found]                               ` <01bb83cc$fb <tequila-0708960947140001@tequila.interlog.com>
@ 1996-08-07  0:00                                 ` Peter Seebach
  0 siblings, 0 replies; 688+ messages in thread
From: Peter Seebach @ 1996-08-07  0:00 UTC (permalink / raw)



In article <tequila-0708960947140001@tequila.interlog.com>,
Mark Eissler <tequila@interlog.com> wrote:
>Obviously, someone that has no clue regarding things ASM won't be able to
>tell you if a certain "for" loop may be better implemented as a "while." 

Well, in C anyway, their semantics are different enough that there exist cases
where one is clearly a better way to express an algorithm.

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-07  0:00                                   ` Tim Behrendsen
  1996-08-07  0:00                                     ` Peter Seebach
@ 1996-08-07  0:00                                     ` James A. Squire
  1996-08-08  0:00                                     ` David Weller
  1996-08-09  0:00                                     ` Bob Gilbert
  3 siblings, 0 replies; 688+ messages in thread
From: James A. Squire @ 1996-08-07  0:00 UTC (permalink / raw)



Tim Behrendsen wrote:

> Of course, but I'm talking about abstractions of assembly, i.e.,
> HLLs.  Remember, C (or any HLL) does not really exist as far as
> the computer knows.  Assembly is the direct raw instruction set of
> the physical machine.  If the student is learning algorithms in
> assembly, they are unquestionably learning the algorithm, and not
> just some vague concept wrapped in 10 layers of wool.

Actually, Assembly doesn't exist either, as far as the computer knows.
Machine language is the direct raw instruction set of the physical
machine.

Plan on teaching your students to program in 1's and 0's soon?
-- 
James Squire                             mailto:ja_squire@csehp3.mdc.com
MDA Avionics Tools & Processes
McDonnell Douglas Aerospace              http://www.mdc.com/
Opinions expressed here are my own and NOT my company's
"My God. Whoever's piloting that shuttle's a madman!"
	-- Ivanova (about Londo), "A Voice in the Wilderness II"




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-07  0:00                                   ` Tim Behrendsen
  1996-08-07  0:00                                     ` Dan Pop
@ 1996-08-07  0:00                                     ` Peter Seebach
  1996-08-08  0:00                                       ` Tim Behrendsen
  1996-08-08  0:00                                     ` Teaching sorts [was Re: What's the best language to start with?] Robert I. Eachus
                                                       ` (7 subsequent siblings)
  9 siblings, 1 reply; 688+ messages in thread
From: Peter Seebach @ 1996-08-07  0:00 UTC (permalink / raw)



In article <01bb83f5$923391e0$87ee6fce@timpent.airshields.com>,
Tim Behrendsen <tim@airshields.com> wrote:
>I agree; my point is that I think the student learns more if they
>are thinking purely in terms of fundamental operations (which are
>still abstractions above the raw hardware components), rather
>than layers and layers of syntax that hide the essential essence
>of what the computer is doing.

But your concept of what the "fundemental operations" are is completely tied
up in how specific hardware you've seen operates.  Algorithms exist in terms
of abstract operations, not moves and branches.

Something like CWEB might be a good language in which to learn algorithms.
Scheme might too.

Let's look at a specific algorithm; the infamous gcd(x, y).

In C, we write

	int
	gcd(int x, int y) {
		if (y == 0)
			return x;
		return gcd(y, (x % y));
	}

or something similar.

What's important here is not any theories people may have about where the x
and y are stored, or how a stack works, but the concept that you can define an
algorithm in terms of a previous case.  Learning that the if() may be
implemented with "a branch" does not help the student understand how the
*algorithm* works, it helps the student understand how the *compiler* works.
These are distinct things.

>If programming is reduced to fundamentals of move, arithmetic,
>test, branch, etc it prevents the student from leaning on the
>abstraction rather than truly understanding the solution to the
>problem.  In other words, if they can express it in the above
>terms, you *know* they understand the solution.

But it also prevents them from learning the abstraction, and truly
understanding the *principle* of the solution.

A student who has memorized the multiplication table for x=1..10, y=1..20,
but doesn't know that 5*12 and 12*5 are the same thing, has learned nothing.

Learning details and implementation does not help except in so far as they are
good examples.

>Perhaps the solution is a middle ground one; rather than "real"
>assembly for students, maybe what they need is a "MIX" kind of
>thing where it is reduced to fundamental elements, an assembly
>abstraction if you will.  That way, "dangerous" concepts such
>as post-increment, etc. can be safely hidden away, but we can
>still give allow them to think in pure data manipulation terms.

Ahh, like C.  :)

But even then, students should not be thinking in terms of tables of bytes
which they manipulate, but in terms of objects which they manipulate.

>The problem is that we *can't* think purely abstractly,
>otherwise we end up with slow crap code.

Nonsense.  A well designed and considered abstraction will generally lend
itself to an efficient and elegant implementation.  An ill-considered
abstraction will spend more time on cruft than it will on solving the problem.

I've heard that 50% of the articles Byte gets are marginal optimizations to
bubblesort.  People with a good understanding of what operations they are
using to express the algorithm are finding real improvements, frequently 40%.
Unfortunately, it's still an N^2 algorithm.

The code bloat problem is almost invariably a result of an ill-considered
abstraction.  Someone comes up with a first approximation of a solution
(bubblesort, for the sorting problem).  Then management sees that it appears
to run, and ships it.  Poof!  Time that should have been spent writing it
correctly, taking advantage of the mistakes made in the prototype, turns into
time spent trying desparately to get a fundementally flawed product to market,
and keep it running.  No one ever gets around to doing a good job of the
initial design.

>It is simply not
>possible to ignore the way code is structured, and completely
>depend on the compiler to save us.  That is just not the
>real world.

No, but I have never seen a good algorithm that didn't lend itself to elegant
and efficient implementation.  If we do decent designs first, and worry about
implementation second, we will find the implementation to be pleasant, easy,
and efficient.

>At least not my world, where I have to pack as many users
>as possible onto one CPU.

Compare the *efficiency* for this purpose of Unix, which is designed, and
MS-DOS, which has had millions of dollars thrown at making it as efficient as
possible.  Compare also things like NT and Berkeley Unix.  You *have* to do
your design in theoretical terms before you think about implementation, or you
end up with a system where 32 megs is seen as too small to run multiple users,
and you need a third party add-on to do it anyway.

The Berkeley people have thrown out more code than the system has in it.  The
algorithms are designed based on the *algorithmic* weaknesses of previous
designs.  Hardware efficiency is evaluated only after you've found a good
algorithm.  And they get excellent performance.

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-07  0:00                                   ` Tim Behrendsen
@ 1996-08-07  0:00                                     ` Peter Seebach
  1996-08-08  0:00                                       ` Tim Behrendsen
  1996-08-07  0:00                                     ` James A. Squire
                                                       ` (2 subsequent siblings)
  3 siblings, 1 reply; 688+ messages in thread
From: Peter Seebach @ 1996-08-07  0:00 UTC (permalink / raw)



In article <01bb846c$e51df220$87ee6fce@timpent.airshields.com>,
Tim Behrendsen <tim@airshields.com> wrote:
>Peter Seebach <seebs@solutions.solon.com> wrote in article
><4u8lff$3bs@solutions.solon.com>...

>Not to get into a debate on the meaning of abstraction, but the
>point is that there is very little hidden from the student when
>they are learning assembly.  This allows them to better concentrate
>on the basics of algorithms, because they are not distracted by syntax.

Huh?

The basics of algorithms are hidden from them, when they are sitting
around counting bytes and remembering the mnemonics for operations.

Assembly has more syntax per operation done than anything else.

>Of course, but I'm talking about abstractions of assembly, i.e.,
>HLLs.  Remember, C (or any HLL) does not really exist as far as
>the computer knows.  Assembly is the direct raw instruction set of
>the physical machine.  If the student is learning algorithms in
>assembly, they are unquestionably learning the algorithm, and not
>just some vague concept wrapped in 10 layers of wool.

No, they're learning assembly.  Assembly is relatively hard.  No amount
of abstraction in a language will hide the algorithm; the more abstract the
language, the more visible the algorithm, because there's less and less there.

The only real exception is when the language encapsulates an algorithm; using
qsort() does not teach you a sorting algorithm.  But writing a quicksort() in
C will teach you at least as much about quicksort as writing it in assembly.

>> >Perhaps a better question is, which is more important: Learning
>> >abstractions or algorithmic analysis?  I say that algorithmic
>> >analysis is 10 to 1 more important than abstractions.

>> "Learning English is 10 to 1 more important than learning any language."

>True, but don't get me started on the English skills of some of
>my applicants!

Huh?  The point of my statement is that English *is* a language.  Algorithmic
analysis cannot be more important than abstractions, because it's a subset of
abstractions.

I have to agree, though, I'm sick to death of native speakers of English who
don't get it.

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-05  0:00                                     ` Peter Seebach
@ 1996-08-07  0:00                                       ` Tom Watson
  0 siblings, 0 replies; 688+ messages in thread
From: Tom Watson @ 1996-08-07  0:00 UTC (permalink / raw)



In article <4u5h81$ol@solutions.solon.com>, seebs@solon.com wrote:

> In article <4u5682$qgs@dawn.mmm.com>, Chris Sonnack <cjsonnack@mmm.com> wrote:
> 
> >How's my programming?  Call 1-800-DEV-NULL
> 
> Cute.
> 

Calling from California, it is a disconnected number!!

Just in case anybody's interested...

-- 
Tom Watson
tsw@3do.com         (Home: tsw@johana.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
       [not found] <sperlman-0507961717550001@p121.ezo.net>
                   ` (19 preceding siblings ...)
  1996-08-01  0:00 ` Andy Hardy
@ 1996-08-07  0:00 ` Fergus Henderson
  1996-08-07  0:00   ` Tim Behrendsen
  20 siblings, 1 reply; 688+ messages in thread
From: Fergus Henderson @ 1996-08-07  0:00 UTC (permalink / raw)



"Tim Behrendsen" <tim@airshields.com> writes:

>Fergus Henderson <fjh@mundook.cs.mu.OZ.AU> wrote in article
>> N lines of assembler is not much more difficult to understand
>> or to code than N lines of C.  But if you want the students to
>> understand say quicksort, it's a lot easier showing them 20 lines
>> of C than 100 lines of assembler.
>
>Who's talking about showing them?  I would suggest that if
>they wrote a quicksort in assembler, they will have a much
>better "feel" for the algorithm, than if they wrote it in C.

So which is better use of a student's time, writing 100 lines of
quicksort in assembler, or using C and writing 20 lines of quicksort,
10 lines of insertion sort, 20 lines of heap sort, 20 lines of
merge sort, and 30 lines of glue to test them all out?

Which will give them more practice in choosing meaningful variable
names?  Which will give them understanding of a range of different
sorting algorithms?  Which will give them experience of the idea of
having multiple implementations of the same interface?  Which will
teach them how to write portable code?  Which will give them more
experience with the sort of software engineering problems they are
likely to encounter in the Real World [tm]?

--
Fergus Henderson <fjh@cs.mu.oz.au>   |  "I have always known that the pursuit
WWW: <http://www.cs.mu.oz.au/~fjh>   |  of excellence is a lethal habit"
PGP: finger fjh@128.250.37.3         |     -- the last words of T. S. Garp.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with
  1996-08-06  0:00                               ` Tim Behrendsen
  1996-08-06  0:00                                 ` Peter Seebach
@ 1996-08-07  0:00                                 ` Ian Ward
  1996-08-08  0:00                                   ` Tim Behrendsen
  1996-08-11  0:00                                 ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Jerone A. Bowers
  2 siblings, 1 reply; 688+ messages in thread
From: Ian Ward @ 1996-08-07  0:00 UTC (permalink / raw)



In article 87ee6fce@timpent.airshields.com, "Tim Behrendsen" <tim@airshields.com> () writes:
>Szu-Wen Huang <huang@mnsinc.com> wrote in article
><4u7grn$eb0@news1.mnsinc.com>...
>> Tim Behrendsen (tim@airshields.com) wrote:
>> : Maybe I'm weird, but I just don't see assembly as being harder
>> : than a HLL, and in fact, it seems to me that it's much easier.
>> : The number of fundamental things to learn is *very* small, and
>> : I would think that being able to show a problem in terms of the
>> : "array of memory" being manipulated would just make it infinitely
>> : easier than having to wrestle with all the abstract nonsense.

Assembler is easy to understand, it has very simple instructions,
this is what makes it so useless at expressing solutions. One needs
so much of it to do anything. Besides all the small solutions have
already been done.

Besides, the second best female software woman I know, who works
for Oracle (in support ???? duh.) got a first class honours degree
in computing/combined science without knowing what Hex was, and I'd
trust her code with my life (but not Oracle's.) Simply because, her
course did not require it. If she had the brains to be able to deal
with abstract concepts, then she could do the design work, and the
simple algorithms could be left to someone else. 

[That was not what I thought of this at the time mind, I thought
she was a stupid bitch, but then in that same year I remember
defending Basic _and_ assembler against Pascal, simply because
I had not come across a problem that I could not solve (and
remember the complete solution months afterward) in my head. 
I wish I go back then, playing around for four hours a day in 
college, and then drinking ten pints of lager (56p a pint.)
Wall to wall vomitting, best days of your life.]

>> 
>> I know what you're trying to say, but you neglect what the subject
>> is trying to teach.  I don't need my students to learn how to print
>> out a string calling interrupt this function that, or that the
>> instruction *after* a branch is always executed (in some pipelined
>> RISCs), or you cannot divide by the ZZX register.  These will all
>> be useless in a few years, perhaps even a few months.  I need my
>> students to learn when and why quicksort is more efficient than
>> bubblesort, and telling them to use assembly sidetracks that effort.
>
>Let me bring it back full-circle where we started.  The reason
>I mention assembly in the first place was the number of graduates
>coming to me for a job that were failing the test I give
>*abysmally*, particularly in the areas of creating an algorithm
>for a problem they've never seen before, and doing logical
>operations.
>

There are two fundamental points here which until recently I had
not sussed, and which seem to be fighting each other in this thread.

1. Algorithmic problem solving. 
Tim here suggests that assembler should be used to teach students
how to become software engineers, because the simple structure 
of the assembly language is easy to understand. Students will 
be able to follow it, etc. Additionally, people studying this, if
(and only if) they become good programmers using this method, then
they are the type of people who can work things out for themselves.

2. Software engineering.
There are two things here, one is abstraction, and the other is
best fit solution from multiple points of reference. (The second,
I like because it works for me, I always used to call this 
abstraction, but the whole of the rest of the software engineering
community, in fact, do not seem to.)

I take no credit for this observation, it is blatantly plagiarised.

Abstraction, the software design technique of placing a model for
your design which is as wide as possible, (such as designing your
car control system to be actually a system which can handle all
vehicles up to say, 14 wheels, and then restraining your
implementation to a four wheeled vehicle, by parameters) is part
of the whole series of modern developments in software engineering,
which have been created or refined since early languages were 
created. It is in these fields that languages such as Ada95 and C++
have advantages over their earlier counterparts.

Plagiarism over.

We see modern languages providing the means to implement,
efficiently (or not), modern design techniques (such as, but not
limited to, abstraction) which are the solutions to the increasing
varying types of problems that software people now face. These
varying different approaches to problem solving are mirrored in
modern languages, and by learning (using in the way they were
intended.) languages of sufficiently different front end, the
methodology rubs off and affects the users thinking. (Of course
the methodology shall not rub off, if the programmer writes the
new language in the same way as he would have his old language.
This is why I always feel you must love the language. This accounts
for the experiences of people achieving better solutions in their
preferred language, even though it may not be the best language to
solve the problem. It also explains why people hate some languages,
which of course, they don't, they just resent having to use it.)

In contrast :

  Modern assembler does nothing more clever at the programmer
  interface than it did twenty years ago, even though they are
  miles more efficient than they ever were.

  In my opinion, the main improvements in processor technology
  have come in the internal way they get their speed. There are,
  I guess, two exceptions to this, that really affect the
  programmer, the ability to connect them together, that is
  multiprocessing... and the useful test and set instructions.

-------------------------- ~ ----------------------------

In an interesting post a couple of weeks ago, Robert Dewar,
in reply to a post about somebody who said that learning 
5 or 6 assemblers 'just for fun' that this kind of experience
was not valuable. This fits in with my pet theory on the
subject, because all assemblers are basically the same, and
they implement clever techniques below the programmer interface,
but nothing innovative at it.

Tim seems to be of the school that teaching CS should be done
in assembler because they don't have to put up with loads
of abstractions (read modern methods.) Peter on the other
hand basically is disagreeing with him, on the grounds that
with exception of new device drivers, all the easy stuff (the
stuff that is so easy it can be done in assembler just with
simple jumps, adds etc.) has been done. Modern methods have
evolved to cope with problems that could not be solved, or at
the very least old methods provided unreliable or slowly
developing solutions.
I do not think problems will converge in there similarity either,
by their very nature, the nature of new problems will diverge as
ways are found to solve existing ones.

I have to agree with Peter in this case. I do not think that
assembler is a very useful grounding for programmers, even 
though Z80 was basically my first language. I think I would
have been a better programmer had this not been the case.

Just how much speed do we need anyway? Even if hand written
assembler was faster than compiled code, which it is not, these
things need to be put into perspective. A modern high end
desktop PC will be faster than an old Cray One (400Mhz. I think)
in _less_ than two years.

>
>I chalked this up to the lack of the fundamentals being taught,
>and the students having their brains filled up so much with
>abstractions that they don't understand how to solve problems
>anymore.

It was probably just lack of what you consider to be fundamental,
however, with the very few exceptions, straight assembler will 
almost totally unused, anywhere, twenty years from now. 

Furthermore, if students were told that computers implemented,
at ground level, say, strings, and all these students'
interfaces provided strings, then they would think the bottom
level was their provided interface. There is no reason why they
need to even know assembler, these days, ten or twenty years
from now, there will be even less of a case.

>
>This is why I think assembly is the better way to teach
>algorithms; it's just you and the algorithm.  It forces them
>to really think about the problem, because they don't have any
>"training wheels" to protect them from the problem.
>

Very few people have good problem solving abilities, I agree
but a lot of this minority have developed this ability themselves,
or inherited it from their parents (or a bit of both.)

Out of these, almost nobody has derived sound software engineering
methods from first principles. They have to be taught.

>Whatever were doing now is *not working*, let me tell you.

Perhaps if you tested people differently, and employed the
people who could understand sound abstract concepts, then the
solutions your company provides would be designed better
originally, rather than having lots of clever bits of algorithmic
code to save the inexperienced design these hackers you have
employed have made. Then you may see more success, and not
feel the situation is so hopeless. :-) Inexperienced (or 
people with narrow experience) but talented
problem solvers, in my experience, often cause more damage
than they should, because they are more likely to just say,
"I can see how to do it!" and start coding. Yet all these 
problems, where it is possible to humanly achieve a viable 
solution using this technique are drying up. Unfortunately,
these problem solvers, having relied on their wits since the
day they first started to think, are the last people to step
back and think, "Is there another way of doing this?"

In the rare cases where they do, they, (like ex boy racers
turned thirty, with children,) still retain their speed but
become the safest people on the road, because they have seen
the behaviour of the vehicle beyonds its limits, and they
know not only when it is safe to speed, but also when it
is not. 

>
>-- Tim Behrendsen (tim@airshields.com)


I think Peter is on the right track here.

Best regards,
Ian.
---
Ian Ward's opinions only : ian@rsd.bel.alcatel.be




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-07  0:00                                   ` Tim Behrendsen
@ 1996-08-07  0:00                                     ` Dan Pop
  1996-08-08  0:00                                       ` Tim Behrendsen
  1996-08-08  0:00                                       ` Christopher R Volpe
  1996-08-07  0:00                                     ` Peter Seebach
                                                       ` (8 subsequent siblings)
  9 siblings, 2 replies; 688+ messages in thread
From: Dan Pop @ 1996-08-07  0:00 UTC (permalink / raw)



In <01bb83f5$923391e0$87ee6fce@timpent.airshields.com> "Tim Behrendsen" <tim@airshields.com> writes:

>The problem is that we *can't* think purely abstractly,
>otherwise we end up with slow crap code.

Care to provide some concrete examples?

>It is simply not
>possible to ignore the way code is structured, and completely
>depend on the compiler to save us.

This doesn't make any sense to me.  Could you be a little bit more
explicit?

The compiler definitely won't save my ass if I choose to use bubblesort
instead of quicksort on a large dataset, but the selection between the
two algorithms is made based on an abstraction (algorithm analysis) not
on how the compiler generates code for one or another.  It's very likely
that quicksort will be better, no matter the compiler and the underlying 
platform.

Once you put micro-optimizations based on knowledge about the
compiler and/or hardware into the code, you impair both the 
readability/maintainability/portability of the code and the opportunities
of another compiler, on another platform, to generate optimal code.
There are situations when this _has_ to be done, but they are isolated
exceptions, not the norm.

Dan
--
Dan Pop
CERN, CN Division
Email: Dan.Pop@cern.ch 
Mail:  CERN - PPE, Bat. 31 R-004, CH-1211 Geneve 23, Switzerland




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-07  0:00 ` Fergus Henderson
@ 1996-08-07  0:00   ` Tim Behrendsen
  1996-08-08  0:00     ` Szu-Wen Huang
  0 siblings, 1 reply; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-07  0:00 UTC (permalink / raw)



Fergus Henderson <fjh@mundook.cs.mu.OZ.AU> wrote in article
<4uaqqg$203@mulga.cs.mu.OZ.AU>...
> "Tim Behrendsen" <tim@airshields.com> writes:
> >Who's talking about showing them?  I would suggest that if
> >they wrote a quicksort in assembler, they will have a much
> >better "feel" for the algorithm, than if they wrote it in C.
> 
> So which is better use of a student's time, writing 100 lines of
> quicksort in assembler, or using C and writing 20 lines of quicksort,
> 10 lines of insertion sort, 20 lines of heap sort, 20 lines of
> merge sort, and 30 lines of glue to test them all out?

I think it's more valuable to truly understand one or two algorithms,
than to vaguely understand 5 algorithms.

> Which will give them more practice in choosing meaningful variable
> names?
> Which will give them experience of the idea of
> having multiple implementations of the same interface?  Which will
> teach them how to write portable code?  Which will give them more
> experience with the sort of software engineering problems they are
> likely to encounter in the Real World [tm]?

Easily learned. Later. More important to understand the
procedural nature of the computer at first.

> Which will give them understanding of a range of different
> sorting algorithms?

I would rather they have a better fundamental understanding
of algorithms in general.

If I may extend a famous quote,

"Teach a man an algorithm, and you have given him one solution.
Teach a man to think, and you have given him all solutions."

[Hey!  I like that... anybody want to quote me in their next
book? ;-> ]

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-06  0:00                                 ` Peter Seebach
@ 1996-08-07  0:00                                   ` Tim Behrendsen
  1996-08-07  0:00                                     ` Dan Pop
                                                       ` (9 more replies)
  0 siblings, 10 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-07  0:00 UTC (permalink / raw)



Peter Seebach <seebs@solutions.solon.com> wrote in article
<4u89c4$p7p@solutions.solon.com>...
> In article <01bb83ad$29c3cfa0$87ee6fce@timpent.airshields.com>,
> Tim Behrendsen <tim@airshields.com> wrote:
> >Let me bring it back full-circle where we started.  The reason
> >I mention assembly in the first place was the number of graduates
> >coming to me for a job that were failing the test I give
> >*abysmally*, particularly in the areas of creating an algorithm
> >for a problem they've never seen before, and doing logical
> >operations.
> 
> These are fundementally abstract operations.  Generalization is the core
of
> algorithm design; you must abstract what the problem is, what a solution
looks
> like, and what their relationship is.  Logic is entirely an abstract
beast.
> 
> >I chalked this up to the lack of the fundamentals being taught,
> >and the students having their brains filled up so much with
> >abstractions that they don't understand how to solve problems
> >anymore.
> 
> The entire concept of a solution to a problem, rather than the answer to
a
> question, is a matter of abstraction.

I agree; my point is that I think the student learns more if they
are thinking purely in terms of fundamental operations (which are
still abstractions above the raw hardware components), rather
than layers and layers of syntax that hide the essential essence
of what the computer is doing.

The people I saw could quote me book definitions of C syntax.  But
they had very little knowledge of what it really meant, and how
to use it.

If programming is reduced to fundamentals of move, arithmetic,
test, branch, etc it prevents the student from leaning on the
abstraction rather than truly understanding the solution to the
problem.  In other words, if they can express it in the above
terms, you *know* they understand the solution.
 
> >This is why I think assembly is the better way to teach
> >algorithms; it's just you and the algorithm.  It forces them
> >to really think about the problem, because they don't have any
> >"training wheels" to protect them from the problem.
> 
> But the training wheels protect them, not from the problem, but from the
> implementation.
> 
> Consider string searching.
> 
> An *algorithm* for searching strings has nothing to do with whether you
can
> use a postdecrement or postincrement addressing mode.
> 
> An assembly program for doing so may well *have to* consider these
issues.

True; there is assembly syntax that could be considered irrelevent
to the central issues.  I really don't see that as a big problem
compared to the much larger problem of the brain-damaged students
that are asking me for jobs.

Perhaps the solution is a middle ground one; rather than "real"
assembly for students, maybe what they need is a "MIX" kind of
thing where it is reduced to fundamental elements, an assembly
abstraction if you will.  That way, "dangerous" concepts such
as post-increment, etc. can be safely hidden away, but we can
still give allow them to think in pure data manipulation terms.
 
> >Whatever were doing now is *not working*, let me tell you.
> 
> What we're doing now is exactly what we do in every other field; we give
them
> lists of desired answers.  Most math programs would be just as bad if
math
> were as high paying and easy to get a job in as CS.
> 
> I've seen *very* few good teachers in any field, especially ones which
require
> abstraction.
> 
> The problem I've seen is that students think too much about how to
implement,
> and not enough about the abstract design.  I have this problem, even.  I
spend
> way too much time, during designs, thinking about what the data type will
need
> to look like, rather than what it will need to be able to do.  This
frequently
> turns into extra time about halfway through the implementation, because I
> didn't think enough about the *abstract* part of the problem.
> 
> I haven't even thought about memory layouts, and I'm *still* too specific
and
> not abstract enough.

The problem is that we *can't* think purely abstractly,
otherwise we end up with slow crap code.  It is simply not
possible to ignore the way code is structured, and completely
depend on the compiler to save us.  That is just not the
real world.

At least not my world, where I have to pack as many users
as possible onto one CPU.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00                                       ` Tim Behrendsen
  1996-08-08  0:00                                         ` Peter Seebach
@ 1996-08-08  0:00                                         ` telnet user
  1996-08-09  0:00                                           ` Tim Behrendsen
  1996-08-09  0:00                                           ` Ed Hook
  1996-08-09  0:00                                         ` Mike Rubenstein
  2 siblings, 2 replies; 688+ messages in thread
From: telnet user @ 1996-08-08  0:00 UTC (permalink / raw)



Tim Behrendsen (tim@airshields.com) wrote:

: This is an interesting case, because it is somewhat inefficently
: implemented.  If you're interested in speed, you would do...

: int gcd(int x, int y) {
:     int z;
:     while (y != 0)
:         z = y; y = x % y; x = z;
:     return(y);
: }

However if you are interested in correctness, you use braces for the
loop.  I think this case where you write assembly in C, get it wrong,
and ...

: Using my AIX compiler, I get a nominal improvement of about
: 10%, mostly because the speed of the modulo is much slower
: than the inefficiency of recursion.

only achieve a 10% speedup proves everyone else's point, and is an
appropriate place to end this thread.

---------------------------------------------------------------------------
Tim Hollebeek         | Disclaimer :=> Everything above is a true statement,
Electron Psychologist |                for sufficiently false values of true.
Princeton University  | email: tim@wfn-shop.princeton.edu
----------------------| http://wfn-shop.princeton.edu/~tim (NEW! IMPROVED!)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-28  0:00                 ` J. Christian Blanchette
                                     ` (3 preceding siblings ...)
  1996-07-31  0:00                   ` AJ Musgrove
@ 1996-08-08  0:00                   ` William Clodius
  1996-08-11  0:00                     ` Dik T. Winter
  1996-08-11  0:00                     ` Fergus Henderson
  1996-08-08  0:00                   ` William Clodius
                                     ` (4 subsequent siblings)
  9 siblings, 2 replies; 688+ messages in thread
From: William Clodius @ 1996-08-08  0:00 UTC (permalink / raw)



There is one special case of recursion, fortunately the most commonly
encountered case, where it is "ridiculously" easy for a compiler to
see that the recursion can be replaced by iteration. The recursive
code must call itself directly on only one branch of the control of
flow and is the last call before returning on that control of
flow. Conventionally it is placed on the last branch of the control of
flow for clarity, hence the name "tail" recursion.

Quoting the NEW Hackers dictionary

"tail recursion: n. If you aren't sick of it already, see tail
recursion." 

which is essentially equivalent to

tail recursion: n. while you aren't sick of it, read "tail recursion:
n. If you aren't sick of it already, see tail recursion."

There are other more subtle cases (where indirect recursion is invoked
(A calls B which calls A), multiple recursion (A calls A more than
once), etc.) where recursion elimination is possible, but significantly
more difficult to detect and implement. For example, B could be
inlined into A and result in tail recursion for the new version of
A. It is doubtfull that any compiler will recognize all special cases,
but there are several that can handle more than just tail recursion.

-- 

William B. Clodius		Phone: (505)-665-9370
Los Alamos National Laboratory	Email: wclodius@lanl.gov
Los Alamos, NM 87545




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00     ` Szu-Wen Huang
@ 1996-08-08  0:00       ` Tim Behrendsen
  1996-08-08  0:00         ` Peter Seebach
                           ` (2 more replies)
  0 siblings, 3 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-08  0:00 UTC (permalink / raw)



Szu-Wen Huang <huang@mnsinc.com> wrote in article
<4ubnhr$714@news1.mnsinc.com>...
> Tim Behrendsen (tim@airshields.com) wrote:
> 
> : I think it's more valuable to truly understand one or two algorithms,
> : than to vaguely understand 5 algorithms.
> 
> 1.  Being able to write algorithm X in assembly doesn't make you
>     understand the algorithm better.  It makes you understand whatever
>     platform du jour your school uses better.
> 2.  Vaguely understanding 5 algorithms is far better than understanding
>     one or two fully.  The primary objective of algorithms courses is
>     to produce students that can *choose* which algorithm to use,
>     not to produce students who can memorize one or two algorithms.
>     In fact, there's rarely a programmer more dangerous than one that
>     has no broad knowledge of algorithms.

My point is someone who has "vaguely" learned five algorithms has
simply memorized them, and learned nothing about the general
principles of how algorithms are created.

Someone who has "the feel" for two algorithms very strongly are
more likely to be able to extend that knowledge to creating new
algorithms.

Computer Science is one of the few, if not only, sciences where a
student can derive all of it just by thinking.

> [snip]
> : > Which will give them experience of the idea of
> : > having multiple implementations of the same interface?  Which will
> : > teach them how to write portable code?  Which will give them more
> : > experience with the sort of software engineering problems they are
> : > likely to encounter in the Real World [tm]?
> 
> : Easily learned. Later. More important to understand the
> : procedural nature of the computer at first.
> 
> If you think software engineering is easily learned, you obviously
> haven't been in the real world.

I live lame software engineering every day, believe me I
understand the level of general ignorance.  You can always
fix someone's style, but you must plant the seed of thought
early on.

> : > Which will give them understanding of a range of different
> : > sorting algorithms?
> 
> : I would rather they have a better fundamental understanding
> : of algorithms in general.
> 
> I thought you just said it was better to "truly understand one or
> two algorithms"?  Make up your mind.  General knowledge, or specific
> knowledge?

Truly understanding two algorithms is better than memorizing five
algorithms, because that is what *gives" the fundamental
understanding.

> : If I may extend a famous quote,
> 
> : "Teach a man an algorithm, and you have given him one solution.
> : Teach a man to think, and you have given him all solutions."
> [snip]
> 
> Teach a man an algorithm and some I/O routines to enter the input
> and display the output, then some routines to set up the stack,
> then some routines to initialize data,

Yes, and wouldn't they truly understand I/O routines and stacks
after that?

> then some reasons why
> instruction X cannot follow instruction Y, then some reasons why
> a small memory model isn't enough, and you confuse the man for
> life.

Yes, if your model is the brain-damaged 8086 model.  I personally
would use a 68000 to teach on because it's a nice straight-forward
orthogonal instruction set.

-- Tim Behrendsen




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00       ` Tim Behrendsen
  1996-08-08  0:00         ` Peter Seebach
@ 1996-08-08  0:00         ` Szu-Wen Huang
  1996-08-08  0:00           ` Tim Behrendsen
                             ` (2 more replies)
  1996-08-08  0:00         ` Christopher R Volpe
  2 siblings, 3 replies; 688+ messages in thread
From: Szu-Wen Huang @ 1996-08-08  0:00 UTC (permalink / raw)



Tim Behrendsen (tim@airshields.com) wrote:
: Szu-Wen Huang <huang@mnsinc.com> wrote in article
: <4ubnhr$714@news1.mnsinc.com>...

: > 1.  Being able to write algorithm X in assembly doesn't make you
: >     understand the algorithm better.  It makes you understand whatever
: >     platform du jour your school uses better.
: > 2.  Vaguely understanding 5 algorithms is far better than understanding
: >     one or two fully.  The primary objective of algorithms courses is
: >     to produce students that can *choose* which algorithm to use,
: >     not to produce students who can memorize one or two algorithms.
: >     In fact, there's rarely a programmer more dangerous than one that
: >     has no broad knowledge of algorithms.

: My point is someone who has "vaguely" learned five algorithms has
: simply memorized them, and learned nothing about the general
: principles of how algorithms are created.

No, your point was writing an algorithm in assembly helps to understand
it fully.  I'll be my own case in point.  I remember most sorting
algorithms vaguely, and I probably could not implement all of them
off the top of my head.  Does that hamper me?  No, because I know
which book to look them up in, and that's exactly what books are for.
The important thing, however, is that I remember that O(n^2) is bad
for a sorting algorithm, and O(n lg n) is pretty good.

Now, what about your student that knows *one* algorithm?

: Someone who has "the feel" for two algorithms very strongly are
: more likely to be able to extend that knowledge to creating new
: algorithms.

Uhm, theoretically.  Somebody with only in-depth knowledge of run-
length encoding is not very likely to come up spontaneously with
Huffman or arithmetic compression.

: Computer Science is one of the few, if not only, sciences where a
: student can derive all of it just by thinking.

Assuming the student doesn't have a deadline and all day to think for
40 years.  Many algorithms are obvious.  Insertion sort, for example,
is what a human would do (if somewhat parallelized).  Quicksort, on
the other hand, is nowhere near intuitive.  Be realistic.

: > [snip]
: > If you think software engineering is easily learned, you obviously
: > haven't been in the real world.

: I live lame software engineering every day, believe me I
: understand the level of general ignorance.  You can always
: fix someone's style, but you must plant the seed of thought
: early on.

Are you asserting that teaching software engineering instead of assembly
language will somehow make the student unable to think?  Assembly
language is a powerful tool best used by experts, not beginners.
Beginners do not have the knowledge to use them properly, and tend
to get caught in details and miss the big picture.  If you give an
assignment of "implement quicksort in PC assembly", how much effort
is spent in:

1.  the algorithm
2.  I/O
3.  OS/platform intricacies
4.  debugging

?  I would imagine your goal would be to maximize #1, which is why
a HLL is better for the purpose.

: > I thought you just said it was better to "truly understand one or
: > two algorithms"?  Make up your mind.  General knowledge, or specific
: > knowledge?

: Truly understanding two algorithms is better than memorizing five
: algorithms, because that is what *gives" the fundamental
: understanding.

Fundamental understanding of RLE has little, if not nothing, to do
with Huffman compression.  Fundamental understanding of quicksort
has nothing to do with heapsort either.

: > Teach a man an algorithm and some I/O routines to enter the input
: > and display the output, then some routines to set up the stack,
: > then some routines to initialize data,

: Yes, and wouldn't they truly understand I/O routines and stacks
: after that?

They would, but would they understand the algorithm?

: > then some reasons why
: > instruction X cannot follow instruction Y, then some reasons why
: > a small memory model isn't enough, and you confuse the man for
: > life.

: Yes, if your model is the brain-damaged 8086 model.  I personally
: would use a 68000 to teach on because it's a nice straight-forward
: orthogonal instruction set.

It doesn't matter.  You are burdening beginners with details that
you could've avoided by using an HLL.  My school was contemplating
teaching one freshman class Prolog as a first language and another
Pascal, then switch in their second year to observe which was more
effective.  It's too bad the experiment has practical problems,
because it would be interesting to see if it's easier for a "recursive-
minded" student to study iteration or the reverse.  In any case,
you want to train students in problem solving, not why the OS requires
value X in register Y.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-07  0:00                                     ` Peter Seebach
@ 1996-08-08  0:00                                       ` Tim Behrendsen
  1996-08-08  0:00                                         ` Peter Seebach
  0 siblings, 1 reply; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-08  0:00 UTC (permalink / raw)



Peter Seebach <seebs@solutions.solon.com> wrote in article
<4uahfe$bao@solutions.solon.com>...
> In article <01bb846c$e51df220$87ee6fce@timpent.airshields.com>,
> Tim Behrendsen <tim@airshields.com> wrote:
> >Peter Seebach <seebs@solutions.solon.com> wrote in article
> ><4u8lff$3bs@solutions.solon.com>...
> 
> >Not to get into a debate on the meaning of abstraction, but the
> >point is that there is very little hidden from the student when
> >they are learning assembly.  This allows them to better concentrate
> >on the basics of algorithms, because they are not distracted by syntax.
> 
> Huh?
> The basics of algorithms are hidden from them, when they are sitting
> around counting bytes and remembering the mnemonics for operations.
> 
> Assembly has more syntax per operation done than anything else.

But the syntax is very straight forward and direct. There are
very few operations in assembly that need to be learned.

> >Of course, but I'm talking about abstractions of assembly, i.e.,
> >HLLs.  Remember, C (or any HLL) does not really exist as far as
> >the computer knows.  Assembly is the direct raw instruction set of
> >the physical machine.  If the student is learning algorithms in
> >assembly, they are unquestionably learning the algorithm, and not
> >just some vague concept wrapped in 10 layers of wool.
> 
> No, they're learning assembly.  Assembly is relatively hard.  No amount
> of abstraction in a language will hide the algorithm; the more abstract
the
> language, the more visible the algorithm, because there's less and less
there.
> 
> The only real exception is when the language encapsulates an algorithm;
using
> qsort() does not teach you a sorting algorithm.  But writing a
quicksort() in
> C will teach you at least as much about quicksort as writing it in
assembly.

Yes, but will it teach you as much about *computers*?  The reality
is that the computer does not execute C.

> >> >Perhaps a better question is, which is more important: Learning
> >> >abstractions or algorithmic analysis?  I say that algorithmic
> >> >analysis is 10 to 1 more important than abstractions.
> 
> >> "Learning English is 10 to 1 more important than learning any
language."
> 
> >True, but don't get me started on the English skills of some of
> >my applicants!
> 
> Huh?  The point of my statement is that English *is* a language. 
Algorithmic
> analysis cannot be more important than abstractions, because it's a
subset of
> abstractions.

That is simply not true!  That is like saying that thought is
a subset of language.  Thoughts are expressed in language,
but they are not dependent on a particular language.

I don't know about anybody else, but when I think about an
algorithm, I have a visualization of the data moving around
and going through transformations.  I get a feel for the
efficiency by thinking about how much work is involved in doing
the movement/transformations.  I want students learning these
concepts for the first time to get this same feel, without
vaguely memorizing algorithms, which is what I think happens
now.

I think one of the reasons algorithms get memorized rather
than learned is that they are protected too much by the
abstraction of arrays, rather than the reality of memory.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00                                       ` Tim Behrendsen
@ 1996-08-08  0:00                                         ` Peter Seebach
  1996-08-08  0:00                                           ` Tim Behrendsen
  1996-08-09  0:00                                           ` Chris Sonnack
  1996-08-08  0:00                                         ` telnet user
  1996-08-09  0:00                                         ` Mike Rubenstein
  2 siblings, 2 replies; 688+ messages in thread
From: Peter Seebach @ 1996-08-08  0:00 UTC (permalink / raw)



In article <01bb853b$ca4c8e00$87ee6fce@timpent.airshields.com>,
Tim Behrendsen <tim@airshields.com> wrote:
>> But your concept of what the "fundemental operations" are is completely
>tied
>> up in how specific hardware you've seen operates.  Algorithms exist in
>terms
>> of abstract operations, not moves and branches.

>Please name me the hardware that does not use moves and branches.

That's not the *point*.  Can you name me hardware with no protons?  Can you
name me hardware students are likely to learn on that doesn't use 5 volt
power on the chip?

However, the algorithm is *NOT DEPENDANT ON* moves and branches.  I can
do quicksort.  In my head.  On paper.  I can sit down, with a pencil, and
implement quicksort.  You can say "these operations correspond to moves and
branches", but really, moves and branches are corresponding to more general
operations.  If you teach the students that it's all moves and branches, they
end up with more understanding of the specific hardware you teach them on, and
current trends in CPU design, *and less of the algorithm*.

I think algorithms should be taught on paper.  *Away* from implementations
on computer.

Believe me, you understand quicksort much better after *DOING* it than you
can from writing any reasonable number of implementations.

>> In C, we write

>> 	int
>> 	gcd(int x, int y) {
>> 		if (y == 0)
>> 			return x;
>> 		return gcd(y, (x % y));
>> 	}

>This is an interesting case, because it is somewhat inefficently
>implemented.  If you're interested in speed, you would do...

>int gcd(int x, int y) {
>    int z;
>    while (y != 0)
>        z = y; y = x % y; x = z;
>    return(y);
>}

Yes, you would.  But so would at least one compiler I use.  Internally.

It doesn't *matter*.  The realization is that, for non-zero x and y, the
greatest common divisor of x and y is conveniently the greatest common
divisor of y and (x mod y).  *THAT* is the algorithm.  The rest is fluff
and nonsense.  You can do that operation on paper, too.

>Using my AIX compiler, I get a nominal improvement of about
>10%, mostly because the speed of the modulo is much slower
>than the inefficiency of recursion.

So?

What numbers are students entering that we *CARE* about the 10% speed
difference?  Optimization is an interesting thing to look at, but has
nothing to do with algorithms.

>My point is that how does a C programmer know that recursive
>implementations are somewhat inefficient?

The C programmer doesn't.  And at least one compiler has been known
to optimize tail-recursion into iteration, sometimes even in the 2-function
case.

>Recently the C/C++
>Users Journal had a little bite-sized article about a
>general looping algorithm, where you could specify the number
>of nested loops in a dynamic fashion.  It was implemented
>recursively.  I wrote a letter back reimplementing the
>algorithm non-recursively, and got like a 40% increase in
>speed.

Neat.  And?  The key here is that, once they showed you *the algorithm*, you
could implement it however you want.  The algorithm itself does not depend on
your speed.

>These sorts of issues are where code bloat comes from, and
>it comes from naive implementations of "valid C syntax".

No, they aren't.  All of those put together wouldn't explain code bloat.

We are seeing programs that are a factor of ten, or more, larger than we think
they are.  These would have to be *mind-numbing* idiocy.

Microoptimizing will not fix it.  Teaching people about *real* design
principles will.

It's worth observing good programmers rewriting.  A good programmer will
add features and cut the size of a program slightly; a bad programmer will
bloat it hugely to add the feature.  (Assuming the original program to be
mediocre.)

A good design will save more space and time than any amount of optimization of
a bad design.

>> But it also prevents them from learning the abstraction, and truly
>> understanding the *principle* of the solution.

>If the C abstraction is good, more abstraction must be better
>then.  How about we teach everything in APL, where we can
>*really* abstract away the details?  No data types, full
>array operations.  Talk about easy quicksort!  I can write it
>in one line of code using a handful of operations (my APL
>is *really* rusty, so I can't give the example).

I sorta favor lisp, which I don't know, but which is elegant and readable.

>The student learns Quicksort, there is no question about it.
>But what have they *really* learned?

They've learned the algorithm itself.  If you want to teach them about
optimization and efficiency, go right ahead, but distinguish O(n) vs O(N^2)
type efficiency from the kinds of efficiency you're talking about.

Remember, that O() includes arbitrary constant multipliers and additions.
8N is just the same as N in complexity.

I think it's a cool idea to teach people how to turn an 8N into a 4N
algorithm.  I don't think it's part of the algorithm, I think it's a different
thing.

>> Nonsense.  A well designed and considered abstraction will generally lend
>> itself to an efficient and elegant implementation.  An ill-considered
>> abstraction will spend more time on cruft than it will on solving the
>problem.

>Like recursive algorithms?

I'm not sure what you're referring back to.  Recursion is one of the first
things I look to eliminate if I have an expensive implementation of what I
know to be a good algorithm.  But if the cost is small, it tends to stay,
because it's frequently obvious how it's supposed to work.

>Agreed; but there are general principles that can be learned
>across *all* architectures.  See follow up to Dan Pop post for
>example of this.

Sure, there are.  And these, I think, can reasonably be taught.  But that
teaching doesn't need any assembly.

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00       ` Tim Behrendsen
@ 1996-08-08  0:00         ` Peter Seebach
  1996-08-08  0:00           ` Tim Behrendsen
  1996-08-10  0:00           ` Mike Rubenstein
  1996-08-08  0:00         ` Szu-Wen Huang
  1996-08-08  0:00         ` Christopher R Volpe
  2 siblings, 2 replies; 688+ messages in thread
From: Peter Seebach @ 1996-08-08  0:00 UTC (permalink / raw)



In article <01bb8536$892ee260$87ee6fce@timpent.airshields.com>,
Tim Behrendsen <tim@airshields.com> wrote:
>My point is someone who has "vaguely" learned five algorithms has
>simply memorized them, and learned nothing about the general
>principles of how algorithms are created.

Which is exactly what will happen if they code them in assembly; they
can't possibly be learning how algorithms are created if they start with
implementing them in a low level language.  Algorithms are generally written
in natural languages, and written on napkins.

The details of which bytes move where are *not* part of the algorithm itself.

>I live lame software engineering every day, believe me I
>understand the level of general ignorance.  You can always
>fix someone's style, but you must plant the seed of thought
>early on.

Yes.

>Truly understanding two algorithms is better than memorizing five
>algorithms, because that is what *gives" the fundamental
>understanding.

But experimenting with the five algorithms, and comparing them, is a better
way to truly understand them than implementing them in a difficult language.

>> Teach a man an algorithm and some I/O routines to enter the input
>> and display the output, then some routines to set up the stack,
>> then some routines to initialize data,

>Yes, and wouldn't they truly understand I/O routines and stacks
>after that?

Sure.  Implementing I/O routines and stacks, *in any language*, is a good way
to learn about I/O routines and stacks.  It's a crappy way to learn about
sorting.

Further, learning about stacks is bad way to learn about computing; stacks are
not a universal implementation of computers.  Students who learn about stacks
early on may start assuming that that's somehow a basic truth of computing.
They may do things like assert that the addresses of local variables in one
function are always lower than the addresses of local variables in a function
"above" it on the stack.  Or maybe that should be higher; both kinds exist.

Being aware of a *conceptual* stack is different.

>Yes, if your model is the brain-damaged 8086 model.  I personally
>would use a 68000 to teach on because it's a nice straight-forward
>orthogonal instruction set.

But not nearly as straightforward as C or lisp.

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00                                       ` Tim Behrendsen
@ 1996-08-08  0:00                                         ` Peter Seebach
  1996-08-08  0:00                                           ` Randy Kaelber
                                                             ` (2 more replies)
  1996-08-09  0:00                                         ` Dan Pop
  1996-08-18  0:00                                         ` Sam B. Siegel
  2 siblings, 3 replies; 688+ messages in thread
From: Peter Seebach @ 1996-08-08  0:00 UTC (permalink / raw)



In article <01bb8534$b2718bc0$87ee6fce@timpent.airshields.com>,
Tim Behrendsen <tim@airshields.com> wrote:
>Look at the code-bloated and slow-software world we live in,
>particularly on desktop platforms.  I think this is caused by
>people not truly understanding what's *really* going on.

I believe it's caused by people who have a poor understanding of what
they're trying to do.

>I've spoken to enough people that have had C++ disasters to
>convince me that the more abstraction there is, the more
>danger there is of inefficient code.  This shouldn't be that
>hard to believe; any time you abstract away details you are
>giving up knowledge of what is going to be efficient.

Abstraction itself is an intentional inefficiency in most cases.  A
function which writes to files on disks *or* to terminal devices is
*guaranteed* to take at least one more test than a simpler function
which handles only one.

>I alluded to this in another post, but a good example is Motif
>and X11.  A programmer who only understands Motif, but does not
>understand X11 is going to write slow crap, period.

No, a programmer who uses Motif at all is likely to generate slow programs,
because there are too many layers involved, and I'm not sure either X *or*
Motif is a good model.

>Now, we know that this does not reflect the real world.  The
>question is, how does a programmer learn the efficient
>implementations that the optimizer can deal with effectively
>from the boneheaded ones?

A programmer does this, *IF IT BECOMES NECESSARY*, by experimenting.

>Here's an example:

>int a[50000],b[50000],c[50000],d[50000],e[50000];

This is not a strictly conforming C program.

>void test1()
>{
>    int i, j;
>    for (j = 0; j < 10; ++j) {
>        for (i = 0; i < 50000; ++i) {
>            ++a[i]; ++b[i]; ++c[i]; ++d[i]; ++e[i];
>        }
>    }
>}

>void test2()
>{
>    int i, j;
>    for (j = 0; j < 10; ++j) {
>        for (i = 0; i < 50000; ++i) ++a[i];
>        for (i = 0; i < 50000; ++i) ++b[i];
>        for (i = 0; i < 50000; ++i) ++c[i];
>        for (i = 0; i < 50000; ++i) ++d[i];
>        for (i = 0; i < 50000; ++i) ++e[i];
>    }
>}

>On my AIX system, test1 runs in 2.47 seconds, and test2
>runs in 1.95 seconds using maximum optimization (-O3).  The
>reason I knew the second would be faster is because I know
>to limit the amount of context information the optimizer has
>to deal with in the inner loops, and I know to keep memory
>localized.

>Now I submit that if I showed the average C programmer
>both programs, they would guess that test1 is faster because
>it has "less code", and that is where abstraction,
>ignorance, and niavete begin to hurt.

I am unconvinced that this example is remotely relevant to the general case.

In particular, in the real world, would you rather maintain code more like
the first or the second?  I'd prefer to maintain the first; it would be
easier to change all occurances of 50000, if this needed to be done.

Abstraction is a trade-off of maintainability and readability vs. code
speed.

I would not expect test1 to be faster.  I would write it anyway, because
I would not expect it to be much slower.  If, *and only if*, there turned
out to be a performance problem, I'd probably start by unrolling the loop
in test1, possibly getting a much greater performance increase.

The cool thing is, I can unroll the loop in test1 much faster than you can
unroll all the loops in test2.  Further, it is obvious in test1 that the
intent is to do the same thing to each array.

The problem with most slow code is not microoptimizations like this; it's vast
inefficiencies at the basic design level.

If I could arrange to need only one of those five arrays, I bet my code would
be faster than yours, no matter *how* much you optimized it.

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00                                         ` Peter Seebach
@ 1996-08-08  0:00                                           ` Randy Kaelber
  1996-08-09  0:00                                           ` J. Blustein
  1996-08-09  0:00                                           ` Chris Sonnack
  2 siblings, 0 replies; 688+ messages in thread
From: Randy Kaelber @ 1996-08-08  0:00 UTC (permalink / raw)



Peter Seebach (seebs@solutions.solon.com) wrote:
> In article <01bb8534$b2718bc0$87ee6fce@timpent.airshields.com>,
> Tim Behrendsen <tim@airshields.com> wrote:

> In particular, in the real world, would you rather maintain code more like
> the first or the second?  I'd prefer to maintain the first; it would be
> easier to change all occurances of 50000, if this needed to be done.

I'd prefer the second, myself, but first, I'd put in:

#define ARRAY_SIZE 50000

and replace all occurences of 50000 with it. Then, you only have to 
change the 50000 once. :)
--
Randy Kaelber:  kaelbers@muohio.edu
DARS Programmer/Analyst, Miami University, Oxford, OH 45056 USA
http://avian.dars.muohio.edu/~kaelbers/

Unsolicited commercial E-mail will be spell checked for a fee of $50.
Sending such mail constitutes acceptance of these terms. 





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Teaching sorts [was Re: What's the best language to start with?]
  1996-08-07  0:00                                   ` Tim Behrendsen
  1996-08-07  0:00                                     ` Dan Pop
  1996-08-07  0:00                                     ` Peter Seebach
@ 1996-08-08  0:00                                     ` Robert I. Eachus
  1996-08-09  0:00                                       ` Robert Dewar
                                                         ` (4 more replies)
  1996-08-13  0:00                                     ` Robert I. Eachus
                                                       ` (6 subsequent siblings)
  9 siblings, 5 replies; 688+ messages in thread
From: Robert I. Eachus @ 1996-08-08  0:00 UTC (permalink / raw)




In article <4udb2o$7io@solutions.solon.com> seebs@solutions.solon.com (Peter Seebach) writes:

  > However, the algorithm is *NOT DEPENDANT ON* moves and branches.
  > I can do quicksort.  In my head.  On paper.  I can sit down, with
  > a pencil, and implement quicksort.  You can say "these operations
  > correspond to moves and branches", but really, moves and branches
  > are corresponding to more general operations.  If you teach the
  > students that it's all moves and branches, they end up with more
  > understanding of the specific hardware you teach them on, and
  > current trends in CPU design, *and less of the algorithm*.

  > I think algorithms should be taught on paper.  *Away* from implementations
  > on computer.

  > Believe me, you understand quicksort much better after *DOING* it than you
  > can from writing any reasonable number of implementations.

    My favorite way to teach sorts is with cards.  You can use playing
cards, but a set of index cards numbered from 1 to 25 (or even random
four digit numbers if you are teaching radix sort) eliminates
distractions.

    I managed to do the "fun" experiment once.  Take three students
and have them learn Quicksort, Heapsort, and Bubblesort on "small"
decks.  At even 50 to 60 cards, the students doing Heapsort and
Quicksort are racing each other*, and the Bubblesort victim is still
hard at work well after they have finished.

    *The race really is to finish the Quicksort before the Heapsorter
has built a heap. This also shows why quicksort is quick, since the
Quicksort step of picking up the cards is O(n), not O(n log n)

	
--

					Robert I. Eachus

with Standard_Disclaimer;
use  Standard_Disclaimer;
function Message (Text: in Clever_Ideas) return Better_Ideas is...




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-07  0:00                                     ` Dan Pop
  1996-08-08  0:00                                       ` Tim Behrendsen
@ 1996-08-08  0:00                                       ` Christopher R Volpe
  1 sibling, 0 replies; 688+ messages in thread
From: Christopher R Volpe @ 1996-08-08  0:00 UTC (permalink / raw)



Dan Pop wrote:
> 
> In <01bb83f5$923391e0$87ee6fce@timpent.airshields.com> "Tim Behrendsen" <tim@airshields.com> writes:
> 
> >The problem is that we *can't* think purely abstractly,
> >otherwise we end up with slow crap code.
> 
> Care to provide some concrete examples?

I'll give ya one: Michael Laszlo's implementation of the Bentley and
Ottmann plane sweep line intersection algorithm in "Computational
Geometry and Computer Graphics in C++". Very well designed code, in
terms of abstract data structures. Nice O(n log n) algorithm. Looks like
a work of art. Runs a hundred times as slow as a brute-force O(n^2)
algorithm, except when the dataset is a bit on the large side, in which
case it dies after exhausting the virtual memory of my Sparc 20.

--

Chris Volpe			Phone: (518) 387-7766 
GE Corporate R&D		Fax:   (518) 387-6560
PO Box 8 			Email: volpecr@crd.ge.com
Schenectady, NY 12301		Web:   http://www.crd.ge.com/~volpecr




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00       ` Tim Behrendsen
  1996-08-08  0:00         ` Peter Seebach
  1996-08-08  0:00         ` Szu-Wen Huang
@ 1996-08-08  0:00         ` Christopher R Volpe
  2 siblings, 0 replies; 688+ messages in thread
From: Christopher R Volpe @ 1996-08-08  0:00 UTC (permalink / raw)



Tim Behrendsen wrote:
> 
> 
> My point is someone who has "vaguely" learned five algorithms has
> simply memorized them, and learned nothing about the general
> principles of how algorithms are created.
> 
> Someone who has "the feel" for two algorithms very strongly are
> more likely to be able to extend that knowledge to creating new
> algorithms.

Just to throw in my $.02 worth, trying to understand an algorithm by
writing/reading it in assembler is like trying to debug an algorithm by
hooking the CPU up to an oscilloscope. There's a reason that algorithms
texts present algorithms in pseudo-code. And the fact that pseudo-code
resembles high-level languages more than assembler is not a coincidence.

> > I thought you just said it was better to "truly understand one or
> > two algorithms"?  Make up your mind.  General knowledge, or specific
> > knowledge?
> 
> Truly understanding two algorithms is better than memorizing five
> algorithms, because that is what *gives" the fundamental
> understanding.
> 

I would say that truly understanding X algorithms is better than
memorizing Y algorithms for all positive values of X and Y. I think the
issue here is whether you can get a better grasp of an algorithm while
being bogged down in the details of assembler.

> > Teach a man an algorithm and some I/O routines to enter the input
> > and display the output, then some routines to set up the stack,
> > then some routines to initialize data,
> 
> Yes, and wouldn't they truly understand I/O routines and stacks
> after that?

Sure, if that's the intent. It doesn't help me understand quicksort,
though.

> 
> > then some reasons why
> > instruction X cannot follow instruction Y, then some reasons why
> > a small memory model isn't enough, and you confuse the man for
> > life.
> 
> Yes, if your model is the brain-damaged 8086 model.  I personally
> would use a 68000 to teach on because it's a nice straight-forward
> orthogonal instruction set.

So, what you're saying is, a simpler expression of an algorithm is
better? Hmmm, interesting :-)

--

Chris Volpe			Phone: (518) 387-7766 
GE Corporate R&D		Fax:   (518) 387-6560
PO Box 8 			Email: volpecr@crd.ge.com
Schenectady, NY 12301		Web:   http://www.crd.ge.com/~volpecr




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with
  1996-08-07  0:00                                 ` What's the best language to start with Ian Ward
@ 1996-08-08  0:00                                   ` Tim Behrendsen
  1996-08-09  0:00                                     ` Robert Dewar
  0 siblings, 1 reply; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-08  0:00 UTC (permalink / raw)



Ian Ward <ian@rsd.bel.alcatel.be> wrote in article
<4uaf5h$mid@btmpjg.god.bel.alcatel.be>...
> There are two fundamental points here which until recently I had
> not sussed, and which seem to be fighting each other in this thread.
> 
> 1. Algorithmic problem solving. 
> Tim here suggests that assembler should be used to teach students
> how to become software engineers, because the simple structure 
> of the assembly language is easy to understand. Students will 
> be able to follow it, etc. Additionally, people studying this, if
> (and only if) they become good programmers using this method, then
> they are the type of people who can work things out for themselves.

What I'm really saying is I want students to get the feel
for algorithmic analysis, and it seems to me that it's better
to "mix the fundamental ingredients" to learn to cook than it
is to reheat a frozen dinner.  Yes, both are technically
cooking, but I married the former. :)
 
> [interesting stuff snipped]
> Just how much speed do we need anyway? Even if hand written
> assembler was faster than compiled code, which it is not, these
> things need to be put into perspective. A modern high end
> desktop PC will be faster than an old Cray One (400Mhz. I think)
> in _less_ than two years.

I don't think there's much of anything that really needs to
be done in assembler, but I think there is no beating it
when it comes to getting the "feel" of how computers really
work.

> >I chalked this up to the lack of the fundamentals being taught,
> >and the students having their brains filled up so much with
> >abstractions that they don't understand how to solve problems
> >anymore.
> 
> It was probably just lack of what you consider to be fundamental,
> however, with the very few exceptions, straight assembler will 
> almost totally unused, anywhere, twenty years from now. 

I should say that my test had them render solutions in C.  I
gave them a moderately easy but not trivial algorithm to
implement, and they just plain couldn't do it.

> >This is why I think assembly is the better way to teach
> >algorithms; it's just you and the algorithm.  It forces them
> >to really think about the problem, because they don't have any
> >"training wheels" to protect them from the problem.
> >
> 
> Very few people have good problem solving abilities, I agree
> but a lot of this minority have developed this ability themselves,
> or inherited it from their parents (or a bit of both.)
> 
> Out of these, almost nobody has derived sound software engineering
> methods from first principles. They have to be taught.

Electronic Engineering seems to do pretty well from "first
principles".  I would say the average competency level of
an EE grad is *much* higher than the average CS grad.

> [more interesting stuff snipped]
> >Whatever were doing now is *not working*, let me tell you.
>[snip] 
> Inexperienced (or 
> people with narrow experience) but talented
> problem solvers, in my experience, often cause more damage
> than they should, because they are more likely to just say,
> "I can see how to do it!" and start coding. Yet all these 
> problems, where it is possible to humanly achieve a viable 
> solution using this technique are drying up. Unfortunately,
> these problem solvers, having relied on their wits since the
> day they first started to think, are the last people to step
> back and think, "Is there another way of doing this?"

I completely disagree!  Someone who is an "algorithm memorizer"
is *much* less likely to go back and think "Is there another
way to do this," because they by definition do not think
about their solutions.  They simply look them up in the book,
and if the book says that's the best, well, no need to go
further.

Take John Carmack, the 3D game engine programmer of Doom and
Quake.  An astoundingly talented guy, and definitely of
the "talented problem solver" class.  According to a mag
article, he has written over 20 gaming engines in search of
the most optimal implementation.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-07  0:00                                     ` Peter Seebach
@ 1996-08-08  0:00                                       ` Tim Behrendsen
  1996-08-08  0:00                                         ` Peter Seebach
                                                           ` (2 more replies)
  0 siblings, 3 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-08  0:00 UTC (permalink / raw)





Peter Seebach <seebs@solutions.solon.com> wrote in article
<4uah1k$b2o@solutions.solon.com>...
> In article <01bb83f5$923391e0$87ee6fce@timpent.airshields.com>,
> Tim Behrendsen <tim@airshields.com> wrote:
> >I agree; my point is that I think the student learns more if they
> >are thinking purely in terms of fundamental operations (which are
> >still abstractions above the raw hardware components), rather
> >than layers and layers of syntax that hide the essential essence
> >of what the computer is doing.
> 
> But your concept of what the "fundemental operations" are is completely
tied
> up in how specific hardware you've seen operates.  Algorithms exist in
terms
> of abstract operations, not moves and branches.

Please name me the hardware that does not use moves and branches.

> Something like CWEB might be a good language in which to learn
algorithms.
> Scheme might too.
> 
> Let's look at a specific algorithm; the infamous gcd(x, y).
> 
> In C, we write
> 
> 	int
> 	gcd(int x, int y) {
> 		if (y == 0)
> 			return x;
> 		return gcd(y, (x % y));
> 	}
> 
> or something similar.
> 
> What's important here is not any theories people may have about where the
x
> and y are stored, or how a stack works, but the concept that you can
define an
> algorithm in terms of a previous case.  Learning that the if() may be
> implemented with "a branch" does not help the student understand how the
> *algorithm* works, it helps the student understand how the *compiler*
works.
> These are distinct things.

This is an interesting case, because it is somewhat inefficently
implemented.  If you're interested in speed, you would do...

int gcd(int x, int y) {
    int z;
    while (y != 0)
        z = y; y = x % y; x = z;
    return(y);
}

Using my AIX compiler, I get a nominal improvement of about
10%, mostly because the speed of the modulo is much slower
than the inefficiency of recursion.

My point is that how does a C programmer know that recursive
implementations are somewhat inefficient?  Recently the C/C++
Users Journal had a little bite-sized article about a
general looping algorithm, where you could specify the number
of nested loops in a dynamic fashion.  It was implemented
recursively.  I wrote a letter back reimplementing the
algorithm non-recursively, and got like a 40% increase in
speed.

These sorts of issues are where code bloat comes from, and
it comes from naive implementations of "valid C syntax".

> >If programming is reduced to fundamentals of move, arithmetic,
> >test, branch, etc it prevents the student from leaning on the
> >abstraction rather than truly understanding the solution to the
> >problem.  In other words, if they can express it in the above
> >terms, you *know* they understand the solution.
> 
> But it also prevents them from learning the abstraction, and truly
> understanding the *principle* of the solution.

If the C abstraction is good, more abstraction must be better
then.  How about we teach everything in APL, where we can
*really* abstract away the details?  No data types, full
array operations.  Talk about easy quicksort!  I can write it
in one line of code using a handful of operations (my APL
is *really* rusty, so I can't give the example).

The student learns Quicksort, there is no question about it.
But what have they *really* learned?

[snip]
> Nonsense.  A well designed and considered abstraction will generally lend
> itself to an efficient and elegant implementation.  An ill-considered
> abstraction will spend more time on cruft than it will on solving the
problem.

Like recursive algorithms?

[snip]
> >It is simply not
> >possible to ignore the way code is structured, and completely
> >depend on the compiler to save us.  That is just not the
> >real world.
> 
> No, but I have never seen a good algorithm that didn't lend itself to
elegant
> and efficient implementation.  If we do decent designs first, and worry
about
> implementation second, we will find the implementation to be pleasant,
easy,
> and efficient.

Well, now you have (see above) :)

> >At least not my world, where I have to pack as many users
> >as possible onto one CPU.
> 
> Compare the *efficiency* for this purpose of Unix, which is designed, and
> MS-DOS, which has had millions of dollars thrown at making it as
efficient as
> possible.  Compare also things like NT and Berkeley Unix.  You *have* to
do
> your design in theoretical terms before you think about implementation,
or you
> end up with a system where 32 megs is seen as too small to run multiple
users,
> and you need a third party add-on to do it anyway.
> 
> The Berkeley people have thrown out more code than the system has in it. 
The
> algorithms are designed based on the *algorithmic* weaknesses of previous
> designs.  Hardware efficiency is evaluated only after you've found a good
> algorithm.  And they get excellent performance.

Agreed; but there are general principles that can be learned
across *all* architectures.  See follow up to Dan Pop post for
example of this.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-07  0:00                               ` Tim Behrendsen
@ 1996-08-08  0:00                                 ` Thomas Hood
  1996-08-09  0:00                                   ` Tim Behrendsen
  1996-08-17  0:00                                 ` Lawrence Kirby
  1 sibling, 1 reply; 688+ messages in thread
From: Thomas Hood @ 1996-08-08  0:00 UTC (permalink / raw)



Tim Behrendsen wrote:
<snip>
> > or to code than N lines of C.  But if you want the students to
> > understand say quicksort, it's a lot easier showing them 20 lines
> > of C than 100 lines of assembler.
> 
> Who's talking about showing them?  I would suggest that if
> they wrote a quicksort in assembler, they will have a much
> better "feel" for the algorithm, than if they wrote it in C.

Nonsense.  They will have a much better feel for how to shift stuff from
one register to another, but will never realize _why_ they are doing it.
Problem solving in anything but the most trivial of cases is the process
of moving from the specific (the problem at hand, for which we have no
solution) to the general (the algorithmic solution, which fits
the problem domain).  Accomplishing this in assembly language is
unnecessary and cruel.  Any HOL has the lexical elements necessary to
encapsulate the concept of a quicksort without the agony of dealing with
assembler.


> 
> > Also, we want to get our students into the habit of writing
> > robust and reusable code, and this is very difficult in assembler.
> > At least C has malloc()/free(); with assembler, you need to write
> > the memory management from scratch.
> 
> Well, I don't the student has to rewrite all the library
> routines!  Nothing precludes you from using them from assembler.

If you are allowing them to use higher level constructs (abstractions)
to accomplish lower level tasks (solve specific problems), then you
are upholding the very concept you are arguing against!

> 
> -- Tim Behrendsen (tim@airshields.com)

-- 
Thomas Hood
Senior Program Manager - Chief Engineer
Former Ada Geek
Neptune Interactive Communications
703.924.9234 x250
thomas@nicom.com
http://thomas.nicom.com




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00                                       ` Tim Behrendsen
@ 1996-08-08  0:00                                         ` Peter Seebach
  0 siblings, 0 replies; 688+ messages in thread
From: Peter Seebach @ 1996-08-08  0:00 UTC (permalink / raw)



In article <01bb853c$be3b2620$87ee6fce@timpent.airshields.com>,
Tim Behrendsen <tim@airshields.com> wrote:
>Peter Seebach <seebs@solutions.solon.com> wrote in article
><4uahfe$bao@solutions.solon.com>...
>> Huh?
>> The basics of algorithms are hidden from them, when they are sitting
>> around counting bytes and remembering the mnemonics for operations.

>> Assembly has more syntax per operation done than anything else.

>But the syntax is very straight forward and direct. There are
>very few operations in assembly that need to be learned.

Fine.  Use scheme or lisp.  Or C.  I betcha C has fewer operators,
keywords, and standard library functions than most processors have
instructions.  :)

Sure, you don't need all of the instructions.  You also don't need about
80% of C for most teaching work.

>> C will teach you at least as much about quicksort as writing it in
>assembly.

>Yes, but will it teach you as much about *computers*?  The reality
>is that the computer does not execute C.

No, nor does it execute assembly.  It executes machine code.

One does not learn C, or any other language, to learn about computers.
If you want to learn about computers also, this is a good thing, and may
well help you.

Learning C is, IMHO, a better way to learn about computers than assembly.  It
shows you the operations, rather than the noise.

>Algorithmic
>> analysis cannot be more important than abstractions, because it's a
>subset of
>> abstractions.

>That is simply not true!  That is like saying that thought is
>a subset of language.  Thoughts are expressed in language,
>but they are not dependent on a particular language.

Not all thoughts are expressed in language.  If they were, thoughts would
be a kind of language, distinguished from other kinds by some set of traits.

All algorithmic analysis analyzes the *principles* of algorithms, not the
details of specific real implementations.

>I don't know about anybody else, but when I think about an
>algorithm, I have a visualization of the data moving around
>and going through transformations.  I get a feel for the
>efficiency by thinking about how much work is involved in doing
>the movement/transformations.  I want students learning these
>concepts for the first time to get this same feel, without
>vaguely memorizing algorithms, which is what I think happens
>now.

I haven't memorized an algorithm in my life.  If I want to do something, I
either think about how it would be done, or I copy it from a likely source and
ignore it.  Frequently, an understanding of the algorithm is not necessary to
what I need it to do.  I also don't understand the file system structures used
by any of my computers, nor do I understand the scheduling algorithms used by
any of them particularly well.

I tend to agree with you on this.  I just don't think that assembly is the
best way to teach these visualizations.

>I think one of the reasons algorithms get memorized rather
>than learned is that they are protected too much by the
>abstraction of arrays, rather than the reality of memory.

Memory is not a reality, it's an implementation.  A student's comprehension
of data had *better* not depend on memory, or that student will choke badly on
trying to implement merge sort on tapes.

Arrays are *one kind* of data structure.  The belief that data is always
conveniently available in some sort of table or memory is dangerous, too.
(For that matter, an elegant and efficient way to sort an array fails
miserably on a linked list, even though both may be stored in memory.)

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00         ` Peter Seebach
@ 1996-08-08  0:00           ` Tim Behrendsen
  1996-08-08  0:00             ` Peter Seebach
  1996-08-10  0:00           ` Mike Rubenstein
  1 sibling, 1 reply; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-08  0:00 UTC (permalink / raw)



Peter Seebach <seebs@solutions.solon.com> wrote in article
<4ud8oo$61t@solutions.solon.com>...
> In article <01bb8536$892ee260$87ee6fce@timpent.airshields.com>,
> Tim Behrendsen <tim@airshields.com> wrote:
> >My point is someone who has "vaguely" learned five algorithms has
> >simply memorized them, and learned nothing about the general
> >principles of how algorithms are created.
> 
> Which is exactly what will happen if they code them in assembly; they
> can't possibly be learning how algorithms are created if they start with
> implementing them in a low level language.  Algorithms are generally
written
> in natural languages, and written on napkins.

What? I guess before HLLs nobody "possibly" learned anything.

I don't know about you, but I think about algorithms in terms of
data flow and manipulation.  HLLs or Assembly don't enter into
either.


> The details of which bytes move where are *not* part of the algorithm
itself.

What What?  That's what algorithms *are*!  What are algorithms
if they aren't black boxes that describe a particular output for
a particular input?  I don't know about your I/O, but mine are
in bytes.

> >Truly understanding two algorithms is better than memorizing five
> >algorithms, because that is what *gives" the fundamental
> >understanding.
> 
> But experimenting with the five algorithms, and comparing them, is a
better
> way to truly understand them than implementing them in a difficult
language.

I agree that I would rather have the student experiment with
*anything* rather than just crank out homework assignments with
no thought involved.

We disagree however on the difficulty of assembly.  I think it
is *much* easier, albeit more tedious, than a HLL.

> Sure.  Implementing I/O routines and stacks, *in any language*, is a good
way
> to learn about I/O routines and stacks.  It's a crappy way to learn about
> sorting.
> 
> Further, learning about stacks is bad way to learn about computing;
stacks are
> not a universal implementation of computers.  Students who learn about
stacks
> early on may start assuming that that's somehow a basic truth of
computing.
> They may do things like assert that the addresses of local variables in
one
> function are always lower than the addresses of local variables in a
function
> "above" it on the stack.  Or maybe that should be higher; both kinds
exist.

But that contradicts the C standard.  You're saying that total
ignorance is better than partial ignorance, and I might
even agree with that.  But I don't think it's reasonable to
argue that total ignorance is better than adequate knowledge.

In fact, I might even argue it's better to expose those
mistakes early on, rather than graduate them completely innocent
of these issues, and then when they need to do some assembly,
they pick up the bad habits.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-28  0:00                 ` J. Christian Blanchette
                                     ` (4 preceding siblings ...)
  1996-08-08  0:00                   ` William Clodius
@ 1996-08-08  0:00                   ` William Clodius
  1996-08-13  0:00                   ` Ole-Hjalmar Kristensen FOU.TD/DELAB
                                     ` (3 subsequent siblings)
  9 siblings, 0 replies; 688+ messages in thread
From: William Clodius @ 1996-08-08  0:00 UTC (permalink / raw)



In article <01bb853b$ca4c8e00$87ee6fce@timpent.airshields.com> "Tim
Behrendsen" <tim@airshields.com> writes: 

   Peter Seebach <seebs@solutions.solon.com> wrote in article
   <4uah1k$b2o@solutions.solon.com>...
<snip>
   > 
   > But your concept of what the "fundemental operations" are is completely
   tied
   > up in how specific hardware you've seen operates.  Algorithms exist in
   terms
   > of abstract operations, not moves and branches.

   Please name me the hardware that does not use moves and branches.

Show me an algorithm, not an implementation of an algorithm, that uses
moves and branches. Your phrasing has been poor on this thread. When
you say you want students to learn algorithms you seem to be really
saying you want students to learn how to implement a given algorithm
efficiently. They are typically two different things. The proper
algorithm can save orders of magnitude in runtime, the proper
implementation smaller, but sometimes important, factors of two or
three. 

   > Something like CWEB might be a good language in which to learn
   algorithms.
   > Scheme might too.
   > 
   > Let's look at a specific algorithm; the infamous gcd(x, y).
   > 
   > In C, we write
   > 
   > 	int
   > 	gcd(int x, int y) {
   > 		if (y == 0)
   > 			return x;
   > 		return gcd(y, (x % y));
   > 	}
   > 
   > or something similar.
   > 
   > What's important here is not any theories people may have about where the
   x
   > and y are stored, or how a stack works, but the concept that you can
   define an
   > algorithm in terms of a previous case.  Learning that the if() may be
   > implemented with "a branch" does not help the student understand how the
   > *algorithm* works, it helps the student understand how the *compiler*
   works.
   > These are distinct things.

   This is an interesting case, because it is somewhat inefficently
   implemented.  If you're interested in speed, you would do...

   int gcd(int x, int y) {
       int z;
       while (y != 0)
	   z = y; y = x % y; x = z;
       return(y);
   }

   Using my AIX compiler, I get a nominal improvement of about
   10%, mostly because the speed of the modulo is much slower
   than the inefficiency of recursion.

   My point is that how does a C programmer know that recursive
   implementations are somewhat inefficient?  Recently the C/C++
   Users Journal had a little bite-sized article about a
   general looping algorithm, where you could specify the number
   of nested loops in a dynamic fashion.  It was implemented
   recursively.  I wrote a letter back reimplementing the
   algorithm non-recursively, and got like a 40% increase in
   speed.

Actually the efficiency of Peter's implementation depends on the
optimizing capability of the compiler. Many compilers recognize his
example as employing tail recursion and eliminate the recursion by
replacing the recursive calls with the appropriate iterations. This
capability is required for Scheme compilers for example. Did you try
full optimization on your AIX compiler? There are of course problems
for which recursive algorithms can not employ such an obviously
optimizable construction. 

   These sorts of issues are where code bloat comes from, and
   it comes from naive implementations of "valid C syntax".

Are there really significant differences in the resulting code sizes?
Myself, I believe that code bloat is primarilly due to feature bloat,
and old programs that are not completely rewritten to remove unneeded
code.

   > >If programming is reduced to fundamentals of move, arithmetic,
   > >test, branch, etc it prevents the student from leaning on the
   > >abstraction rather than truly understanding the solution to the
   > >problem.  In other words, if they can express it in the above
   > >terms, you *know* they understand the solution.
   > 
   > But it also prevents them from learning the abstraction, and truly
   > understanding the *principle* of the solution.

   If the C abstraction is good, more abstraction must be better
   then.  How about we teach everything in APL, where we can
   *really* abstract away the details?  No data types, full
   array operations.  Talk about easy quicksort!  I can write it
   in one line of code using a handful of operations (my APL
   is *really* rusty, so I can't give the example).

Sounds good to me, but J, NIAL, or NESL might be better. Or a higher
order language such as Scheme or Clean.

   The student learns Quicksort, there is no question about it.
   But what have they *really* learned?

They have the chance to implement Heapsort, and Insertion and find out
how all three scale with problem size and character. (Let them run a
simple heapsort on a previously sorted array, or a large number of
small arrays.)

   [snip]
   > Nonsense.  A well designed and considered abstraction will generally lend
   > itself to an efficient and elegant implementation.  An ill-considered
   > abstraction will spend more time on cruft than it will on solving the
   problem.

   Like recursive algorithms?

Wasn't it Knuth who said something like, "Premature optimization is
the root of all evil"? We should write code that, to the greatest
extent possible is self explanatory, does the job that is needed, and
where computationally intensive parts of the code are expected, an
appropriately chosen algorithm. Then if testing shows a performance
problem, identify the problem routines and clean them up.

<snip>

Followups directed to comp.edu
-- 

William B. Clodius		Phone: (505)-665-9370
Los Alamos National Laboratory	Email: wclodius@lanl.gov
Los Alamos, NM 87545




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00                                         ` Peter Seebach
@ 1996-08-08  0:00                                           ` Tim Behrendsen
  1996-08-08  0:00                                             ` Peter Seebach
  1996-08-14  0:00                                             ` Richard A. O'Keefe
  1996-08-09  0:00                                           ` Chris Sonnack
  1 sibling, 2 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-08  0:00 UTC (permalink / raw)



Peter Seebach <seebs@solutions.solon.com> wrote in article
<4udb2o$7io@solutions.solon.com>...
> >> In C, we write
> 
> >> 	int
> >> 	gcd(int x, int y) {
> >> 		if (y == 0)
> >> 			return x;
> >> 		return gcd(y, (x % y));
> >> 	}
> 
> >This is an interesting case, because it is somewhat inefficently
> >implemented.  If you're interested in speed, you would do...
> 
> >int gcd(int x, int y) {
> >    int z;
> >    while (y != 0)
> >        z = y; y = x % y; x = z;
> >    return(y);
> >}
> 
> Yes, you would.  But so would at least one compiler I use.  Internally.

Which compiler is that?  I would like to see the compiler that
unravels recursion into non-recursive algorithms.
 
> It doesn't *matter*.  The realization is that, for non-zero x and y, the
> greatest common divisor of x and y is conveniently the greatest common
> divisor of y and (x mod y).  *THAT* is the algorithm.  The rest is fluff
> and nonsense.  You can do that operation on paper, too.
> 
> >Using my AIX compiler, I get a nominal improvement of about
> >10%, mostly because the speed of the modulo is much slower
> >than the inefficiency of recursion.
> 
> So?
> 
> What numbers are students entering that we *CARE* about the 10% speed
> difference?  Optimization is an interesting thing to look at, but has
> nothing to do with algorithms.

That is ivory tower insanity!  :)

It has *everthing* to do with algorithms.  You showed me a recursive
algorithm, I rewrote it to be faster.  The fact that in this case
it's only 10% is irrelevent; for another algorithm it could have
been much more (compare recursive v.s. non-recursive quicksort).

I consider a recursive solution to be a different algorithm than a
non-recursive one, although I realize that is arguably an
implementation issue.

Not teaching students about optimization is why we have such
a code-bloat problem right now.  They think there is no difference
by coding something.

> >My point is that how does a C programmer know that recursive
> >implementations are somewhat inefficient?
> 
> The C programmer doesn't.  And at least one compiler has been known
> to optimize tail-recursion into iteration, sometimes even in the
2-function
> case.

Big deal, one compiler does it, and I would like to see how
it handles more complex cases.  The point is that in the
real world, these issues make big differences.

> >These sorts of issues are where code bloat comes from, and
> >it comes from naive implementations of "valid C syntax".
> 
> No, they aren't.  All of those put together wouldn't explain code bloat.
> 
> We are seeing programs that are a factor of ten, or more, larger than we
think
> they are.  These would have to be *mind-numbing* idiocy.
> 
> Microoptimizing will not fix it.  Teaching people about *real* design
> principles will.

What "real" design principles are you referring to?

If every algorithm is written efficiently and correctly, the
cumulative effect is a well written product.

> It's worth observing good programmers rewriting.  A good programmer will
> add features and cut the size of a program slightly; a bad programmer
will
> bloat it hugely to add the feature.  (Assuming the original program to be
> mediocre.)
> 
> A good design will save more space and time than any amount of
optimization of
> a bad design.

Agreed.  But a bad implementation of a good design will destroy
a product just as easily.

> >The student learns Quicksort [in APL], there is no question about it.
> >But what have they *really* learned?
> 
> They've learned the algorithm itself.  If you want to teach them about
> optimization and efficiency, go right ahead, but distinguish O(n) vs
O(N^2)
> type efficiency from the kinds of efficiency you're talking about.
> 
> Remember, that O() includes arbitrary constant multipliers and additions.
> 8N is just the same as N in complexity.
> 
> I think it's a cool idea to teach people how to turn an 8N into a 4N
> algorithm.  I don't think it's part of the algorithm, I think it's a
different
> thing.

Agreed.  That's why I think that starting out implementing
algorithms so that they inherently see the flow of the data
helps to understand these issues.  If you're doing quicksort,
*data moves* no matter what the architecture is, and the more
movement, the less efficiency.

If I implemented Quicksort in APL, say (which has direct
manipulation of arrays in one operation), the student would
not see the movement, because they would just see the computer
doing the array "in one fell swoop", but wouldn't really
experience the fact that the computer doesn't really do it
that way in reality.

> >> Nonsense.  A well designed and considered abstraction will generally
lend
> >> itself to an efficient and elegant implementation.  An ill-considered
> >> abstraction will spend more time on cruft than it will on solving the
> >problem.
> 
> >Like recursive algorithms?
> 
> I'm not sure what you're referring back to.  Recursion is one of the
first
> things I look to eliminate if I have an expensive implementation of what
I
> know to be a good algorithm.  But if the cost is small, it tends to stay,
> because it's frequently obvious how it's supposed to work.

But why do you look to eliminate recursion in the expensive case?
It is part of the C standard, it is perfectly valid C syntax.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00                                           ` Tim Behrendsen
@ 1996-08-08  0:00                                             ` Peter Seebach
  1996-08-14  0:00                                             ` Richard A. O'Keefe
  1 sibling, 0 replies; 688+ messages in thread
From: Peter Seebach @ 1996-08-08  0:00 UTC (permalink / raw)



In article <01bb8569$9910dca0$87ee6fce@timpent.airshields.com>,
Tim Behrendsen <tim@airshields.com> wrote:
>Peter Seebach <seebs@solutions.solon.com> wrote in article
><4udb2o$7io@solutions.solon.com>...
>> >> In C, we write

>> >> 	int
>> >> 	gcd(int x, int y) {
>> >> 		if (y == 0)
>> >> 			return x;
>> >> 		return gcd(y, (x % y));
>> >> 	}

>> >This is an interesting case, because it is somewhat inefficently
>> >implemented.  If you're interested in speed, you would do...

>> >int gcd(int x, int y) {
>> >    int z;
>> >    while (y != 0)
>> >        z = y; y = x % y; x = z;
>> >    return(y);
>> >}

>> Yes, you would.  But so would at least one compiler I use.  Internally.

>Which compiler is that?  I would like to see the compiler that
>unravels recursion into non-recursive algorithms.

gcc handles trivial tail recursion, or so we're told.

>> >Using my AIX compiler, I get a nominal improvement of about
>> >10%, mostly because the speed of the modulo is much slower
>> >than the inefficiency of recursion.

>> What numbers are students entering that we *CARE* about the 10% speed
>> difference?  Optimization is an interesting thing to look at, but has
>> nothing to do with algorithms.

>That is ivory tower insanity!  :)

It is also the essense of learning the difference between algorithm and
implementation, which are very different arts.

>It has *everthing* to do with algorithms.  You showed me a recursive
>algorithm, I rewrote it to be faster.  The fact that in this case
>it's only 10% is irrelevent; for another algorithm it could have
>been much more (compare recursive v.s. non-recursive quicksort).

>I consider a recursive solution to be a different algorithm than a
>non-recursive one, although I realize that is arguably an
>implementation issue.

For the greatest common divisor, it certainly is.  It's like realizing
that you can say "a %= b" instead of "while (a > b) a -= b;".  It doesn't
change the *meaning*, only the representation.

>Not teaching students about optimization is why we have such
>a code-bloat problem right now.  They think there is no difference
>by coding something.

That doesn't make them identical fields of study.  Algorithms and
optimization are two different things.

>Big deal, one compiler does it, and I would like to see how
>it handles more complex cases.  The point is that in the
>real world, these issues make big differences.

Then real-world programmers will, rather than guessing about what compilers
are and aren't smart enough to do, try several implementations and benchmark
them.

>> Microoptimizing will not fix it.  Teaching people about *real* design
>> principles will.

>What "real" design principles are you referring to?

The basic questions of what data types to use to represent your model, and
what model to use to represent your problem.  If I design a word processor
as a set of operations, each of which is always done in sequence, I may end
up with a loop like
	while (1) {
		update_menu();
		update_display();
		update_cursor();
		...
		process_last_command();
		get_command();
	}

in which each command is processed by setting flags, which each of the various
update functions reacts to.  For instance, when I select "bold", the function
which redraws the entire display every time through the loop gets a flag set
to embolden something.

You could make your display algorithm take less than 1% of the time a "normal"
one would, and this program will *still* suck.

>If every algorithm is written efficiently and correctly, the
>cumulative effect is a well written product.

Unless the basic design is fundementally flawed.

Lest you say that the above is too stupid to be imagined, let's remember how
much harder good design gets as the goals get larger and more complicated.

Further, I have seen *an actual commercial product* whose inner loop is
essentially a large set of nested switch statements to process the cursor
location in terms of which menus are active.  An actual reported bug, since
fixed, was that moving the mouse over a certain button *without clicking on
it* inserted text if you had previously selected a certain other button.

The person who designed that *didn't* design it, and no amount of improvement
in the "efficiency" will ever correct that.

>Agreed.  But a bad implementation of a good design will destroy
>a product just as easily.

I don't think so.  It'll be easier to correct, for once.

>Agreed.  That's why I think that starting out implementing
>algorithms so that they inherently see the flow of the data
>helps to understand these issues.  If you're doing quicksort,
>*data moves* no matter what the architecture is, and the more
>movement, the less efficiency.

Perhaps; however, counting move instructions in the code won't show you how
much the data moves.  A "good" bubblesort may easily have fewer moves.

>If I implemented Quicksort in APL, say (which has direct
>manipulation of arrays in one operation), the student would
>not see the movement, because they would just see the computer
>doing the array "in one fell swoop", but wouldn't really
>experience the fact that the computer doesn't really do it
>that way in reality.

So talk to them about it.  Take the opportunity to clarify something about
APL.

>But why do you look to eliminate recursion in the expensive case?
>It is part of the C standard, it is perfectly valid C syntax.

Because my experience has shown that, often, it produces slow code, and
I can produce better results iteratively.  But I don't worry about speed until
I've shown that my design is functional.

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00           ` Tim Behrendsen
@ 1996-08-08  0:00             ` Peter Seebach
  1996-08-09  0:00               ` Tim Behrendsen
  0 siblings, 1 reply; 688+ messages in thread
From: Peter Seebach @ 1996-08-08  0:00 UTC (permalink / raw)



In article <01bb8567$4adddbc0$87ee6fce@timpent.airshields.com>,
Tim Behrendsen <tim@airshields.com> wrote:
>Peter Seebach <seebs@solutions.solon.com> wrote in article
><4ud8oo$61t@solutions.solon.com>...
>> Which is exactly what will happen if they code them in assembly; they
>> can't possibly be learning how algorithms are created if they start with
>> implementing them in a low level language.  Algorithms are generally
>> written in natural languages, and written on napkins.

>What? I guess before HLLs nobody "possibly" learned anything.

Probably not.  We have *no* evidence that people learned or comprehended *any*
algorithms before we developed HLL's.  The one I use for more than 90% of my
discussion and analysis of algorithms is called "English", though this is
arguably a misnomer, as I'm not.

>I don't know about you, but I think about algorithms in terms of
>data flow and manipulation.  HLLs or Assembly don't enter into
>either.

I think about them in terms of descriptions of rules for behavior.

Cantor's diagonal proof is an algorithm.  It's trivially unimplementable;
however, it's a brilliant algorithm for generating a result from an input,
despite the fact that neither the input nor the result could possibly be
stored anywhere.

>> The details of which bytes move where are *not* part of the algorithm
>> itself.

>What What?  That's what algorithms *are*!  What are algorithms
>if they aren't black boxes that describe a particular output for
>a particular input?  I don't know about your I/O, but mine are
>in bytes.

Exactly.  You're thinking of the *implementation*.  When you say "swap a and
b", it doesn't *MATTER*, in terms of what swapping does, whether this is
implemented by 3 xor's, by 3 assigns with a temporary, or by leaving the
values where they are and swapping the names of a and b.

>I agree that I would rather have the student experiment with
>*anything* rather than just crank out homework assignments with
>no thought involved.

*stunned silence*

We agree on something?  Someone say the H word, this thread is nearly gone.

>We disagree however on the difficulty of assembly.  I think it
>is *much* easier, albeit more tedious, than a HLL.

Huh.  Well, see, you're probably comfortable with it.  I can't *imagine* being
able to keep track of all of that.

>> Further, learning about stacks is bad way to learn about computing;
>stacks are
>> not a universal implementation of computers.  Students who learn about
>stacks
>> early on may start assuming that that's somehow a basic truth of
>computing.
>> They may do things like assert that the addresses of local variables in
>one
>> function are always lower than the addresses of local variables in a
>function
>> "above" it on the stack.  Or maybe that should be higher; both kinds
>exist.

>But that contradicts the C standard.

What does?  C can be implemented on stack-free machines.  Remember, there's no
guarantee that the relationals (lt, gt, lte, gte) are meaningful *at all* to
pointers not into (or just past the end of) a single object.

>You're saying that total
>ignorance is better than partial ignorance, and I might
>even agree with that.  But I don't think it's reasonable to
>argue that total ignorance is better than adequate knowledge.

No.  I just have *very* high standards for the adequate knowledge that makes
it safe for a programmer in a high level language to have beliefs about an
alleged understanding of "what really happens".

>In fact, I might even argue it's better to expose those
>mistakes early on, rather than graduate them completely innocent
>of these issues, and then when they need to do some assembly,
>they pick up the bad habits.

If they've been taught well, they'll see how the operations they already
understand are happening, rather than use the assembly to fill in holes in an
incomplete understanding.

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00         ` Szu-Wen Huang
@ 1996-08-08  0:00           ` Tim Behrendsen
  1996-08-09  0:00             ` Szu-Wen Huang
  1996-08-09  0:00           ` some days weren't there at all
  1996-08-10  0:00           ` Mike Rubenstein
  2 siblings, 1 reply; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-08  0:00 UTC (permalink / raw)



Szu-Wen Huang <huang@mnsinc.com> wrote in article
<4ud8m5$ka5@news1.mnsinc.com>...
> Tim Behrendsen (tim@airshields.com) wrote:

> : My point is someone who has "vaguely" learned five algorithms has
> : simply memorized them, and learned nothing about the general
> : principles of how algorithms are created.
> 
> No, your point was writing an algorithm in assembly helps to understand
> it fully.  I'll be my own case in point.  I remember most sorting
> algorithms vaguely, and I probably could not implement all of them
> off the top of my head.  Does that hamper me?  No, because I know
> which book to look them up in, and that's exactly what books are for.
> The important thing, however, is that I remember that O(n^2) is bad
> for a sorting algorithm, and O(n lg n) is pretty good.
> 
> Now, what about your student that knows *one* algorithm?

That's fine, anybody can look something up in a book.  I want
people working for me that can *think*, not just look it
up in the book.  Otherwise, how do you know when a
particular algorithm is appropriate for a particular problem?

When doing large projects, it's not just a matter of reaching
into the bag o' algorithms and pulling it out.  It takes
careful analysis, and the best way a student learns *that*
is to really go through the process of taking a problem and
seeing how the problem translates into data movement, etc.

What we're doing now is *sure* not working.

> : Someone who has "the feel" for two algorithms very strongly are
> : more likely to be able to extend that knowledge to creating new
> : algorithms.
> 
> Uhm, theoretically.  Somebody with only in-depth knowledge of run-
> length encoding is not very likely to come up spontaneously with
> Huffman or arithmetic compression.

Perhaps not "spontaneously", but you have them thinking about
compression issues.  "Thinking" is the key word, not just
memorizing an algorithm that they will forget after the final.

> : Computer Science is one of the few, if not only, sciences where a
> : student can derive all of it just by thinking.
> 
> Assuming the student doesn't have a deadline and all day to think for
> 40 years.  Many algorithms are obvious.  Insertion sort, for example,
> is what a human would do (if somewhat parallelized).  Quicksort, on
> the other hand, is nowhere near intuitive.  Be realistic.

I'm not saying everyone *does* derive all of it, but the
best programmers, even if they can't remember the exact
details of an algorithm, can rederive it by thinking
about the problem.

And I wouldn't say Quicksort is not intuitive, it is a logical
extension of recursive tree algorithms.  Not obvious, but
not unintuitive.  Lempel-Ziv or even the 'diff' algorithm
I might agree.

> Are you asserting that teaching software engineering instead of assembly
> language will somehow make the student unable to think?  Assembly
> language is a powerful tool best used by experts, not beginners.
> Beginners do not have the knowledge to use them properly, and tend
> to get caught in details and miss the big picture.  If you give an
> assignment of "implement quicksort in PC assembly", how much effort
> is spent in:
> 
> 1.  the algorithm
> 2.  I/O
> 3.  OS/platform intricacies
> 4.  debugging
> 
> ?  I would imagine your goal would be to maximize #1, which is why
> a HLL is better for the purpose.

Not only am I asserting, let me say in the strongest possible
terms that students are being graduated with CS degrees
that are unable to think.  I hate to keep harping on my
test, but on one particular hiring cycle, out of 50 applicants
49 were unable to create an algorithm for a problem they had
not seen before.  I hired the one guy, and he has been a great
success.

> : Truly understanding two algorithms is better than memorizing five
> : algorithms, because that is what *gives" the fundamental
> : understanding.
> 
> Fundamental understanding of RLE has little, if not nothing, to do
> with Huffman compression.  Fundamental understanding of quicksort
> has nothing to do with heapsort either.

That is simply not true.  Taking compression; if have done
thorough analysis of RLE, and understand the issues involved
in finding patterns (sequences of common bytes) and replacing
them with another encoded value, that is the basis for
more complex pattern matching/replacement.  If the student
is thinking, they will see that a large majority of the
input data is not being compressed, and this leads to wondering
if there is a general way to find more complex patterns in the
data.
 
> : > Teach a man an algorithm and some I/O routines to enter the input
> : > and display the output, then some routines to set up the stack,
> : > then some routines to initialize data,
> 
> : Yes, and wouldn't they truly understand I/O routines and stacks
> : after that?
> 
> They would, but would they understand the algorithm?

Sure; how much time do you think it takes to make one library
call to output data?  Push the parameters on the stack and
call 'printf'.  Big deal.

> : Yes, if your model is the brain-damaged 8086 model.  I personally
> : would use a 68000 to teach on because it's a nice straight-forward
> : orthogonal instruction set.
> 
> It doesn't matter.  You are burdening beginners with details that
> you could've avoided by using an HLL.  My school was contemplating
> teaching one freshman class Prolog as a first language and another
> Pascal, then switch in their second year to observe which was more
> effective.  It's too bad the experiment has practical problems,
> because it would be interesting to see if it's easier for a "recursive-
> minded" student to study iteration or the reverse.  In any case,
> you want to train students in problem solving, not why the OS requires
> value X in register Y.

I think you exaggerate the complexity of assembly.  This is like
saying "we want to train students in problem solving, not where
the commas go in the syntax."

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-07  0:00   ` Tim Behrendsen
@ 1996-08-08  0:00     ` Szu-Wen Huang
  1996-08-08  0:00       ` Tim Behrendsen
  0 siblings, 1 reply; 688+ messages in thread
From: Szu-Wen Huang @ 1996-08-08  0:00 UTC (permalink / raw)



Tim Behrendsen (tim@airshields.com) wrote:
: Fergus Henderson <fjh@mundook.cs.mu.OZ.AU> wrote in article
: <4uaqqg$203@mulga.cs.mu.OZ.AU>...

: > So which is better use of a student's time, writing 100 lines of
: > quicksort in assembler, or using C and writing 20 lines of quicksort,
: > 10 lines of insertion sort, 20 lines of heap sort, 20 lines of
: > merge sort, and 30 lines of glue to test them all out?

: I think it's more valuable to truly understand one or two algorithms,
: than to vaguely understand 5 algorithms.

1.  Being able to write algorithm X in assembly doesn't make you
    understand the algorithm better.  It makes you understand whatever
    platform du jour your school uses better.
2.  Vaguely understanding 5 algorithms is far better than understanding
    one or two fully.  The primary objective of algorithms courses is
    to produce students that can *choose* which algorithm to use,
    not to produce students who can memorize one or two algorithms.
    In fact, there's rarely a programmer more dangerous than one that
    has no broad knowledge of algorithms.

[snip]
: > Which will give them experience of the idea of
: > having multiple implementations of the same interface?  Which will
: > teach them how to write portable code?  Which will give them more
: > experience with the sort of software engineering problems they are
: > likely to encounter in the Real World [tm]?

: Easily learned. Later. More important to understand the
: procedural nature of the computer at first.

If you think software engineering is easily learned, you obviously
haven't been in the real world.

: > Which will give them understanding of a range of different
: > sorting algorithms?

: I would rather they have a better fundamental understanding
: of algorithms in general.

I thought you just said it was better to "truly understand one or
two algorithms"?  Make up your mind.  General knowledge, or specific
knowledge?

: If I may extend a famous quote,

: "Teach a man an algorithm, and you have given him one solution.
: Teach a man to think, and you have given him all solutions."
[snip]

Teach a man an algorithm and some I/O routines to enter the input
and display the output, then some routines to set up the stack,
then some routines to initialize data, then some reasons why
instruction X cannot follow instruction Y, then some reasons why
a small memory model isn't enough, and you confuse the man for
life.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-06  0:00       ` Dan Pop
@ 1996-08-08  0:00         ` steidl
  0 siblings, 0 replies; 688+ messages in thread
From: steidl @ 1996-08-08  0:00 UTC (permalink / raw)



In <danpop.839353428@news.cern.ch>, Dan.Pop@cern.ch (Dan Pop) writes:
>In <4u5rqe$9gv@ns.broadvision.com> patrick@broadvision.com (Patrick Horgan) writes:
>
>>In article <01bb7fcc$c5a98de0$87ee6fce@timpent.airshields.com>, "Tim Behrendsen" <tim@airshields.com> writes:
>>> Except, IMO assembly should be used *exclusively* for the first two
>>> years of a CS degree.  The first two years is usually all algorithmic
>>> analysis, anyway.  There's nothing you can't learn about algorithms
>>> that you can't learn and learn it better doing it in assembly.
>>
>>Good point Tim!  It's sure a lot easier counting cycles in assembler.
>
>Yeah, predicting cache misses and pipe stalls is a piece of cake.
>The superpipelined processors make the problem even easier.

That's why I loved programming my CoCos (6809) - no cache misses (no
cache), no pipeline stalls (no pipeline), no complex instruction
interactions (no superscaler capabilities), no wait states (memory was
several times *faster* than the CPU :-).  But I don't live in dispair,
maybe those days will return... (as in "ha ha, only serious")

>>Unfortunately, a lot of schools aren't teaching algorithmic analysis
>>anymore.
>
>If there is any connection between algorithm analysis and cycle counting,
>I definitely missed it.

Well, "way back then" I could do a quick sanity check on any given
algorithm analysis by plugging the cycles into the formula and pulling
out the old stopwatch.  Granted, doing so was typically not necessary,
but it was often-times still gratifying.  [OK, so shaving off a few
hundreths of a second fed my ego -- I'm much better now, really! ;-)]


-Jeff

steidl@centuryinter.net - http://www.dont.i.wish.com/
All opinions are my own, and are subject to change without notice.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-07  0:00                                     ` Dan Pop
@ 1996-08-08  0:00                                       ` Tim Behrendsen
  1996-08-08  0:00                                         ` Peter Seebach
                                                           ` (2 more replies)
  1996-08-08  0:00                                       ` Christopher R Volpe
  1 sibling, 3 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-08  0:00 UTC (permalink / raw)



Dan Pop <Dan.Pop@cern.ch> wrote in article
<danpop.839450672@news.cern.ch>...
> In <01bb83f5$923391e0$87ee6fce@timpent.airshields.com> "Tim Behrendsen"
<tim@airshields.com> writes:
> 
> >The problem is that we *can't* think purely abstractly,
> >otherwise we end up with slow crap code.
> 
> Care to provide some concrete examples?

Look at the code-bloated and slow-software world we live in,
particularly on desktop platforms.  I think this is caused by
people not truly understanding what's *really* going on.

For example, look at OOP.  Very naive implementations of OOP
used a huge amount of dynamic memory allocation, which can cause
severe performance problems.  That's why I don't use C++ for my
products; to do it right you have to do a very careful analysis
of how your classes are going fit together efficiently.

I've spoken to enough people that have had C++ disasters to
convince me that the more abstraction there is, the more
danger there is of inefficient code.  This shouldn't be that
hard to believe; any time you abstract away details you are
giving up knowledge of what is going to be efficient.

I alluded to this in another post, but a good example is Motif
and X11.  A programmer who only understands Motif, but does not
understand X11 is going to write slow crap, period.

> >It is simply not
> >possible to ignore the way code is structured, and completely
> >depend on the compiler to save us.
> 
> This doesn't make any sense to me.  Could you be a little bit more
> explicit?
> 
> The compiler definitely won't save my ass if I choose to use bubblesort
> instead of quicksort on a large dataset, but the selection between the
> two algorithms is made based on an abstraction (algorithm analysis) not
> on how the compiler generates code for one or another.  It's very likely
> that quicksort will be better, no matter the compiler and the underlying 
> platform.
> 
> Once you put micro-optimizations based on knowledge about the
> compiler and/or hardware into the code, you impair both the 
> readability/maintainability/portability of the code and the opportunities
> of another compiler, on another platform, to generate optimal code.
> There are situations when this _has_ to be done, but they are isolated
> exceptions, not the norm.

I gave this example in another post, but nobody responded.  I think
it was too good. :-)  I'll try again ...

I can prove that your statement is wrong.

Let's say I have the perfect optimizer that takes C code and
provides the absolute most efficient translation possible.  Given
that's the case, it won't improve an O(n) algorithm to an O(n^2)
algorithm.

Now, that should mean that I can order my C code into any
algorithmically valid sequence and end up with exactly the same
running time, because the optimizer is always perfect.

Now, we know that this does not reflect the real world.  The
question is, how does a programmer learn the efficient
implementations that the optimizer can deal with effectively
from the boneheaded ones?

Here's an example:

int a[50000],b[50000],c[50000],d[50000],e[50000];

void test1()
{
    int i, j;
    for (j = 0; j < 10; ++j) {
        for (i = 0; i < 50000; ++i) {
            ++a[i]; ++b[i]; ++c[i]; ++d[i]; ++e[i];
        }
    }
}

void test2()
{
    int i, j;
    for (j = 0; j < 10; ++j) {
        for (i = 0; i < 50000; ++i) ++a[i];
        for (i = 0; i < 50000; ++i) ++b[i];
        for (i = 0; i < 50000; ++i) ++c[i];
        for (i = 0; i < 50000; ++i) ++d[i];
        for (i = 0; i < 50000; ++i) ++e[i];
    }
}

On my AIX system, test1 runs in 2.47 seconds, and test2
runs in 1.95 seconds using maximum optimization (-O3).  The
reason I knew the second would be faster is because I know
to limit the amount of context information the optimizer has
to deal with in the inner loops, and I know to keep memory
localized.

Now I submit that if I showed the average C programmer
both programs, they would guess that test1 is faster because
it has "less code", and that is where abstraction,
ignorance, and niavete begin to hurt.

-- Tim Behrendsen (tim@airshields.com)






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-06  0:00                             ` Fergus Henderson
  1996-08-07  0:00                               ` Tim Behrendsen
@ 1996-08-08  0:00                               ` Stephen M O'Shaughnessy
  1996-08-09  0:00                               ` Stephen M O'Shaughnessy
       [not found]                               ` <01bb846d$ <Dvtnon.I49@most.fw.hac.com>
  3 siblings, 0 replies; 688+ messages in thread
From: Stephen M O'Shaughnessy @ 1996-08-08  0:00 UTC (permalink / raw)



c6c01780$87ee6fce@timpent.airshields.com> <4uaqqg$203@mulga.cs.mu.OZ.AU> <01bb84b4$75304ce0$87ee6fce@timpent.airshields.com>
MIME-Version: 1.0

In article <01bb84b4$75304ce0$87ee6fce@timpent.airshields.com>, 
tim@airshields.com says...
>
>
>Easily learned. Later. More important to understand the
>procedural nature of the computer at first.
>
I view my computer(s) more as a standard tool to solve problems.  Like the drill
in my garage.  I don't need to know the voltage and current rating of an electric
drill to use it,  BECAUSE most house wiring is standardized and the drill is
designed to work with house wiring.  The same can be said of HLLs, they
standardize away (abstract) the details of the computer that I do not need
to be concerned with.  This is not to say these details are not important.  When
my drill starts tripping circuit breakers I have a problem that needs to be 
solved. 

My point is that I take exception to the Tim's claim that we need to learn the
computer "first".





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-07  0:00                                   ` Tim Behrendsen
  1996-08-07  0:00                                     ` Peter Seebach
  1996-08-07  0:00                                     ` James A. Squire
@ 1996-08-08  0:00                                     ` David Weller
  1996-08-09  0:00                                     ` Bob Gilbert
  3 siblings, 0 replies; 688+ messages in thread
From: David Weller @ 1996-08-08  0:00 UTC (permalink / raw)



In article <01bb846c$e51df220$87ee6fce@timpent.airshields.com>,
Tim Behrendsen <tim@airshields.com> continues his long, and relatively
indefensible, arguments for teaching assembler instead of a high-level
language for CS1-type classes...:
>
>Not to get into a debate on the meaning of abstraction, but the
>point is that there is very little hidden from the student when
>they are learning assembly.  This allows them to better concentrate
>on the basics of algorithms, because they are not distracted by syntax.
>

Yeah, for loops are sooo much clearer when you write in
assembler...and so much less error-prone because everything is right
in front of you!

>> What are you *talking* about?  Algorithmic analysis is fundementally an
>> abstraction.  Rather than looking at the *specific* costs of the
>algorithm, we
>> look at the *kinds* of costs.  N^2 vs. log(N) complexity is entirely an
>> abstraction.
>
>Of course, but I'm talking about abstractions of assembly, i.e.,
>HLLs.  Remember, C (or any HLL) does not really exist as far as
>the computer knows.  Assembly is the direct raw instruction set of
>the physical machine.  If the student is learning algorithms in
>assembly, they are unquestionably learning the algorithm, and not
>just some vague concept wrapped in 10 layers of wool.
>
>The *reality* is, students graduating today are just not getting
>it.

Your experience with new employees/interviewees that "just don't get
it" provides 0% support to your claim that abstractions in assembly
will produce smarter students (which, from my recollection at the
beginning of this thread, is what you were claiming).

Please note: Followups have been redirected -- this argument has very
little to do with the newsgroups it originally started in, and this
thread is FAR more appropriate in comp.edu.

-- 
    Visit the Ada 95 Booch Components Homepage: www.ocsystems.com/booch
           This is not your father's Ada -- lglwww.epfl.ch/Ada




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-01  0:00       ` Tony Konashenok
  1996-08-04  0:00         ` Lawrence Kirby
@ 1996-08-09  0:00         ` Verne Arase
  1 sibling, 0 replies; 688+ messages in thread
From: Verne Arase @ 1996-08-09  0:00 UTC (permalink / raw)



In article <4tqoru$kdn@overload.lbl.gov>,
tonyk@sseos.lbl.gov (Tony Konashenok) wrote:

 >Another serious limitation is that fixed-point numbers can only be
integer.
 >I can't count how many times I had to do dirty tricks to circumvent it.

... or really stupid stuff like having a numeric interger constant suddenly
shift over into octal because you coded a leading zero.

Yes I know a leading zero makes it octal. Yes, I know that C was originally
designed to be run on much slower hardware, and some shortcuts had to be
taken.

Doesn't mean that I don't occasionaly get bitten by this one though :-/.

---
The above are my own opinions, and not those of my employer.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00         ` Szu-Wen Huang
  1996-08-08  0:00           ` Tim Behrendsen
@ 1996-08-09  0:00           ` some days weren't there at all
  1996-08-10  0:00           ` Mike Rubenstein
  2 siblings, 0 replies; 688+ messages in thread
From: some days weren't there at all @ 1996-08-09  0:00 UTC (permalink / raw)
  Cc: huang


Szu-Wen Huang wrote:
[speaking about the benefit of knowing one/two algorithms deeply
vs. knowing five/six more generally]
>  No, because I know
> which book to look them up in, and that's exactly what books are for.
> The important thing, however, is that I remember that O(n^2) is bad
> for a sorting algorithm, and O(n lg n) is pretty good.
> 
> Now, what about your student that knows *one* algorithm?
	This touches on an interesting point..I'm inclined to believe 
that the _mathmatical_/theoretical training behind the coding exercises 
might make a bit of a difference in how much is retained or understood. 
It doesn't matter how many sorting algorithms I can code, or how many 
different times I can memorize if I don't know where they 'fit' in the 
overall picture. This sort of training should be language-independent.
	I'm taking a CS class now; freshman-level stuff(I hope). The 
professor showed us today _why_ a comparison-based sorting algorithm 
can't take fewer than O(n log n) steps. Maybe I'm a twisted, 
academic-inclined person, but I acutally found the proof interesting. 
Maybe I would go so far as to say _beautiful_. He managed to present it 
in a straightforward way, w/o needing mathmatical notation, in such a 
way that it became obviously correct. That's independent of language.
	 
 
[responding to comment about all of CS being 'intuitive']
>  Many algorithms are obvious.  Insertion sort, for example,
> is what a human would do (if somewhat parallelized).  Quicksort, on
> the other hand, is nowhere near intuitive.  Be realistic.
	I would submit that recursive solutions require a different sort 
of intuition than the normal. At least it seems that way to me. If I'm 
looking to solve a problem recursively, I'll (now) naturally think about 
ways to split the problem set in half to minimize work. Why? Because 
I've been exposed to the idea.
	It seems to me that this kind of intuition can be developed. In 
my own case, early experience with Scheme (thank you, Mr. Grillmeyer), 
has led to a _much_ better grasp of recursion and recursive tree 
structures. I honestly think it was one of the best preparations I could 
have had...although I am the first to admit that I can't really back up 
that statement. :-)
	In any case, building/reinforcing these patterns of thought 
should be very important. I want to get to the point where QuickSort 
*is* intuitive, simply because, well, "it has to be that way". It's that 
kind of visceral understanding, coupled with the ability (when 
necessary) to back it up by referring to sound justification, which 
helps on the programming projects I am compelled to do. When I can't 
produce it, I flounder. 
	
[apologies for cutting the previous poster out of this post]
[back-and-forth about assembler vs. HLL]
> It doesn't matter.  You are burdening beginners with details that
> you could've avoided by using an HLL.  
	Um, just as an aside, I do not know assembler, and I do not 
presume to comment on its complexity or suitability for CS instruction, 
but many thanks to whomever posted the gcc -S switch. I tried it on one 
of my current C++ source files, and the result was, well, interesting. I 
understood little of it, but I spent a very small amount of time, and am 
unfamiliar with the conventions of the language (what's an .fstab? file 
system table?).
	At least I can look at the assembler output now. Never knew I 
could do that before. Thanks.

>My school was contemplating
> teaching one freshman class Prolog as a first language and another
> Pascal, then switch in their second year to observe which was more
> effective.  
	Out of curiosity, why Prolog? I was taught that it represents a 
different paradigm/model/whatever from both functional and imperative 
programming; sort of "off by itself". It's not a language I know much 
about. One more thing to do in the next couple years, I guess. :-)
	  	

>It's too bad the experiment has practical problems,
> because it would be interesting to see if it's easier for a "recursive-
> minded" student to study iteration or the reverse. 
	Personally, I think the form/paradigm of the language being 
studied, as well as what can and cannot be expressed, would have a much 
deeper effect on coding style and programming practice. The first 
language I studied academically (I refuse to count BASIC as my 'first 
language'!) was Scheme. It's a wondefully expressive and functional way 
to do things, or at least it _appears this way to me_ because I learned 
the concepts of programming by working with it. I know it has had a 
profound effect upon my understanding of recursion, and undoubtedly an 
influence on my coding style.
	I've gone on to Pascal, and now C++, and have had to adjust 
accordingly. Making the transition to Pascal was more than just studying 
iteration (actually, there's a (do construct in Scheme...not 
reccomended, but it works) - it required a complete change in 
understanding of what a function is and how it works. I remember one 
project when I tried to pass functions as arguments...and did not get 
particularly far. Of course, now, with C++, I can do this. I also have 
operator overloading. However, I'm trying to be a good C++ citizen and 
concentrate on this funny objet-oriented concept. :-) 
	
> In any case,
> you want to train students in problem solving, not why the OS requires
> value X in register Y.
	Any programming assignment in any language will require a great 
degree of problem solving. I spent/am spending far, far, far too much 
time tracking down and squashing bugs, no matter what language I have/am 
program(ming) in. Some of these are syntax errors, but far more are 
simply not thinking out my algorithm completely enough. Garbage in, 
garbage out...
	Again, I do not have the pleasure of being acquainted with 
assembler, but it seems that a student will simply adapt to the 
landscape of whatever language is his/her first and then use that to 
solve any proceeding problems. The question of "What language to learn 
first?" then becomes "What do you want your mind to look like?" If 
you've got a hammer...

-David Molnar




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00                                         ` Peter Seebach
  1996-08-08  0:00                                           ` Randy Kaelber
@ 1996-08-09  0:00                                           ` J. Blustein
  1996-08-11  0:00                                             ` Peter Seebach
  1996-08-09  0:00                                           ` Chris Sonnack
  2 siblings, 1 reply; 688+ messages in thread
From: J. Blustein @ 1996-08-09  0:00 UTC (permalink / raw)



In article <4ud81d$5ii@solutions.solon.com>,
Peter Seebach <seebs@solon.com> wrote:
>Abstraction is a trade-off of maintainability and readability vs. code
>speed.
    Can we say that we should be concerned with issues of effectiveness
instead of efficency?  I know that, as programmers, we pursuing elegance
and that an important part of that is efficency, but it is not the most
important aspect.  Finally, there are outside concerns such as deadlines.

>If I could arrange to need only one of those five arrays, I bet my code would
>be faster than yours, no matter *how* much you optimized it.

    I really didn't want to get dragged into this discussion, but I very
much wanted to make that one point.  It has been made before, but I found 
that referring to effectiveness made the point clearer for me.
-- 
J. Blustein   http://www.csd.uwo.ca/~jamie/.Refs/C-refs.html  <jamie@csd.uwo.ca>




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00           ` Tim Behrendsen
@ 1996-08-09  0:00             ` Szu-Wen Huang
  1996-08-09  0:00               ` Tim Behrendsen
  0 siblings, 1 reply; 688+ messages in thread
From: Szu-Wen Huang @ 1996-08-09  0:00 UTC (permalink / raw)



Tim Behrendsen (tim@airshields.com) wrote:

: That's fine, anybody can look something up in a book.  I want
: people working for me that can *think*, not just look it
: up in the book.  Otherwise, how do you know when a
: particular algorithm is appropriate for a particular problem?

Because I know the range of performance I can expect from a
class of algorithms, expressed in time complexities.  In fact,
I also know what problems have polynomial-time solutions, and
what *don't*.  None of the above comes from my expert knowledge
of assembly.  Are you saying if I don't know assembly I can't
pick an algorithm?

: When doing large projects, it's not just a matter of reaching
: into the bag o' algorithms and pulling it out.

Really?  When you want a sorting algorithm, I suppose you spend
a few weeks thinking about it first?  When you want compression,
surely you meditate in the mountains?  The industry has many
many solved problems, some free, some not free.  It *is*, as
a matter of fact, just a matter of knowing where to look - and
when to stop looking.  Again, none of these have to do with
assembly language.

: It takes
: careful analysis, and the best way a student learns *that*
: is to really go through the process of taking a problem and
: seeing how the problem translates into data movement, etc.

Data movement, sir, is not restricted to assembly language.
Assembly language, in fact, impedes the understanding of data
movement because of all the extraneous things that have to be
done to make the program work.

: What we're doing now is *sure* not working.

Because some students graduate without understanding algorithm
complexity, not because they graduate without learning assembly
language.

: > Uhm, theoretically.  Somebody with only in-depth knowledge of run-
: > length encoding is not very likely to come up spontaneously with
: > Huffman or arithmetic compression.

: Perhaps not "spontaneously", but you have them thinking about
: compression issues.  "Thinking" is the key word, not just
: memorizing an algorithm that they will forget after the final.

I never said anything about a student that forgot everything after
an exam.  I said, it is better to have *general* knowledge than
specific knowledge in intermediate levels.  My student would've
remembered that he was taught Huffman, which compressed better
than RLE.  Now, how Huffman works he may not immediately remember,
but he would know where to look it up, and that's all he needs.
This student with general knowledge can use books to his advantage,
while your RLE expert sits in a cubicle and tries to reinvent the
wheel.

: > Assuming the student doesn't have a deadline and all day to think for
: > 40 years.  Many algorithms are obvious.  Insertion sort, for example,
: > is what a human would do (if somewhat parallelized).  Quicksort, on
: > the other hand, is nowhere near intuitive.  Be realistic.

: I'm not saying everyone *does* derive all of it, but the
: best programmers, even if they can't remember the exact
: details of an algorithm, can rederive it by thinking
: about the problem.

Of course, but we're not talking about the best programmers.  We're
talking about beginners and intermediates.  The best programmers are
probably experts in several related fields - the language, the
platform, the algorithms, and the tools.  Now, without comprehensive
knowledge, it is better to be general than specific.

: And I wouldn't say Quicksort is not intuitive, it is a logical
: extension of recursive tree algorithms.  Not obvious, but
: not unintuitive.  Lempel-Ziv or even the 'diff' algorithm
: I might agree.

How long did it take you to come up with Quicksort armed with
the knowledge from implementing bubblesort in assembly?

: Not only am I asserting, let me say in the strongest possible
: terms that students are being graduated with CS degrees
: that are unable to think.

I don't disagree.  In fact, I'll give you an example.  My students
in introductory Pascal courses started programming with Turbo
Pascal.  If you learned programming earlier, you'll probably be
the kind that thinks hard about a problem, goes "eureka" one day
in the shower, and then sits down to type.  This generation, on
the other hand, sits right down and types away and turns in some
truly embarrassing solutions to simple problems.

I tend to blame the TV.  Point, however, is on my side.  The
students were given tools they could not use properly.  The
debugger that was so convenient for an expert actually destroyed
their will to think about a problem ahead of implementation.  Now
they just hack together some code and trace through it until they
get it right.  Assembly will have the same ills.  Firstly, they
distract the students and force them to implementation details
specific to the platform, and secondly is simply too powerful for
beginners to use properly.

: I hate to keep harping on my
: test, but on one particular hiring cycle, out of 50 applicants
: 49 were unable to create an algorithm for a problem they had
: not seen before.  I hired the one guy, and he has been a great
: success.

I gave the Vernier encoding as a hands-on exam.  You are of course
familiar with the:

   A B C D E F G ... Z
 A A B C D E F G ... Z
 B B C D E F G H ... A
 C C D E F G H I ... B
 .
 Z Z A B C D E F ... Y

table, wherein you ask for a plaintext string and a cipher key, then
index the row with plaintext and the column with the key to get the
ciphertext.

Trivial?  None of the ten or so students considered the fact that
the table can be expressed as a formula.  *All* of them stored this
table somewhere.  One particularly creative one even read the table
from a textfile.

I don't think this phenomenon was due to the fact that they didn't
know assembly.

: > Fundamental understanding of RLE has little, if not nothing, to do
: > with Huffman compression.  Fundamental understanding of quicksort
: > has nothing to do with heapsort either.

: That is simply not true.  Taking compression; if have done
: thorough analysis of RLE, and understand the issues involved
: in finding patterns (sequences of common bytes) and replacing
: them with another encoded value, that is the basis for
: more complex pattern matching/replacement.  If the student
: is thinking, they will see that a large majority of the
: input data is not being compressed, and this leads to wondering
: if there is a general way to find more complex patterns in the
: data.

Even if what you say is true (it's difficult, at best), I still don't
see how this student can be a more effective engineer than one who
has vague knowledge of both, but distinctly know that something *is*
better than RLE.

: > : > Teach a man an algorithm and some I/O routines to enter the input
: > : > and display the output, then some routines to set up the stack,
: > : > then some routines to initialize data,

: > : Yes, and wouldn't they truly understand I/O routines and stacks
: > : after that?

: > They would, but would they understand the algorithm?

: Sure; how much time do you think it takes to make one library
: call to output data?  Push the parameters on the stack and
: call 'printf'.  Big deal.

Then why not use C instead?  Realize that you are the one trying to
prove that assembly is better for teaching algorithms.  If you're
going to use library functions extensively, what's wrong with C
then?

: > It doesn't matter.  You are burdening beginners with details that
: > you could've avoided by using an HLL.  My school was contemplating
: > teaching one freshman class Prolog as a first language and another
: > Pascal, then switch in their second year to observe which was more
: > effective.  It's too bad the experiment has practical problems,
: > because it would be interesting to see if it's easier for a "recursive-
: > minded" student to study iteration or the reverse.  In any case,
: > you want to train students in problem solving, not why the OS requires
: > value X in register Y.

: I think you exaggerate the complexity of assembly.  This is like
: saying "we want to train students in problem solving, not where
: the commas go in the syntax."

Where the commas go is important, because it is a prerequisite to a
correct solution.  If we one day can build computer languages that
can understand humans better, perhaps we will not care where we
put commas.  Assembly is not complex per se, but is more complex
than HLLs to learn and read, and as such is counterproductive when
used as a tool to teach algorithms.

You still haven't proven why assembly is necessary, or at least
better.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
       [not found]                               ` <01bb846d$ <Dvtnon.I49@most.fw.hac.com>
@ 1996-08-09  0:00                                 ` Tim Behrendsen
  0 siblings, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-09  0:00 UTC (permalink / raw)



Stephen M O'Shaughnessy <smosha@most.fw.hac.com> wrote in article
<Dvtnon.I49@most.fw.hac.com>...
> c6c01780$87ee6fce@timpent.airshields.com> <4uaqqg$203@mulga.cs.mu.OZ.AU>
<01bb84b4$75304ce0$87ee6fce@timpent.airshields.com>
> MIME-Version: 1.0
> 
> In article <01bb84b4$75304ce0$87ee6fce@timpent.airshields.com>, 
> tim@airshields.com says...
> >
> >
> >Easily learned. Later. More important to understand the
> >procedural nature of the computer at first.
> >
> I view my computer(s) more as a standard tool to solve problems.  Like
the drill
> in my garage.  I don't need to know the voltage and current rating of an
electric
> drill to use it,  BECAUSE most house wiring is standardized and the drill
is
> designed to work with house wiring.  The same can be said of HLLs, they
> standardize away (abstract) the details of the computer that I do not
need
> to be concerned with.  This is not to say these details are not
important.  When
> my drill starts tripping circuit breakers I have a problem that needs to
be 
> solved. 
> 
> My point is that I take exception to the Tim's claim that we need to
learn the
> computer "first".

I think a more correct analogy is the current and voltage
are the underlying hardware, and the tools are assembly.
HLLs are like prefab walls you slap together to build the
house.  Now, you can build a lot of houses that way, and
they will be generally be built well.  But if you want to
understand how houses are put together, you have to go to the
fundamental tools, with an understanding of structural
dynamics.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-08  0:00                                     ` Teaching sorts [was Re: What's the best language to start with?] Robert I. Eachus
@ 1996-08-09  0:00                                       ` Robert Dewar
  1996-08-10  0:00                                       ` Lawrence Kirby
                                                         ` (3 subsequent siblings)
  4 siblings, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-09  0:00 UTC (permalink / raw)



Robert Eachus says

"    I managed to do the "fun" experiment once.  Take three students
and have them learn Quicksort, Heapsort, and Bubblesort on "small"
decks.  At even 50 to 60 cards, the students doing Heapsort and
Quicksort are racing each other*, and the Bubblesort victim is still
hard at work well after they have finished."

Try extending your experiment (I have also used this) a bit. Have a fourth
person sort the deck who knows none of these algorithms. That fourth 
person will typically beat the Quicksort and Heapsort guys. Why? Because
the natural way to sort cards is with some physical embodiment of adress
calculation sorting, which can have an average time performance that is
order (N) rather than order N log N.

This can be an instructive addition to your experiment!





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with
  1996-08-08  0:00                                   ` Tim Behrendsen
@ 1996-08-09  0:00                                     ` Robert Dewar
  0 siblings, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-09  0:00 UTC (permalink / raw)



Tim said

"I completely disagree!  Someone who is an "algorithm memorizer"
is *much* less likely to go back and think "Is there another
way to do this," because they by definition do not think
about their solutions.  They simply look them up in the book,
and if the book says that's the best, well, no need to go
further."

This seems like a pretty extreme case of NIH syndrome. Any competent
programmer MUST be an "algorithm memorizer" in the sense that if you
have a complex algorithmic problem, then you should at LEAST know where
to go and look up appropriate solutions. In other than trivial cases,
you are very unlikely to be able to cough up on the spot solutions
that compare to the combined knowledge represented by 30 years of
research by thousands of researchers. 

Over and over again I see cases of careful and well written implementations
of completely absurd algorithms for standard problems, where quite
obviously the programmer was simply unaware that there are much better
solutions known. 

Suppose for example you need a fast in place sort with bounded time
behavior (i.e. you are concerned with worst case performance). A good
programmer knows immediately that treesort3 or some variation of it (Knuth
likes to call this algorithm heapsort) is the natural choice. You are NOT
about to invent this on your own. You don't necessarily need to recall
the details, but you should certainly know where to look up this algorithm,
and then be able to code from that description, or in some environments
you can of course short circuit this by using standard libraries. For
instance, if you want to compute eigenvalues of a large matrix, then
you CERTAINLY do not try to invent your algorithm, and you probably don't
even go and look up an algorithm, instead you use a standard library.

Reuse, of both code in standard libraries, and of ideas, in the form of
published algorithms, is a key, perhaps *the* key, tool in a programmers
arsenal.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-06  0:00                             ` Fergus Henderson
  1996-08-07  0:00                               ` Tim Behrendsen
  1996-08-08  0:00                               ` Stephen M O'Shaughnessy
@ 1996-08-09  0:00                               ` Stephen M O'Shaughnessy
  1996-08-09  0:00                                 ` Tim Behrendsen
       [not found]                               ` <01bb846d$ <Dvtnon.I49@most.fw.hac.com>
  3 siblings, 1 reply; 688+ messages in thread
From: Stephen M O'Shaughnessy @ 1996-08-09  0:00 UTC (permalink / raw)



In article <01bb85af$e83cf2a0$32ee6fce@timhome2>, tim@airshields.com says...

>I think a more correct analogy is the current and voltage
>are the underlying hardware, and the tools are assembly.
>HLLs are like prefab walls you slap together to build the
>house.  Now, you can build a lot of houses that way, and
>they will be generally be built well.  But if you want to
>understand how houses are put together, you have to go to the
>fundamental tools, with an understanding of structural
>dynamics.
>
>-- Tim Behrendsen (tim@airshields.com)

I still have trouble with your definition of assembly.  An assembly language
is just the one for one translation of the machine code.  For the 8051

mov 90h,#0ffh is the machine code 75 90 FF

This is directly voltages existing in the hardware (bits).  In this sense
assembly is the hardware.  My position is that I don't need to know 
hardware specifics to learn algorithms and programming.  Certainly I don't
need that understanding FIRST.  And my boss, customer, 
or whomever, does not care about my understanding as long as I can 
deliver a well built product.  Which, as you pointed out above, I can do.






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00                                         ` telnet user
@ 1996-08-09  0:00                                           ` Tim Behrendsen
  1996-08-09  0:00                                           ` Ed Hook
  1 sibling, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-09  0:00 UTC (permalink / raw)



telnet user <tim@franck> wrote in article <4udjki$2l8@cnn.Princeton.EDU>...
> Tim Behrendsen (tim@airshields.com) wrote:
> 
> : This is an interesting case, because it is somewhat inefficently
> : implemented.  If you're interested in speed, you would do...
> 
> : int gcd(int x, int y) {
> :     int z;
> :     while (y != 0)
> :         z = y; y = x % y; x = z;
> :     return(y);
> : }
> 
> However if you are interested in correctness, you use braces for the
> loop.  I think this case where you write assembly in C, get it wrong,
> and ...
> : Using my AIX compiler, I get a nominal improvement of about
> : 10%, mostly because the speed of the modulo is much slower
> : than the inefficiency of recursion.
> 
> only achieve a 10% speedup proves everyone else's point, and is an
> appropriate place to end this thread.

Yes, with *this* algorithm, on *this* computer, I only got a
10% improvement.  I rewrote an algorithm to be non-recrursive
that was published in CUJ and got a 40% improvement.  This is
the reality of what happens when you let the compiler to do the
thinking.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-09  0:00             ` Szu-Wen Huang
@ 1996-08-09  0:00               ` Tim Behrendsen
  1996-08-10  0:00                 ` Szu-Wen Huang
  0 siblings, 1 reply; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-09  0:00 UTC (permalink / raw)



Szu-Wen Huang <huang@mnsinc.com> wrote in article
<4ue5l3$frr@news1.mnsinc.com>...
> Tim Behrendsen (tim@airshields.com) wrote:
> 
> : That's fine, anybody can look something up in a book.  I want
> : people working for me that can *think*, not just look it
> : up in the book.  Otherwise, how do you know when a
> : particular algorithm is appropriate for a particular problem?
> 
> Because I know the range of performance I can expect from a
> class of algorithms, expressed in time complexities.  In fact,
> I also know what problems have polynomial-time solutions, and
> what *don't*.  None of the above comes from my expert knowledge
> of assembly.  Are you saying if I don't know assembly I can't
> pick an algorithm?

It's not a question of "knowing assembly" or not, it's a
question of understanding algorithms.  There's more to writing
programs than just knowing an algorithm is O(log n).  It
has to implemented efficiently.

> : When doing large projects, it's not just a matter of reaching
> : into the bag o' algorithms and pulling it out.
> 
> Really?  When you want a sorting algorithm, I suppose you spend
> a few weeks thinking about it first?  When you want compression,
> surely you meditate in the mountains?  The industry has many
> many solved problems, some free, some not free.  It *is*, as
> a matter of fact, just a matter of knowing where to look - and
> when to stop looking.  Again, none of these have to do with
> assembly language.

Maybe you're paid to sit and write sorting algorithms all day,
but where I sit, the world is not that simple.  Algorithms
have to fit into a cohesive whole, and it's often the case
that some algorithms are more appropriate in certain cases
than others.  Otherwise, everyone would always use Quicksort.

But, you're right that this has nothing to do with assembly.
The question is how to have students get a feel for how
algorithms actually work in the real world, and there is no
substitute for working with the real computer.

> : It takes
> : careful analysis, and the best way a student learns *that*
> : is to really go through the process of taking a problem and
> : seeing how the problem translates into data movement, etc.
> 
> Data movement, sir, is not restricted to assembly language.
> Assembly language, in fact, impedes the understanding of data
> movement because of all the extraneous things that have to be
> done to make the program work.

What extraneous things?  I don't understand why you think it's
so enormously complicated to use assembly.  Programmers used
to do it all the time, and they somehow managed to write
programs without suffering mental breakdowns.

> : What we're doing now is *sure* not working.
> 
> Because some students graduate without understanding algorithm
> complexity, not because they graduate without learning assembly
> language.

I agree with the first, but the reason they are not "getting"
algorithm complexity is because we're shrouding it in all
this abstract mystery.

> : > Uhm, theoretically.  Somebody with only in-depth knowledge of run-
> : > length encoding is not very likely to come up spontaneously with
> : > Huffman or arithmetic compression.
> 
> : Perhaps not "spontaneously", but you have them thinking about
> : compression issues.  "Thinking" is the key word, not just
> : memorizing an algorithm that they will forget after the final.
> 
> I never said anything about a student that forgot everything after
> an exam.  I said, it is better to have *general* knowledge than
> specific knowledge in intermediate levels.  My student would've
> remembered that he was taught Huffman, which compressed better
> than RLE.  Now, how Huffman works he may not immediately remember,
> but he would know where to look it up, and that's all he needs.
> This student with general knowledge can use books to his advantage,
> while your RLE expert sits in a cubicle and tries to reinvent the
> wheel.

You're making the assumption that we teach them RLE, and then
hide all the books from him.  My point is that packing
algorithms into their heads is not necessarily going to
teach them how to invent algorithms when they need them.

> : I'm not saying everyone *does* derive all of it, but the
> : best programmers, even if they can't remember the exact
> : details of an algorithm, can rederive it by thinking
> : about the problem.
> 
> Of course, but we're not talking about the best programmers.  We're
> talking about beginners and intermediates.  The best programmers are
> probably experts in several related fields - the language, the
> platform, the algorithms, and the tools.  Now, without comprehensive
> knowledge, it is better to be general than specific.

But we can learn from the best programmers about how their
thought processes work.  Again, the most important thing is
to get the students "thinking like programmers".

> : And I wouldn't say Quicksort is not intuitive, it is a logical
> : extension of recursive tree algorithms.  Not obvious, but
> : not unintuitive.  Lempel-Ziv or even the 'diff' algorithm
> : I might agree.
> 
> How long did it take you to come up with Quicksort armed with
> the knowledge from implementing bubblesort in assembly?

I was introduced to Quicksort before I had the chance to
invent it myself. :)

> : Not only am I asserting, let me say in the strongest possible
> : terms that students are being graduated with CS degrees
> : that are unable to think.
> 
> [snip]
> I tend to blame the TV.  Point, however, is on my side.  The
> students were given tools they could not use properly.  The
> debugger that was so convenient for an expert actually destroyed
> their will to think about a problem ahead of implementation.  Now
> they just hack together some code and trace through it until they
> get it right.  Assembly will have the same ills.  Firstly, they
> distract the students and force them to implementation details
> specific to the platform, and secondly is simply too powerful for
> beginners to use properly.

Don't get me started about debuggers... I think in certain
ways debuggers have lowered productivity rather than
raised it.

>[snip]
> Trivial?  None of the ten or so students considered the fact that
> the table can be expressed as a formula.  *All* of them stored this
> table somewhere.  One particularly creative one even read the table
> from a textfile.
> 
> I don't think this phenomenon was due to the fact that they didn't
> know assembly.

Look, it's not so much that I love assembly so much, but I
think a much more basic step-by-step approach works better
than the overly-abstracted approach that's currently in vogue.
Assembly just seems to be most straightforward method at
getting to the fundamentals of going on.  I wouldn't be completely
opposed to an "idealized" assembly that was interpreted or
something.

I just want students to get a better feel for what the computer
really does.

> : > Fundamental understanding of RLE has little, if not nothing, to do
> : > with Huffman compression.  Fundamental understanding of quicksort
> : > has nothing to do with heapsort either.
> 
> : That is simply not true.  Taking compression; if have done
> : thorough analysis of RLE, and understand the issues involved
> : in finding patterns (sequences of common bytes) and replacing
> : them with another encoded value, that is the basis for
> : more complex pattern matching/replacement.  If the student
> : is thinking, they will see that a large majority of the
> : input data is not being compressed, and this leads to wondering
> : if there is a general way to find more complex patterns in the
> : data.
> 
> Even if what you say is true (it's difficult, at best), I still don't
> see how this student can be a more effective engineer than one who
> has vague knowledge of both, but distinctly know that something *is*
> better than RLE.

Because the student is thinking about compression, rather than
being spoon-fed the information from a book or the teacher.
When the get into the real world, this thought process will
stay with them when they begin to need to break down much
more complex problems into algorithms.

> : Sure; how much time do you think it takes to make one library
> : call to output data?  Push the parameters on the stack and
> : call 'printf'.  Big deal.
> 
> Then why not use C instead?  Realize that you are the one trying to
> prove that assembly is better for teaching algorithms.  If you're
> going to use library functions extensively, what's wrong with C
> then?

Using library routines to do simple I/O tasks does not take
away from the learning experience of the algorithm.  Somehow
input and output have to happen, and I see no reason to burden
the student with problems that are not relevent to what they
are trying to learn.

> : I think you exaggerate the complexity of assembly.  This is like
> : saying "we want to train students in problem solving, not where
> : the commas go in the syntax."
> 
> Where the commas go is important, because it is a prerequisite to a
> correct solution.  If we one day can build computer languages that
> can understand humans better, perhaps we will not care where we
> put commas.  Assembly is not complex per se, but is more complex
> than HLLs to learn and read, and as such is counterproductive when
> used as a tool to teach algorithms.
> 
> You still haven't proven why assembly is necessary, or at least
> better.

Again, I go back to the fact that what we're doing now is
not working, based on my experience (and you admitted to
above).  What are we doing now?  We are teaching students at
an extremely high level of abstraction.

Since this does not seem to be giving the students the
critical thinking capabilities, what would?  I think that
more emphasis on the "fetch-and-execute" fundamentals would
force the students to think more procedurally and less
abstractly.

Not to be repetitious with this example, but I look at how EEs
are taught.  They begin by learning the fundamental components;
resisters, capacitors, ohm's law, etc.  They seem to me to be
much better prepared to do real work than CS graduates, and I
think it's because of the early emphasis on the fundamentals.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00             ` Peter Seebach
@ 1996-08-09  0:00               ` Tim Behrendsen
  1996-08-09  0:00                 ` Peter Seebach
  0 siblings, 1 reply; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-09  0:00 UTC (permalink / raw)



Peter Seebach <seebs@solutions.solon.com> wrote in article
<4ue7tm$onn@solutions.solon.com>...
> In article <01bb8567$4adddbc0$87ee6fce@timpent.airshields.com>,
> Tim Behrendsen <tim@airshields.com> wrote:
> >Peter Seebach <seebs@solutions.solon.com> wrote in article
> ><4ud8oo$61t@solutions.solon.com>...
> >> Which is exactly what will happen if they code them in assembly; they
> >> can't possibly be learning how algorithms are created if they start
with
> >> implementing them in a low level language.  Algorithms are generally
> >> written in natural languages, and written on napkins.
> 
> >What? I guess before HLLs nobody "possibly" learned anything.
> 
> Probably not.  We have *no* evidence that people learned or comprehended
*any*
> algorithms before we developed HLL's.  The one I use for more than 90% of
my
> discussion and analysis of algorithms is called "English", though this is
> arguably a misnomer, as I'm not.

Well, now you're being silly, especially considering Knuth
expressed his books in MIX.

> >I don't know about you, but I think about algorithms in terms of
> >data flow and manipulation.  HLLs or Assembly don't enter into
> >either.
> 
> I think about them in terms of descriptions of rules for behavior.

At first, but what about when you're ready to implement them?

> >We disagree however on the difficulty of assembly.  I think it
> >is *much* easier, albeit more tedious, than a HLL.
> 
> Huh.  Well, see, you're probably comfortable with it.  I can't *imagine*
being
> able to keep track of all of that.

I have full confidence in your abilities. :-)

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-09  0:00                                         ` Mike Rubenstein
@ 1996-08-09  0:00                                           ` Tim Behrendsen
  1996-08-10  0:00                                             ` Mike Rubenstein
  0 siblings, 1 reply; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-09  0:00 UTC (permalink / raw)



Mike Rubenstein <miker3@ix.netcom.com> wrote in article
<320b35a2.43769707@nntp.ix.netcom.com>...
> "Tim Behrendsen" <tim@airshields.com> wrote:
> > Peter Seebach <seebs@solutions.solon.com> wrote in article
> > <4uah1k$b2o@solutions.solon.com>...
> > > In C, we write
> > > 
> > > 	int
> > > 	gcd(int x, int y) {
> > > 		if (y == 0)
> > > 			return x;
> > > 		return gcd(y, (x % y));
> > > 	}
> > > 
> > This is an interesting case, because it is somewhat inefficently
> > implemented.  If you're interested in speed, you would do...
> > 
> > int gcd(int x, int y) {
> >     int z;
> >     while (y != 0)
> >         z = y; y = x % y; x = z;
> >     return(y);
> > }
> 
> I don't understand how going into an infinite loop when y != 0
> improves the speed?  I guess I cut class that day.
> 
> Nor do I see how dividing by 0 (when y == 0) is an improvement?
> Perhaps I just cut too many classes.
> 
> Let's review the bidding.  Peter gives an algorithm.  You claim that
> knowledge of assembly helps one understand it better and produce a
> program that goes into an infinite loop if y is not zero and divides
> by 0 if y is zero.
> 
> Perhaps you meant
> 
> 	int gcd(int x, int y) {
> 	    int z;
> 	    while (y != 0)
> 	        { z = y; y = x % y; x = z; }
> 	    return(y);
> 	}
> 
> This avoids the infinite loop, but can be made even more efficient:
> 
> 	int gcd(int x, int y)
> 	{
> 	  return 0;
> 	}
> 
> Of course this isn't quite equivalent to Peter's function.
> 
> I guess, like Peter, I'm a little thick today.  Please explain again
> how knowing assembly language helps one understand algorithms?  I got
> a little confused when you transformed Peter's correct, but slightly
> inefficient function into an infinite loop.

Yes, you're right; my typo proves all my arguments false.  How can I
have been such a fool?

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00                                 ` Thomas Hood
@ 1996-08-09  0:00                                   ` Tim Behrendsen
  0 siblings, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-09  0:00 UTC (permalink / raw)



Thomas Hood <thomas@nicom.com> wrote in article
<320A65F5.65D9@nicom.com>...
> Tim Behrendsen wrote:
> <snip>
> > > or to code than N lines of C.  But if you want the students to
> > > understand say quicksort, it's a lot easier showing them 20 lines
> > > of C than 100 lines of assembler.
> > 
> > Who's talking about showing them?  I would suggest that if
> > they wrote a quicksort in assembler, they will have a much
> > better "feel" for the algorithm, than if they wrote it in C.
> 
> Nonsense.  They will have a much better feel for how to shift stuff from
> one register to another, but will never realize _why_ they are doing it.
> Problem solving in anything but the most trivial of cases is the process
> of moving from the specific (the problem at hand, for which we have no
> solution) to the general (the algorithmic solution, which fits
> the problem domain).  Accomplishing this in assembly language is
> unnecessary and cruel.  Any HOL has the lexical elements necessary to
> encapsulate the concept of a quicksort without the agony of dealing with
> assembler.

Cruel?  Agony?  Assembly is *not* that hard.

> > > Also, we want to get our students into the habit of writing
> > > robust and reusable code, and this is very difficult in assembler.
> > > At least C has malloc()/free(); with assembler, you need to write
> > > the memory management from scratch.
> > 
> > Well, I don't the student has to rewrite all the library
> > routines!  Nothing precludes you from using them from assembler.
> 
> If you are allowing them to use higher level constructs (abstractions)
> to accomplish lower level tasks (solve specific problems), then you
> are upholding the very concept you are arguing against!

No, I see no reason to have them rewrite the O/S if that's not
the purpose of the assignment.  They can get input from the library
routines, and output using the library routines, but implementation
of the algorithm should be done using assembly.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00                                       ` Tim Behrendsen
  1996-08-08  0:00                                         ` Peter Seebach
@ 1996-08-09  0:00                                         ` Dan Pop
  1996-08-11  0:00                                           ` Tim Behrendsen
  1996-08-11  0:00                                           ` Mark Wooding
  1996-08-18  0:00                                         ` Sam B. Siegel
  2 siblings, 2 replies; 688+ messages in thread
From: Dan Pop @ 1996-08-09  0:00 UTC (permalink / raw)



In <01bb8534$b2718bc0$87ee6fce@timpent.airshields.com> "Tim Behrendsen" <tim@airshields.com> writes:

>Dan Pop <Dan.Pop@cern.ch> wrote in article
><danpop.839450672@news.cern.ch>...
>> In <01bb83f5$923391e0$87ee6fce@timpent.airshields.com> "Tim Behrendsen"
><tim@airshields.com> writes:
>> 
>> >The problem is that we *can't* think purely abstractly,
>> >otherwise we end up with slow crap code.
>> 
>> Care to provide some concrete examples?
>
>Look at the code-bloated and slow-software world we live in,
>particularly on desktop platforms.  I think this is caused by
>people not truly understanding what's *really* going on.

I was asking for a concrete example, like a quicksort implementation
made by someone thinking purely abstractly as opposed to a quicksort
implementation made by someone who understands the low level details.

>For example, look at OOP.  Very naive implementations of OOP
>used a huge amount of dynamic memory allocation, which can cause
>severe performance problems.  That's why I don't use C++ for my
>products; to do it right you have to do a very careful analysis
>of how your classes are going fit together efficiently.
>
>I've spoken to enough people that have had C++ disasters to
>convince me that the more abstraction there is, the more
>danger there is of inefficient code.  This shouldn't be that
>hard to believe; any time you abstract away details you are
>giving up knowledge of what is going to be efficient.

Nonsense.  The efficiency of an implementation can be also thought in
abstract terms, you don't need to know how the compiler works or
assembly language in order to implement your application in an efficient
way.  Or you may know all these irrelevant details and still write 
inefficient code, because it's so much easier in many OOPLs.

>I alluded to this in another post, but a good example is Motif
>and X11.  A programmer who only understands Motif, but does not
>understand X11 is going to write slow crap, period.

More nonsense.  Unless your applications spend the bulk of their CPU
time in the user interface, that should be a non-issue.

>Here's an example:
>
>int a[50000],b[50000],c[50000],d[50000],e[50000];
>
>void test1()
>{
>    int i, j;
>    for (j = 0; j < 10; ++j) {
>        for (i = 0; i < 50000; ++i) {
>            ++a[i]; ++b[i]; ++c[i]; ++d[i]; ++e[i];
>        }
>    }
>}
>
>void test2()
>{
>    int i, j;
>    for (j = 0; j < 10; ++j) {
>        for (i = 0; i < 50000; ++i) ++a[i];
>        for (i = 0; i < 50000; ++i) ++b[i];
>        for (i = 0; i < 50000; ++i) ++c[i];
>        for (i = 0; i < 50000; ++i) ++d[i];
>        for (i = 0; i < 50000; ++i) ++e[i];
>    }
>}
>
>On my AIX system, test1 runs in 2.47 seconds, and test2
>runs in 1.95 seconds using maximum optimization (-O3).  The
>reason I knew the second would be faster is because I know
>to limit the amount of context information the optimizer has
>to deal with in the inner loops, and I know to keep memory
>localized.

1. For a marginal speed increase (~25%), you compromised the readability
   of the code.

2. Another compiler, on another system, might generate faster code
   out of the test1.  This is especially true for supercomputers,
   which have no cache memory (and where the micro-optimizations are done
   based on a completely different set of criteria) and where the cpu time
   is really expensive.

3. You used exclusively abstract concepts to justify why test2 is
   faster on your particular platform/compiler combination.  No references
   to the length of a cache line or to the compiler being able to use
   a more efficient instruction or instruction combination in one case
   than in the other.

Let's see what happens on my 486DX33 box:

    ues4:~/afs/tmp 32> cc -O2 -o test1 test1.c
    ues4:~/afs/tmp 33> time ./test1
    1.100u 0.230s 0:01.51 88.0% 0+0k 0+0io 13pf+0w
    ues4:~/afs/tmp 34> cc -O2 -o test2 test2.c
    ues4:~/afs/tmp 35> time ./test2
    1.170u 0.180s 0:01.45 93.1% 0+0k 0+0io 13pf+0w

So, it's 1.10 + 0.23 = 1.33 seconds of cpu time for test1 versus
1.17 + 0.18 = 1.35 seconds for test2.  Conclusions:

1. My 486DX33 is faster than your AIX system (or maybe gcc is faster than
   xlc) :-)

2. Your results are not reproducible.  Your extra "insight" simply "helped"
   you to generate less readable code.

>Now I submit that if I showed the average C programmer
>both programs, they would guess that test1 is faster because
>it has "less code",

And he might be right, both on a CP/M-80 micro and a Cray supercomputer.
Or on most systems with a compiler that can do loop unrolling.

>and that is where abstraction,
>ignorance, and niavete begin to hurt.

The key to proper optimization is profiling.  Additional knowledge about
a certain platform is a lot less important.  

And if the wrong algorithm has been chosen in the first place, no
amount of micro-optimization will save the code performance.  The guy
who can do a decent algorithm analysis (an entirely abstract operation)
will always beat the one who is an expert assembly programmer but
prefers to spend his time coding instead of dealing with abstractions.

This is an old story (it happened during the early to mid eighties) and
I forgot the details.  One software company had a nice product for the PC,
but it was rather slow.  To get the "best" performance, it was coded in
assembly.  Another company decided to emulate that product and they did it
quite successfully (their version is about 4 times faster).  They 
implemented it in C, but they've carefully chosen their algorithms.

Dan
--
Dan Pop
CERN, CN Division
Email: Dan.Pop@cern.ch 
Mail:  CERN - PPE, Bat. 31 R-004, CH-1211 Geneve 23, Switzerland




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00                                         ` telnet user
  1996-08-09  0:00                                           ` Tim Behrendsen
@ 1996-08-09  0:00                                           ` Ed Hook
  1 sibling, 0 replies; 688+ messages in thread
From: Ed Hook @ 1996-08-09  0:00 UTC (permalink / raw)



In article <4udjki$2l8@cnn.Princeton.EDU>, tim@franck (telnet user) writes:
|> Tim Behrendsen (tim@airshields.com) wrote:
|> 
|> : This is an interesting case, because it is somewhat inefficently
|> : implemented.  If you're interested in speed, you would do...
|> 
|> : int gcd(int x, int y) {
|> :     int z;
|> :     while (y != 0)
|> :         z = y; y = x % y; x = z;
|> :     return(y);
|> : }
|> 
|> However if you are interested in correctness, you use braces for the
|> loop.  I think this case where you write assembly in C, get it wrong,
|> and ...
|>
  And ... if, in addition, you're interested in _mathematical_ correctness,
  you'll write instead:

  int gcd(int x, int y)
  {
	int z;
	while ( y != 0 ) {
		z = y;
		y = x % z;
		x = z;
	}
	return x;
   }

   On the other hand, I can greatly speed up tha given code; just replace
   it by

   int gcd(int x, int y)
   {
	return 0;
   }

   -- it does the same thing.
 
|> : Using my AIX compiler, I get a nominal improvement of about
|> : 10%, mostly because the speed of the modulo is much slower
|> : than the inefficiency of recursion.
|> 
|> only achieve a 10% speedup proves everyone else's point, and is an
|> appropriate place to end this thread.
|>

  I can't argue with that ...
 
|> ---------------------------------------------------------------------------
|> Tim Hollebeek         | Disclaimer :=> Everything above is a true statement,
|> Electron Psychologist |                for sufficiently false values of true.
|> Princeton University  | email: tim@wfn-shop.princeton.edu
|> ----------------------| http://wfn-shop.princeton.edu/~tim (NEW! IMPROVED!)

-- 
 Ed Hook                              |       Coppula eam, se non posit
 Computer Sciences Corporation        |         acceptera jocularum.
 NASA Langley Research Center         | Me? Speak for my employer?...<*snort*>
 Internet: hook@cscsun3.larc.nasa.gov |        ... Get a _clue_ !!! ...




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-09  0:00               ` Tim Behrendsen
@ 1996-08-09  0:00                 ` Peter Seebach
  1996-08-15  0:00                   ` James_Rogers
  0 siblings, 1 reply; 688+ messages in thread
From: Peter Seebach @ 1996-08-09  0:00 UTC (permalink / raw)



In article <01bb8608$9a12bc00$87ee6fce@timpent.airshields.com>,
Tim Behrendsen <tim@airshields.com> wrote:
>Peter Seebach <seebs@solutions.solon.com> wrote in article
><4ue7tm$onn@solutions.solon.com>...
>> Probably not.  We have *no* evidence that people learned or comprehended
>*any*
>> algorithms before we developed HLL's.  The one I use for more than 90% of
>my
>> discussion and analysis of algorithms is called "English", though this is
>> arguably a misnomer, as I'm not.

>Well, now you're being silly, especially considering Knuth
>expressed his books in MIX.

No, he expressed them in English, then used MIX to show a possible
implementation.  Since then, he's mostly moved to a family of "web" languages
(no relation) in which the source *is* the English text of the documentation,
and the "source code" used by the compiler is generated from small code
fragments which are essentially figures referred to in the text.

Despite the fact that I can't read Pascal (I try, but I always misunderstand
things), I can read a WEB program using pascal as the "base language" easily
and comfortably.  At the end, I have a *solid* understanding of what the
algorithm does in principle, even if I'm not sure I follow the implementation.

I end up knowing it well enough that I could write my own implementation,
despite not having really seen the implementation or tried to understand it.

>At first, but what about when you're ready to implement them?

Generally, I test them out by writing the algorithm in English, and drawing
pictures, and following the rules myself.  I occasionally use C for
complicated ones.  Or perl.

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00                                         ` Peter Seebach
  1996-08-08  0:00                                           ` Randy Kaelber
  1996-08-09  0:00                                           ` J. Blustein
@ 1996-08-09  0:00                                           ` Chris Sonnack
  1996-08-10  0:00                                             ` Tim Behrendsen
  2 siblings, 1 reply; 688+ messages in thread
From: Chris Sonnack @ 1996-08-09  0:00 UTC (permalink / raw)



Peter Seebach (seebs@solutions.solon.com) wrote:

>> Look at the code-bloated and slow-software world we live in,
>> particularly on desktop platforms.  I think this is caused by
>> people not truly understanding what's *really* going on.
>
> I believe it's caused by people who have a poor understanding of what
> they're trying to do.

Or more likely, both.

>> void test1() {
>>    int i, j;
>>    for (j = 0; j < 10; ++j) {
>>        for (i = 0; i < 50000; ++i) {
>>            ++a[i]; ++b[i]; ++c[i]; ++d[i]; ++e[i];
>>        }
>>    }
>> }
>>void test2() {
>>    int i, j;
>>    for (j = 0; j < 10; ++j) {
>>        for (i = 0; i < 50000; ++i) ++a[i];
>>        for (i = 0; i < 50000; ++i) ++b[i];
>>        for (i = 0; i < 50000; ++i) ++c[i];
>>        for (i = 0; i < 50000; ++i) ++d[i];
>>        for (i = 0; i < 50000; ++i) ++e[i];
>>    }
>> }
>> On my AIX system, test1 runs in 2.47 seconds, and test2
>> runs in 1.95 seconds using maximum optimization (-O3).
>
> I am unconvinced that this example is remotely relevant to the general
> case.

Probably not, but the real world does contain specific cases like this.
I really don't understand the resistance to the idea that programmers
should understand AS MUCH AS POSSIBLE about their craft. I really don't.

It's like an auto mechanic who doesn't want to know as much as possible
about how car engines work. Which mechanic would you rather hire; one
who knows the high-level stuff and can do MOST types of work, or one
who can build an engine from scrap metal?

If you (generic you) don't want to learn about the metal, fine. That
means I can get hired over you when the job entails embedded programming
or device drivers or a lot of other tasks fully rounded programmers may
face.

Do I like assembly? Hell, no. Am I fully capable of being hired for a
job that requires its use? Hell, yes.

> In particular, in the real world, would you rather maintain code more like
> the first or the second?  I'd prefer to maintain the first; it would be
> easier to change all occurances of 50000, if this needed to be done.

They're both pretty clear. And any real programmer knows rule #27: "There
should be no constants in your code except the numbers 1 and 0, and you
should view those with suspicion." A real programmer would use a #define
for the 50000, so changing it would be easy in either event.

> Abstraction is a trade-off of maintainability and readability vs. code
> speed.

The above example has little to do with abstraction either way. The more
general case would be, how does one code large loop constructs. This is
one case where understanding the underlying concepts does help.

> If, *and only if*, there turned out to be a performance problem, I'd
> probably start by unrolling the loop in test1, possibly getting a
> much greater performance increase.

I'm not sure unfolding a ten-fold iteration would do much. Ten times a
really tiny bit is still a pretty tiny bit. The key here is that a
programmer who already understands BOTH the high- and the low-level
would naturally write it the "right" way first. In the real world, doing
it right the first time counts for a lot.

> Further, it is obvious in test1 that the intent is to do the same thing
> to each array.

As written, it's obvious in both cases.

> The problem with most slow code is not microoptimizations like this; it's
> vast inefficiencies at the basic design level.

Absolutely.

--
Chris Sonnack  <cjsonnack@mmm.com>                  http://eishcq.mmm.com
Engineering Information Services/Information Technology/3M, St.Paul, Minn
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
It takes a long time to understand nothing.

Opinions expressed herein are my own and may not represent those of my employer.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00                                         ` Peter Seebach
  1996-08-08  0:00                                           ` Tim Behrendsen
@ 1996-08-09  0:00                                           ` Chris Sonnack
  1 sibling, 0 replies; 688+ messages in thread
From: Chris Sonnack @ 1996-08-09  0:00 UTC (permalink / raw)



Peter Seebach (seebs@solutions.solon.com) wrote:

> I think algorithms should be taught on paper.  *Away* from implementations
> on computer.
>
> Believe me, you understand quicksort much better after *DOING* it than you
> can from writing any reasonable number of implementations.

Absolutely! Without qualification.

And:

> ...once they showed you *the algorithm*, you could implement it however
> you want.

Yep.

> It's worth observing good programmers rewriting.  A good programmer will
> add features and cut the size of a program slightly; a bad programmer will
> bloat it hugely to add the feature.  (Assuming the original program to be
> mediocre.)

This is so true. And it applies to ones own work. Most programs of any
size I write follow a time/size curve, getting larger as the features
are implemented. But at some point I begin to //truly// understand the
solution the program represents. This, to my mind and experience, only
comes from living with the problem and in the code for a time; a long
time. All the design analysis in the world never seems to quite touch
the heart of the matter. It's the wedding between the concrete problem
and the concrete mechanism (language) that births the elegant solution.

Once I hit that point, the code size begins to decrease and and the
solution starts to become truly elegant.

ObTopic:

I disagree students should START with assembly (although I did and it's
served me well). I do feel that any programmer who really wants to be
called a true craftsperson should know AS MUCH AS POSSIBLE about their
craft. I think there is much to be gained from looking at problems and
solutions from as many points of view, high and low, as possible.

Hell, I think it's self-evident.

--
Chris Sonnack  <cjsonnack@mmm.com>                  http://eishcq.mmm.com
Engineering Information Services/Information Technology/3M, St.Paul, Minn
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
TODAY'S RULE: No Smoffing or Fnargling!

Opinions expressed herein are my own and may not represent those of my employer.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-07  0:00       ` Stephen M O'Shaughnessy
@ 1996-08-09  0:00         ` Bob Gilbert
  0 siblings, 0 replies; 688+ messages in thread
From: Bob Gilbert @ 1996-08-09  0:00 UTC (permalink / raw)



In article <Dvro5K.MKo@most.fw.hac.com>, smosha@most.fw.hac.com (Stephen M O'Shaughnessy) writes:
> In article <4u7fol$26s@zeus.orl.mmc.com>, rgilbert@unconfigured.xvnews.domain says...
> >
> >Why can't we learn both at the same time?  
> >
> >When it came to learning computer science I think I tended to learn
> >both at the same time.  I took basic EE courses and learned about
> >operating transistors in saturation, how to build flip-flop circuits,
> >and how to implement logic using these circuits, and finally how to
> >design a computer architecture using these circuits (including micro-
> >code design).  At the same time I was learning PL/I programming, how
> >to write bubble sorts, learning about the merits of structured 
> >programming, top-down design methods, various data structures, data 
> >base design, discrete mathmatics, ect.  This was overlapped and 
> >followed with learning assembly, state machine theory, Turing machines,
> >general compiler (language) theory, and so forth.  Somewhere in my 
> >second senior year it all started to come together and make sense, 
> >not necessarily with a sudden turning on of the light of understanding,
> >but a gradual turning up of the dimmer switch.
> >
> That must have been one hell of a lecture.  By *same time* I meant SAME TIME.  Some
> of your above mentioned topics must have been learned, and mastered, before the
> others. 

As I reflect back on the required course work, is seems that there were
probably as many as three distinct threads to be pursued in parallel 
(looking at the upper-level courses).  One started from the hardware 
(EE) viewpoint and worked it's way up (transistors, flip-flops, logic
circuits, computer architecture, micro-code, assembly), one started at 
a high (more abstract) level and worked down (PL/I programming, structured
design, data structures, data base design, etc.), and one started somewhere
in the middle and worked out (assembly, Turing machines, compiler design,
language theory, etc..  The point is, the curriculum was providing both 
top down and bottom up (and maybe some other, shotgun?) approaches to 
teaching the entire subject matter.  Certainly some courses required that
certain prerequisites be met, but my definition of "same time" is that I was 
taking introductory PL/I programming courses in the same semester that I
was taking digital design courses, the following semester might include 
courses in computer architecture concurrent with a data structures class,
and so on.  In one path you learned about roots, bark, and leaves, and these
came together to make up trees, and you can then take a bunch of trees and 
make a forest.  The other path started out teaching you about the forest 
and showed that it was made up of a bunch of trees.....  Both paths were
pursued in parallel.
 
> Care to comment on what you mean by second senior year?

It could be that there are those of us whose dedication to the pursuit of
higher learning and obtaining a professional degree is not all that great
(or at least wasn't when we were young and easily distracted) or we lacked
the necessary motivation to complete a degree program in the usual time 
frame, or.....  we unwittingly pursued other degrees, like mechanical 
engineering, before discovering our preference for computer science and 
thus spent extra time taking additional course work not required for a 
computer science degree.  I'm not telling which, if either, it is :-) .

-Bob






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-07  0:00                                   ` Tim Behrendsen
                                                       ` (2 preceding siblings ...)
  1996-08-08  0:00                                     ` David Weller
@ 1996-08-09  0:00                                     ` Bob Gilbert
  1996-08-10  0:00                                       ` Tim Behrendsen
  3 siblings, 1 reply; 688+ messages in thread
From: Bob Gilbert @ 1996-08-09  0:00 UTC (permalink / raw)



In article <01bb846c$e51df220$87ee6fce@timpent.airshields.com>, "Tim Behrendsen" <tim@airshields.com> writes:
> 
> Not to get into a debate on the meaning of abstraction, but the
> point is that there is very little hidden from the student when
> they are learning assembly.  This allows them to better concentrate
> on the basics of algorithms, because they are not distracted by syntax.

Yeah, but they are distracted by learning the computer architecture.
When I was first introduced to programming in assembly, the learning
of algorithms was not the purpose of the course.  What was learned
from programming in assembly was the underlying architecture, what
registers were, how memory was addressed, etc.

> Of course, but I'm talking about abstractions of assembly, i.e.,
> HLLs.  Remember, C (or any HLL) does not really exist as far as
> the computer knows.  Assembly is the direct raw instruction set of
> the physical machine.  If the student is learning algorithms in
> assembly, they are unquestionably learning the algorithm, and not
> just some vague concept wrapped in 10 layers of wool.

I though assembly was an abstraction of the raw instruction set, not
exactly the raw instruction set.  After all, the assembler allows one
to abstract memory locations by assigning names or labels to them, it 
abstracts the instructions by assigning shorthand pneumonics to them, 
many allow the abstraction of code fragments which might perform some
higher level function by allowing the programmer to implement them as
macros, etc.

There were two or three concepts that were emphasised through the teaching
of assembly.  First was learning the underlying computer architecture, its
registers, how it addressed and used memory, how it executed the insructions.
Second was learning the purpose and usefulness of an assembler, and what an
assembler was.  We learned about one/two/and multi-pass assemblers, how 
assemblers made it easier to program a computer through abstraction (try 
generating the bit patterens to be stored in memory for a small program by
hand).  Our primary project for the class was to code in assembly a two pass
assembler which implemented a subset of the instructions for the machine 
(some Harris mini-computer the school had) we had.  The only specific 
algorithms we were taught dealt with some of specific design issues that 
came about when attempting to implement certain aspects of the assembler, 
such as hashing functions for storing/retreiving into a symbol table, etc.
Come to think of it, these algorithms were presented to us and studied using 
some higher level psuedo code.  While we did implement them in assembly, I 
really don't think that exercise helped us to understand the algorithms any
better.

-Bob







^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-09  0:00                               ` Stephen M O'Shaughnessy
@ 1996-08-09  0:00                                 ` Tim Behrendsen
  0 siblings, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-09  0:00 UTC (permalink / raw)



Stephen M O'Shaughnessy <smosha@most.fw.hac.com> wrote in article
<DvvG3E.3sy@most.fw.hac.com>...
> In article <01bb85af$e83cf2a0$32ee6fce@timhome2>, tim@airshields.com
says...
> 
> >I think a more correct analogy is the current and voltage
> >are the underlying hardware, and the tools are assembly.
> >HLLs are like prefab walls you slap together to build the
> >house.  Now, you can build a lot of houses that way, and
> >they will be generally be built well.  But if you want to
> >understand how houses are put together, you have to go to the
> >fundamental tools, with an understanding of structural
> >dynamics.
> >
> >-- Tim Behrendsen (tim@airshields.com)
> 
> I still have trouble with your definition of assembly.  An assembly
language
> is just the one for one translation of the machine code.  For the 8051
> 
> mov 90h,#0ffh is the machine code 75 90 FF
> 
> This is directly voltages existing in the hardware (bits).  In this sense
> assembly is the hardware.  My position is that I don't need to know 
> hardware specifics to learn algorithms and programming.  Certainly I
don't
> need that understanding FIRST.  And my boss, customer, 
> or whomever, does not care about my understanding as long as I can 
> deliver a well built product.  Which, as you pointed out above, I can do.

It's not a question of whether you "need to know" assembly to
learn, it's a question of what's best.  In my experience, CS
students are being graduated without the necessary skills to build
"well built products".  We have the world you're describing,
and it's not working.  CS graduates quite simply are not being
taught to think like programmers, that is, in an efficient
procedural way.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00                                       ` Tim Behrendsen
  1996-08-08  0:00                                         ` Peter Seebach
  1996-08-08  0:00                                         ` telnet user
@ 1996-08-09  0:00                                         ` Mike Rubenstein
  1996-08-09  0:00                                           ` Tim Behrendsen
  2 siblings, 1 reply; 688+ messages in thread
From: Mike Rubenstein @ 1996-08-09  0:00 UTC (permalink / raw)



"Tim Behrendsen" <tim@airshields.com> wrote:

> 
> 
> Peter Seebach <seebs@solutions.solon.com> wrote in article
> <4uah1k$b2o@solutions.solon.com>...
> > In article <01bb83f5$923391e0$87ee6fce@timpent.airshields.com>,
> > Tim Behrendsen <tim@airshields.com> wrote:
> > >I agree; my point is that I think the student learns more if they
> > >are thinking purely in terms of fundamental operations (which are
> > >still abstractions above the raw hardware components), rather
> > >than layers and layers of syntax that hide the essential essence
> > >of what the computer is doing.
> > 
> > But your concept of what the "fundemental operations" are is completely
> tied
> > up in how specific hardware you've seen operates.  Algorithms exist in
> terms
> > of abstract operations, not moves and branches.
> 
> Please name me the hardware that does not use moves and branches.
> 
> > Something like CWEB might be a good language in which to learn
> algorithms.
> > Scheme might too.
> > 
> > Let's look at a specific algorithm; the infamous gcd(x, y).
> > 
> > In C, we write
> > 
> > 	int
> > 	gcd(int x, int y) {
> > 		if (y == 0)
> > 			return x;
> > 		return gcd(y, (x % y));
> > 	}
> > 
> > or something similar.
> > 
> > What's important here is not any theories people may have about where the
> x
> > and y are stored, or how a stack works, but the concept that you can
> define an
> > algorithm in terms of a previous case.  Learning that the if() may be
> > implemented with "a branch" does not help the student understand how the
> > *algorithm* works, it helps the student understand how the *compiler*
> works.
> > These are distinct things.
> 
> This is an interesting case, because it is somewhat inefficently
> implemented.  If you're interested in speed, you would do...
> 
> int gcd(int x, int y) {
>     int z;
>     while (y != 0)
>         z = y; y = x % y; x = z;
>     return(y);
> }

I don't understand how going into an infinite loop when y != 0
improves the speed?  I guess I cut class that day.

Nor do I see how dividing by 0 (when y == 0) is an improvement?
Perhaps I just cut too many classes.

Let's review the bidding.  Peter gives an algorithm.  You claim that
knowledge of assembly helps one understand it better and produce a
program that goes into an infinite loop if y is not zero and divides
by 0 if y is zero.

Perhaps you meant

	int gcd(int x, int y) {
	    int z;
	    while (y != 0)
	        { z = y; y = x % y; x = z; }
	    return(y);
	}

This avoids the infinite loop, but can be made even more efficient:

	int gcd(int x, int y)
	{
	  return 0;
	}

Of course this isn't quite equivalent to Peter's function.

I guess, like Peter, I'm a little thick today.  Please explain again
how knowing assembly language helps one understand algorithms?  I got
a little confused when you transformed Peter's correct, but slightly
inefficient function into an infinite loop.


Michael M Rubenstein




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00         ` Szu-Wen Huang
  1996-08-08  0:00           ` Tim Behrendsen
  1996-08-09  0:00           ` some days weren't there at all
@ 1996-08-10  0:00           ` Mike Rubenstein
  1996-08-11  0:00             ` Szu-Wen Huang
  1996-08-17  0:00             ` Richard Chiu
  2 siblings, 2 replies; 688+ messages in thread
From: Mike Rubenstein @ 1996-08-10  0:00 UTC (permalink / raw)



huang@mnsinc.com (Szu-Wen Huang) wrote:

> Tim Behrendsen (tim@airshields.com) wrote:
> : Szu-Wen Huang <huang@mnsinc.com> wrote in article
> : <4ubnhr$714@news1.mnsinc.com>...
> 
> : > 1.  Being able to write algorithm X in assembly doesn't make you
> : >     understand the algorithm better.  It makes you understand whatever
> : >     platform du jour your school uses better.
> : > 2.  Vaguely understanding 5 algorithms is far better than understanding
> : >     one or two fully.  The primary objective of algorithms courses is
> : >     to produce students that can *choose* which algorithm to use,
> : >     not to produce students who can memorize one or two algorithms.
> : >     In fact, there's rarely a programmer more dangerous than one that
> : >     has no broad knowledge of algorithms.
> 
> : My point is someone who has "vaguely" learned five algorithms has
> : simply memorized them, and learned nothing about the general
> : principles of how algorithms are created.
> 
> No, your point was writing an algorithm in assembly helps to understand
> it fully.  I'll be my own case in point.  I remember most sorting
> algorithms vaguely, and I probably could not implement all of them
> off the top of my head.  Does that hamper me?  No, because I know
> which book to look them up in, and that's exactly what books are for.
> The important thing, however, is that I remember that O(n^2) is bad
> for a sorting algorithm, and O(n lg n) is pretty good.

But that is not always true.

A number of years ago I developed a program that had to do a large
number of sorts with the following characteristics:

	1.  The mean number of items to be sorted was about 5.  In a 
	    test sample of a million cases, the larges number of items
	    to be sorted was 35.

	2.  The items were usually in order.  In the test sample, 90% 
	    were in order, and most of the rest were in order except 
	    for a single interchange of adjacent items.  Only 8 were 
	    out of order by more than three interchanges or required 
	    interchanges of nonadjacent items.

Care to try this with quicksort?  or heapsort?  A good old O(n^2)
insertion sort works quite nicely.


Michael M Rubenstein




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00         ` Peter Seebach
  1996-08-08  0:00           ` Tim Behrendsen
@ 1996-08-10  0:00           ` Mike Rubenstein
  1996-08-10  0:00             ` Peter Seebach
  1996-08-11  0:00             ` Craig Franck
  1 sibling, 2 replies; 688+ messages in thread
From: Mike Rubenstein @ 1996-08-10  0:00 UTC (permalink / raw)



seebs@solutions.solon.com (Peter Seebach) wrote:

> In article <01bb8536$892ee260$87ee6fce@timpent.airshields.com>,
> Tim Behrendsen <tim@airshields.com> wrote:
> >My point is someone who has "vaguely" learned five algorithms has
> >simply memorized them, and learned nothing about the general
> >principles of how algorithms are created.
> 
> Which is exactly what will happen if they code them in assembly; they
> can't possibly be learning how algorithms are created if they start with
> implementing them in a low level language.  Algorithms are generally written
> in natural languages, and written on napkins.

Are you suggesting that the guy who invented the GCD algorithm you
used didn't know assembly language? :-)


Michael M Rubenstein




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-09  0:00                                           ` Tim Behrendsen
@ 1996-08-10  0:00                                             ` Mike Rubenstein
  1996-08-12  0:00                                               ` Tim Behrendsen
  0 siblings, 1 reply; 688+ messages in thread
From: Mike Rubenstein @ 1996-08-10  0:00 UTC (permalink / raw)



"Tim Behrendsen" <tim@airshields.com> wrote:

> Mike Rubenstein <miker3@ix.netcom.com> wrote in article
> <320b35a2.43769707@nntp.ix.netcom.com>...
> > "Tim Behrendsen" <tim@airshields.com> wrote:
> > > Peter Seebach <seebs@solutions.solon.com> wrote in article
> > > <4uah1k$b2o@solutions.solon.com>...
> > > > In C, we write
> > > > 
> > > > 	int
> > > > 	gcd(int x, int y) {
> > > > 		if (y == 0)
> > > > 			return x;
> > > > 		return gcd(y, (x % y));
> > > > 	}
> > > > 
> > > This is an interesting case, because it is somewhat inefficently
> > > implemented.  If you're interested in speed, you would do...
> > > 
> > > int gcd(int x, int y) {
> > >     int z;
> > >     while (y != 0)
> > >         z = y; y = x % y; x = z;
> > >     return(y);
> > > }
> > 
> > I don't understand how going into an infinite loop when y != 0
> > improves the speed?  I guess I cut class that day.
> > 
> > Nor do I see how dividing by 0 (when y == 0) is an improvement?
> > Perhaps I just cut too many classes.
> > 
> > Let's review the bidding.  Peter gives an algorithm.  You claim that
> > knowledge of assembly helps one understand it better and produce a
> > program that goes into an infinite loop if y is not zero and divides
> > by 0 if y is zero.
> > 
> > Perhaps you meant
> > 
> > 	int gcd(int x, int y) {
> > 	    int z;
> > 	    while (y != 0)
> > 	        { z = y; y = x % y; x = z; }
> > 	    return(y);
> > 	}
> > 
> > This avoids the infinite loop, but can be made even more efficient:
> > 
> > 	int gcd(int x, int y)
> > 	{
> > 	  return 0;
> > 	}
> > 
> > Of course this isn't quite equivalent to Peter's function.
> > 
> > I guess, like Peter, I'm a little thick today.  Please explain again
> > how knowing assembly language helps one understand algorithms?  I got
> > a little confused when you transformed Peter's correct, but slightly
> > inefficient function into an infinite loop.
> 
> Yes, you're right; my typo proves all my arguments false.  How can I
> have been such a fool?

Perhaps because you think like an assembly language programmer --
there's an enormous advantage to clear code.  Efficiency may, or may
not be important.  Yet you assumed that it was worthwhile to make
Peter's code more efficient.  This is the assembly language
programmer's disease.

If I were doing a lot of gcd calculations, I'd certainly try to
optimize the program.  But in most applications very little of the
code is time critical.  Where it is not, clear code wins over
efficient code.

Furthermore, when teaching algorithms Peter's code is what you want,
at least for the first cut.  It shows a general technique of algorithm
design, reducing a problem to a similar but easier one, that your
code, even if written correctly, hides.

It's near September, and I feel safe in predicting that we will soon
see quite a few posts asking how to solve the towers of Hanoi puzzle.
Those who understand Peter's formulation of Euclid's algorithm will do
better than those who only know yours.


Michael M Rubenstein




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-05  0:00                                     ` Tim Hollebeek
@ 1996-08-10  0:00                                       ` Mike Rubenstein
  0 siblings, 0 replies; 688+ messages in thread
From: Mike Rubenstein @ 1996-08-10  0:00 UTC (permalink / raw)



tim@franck (Tim Hollebeek) wrote:

> Any C programmer should know a string literal is a *POINTER TO* a real
> object in memory.  I've seen tons of bad C code based on not
> understanding this.  The famous 'if (x == "foo")' is just one example.

This is just not true.  A string literal is an array of char, not a
pointer.  In many situations it is converted to a pointer, but there
are some where it is not.  The "C programmer" who thinks a string
literal is a pointer is not going to understand why

	#include <stdio.h>

	int main(void)
	{
	  char* a;
	  a = "hello world";
	  if (sizeof a == sizeof "hello world")
	    printf("no\n");
	  else
	    printf("yes\n");
	  return 0;
	}

probably prints yes rather than no.

Michael M Rubenstein




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-10  0:00           ` Mike Rubenstein
@ 1996-08-10  0:00             ` Peter Seebach
  1996-08-11  0:00             ` Craig Franck
  1 sibling, 0 replies; 688+ messages in thread
From: Peter Seebach @ 1996-08-10  0:00 UTC (permalink / raw)



In article <320bf84e.9269979@nntp.ix.netcom.com>,
Mike Rubenstein <miker3@ix.netcom.com> wrote:
>Are you suggesting that the guy who invented the GCD algorithm you
>used didn't know assembly language? :-)

Actually, the Greeks had an unfair advantage, in that they learned math as
children.  In fact, *all* of their texts are math texts, consisting of
sequences of variables being multiplied.  At least, that's what I see when
I try to read 'em.  Not sure *why* they never use anything but the sorta
weird ones, but I guess it works for them.  I woulda used more x's and
n's, but hey.

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-08  0:00                                     ` Teaching sorts [was Re: What's the best language to start with?] Robert I. Eachus
  1996-08-09  0:00                                       ` Robert Dewar
  1996-08-10  0:00                                       ` Lawrence Kirby
@ 1996-08-10  0:00                                       ` Al Aab
  1996-08-12  0:00                                       ` Steve Heller
  1996-08-14  0:00                                       ` Stephen Baynes
  4 siblings, 0 replies; 688+ messages in thread
From: Al Aab @ 1996-08-10  0:00 UTC (permalink / raw)




i did 
enjoy your post
professor


-- 





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-09  0:00                                           ` Chris Sonnack
@ 1996-08-10  0:00                                             ` Tim Behrendsen
  1996-08-11  0:00                                               ` Dan Pop
  1996-08-11  0:00                                               ` Chris Sonnack
  0 siblings, 2 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-10  0:00 UTC (permalink / raw)



Chris Sonnack <cjsonnack@mmm.com> wrote in article
<4ug5u5$kha@dawn.mmm.com>...

> They're both pretty clear. And any real programmer knows rule #27: "There
> should be no constants in your code except the numbers 1 and 0, and you
> should view those with suspicion."

I would say, "There should be no constants in your code except 0.  Tests
should be less than, equal, greater than, or not equal 0.  Otherwise,
it better involve a symbol."

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-09  0:00               ` Tim Behrendsen
@ 1996-08-10  0:00                 ` Szu-Wen Huang
  1996-08-11  0:00                   ` Tim Behrendsen
  0 siblings, 1 reply; 688+ messages in thread
From: Szu-Wen Huang @ 1996-08-10  0:00 UTC (permalink / raw)



Tim Behrendsen (tim@airshields.com) wrote:
[snip]
: But, you're right that this has nothing to do with assembly.
: The question is how to have students get a feel for how
: algorithms actually work in the real world, and there is no
: substitute for working with the real computer.

Maybe we could just short-circuit this prolonged discussion to why
HLLs cannot help you teach students and why assembly is necessary.

: > Data movement, sir, is not restricted to assembly language.
: > Assembly language, in fact, impedes the understanding of data
: > movement because of all the extraneous things that have to be
: > done to make the program work.

: What extraneous things?  I don't understand why you think it's
: so enormously complicated to use assembly.  Programmers used
: to do it all the time, and they somehow managed to write
: programs without suffering mental breakdowns.

I didn't say "enormously complicated".  I said, "extraneous".
Knowledge a beginner does not *need* is likely to confuse and
impede more than help.

: > : What we're doing now is *sure* not working.

: > Because some students graduate without understanding algorithm
: > complexity, not because they graduate without learning assembly
: > language.

: I agree with the first, but the reason they are not "getting"
: algorithm complexity is because we're shrouding it in all
: this abstract mystery.

That's your assertion.  An equally valid one is as I explained,
that the least experienced instructors get assigned to beginners.

[snip]
: But we can learn from the best programmers about how their
: thought processes work.  Again, the most important thing is
: to get the students "thinking like programmers".

An expert programmer knows when rules can be broken.  Beginners
do not have the knowledge to make that decision.  You simply cannot
think like an expert until you are one.

[snip]
: Don't get me started about debuggers... I think in certain
: ways debuggers have lowered productivity rather than
: raised it.

Precisely.  Giving the beginner too much impedes them rather than
develops them.

[snip]
: I just want students to get a better feel for what the computer
: really does.

And the risk, as we pointed out, is that the percentage of
attention they spend on the algorithm itself is actually lowered.

: Because the student is thinking about compression, rather than
: being spoon-fed the information from a book or the teacher.
: When the get into the real world, this thought process will
: stay with them when they begin to need to break down much
: more complex problems into algorithms.

Babies are spoon-fed for a very good reason, and that is because
they don't know what to eat.  We don't send babies into the real
world, just as we don't (hopefully) send beginners into the real
world.  A 15 year old might be advised that premarital sex is
"bad" because of responsibilities, diseases, et al, but an 8 year
old that asked is probably just answered with "it's bad."  You
confuse the 8 year old when you start talking about AIDS.

[snip]
: Using library routines to do simple I/O tasks does not take
: away from the learning experience of the algorithm.  Somehow
: input and output have to happen, and I see no reason to burden
: the student with problems that are not relevent to what they
: are trying to learn.

Precisely the reason to stay away from assembly.

[snip]
: > You still haven't proven why assembly is necessary, or at least
: > better.

: Again, I go back to the fact that what we're doing now is
: not working, based on my experience (and you admitted to
: above).  What are we doing now?  We are teaching students at
: an extremely high level of abstraction.

That's false dichotomy.  Ineffective teaching of HLL does not
prove that assembly will be effective.

[snip]
: Not to be repetitious with this example, but I look at how EEs
: are taught.  They begin by learning the fundamental components;
: resisters, capacitors, ohm's law, etc.  They seem to me to be
: much better prepared to do real work than CS graduates, and I
: think it's because of the early emphasis on the fundamentals.

Another false example.  I had two EE courses as a CS undergrad,
and we skipped most of that without hampering my ability to do
logic design.  In fact, my CS education was far more useful to
me when working with gates than my EE education.  I didn't need
to know how transistors made up a gate in order to assembly
several gates into an adder.  In fact, I had forgotten totally
how transistors work, yet I can still build an adder if you tell
me which symbol means NAND and NOR (the fundamentals).




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-09  0:00                                     ` Bob Gilbert
@ 1996-08-10  0:00                                       ` Tim Behrendsen
  1996-08-11  0:00                                         ` Craig Franck
  1996-08-11  0:00                                         ` Peter Seebach
  0 siblings, 2 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-10  0:00 UTC (permalink / raw)



Bob Gilbert <rgilbert@unconfigured.xvnews.domain> wrote in article
<4ug4eh$qn8@zeus.orl.mmc.com>...
> In article <01bb846c$e51df220$87ee6fce@timpent.airshields.com>, "Tim
Behrendsen" <tim@airshields.com> writes:
> > 
> > Not to get into a debate on the meaning of abstraction, but the
> > point is that there is very little hidden from the student when
> > they are learning assembly.  This allows them to better concentrate
> > on the basics of algorithms, because they are not distracted by syntax.
> 
> Yeah, but they are distracted by learning the computer architecture.
> When I was first introduced to programming in assembly, the learning
> of algorithms was not the purpose of the course.  What was learned
> from programming in assembly was the underlying architecture, what
> registers were, how memory was addressed, etc.
> 
> > Of course, but I'm talking about abstractions of assembly, i.e.,
> > HLLs.  Remember, C (or any HLL) does not really exist as far as
> > the computer knows.  Assembly is the direct raw instruction set of
> > the physical machine.  If the student is learning algorithms in
> > assembly, they are unquestionably learning the algorithm, and not
> > just some vague concept wrapped in 10 layers of wool.
> 
> I though assembly was an abstraction of the raw instruction set, not
> exactly the raw instruction set.  After all, the assembler allows one
> to abstract memory locations by assigning names or labels to them, it 
> abstracts the instructions by assigning shorthand pneumonics to them, 
> many allow the abstraction of code fragments which might perform some
> higher level function by allowing the programmer to implement them as
> macros, etc.

Well, technically you're right but there is a one-to-one
correspondance between assembly mnemonics and machine language,
so there is no practical difference between the two, as far
as understanding the machine.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-08  0:00                                     ` Teaching sorts [was Re: What's the best language to start with?] Robert I. Eachus
  1996-08-09  0:00                                       ` Robert Dewar
@ 1996-08-10  0:00                                       ` Lawrence Kirby
  1996-08-10  0:00                                       ` Al Aab
                                                         ` (2 subsequent siblings)
  4 siblings, 0 replies; 688+ messages in thread
From: Lawrence Kirby @ 1996-08-10  0:00 UTC (permalink / raw)



In article <EACHUS.96Aug8155946@spectre.mitre.org>
           eachus@spectre.mitre.org "Robert I. Eachus" writes:

>    *The race really is to finish the Quicksort before the Heapsorter
>has built a heap. This also shows why quicksort is quick, since the
                                                                 ^^^
>Quicksort step of picking up the cards is O(n), not O(n log n)
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

What does this phrase mean? Quicksort is an (n log n) (average time)
operation. Building a heap is an O(n) operation.

-- 
-----------------------------------------
Lawrence Kirby | fred@genesis.demon.co.uk
Wilts, England | 70734.126@compuserve.com
-----------------------------------------




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00                   ` William Clodius
@ 1996-08-11  0:00                     ` Dik T. Winter
  1996-08-11  0:00                     ` Fergus Henderson
  1 sibling, 0 replies; 688+ messages in thread
From: Dik T. Winter @ 1996-08-11  0:00 UTC (permalink / raw)



In article <g1sp9xtkmd.fsf@hotspec.lanl.gov> clodius@hotspec.lanl.gov (William Clodius) writes:
 > There is one special case of recursion, fortunately the most commonly
 > encountered case, where it is "ridiculously" easy for a compiler to
 > see that the recursion can be replaced by iteration.
...
Etc.  An interesting thread, but I wish to add that I know of at least
one system were a recursive algorithm was *significantly faster* than
the equivalent one with tail recursion removed.

But talking about that "gcd" thingy.  I tested and found the following
on two different architectures (times are seconds for a complete program):

		recursive	non-recursive

arch1, non-opt	0.52		0.42
arch1, opt	0.43		0.41

arch2, non-opt	2.65		0.44
arch2, opt	0.39		0.42

Now, what does this prove?  Absolutely nothing (except that there is
something strange with the non-optimization on arch2).  And what are
we talking about anyway?  The call was:
	gcd(2971215073LU, 1836311903LU);
(and if you do not understand why I used those numbers, those are the
absolutely worst case for 32-bit numbers; Fibonacci, you know).
That was the call and it was executed 10,000 times.  So we are talking
about a routine taking some tens of microseconds.  If that would form
a large part of a total program (large enough to make a 10% improvement
worthwhile), I would bother more about the program.

Somebody used the word "microoptimization"; how right he/she was.
-- 
dik t. winter, cwi, kruislaan 413, 1098 sj  amsterdam, nederland, +31205924098
home: bovenover 215, 1025 jn  amsterdam, nederland; http://www.cwi.nl/~dik/




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-10  0:00           ` Mike Rubenstein
  1996-08-10  0:00             ` Peter Seebach
@ 1996-08-11  0:00             ` Craig Franck
  1 sibling, 0 replies; 688+ messages in thread
From: Craig Franck @ 1996-08-11  0:00 UTC (permalink / raw)



miker3@ix.netcom.com (Mike Rubenstein) wrote:
>seebs@solutions.solon.com (Peter Seebach) wrote:
>
>> In article <01bb8536$892ee260$87ee6fce@timpent.airshields.com>,
>> Tim Behrendsen <tim@airshields.com> wrote:
>> >My point is someone who has "vaguely" learned five algorithms has
>> >simply memorized them, and learned nothing about the general
>> >principles of how algorithms are created.
>> 
>> Which is exactly what will happen if they code them in assembly; they
>> can't possibly be learning how algorithms are created if they start with
>> implementing them in a low level language.  Algorithms are generally written
>> in natural languages, and written on napkins.
>
>Are you suggesting that the guy who invented the GCD algorithm you
>used didn't know assembly language? :-)

Euclid was a mathmetician, not a programmer :-)

I think why you should write algorithms in a natural language or even
psudo-code (whether on a napkin or not) is that an algorithm is not 
tied to any language. You implement the algorithm in a language, but
the algorithm itself has nothing to do with any one language. Our friend 
Euclid would probably think it existed in the "realm of idea's" A lot
of books use sort of a pidgeon pascal to discribe algorithms. Ideally
you could descibe it in math, but of course the concept of an iterative
construct is very week in mathmatics. To understand the creation of
algorithms you need to learn structured reasoning, critical thinking
skills, and get alot of simple algoithms under your belt before tackling
big ones. Figuring out how to print centered text on a monitor has
nothing to do with assembly language even though you can code as such.

-- 

Craig
-----
clfranck@worldnet.att.net
Manchester, NH
There are no electrons...






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-10  0:00                                       ` Tim Behrendsen
@ 1996-08-11  0:00                                         ` Craig Franck
  1996-08-11  0:00                                           ` Tim Behrendsen
  1996-08-11  0:00                                         ` Peter Seebach
  1 sibling, 1 reply; 688+ messages in thread
From: Craig Franck @ 1996-08-11  0:00 UTC (permalink / raw)



"Tim Behrendsen" <tim@airshields.com> wrote:
>Bob Gilbert <rgilbert@unconfigured.xvnews.domain> wrote in article
>
>> I though assembly was an abstraction of the raw instruction set, not
>> exactly the raw instruction set.  After all, the assembler allows one
>> to abstract memory locations by assigning names or labels to them, it 
>> abstracts the instructions by assigning shorthand pneumonics to them, 
>> many allow the abstraction of code fragments which might perform some
>> higher level function by allowing the programmer to implement them as
>> macros, etc.
>
>Well, technically you're right but there is a one-to-one
>correspondance between assembly mnemonics and machine language,
>so there is no practical difference between the two, as far
>as understanding the machine.
>

Realy? Why stop there? With RISC microprocessors there is almost
a one to one correspondance between instructions and hardware, but
with CISC there is not. There are little programs written in 
microcode that run when machine language instructions are executed.
Linked lists were primatives to the old VAX archetecture. And why 
stop there, to really understand computers you need to understand 
digital electronics. NAND gates, flip flops, inverters, ect. To
understand them you need a background in electronics. You need
to understand how transistors work. My point to all this is you
reach a point of diminishing return. Starting a class on algorithms
with a discusion of Ohms law is ridiculous. Needing to know how
to move values into registers rather than just writing a = b + c;
is not as bad, but its still not needed. 


-- 

Craig
-----
clfranck@worldnet.att.net
Manchester, NH
There are no electrons...






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-09  0:00                                           ` J. Blustein
@ 1996-08-11  0:00                                             ` Peter Seebach
  0 siblings, 0 replies; 688+ messages in thread
From: Peter Seebach @ 1996-08-11  0:00 UTC (permalink / raw)



In article <4ufbmd$2e0@falcon.ccs.uwo.ca>,
J. Blustein <jamie@csd.uwo.ca> wrote:
>    Can we say that we should be concerned with issues of effectiveness
>instead of efficency?  I know that, as programmers, we pursuing elegance
>and that an important part of that is efficency, but it is not the most
>important aspect.  Finally, there are outside concerns such as deadlines.

That's an interesting way of approaching it.  Certainly, for any given real
world task, a slow solution is likely to be better than none at all.  One of
the weaknesses of assembly is that it adapts poorly to sudden changes in
requirements.  Even the best written 80x86 assembly program will take a lot of
porting effort to run on an Alpha - more than all but the worst C programs.

In the end, you have to know what your requirements are, including long-term
maintenance, speed, reliability, and due date.  There will often be more than
one way to meet them, but more often there will be no way at all to meet them
all, and you must make sacrifices.

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-10  0:00                                       ` Tim Behrendsen
  1996-08-11  0:00                                         ` Craig Franck
@ 1996-08-11  0:00                                         ` Peter Seebach
  1996-08-11  0:00                                           ` Tim Behrendsen
  1 sibling, 1 reply; 688+ messages in thread
From: Peter Seebach @ 1996-08-11  0:00 UTC (permalink / raw)



In article <01bb86f5$f7f8ae40$32ee6fce@timhome2>,
Tim Behrendsen <tim@airshields.com> wrote:
>Well, technically you're right but there is a one-to-one
>correspondance between assembly mnemonics and machine language,
>so there is no practical difference between the two, as far
>as understanding the machine.

Are you quite sure?  I'd assume that assemblers would let you enter constants
in at least the three commonly used bases, and I would be completely
unsurprised if there were occasionally two ways of spelling a given operation,
perhaps allowing for a distinction which no longer exists.

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-09  0:00                                         ` Dan Pop
  1996-08-11  0:00                                           ` Tim Behrendsen
@ 1996-08-11  0:00                                           ` Mark Wooding
  1996-08-19  0:00                                             ` James Youngman
  1 sibling, 1 reply; 688+ messages in thread
From: Mark Wooding @ 1996-08-11  0:00 UTC (permalink / raw)



Dan Pop <Dan.Pop@cern.ch> wrote:
> In <01bb8534$b2718bc0$87ee6fce@timpent.airshields.com> "Tim Behrendsen" <tim@airshields.com> writes:
> 
> >For example, look at OOP.  Very naive implementations of OOP
> >used a huge amount of dynamic memory allocation, which can cause
> >severe performance problems.  That's why I don't use C++ for my
> >products; to do it right you have to do a very careful analysis
> >of how your classes are going fit together efficiently.
> >
> >I've spoken to enough people that have had C++ disasters to
> >convince me that the more abstraction there is, the more
> >danger there is of inefficient code.  This shouldn't be that
> >hard to believe; any time you abstract away details you are
> >giving up knowledge of what is going to be efficient.
> 
> Nonsense.  The efficiency of an implementation can be also thought in
> abstract terms, you don't need to know how the compiler works or
> assembly language in order to implement your application in an efficient
> way.  Or you may know all these irrelevant details and still write 
> inefficient code, because it's so much easier in many OOPLs.

Indeed.  But in the same way, if you use (the same) good algorithms in
the non-OOPy language, I suspect you'll end up with it going faster.
It's important to get your algorithms right, but it's also important to
get the right level of abstraction.

> More nonsense.  Unless your applications spend the bulk of their CPU
> time in the user interface, that should be a non-issue.

Applications do tend to be spending a suspiciously long time in their
user interfaces.  You're right that it /should/ be a non-issue, but it
doesn't appear to be.  Responsiveness of interfaces is rather important
as far as producing a good impression is concerned.

On a platform I've done a lot of work on, the main user interface kit
has wandered in the direction of being more abstract and HLL-based.  I
spotted this trend a couple of years ago, wrote a user interface library
entirely in assembler (it took two of us about a year); it ended up
being considerably more responsive, more featureful, and smaller by
about a factor of five.  This scared me quite a lot.

> And if the wrong algorithm has been chosen in the first place, no
> amount of micro-optimization will save the code performance.  The guy
> who can do a decent algorithm analysis (an entirely abstract
> operation) will always beat the one who is an expert assembly
> programmer but prefers to spend his time coding instead of dealing
> with abstractions.

An easy solution presents itself: do both.  Implementing a complicated
algorithm in assembler isn't much harder than implementing it in C (so
I've found), and maybe I'm just a bit odd, but my assembler code is
usually less buggy and easier to debug.  I'd only consider using a HLL
if development time is really tight, quality is totally unimportant,
portability is an issue or the instruction set in question is truly
horrid.

> This is an old story (it happened during the early to mid eighties)
> and I forgot the details.  One software company had a nice product for
> the PC, but it was rather slow.  To get the "best" performance, it was
> coded in assembly.  Another company decided to emulate that product
> and they did it quite successfully (their version is about 4 times
> faster).  They implemented it in C, but they've carefully chosen their
> algorithms.

I shan't stand here and defend assembler programmers who use the wrong
algorithms: that's silly.  But an assembler programmer who uses the
/right/ algorithms, now there's someone to look up to.

Why is it, by the way, that no-one here (apart from me) has mentioned
the issue of code size?  Does no-one care?  If it's accepted that the
performance of 90% of a program is not particularly relevant, why isn't
it written to save space?  Surely /this/ is the reason that programs are
too big.
-- 
[mdw]

`When our backs are against the wall, we shall turn and fight.'
		-- John Major





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00                   ` William Clodius
  1996-08-11  0:00                     ` Dik T. Winter
@ 1996-08-11  0:00                     ` Fergus Henderson
  1 sibling, 0 replies; 688+ messages in thread
From: Fergus Henderson @ 1996-08-11  0:00 UTC (permalink / raw)



clodius@hotspec.lanl.gov (William Clodius) writes:

>There is one special case of recursion, fortunately the most commonly
>encountered case, where it is "ridiculously" easy for a compiler to
>see that the recursion can be replaced by iteration. The recursive
>code must call itself directly on only one branch of the control of
>flow and is the last call before returning on that control of
>flow. Conventionally it is placed on the last branch of the control of
>flow for clarity, hence the name "tail" recursion.

That isn't sufficient condition; in C, it's also necessary for the
compiler to check that the function not take the address of any local
variables.  (GNU C checks a slightly more general condition -- it only
does tail recursion optimization in functions which don't have any
variables stored on the stack at all.)

The requirement that it be only one branch of the control flow is
unnecessary.  In C, it should be just as "ridiculously" easy for the
compiler to optimize any `return' statements which return the result
of a recursive call to the containing function, regardless of the
control flow.  For example, in

	void f(int x) {
		if (x % 4 == 0) return f(x/3 - 1);
		if (x % 3 == 0) return f(x/2 - 1);
		...
	}

it should be easy enough to replace this with

	void f(int x) {
	entry:
		if (x % 4 == 0) { x = x/3 - 1; goto entry; }
		if (x % 3 == 0) { x = x/2 - 1; goto entry; }
		...
	}

The compiler doesn't need to check that there is only one flow
of control, it just needs to check that the function doesn't
take the address of any local variables (before the recursive tail call).

--
Fergus Henderson <fjh@cs.mu.oz.au>   |  "I have always known that the pursuit
WWW: <http://www.cs.mu.oz.au/~fjh>   |  of excellence is a lethal habit"
PGP: finger fjh@128.250.37.3         |     -- the last words of T. S. Garp.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-10  0:00           ` Mike Rubenstein
@ 1996-08-11  0:00             ` Szu-Wen Huang
  1996-08-17  0:00             ` Richard Chiu
  1 sibling, 0 replies; 688+ messages in thread
From: Szu-Wen Huang @ 1996-08-11  0:00 UTC (permalink / raw)



Mike Rubenstein (miker3@ix.netcom.com) wrote:
: huang@mnsinc.com (Szu-Wen Huang) wrote:
[snip]
: > The important thing, however, is that I remember that O(n^2) is bad
: > for a sorting algorithm, and O(n lg n) is pretty good.

: But that is not always true.

Of course not, that's why we study best case, average case, and
worst case performances of both time and space of each algorithm.

: A number of years ago I developed a program that had to do a large
: number of sorts with the following characteristics:
[snip]
: Care to try this with quicksort?  or heapsort?  A good old O(n^2)
: insertion sort works quite nicely.

Which is why we examine what causes best- and worst-case behaviors
of each algorithm.  Without assumption on the inputs (and that is
frequently the case), the general knowledge that "quicksort is
better than insertion sort" holds valid.  With additional information
(such as your mostly partially sorted data), what we know about the
best case of insertion sort *and* what we know about how quicksort
performs on sorted input leads us to the correct solution.  In fact,
the general knowledge that many practical quicksort algorithms
switch to insertion sort when partitions are small is already a
most obvious hint to help arrive at the proper solution.

All this to say, assembly language still isn't necessary to understand
algorithms.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-10  0:00                                             ` Tim Behrendsen
  1996-08-11  0:00                                               ` Dan Pop
@ 1996-08-11  0:00                                               ` Chris Sonnack
  1 sibling, 0 replies; 688+ messages in thread
From: Chris Sonnack @ 1996-08-11  0:00 UTC (permalink / raw)



Tim Behrendsen (tim@airshields.com) wrote:
> Chris Sonnack <cjsonnack@mmm.com> wrote in article
>
>> They're both pretty clear. And any real programmer knows rule #27: "There
>> should be no constants in your code except the numbers 1 and 0, and you
>> should view those with suspicion."
>
> I would say, "There should be no constants in your code except 0.  Tests
> should be less than, equal, greater than, or not equal 0.  Otherwise,
> it better involve a symbol."

Some languages don't have increment/decrement operators, hence:

    foo = foo + 1

--
Chris Sonnack  <cjsonnack@mmm.com>                  http://eishcq.mmm.com
Engineering Information Services/Information Technology/3M, St.Paul, Minn
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
TODAY'S RULE: No Smoffing or Fnargling!

Opinions expressed herein are my own and may not represent those of my employer.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-10  0:00                                             ` Tim Behrendsen
@ 1996-08-11  0:00                                               ` Dan Pop
  1996-08-12  0:00                                                 ` Tim Behrendsen
  1996-08-12  0:00                                                 ` Chris Sonnack
  1996-08-11  0:00                                               ` Chris Sonnack
  1 sibling, 2 replies; 688+ messages in thread
From: Dan Pop @ 1996-08-11  0:00 UTC (permalink / raw)



In <01bb86f6$d92cf6a0$32ee6fce@timhome2> "Tim Behrendsen" <tim@airshields.com> writes:

>Chris Sonnack <cjsonnack@mmm.com> wrote in article
><4ug5u5$kha@dawn.mmm.com>...
>
>> They're both pretty clear. And any real programmer knows rule #27: "There
>> should be no constants in your code except the numbers 1 and 0, and you
>> should view those with suspicion."
>
>I would say, "There should be no constants in your code except 0.  Tests
>should be less than, equal, greater than, or not equal 0.  Otherwise,
>it better involve a symbol."

This is ludicrous.  When coding a binary search, NO symbol will be better
than the constant 2.  Ditto for the constant 10 when doing binary to 
decimal conversions.  And the list could go on and on.

The rule is: "there should be no _arbitrary_ constants in your code,
with no exceptions".

Dan
--
Dan Pop
CERN, CN Division
Email: Dan.Pop@cern.ch 
Mail:  CERN - PPE, Bat. 31 R-004, CH-1211 Geneve 23, Switzerland




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-09  0:00                                         ` Dan Pop
@ 1996-08-11  0:00                                           ` Tim Behrendsen
  1996-08-11  0:00                                             ` Dan Pop
  1996-08-12  0:00                                             ` Peter Seebach
  1996-08-11  0:00                                           ` Mark Wooding
  1 sibling, 2 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-11  0:00 UTC (permalink / raw)



Dan Pop <Dan.Pop@cern.ch> wrote in article
<danpop.839594575@news.cern.ch>...
> In <01bb8534$b2718bc0$87ee6fce@timpent.airshields.com> "Tim Behrendsen"
<tim@airshields.com> writes:
> 
> >Dan Pop <Dan.Pop@cern.ch> wrote in article
> ><danpop.839450672@news.cern.ch>...
> >
> >I've spoken to enough people that have had C++ disasters to
> >convince me that the more abstraction there is, the more
> >danger there is of inefficient code.  This shouldn't be that
> >hard to believe; any time you abstract away details you are
> >giving up knowledge of what is going to be efficient.
> 
> Nonsense.  The efficiency of an implementation can be also thought in
> abstract terms, you don't need to know how the compiler works or
> assembly language in order to implement your application in an efficient
> way.  Or you may know all these irrelevant details and still write 
> inefficient code, because it's so much easier in many OOPLs.

Certainly the most influential factor to the speed of an
application is the algorithms that are chosen.  No question.

That having been said, why are application written in OOPLs
often very inefficient?  I think it's because they are built up
with so many abstract black boxes that it's difficult to keep
track of the interactions between the pieces.

Same with HLLs.  They are nothing more than "objects" on top
of the assembly language.  Any time you have generalization
you introduce inefficiency, and knowing the general principles
behind how things are going to compile can only help.

An example from another post is recursion; show me the
computer where recursion is not slower than a non-recursive
implementation of an algorithm

> >I alluded to this in another post, but a good example is Motif
> >and X11.  A programmer who only understands Motif, but does not
> >understand X11 is going to write slow crap, period.
> 
> More nonsense.  Unless your applications spend the bulk of their CPU
> time in the user interface, that should be a non-issue.

You obviously haven't done much Motif programming.  Example: If
you are not careful, you can introduce a round trip every time
you specify a color.

> >Here's an example:
> >
> >int a[50000],b[50000],c[50000],d[50000],e[50000];
> >
> >void test1()
> >{
> >    int i, j;
> >    for (j = 0; j < 10; ++j) {
> >        for (i = 0; i < 50000; ++i) {
> >            ++a[i]; ++b[i]; ++c[i]; ++d[i]; ++e[i];
> >        }
> >    }
> >}
> >
> >void test2()
> >{
> >    int i, j;
> >    for (j = 0; j < 10; ++j) {
> >        for (i = 0; i < 50000; ++i) ++a[i];
> >        for (i = 0; i < 50000; ++i) ++b[i];
> >        for (i = 0; i < 50000; ++i) ++c[i];
> >        for (i = 0; i < 50000; ++i) ++d[i];
> >        for (i = 0; i < 50000; ++i) ++e[i];
> >    }
> >}
> >
> >On my AIX system, test1 runs in 2.47 seconds, and test2
> >runs in 1.95 seconds using maximum optimization (-O3).  The
> >reason I knew the second would be faster is because I know
> >to limit the amount of context information the optimizer has
> >to deal with in the inner loops, and I know to keep memory
> >localized.
> 
> 1. For a marginal speed increase (~25%), you compromised the readability
>    of the code.

You call 25% a "marginal" increase?  It's attitudes like this
that give us the slow code world we have right now.  And how
is one less readable than the other?

> 2. Another compiler, on another system, might generate faster code
>    out of the test1.  This is especially true for supercomputers,
>    which have no cache memory (and where the micro-optimizations are done
>    based on a completely different set of criteria) and where the cpu
time
>    is really expensive.

Show me the computer where test1 comes out faster.  Or shouldn't
we depend on the compiler to optimize this?

> 3. You used exclusively abstract concepts to justify why test2 is
>    faster on your particular platform/compiler combination.  No
references
>    to the length of a cache line or to the compiler being able to use
>    a more efficient instruction or instruction combination in one case
>    than in the other.
> 
> Let's see what happens on my 486DX33 box:
> 
> So, it's 1.10 + 0.23 = 1.33 seconds of cpu time for test1 versus
> 1.17 + 0.18 = 1.35 seconds for test2.  Conclusions:
> 
> 1. My 486DX33 is faster than your AIX system (or maybe gcc is faster than
>    xlc) :-)

Oops!  I just realized my AIX program had 100 for the outer loop, not
10 (had me worried for a minute there ...)

> 2. Your results are not reproducible.  Your extra "insight" simply
"helped"
>    you to generate less readable code.

Actually, try again with 500 or 5000 in the inner loops; 50000
probably flushed the cache.  In any case, the code is identically
readable either way IMO, and costs you nothing to implement
the efficient way.  So it wasn't on one architecture, big deal.
On the average, it will be, and costs nothing to do it that way/

> >Now I submit that if I showed the average C programmer
> >both programs, they would guess that test1 is faster because
> >it has "less code",
> 
> And he might be right, both on a CP/M-80 micro and a Cray supercomputer.
> Or on most systems with a compiler that can do loop unrolling.

Not gonna find this mythical bizarre beast, except for trivial
differences because of the extra loop overhead.

> >and that is where abstraction,
> >ignorance, and niavete begin to hurt.
> 
> The key to proper optimization is profiling.  Additional knowledge about
> a certain platform is a lot less important.  

Agreed; optimization shouldn't be to a certain platform, but
optimizations can be made on general basis.

Again, is it the case that I can order my code into any
algorithmically valid sequence and get identical running times?

> And if the wrong algorithm has been chosen in the first place, no
> amount of micro-optimization will save the code performance.  The guy
> who can do a decent algorithm analysis (an entirely abstract operation)
> will always beat the one who is an expert assembly programmer but
> prefers to spend his time coding instead of dealing with abstractions.

Agreed.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-11  0:00                                         ` Craig Franck
@ 1996-08-11  0:00                                           ` Tim Behrendsen
  0 siblings, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-11  0:00 UTC (permalink / raw)



Craig Franck <clfranck@worldnet.att.net> wrote in article
<4ujdus$7ph@mtinsc01-mgt.ops.worldnet.att.net>...
> "Tim Behrendsen" <tim@airshields.com> wrote:
> >Bob Gilbert <rgilbert@unconfigured.xvnews.domain> wrote in article
> >
> >> I though assembly was an abstraction of the raw instruction set, not
> >> exactly the raw instruction set.  After all, the assembler allows one
> >> to abstract memory locations by assigning names or labels to them, it 
> >> abstracts the instructions by assigning shorthand pneumonics to them, 
> >> many allow the abstraction of code fragments which might perform some
> >> higher level function by allowing the programmer to implement them as
> >> macros, etc.
> >
> >Well, technically you're right but there is a one-to-one
> >correspondance between assembly mnemonics and machine language,
> >so there is no practical difference between the two, as far
> >as understanding the machine.
> >
> 
> Realy? Why stop there? With RISC microprocessors there is almost
> a one to one correspondance between instructions and hardware, but
> with CISC there is not. There are little programs written in 
> microcode that run when machine language instructions are executed.
> Linked lists were primatives to the old VAX archetecture. And why 
> stop there, to really understand computers you need to understand 
> digital electronics. NAND gates, flip flops, inverters, ect. To
> understand them you need a background in electronics. You need
> to understand how transistors work. My point to all this is you
> reach a point of diminishing return. Starting a class on algorithms
> with a discusion of Ohms law is ridiculous. Needing to know how
> to move values into registers rather than just writing a = b + c;
> is not as bad, but its still not needed. 

To tell you the truth, I think we *should* teach more hardware theory
than we do.  Maybe then I wouldn't have people ask me if they need
to change the IP address on their X terminal with the monitor is changed
(Yes, this happened).

But to answer your question, the reason is that the microcode is not
programmable in any kind of general way by the compiler.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-11  0:00                                         ` Peter Seebach
@ 1996-08-11  0:00                                           ` Tim Behrendsen
  1996-08-12  0:00                                             ` Alf P. Steinbach
  1996-08-13  0:00                                             ` Szu-Wen Huang
  0 siblings, 2 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-11  0:00 UTC (permalink / raw)



Peter Seebach <seebs@solutions.solon.com> wrote in article
<4uk67k$ii2@solutions.solon.com>...
> In article <01bb86f5$f7f8ae40$32ee6fce@timhome2>,
> Tim Behrendsen <tim@airshields.com> wrote:
> >Well, technically you're right but there is a one-to-one
> >correspondance between assembly mnemonics and machine language,
> >so there is no practical difference between the two, as far
> >as understanding the machine.
> 
> Are you quite sure?  I'd assume that assemblers would let you enter
constants
> in at least the three commonly used bases, and I would be completely
> unsurprised if there were occasionally two ways of spelling a given
operation,
> perhaps allowing for a distinction which no longer exists.

Of course you can enter constants in different bases, but
that doesn't affect the code that's generated.  The entire
*point* of assembler is to be a 1-to-1 corresponence.  Otherwise,
it would be a HLL.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-10  0:00                 ` Szu-Wen Huang
@ 1996-08-11  0:00                   ` Tim Behrendsen
  0 siblings, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-11  0:00 UTC (permalink / raw)



Szu-Wen Huang <huang@mnsinc.com> wrote in article
<4uioj0$ptn@news1.mnsinc.com>...
> Tim Behrendsen (tim@airshields.com) wrote:
> [snip]
> 
> : > Because some students graduate without understanding algorithm
> : > complexity, not because they graduate without learning assembly
> : > language.
> 
> : I agree with the first, but the reason they are not "getting"
> : algorithm complexity is because we're shrouding it in all
> : this abstract mystery.
> 
> That's your assertion.  An equally valid one is as I explained,
> that the least experienced instructors get assigned to beginners.

So, given this theory, what do EE students seem so much better
prepared than CS students?  I just don't see the same level of
mediocrity being complained about from the EE community.

I think it's because EE students are taught exactly the way that
I've been describing, and in fact, most subjects are taught the way
I've describing.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-06  0:00                               ` Tim Behrendsen
  1996-08-06  0:00                                 ` Peter Seebach
  1996-08-07  0:00                                 ` What's the best language to start with Ian Ward
@ 1996-08-11  0:00                                 ` Jerone A. Bowers
  2 siblings, 0 replies; 688+ messages in thread
From: Jerone A. Bowers @ 1996-08-11  0:00 UTC (permalink / raw)



Tim Behrendsen (tim@airshields.com) wrote:

: Let me bring it back full-circle where we started.  The reason
: I mention assembly in the first place was the number of graduates
: coming to me for a job that were failing the test I give
: *abysmally*, particularly in the areas of creating an algorithm
: for a problem they've never seen before, and doing logical
: operations.

: I chalked this up to the lack of the fundamentals being taught,
: and the students having their brains filled up so much with
: abstractions that they don't understand how to solve problems
: anymore.

: This is why I think assembly is the better way to teach
: algorithms; it's just you and the algorithm.  It forces them
: to really think about the problem, because they don't have any
: "training wheels" to protect them from the problem.

	An algorithm is an algorithm.  An algorithm is ( or should be ) programming 
language independent.  It would seem that assembly would teach you more about a 
particular platform than about algorithms.  I agree about the lack of fundamentals 
being taught.  Perhaps just naked algorithm classes and data strutcure classes.
Perhaps we are concentrating too much on how something works and too little on
why it works.  
  

: Whatever were doing now is *not working*, let me tell you.

: -- Tim Behrendsen (tim@airshields.com)

--
These thoughts are mine and mine alone.    I can say that, right?

EMAIL: jab@bowtech.erinet.com 		|  Our differences should not
SMAIL  9221 N Main St			    |  make us enemies, but should
Dayton, OH 45415-1126			    |  make us friends







^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-11  0:00                                           ` Tim Behrendsen
@ 1996-08-11  0:00                                             ` Dan Pop
  1996-08-13  0:00                                               ` Tim Behrendsen
  1996-08-12  0:00                                             ` Peter Seebach
  1 sibling, 1 reply; 688+ messages in thread
From: Dan Pop @ 1996-08-11  0:00 UTC (permalink / raw)



In <01bb87cf$97ae8e80$87ee6fce@timpent.airshields.com> "Tim Behrendsen" <tim@airshields.com> writes:

>Dan Pop <Dan.Pop@cern.ch> wrote in article
><danpop.839594575@news.cern.ch>...
>> In <01bb8534$b2718bc0$87ee6fce@timpent.airshields.com> "Tim Behrendsen"
><tim@airshields.com> writes:
>> 
>> >Here's an example:
>> >
>> >int a[50000],b[50000],c[50000],d[50000],e[50000];
>> >
>> >void test1()
>> >{
>> >    int i, j;
>> >    for (j = 0; j < 10; ++j) {
>> >        for (i = 0; i < 50000; ++i) {
>> >            ++a[i]; ++b[i]; ++c[i]; ++d[i]; ++e[i];
>> >        }
>> >    }
>> >}
>> >
>> >void test2()
>> >{
>> >    int i, j;
>> >    for (j = 0; j < 10; ++j) {
>> >        for (i = 0; i < 50000; ++i) ++a[i];
>> >        for (i = 0; i < 50000; ++i) ++b[i];
>> >        for (i = 0; i < 50000; ++i) ++c[i];
>> >        for (i = 0; i < 50000; ++i) ++d[i];
>> >        for (i = 0; i < 50000; ++i) ++e[i];
>> >    }
>> >}
>> >
>> >On my AIX system, test1 runs in 2.47 seconds, and test2
>> >runs in 1.95 seconds using maximum optimization (-O3).  The
>> >reason I knew the second would be faster is because I know
>> >to limit the amount of context information the optimizer has
>> >to deal with in the inner loops, and I know to keep memory
>> >localized.
>> 
>> 1. For a marginal speed increase (~25%), you compromised the readability
>>    of the code.
>
>You call 25% a "marginal" increase?  It's attitudes like this
>that give us the slow code world we have right now.

Yes, I do call 25% a marginal increase.  Would you be considerably happier
if everything would run 25% faster, at the expense of everything being
harder to develop and maintain by a factor of 2?

>And how is one less readable than the other?

Methinks one loop is easier to read than five.

>> 2. Another compiler, on another system, might generate faster code
>>    out of the test1.  This is especially true for supercomputers,
>>    which have no cache memory (and where the micro-optimizations are done
>>    based on a completely different set of criteria) and where the cpu
>time
>>    is really expensive.
>
>Show me the computer where test1 comes out faster. 

Already done: my notebook.

>Or shouldn't we depend on the compiler to optimize this?

Exactly, it's compiler's job to optimize it, not mine.

>> Let's see what happens on my 486DX33 box:
>> 
>> So, it's 1.10 + 0.23 = 1.33 seconds of cpu time for test1 versus
>> 1.17 + 0.18 = 1.35 seconds for test2.  Conclusions:
>> 
>> 1. My 486DX33 is faster than your AIX system (or maybe gcc is faster than
>>    xlc) :-)
>
>Oops!  I just realized my AIX program had 100 for the outer loop, not
>10 (had me worried for a minute there ...)
>
>> 2. Your results are not reproducible.  Your extra "insight" simply
>"helped"
>>    you to generate less readable code.
>
>Actually, try again with 500 or 5000 in the inner loops; 50000
>probably flushed the cache.

Get a clue.  If test1 flushes the cache, test2 will flush it, as well.
Both implementations behave the same way WRT cache utilization: all
5 x 50000 array elements are accessed in a single iteration of the outer
loop, hence the same thing (cache hit or cache miss) will happen at the
next iteration in both versions.  Cache is simply a non-issue in your
example (even if you intended it to be :-)

>In any case, the code is identically
>readable either way IMO, and costs you nothing to implement
>the efficient way.

You forgot to provide a VALID justification for your claim that test2
is the "efficient way".

>So it wasn't on one architecture, big deal.

Can you prove that it will be on any other architecture?  According
to your line of argumentation, I could say: "so it was on one architecture,
big deal" :-)

>On the average, it will be, and costs nothing to do it that way/

I still have to see the proof.  If your time is worth nothing, then you're
right, it costs nothing.

>> The key to proper optimization is profiling.  Additional knowledge about
>> a certain platform is a lot less important.  
>
>Agreed; optimization shouldn't be to a certain platform, but
>optimizations can be made on general basis.

Right.  By selecting the proper algorithm.  Most other kinds of
optimizations will be specific to a certain machine, or, at best, class
of machines.  For example, an optimization which tries to improve the
cache hit rate might hurt on a supercomputer which has no cache, but it
is adversely affected by certain memory access patterns.

When programming for a single platform/compiler combination, it makes
sense to perform micro-optimizations, _if they're rewarding enough_.
But then, if portability is not an issue, the critical code could be
written in assembly, as well.  When writing portable code in a HLL,
micro-optimizations are usually a waste of effort and programmer time.

Dan
--
Dan Pop
CERN, CN Division
Email: Dan.Pop@cern.ch 
Mail:  CERN - PPE, Bat. 31 R-004, CH-1211 Geneve 23, Switzerland




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-12  0:00                                                 ` Mike Rubenstein
  1996-08-12  0:00                                                   ` Mark Wooding
@ 1996-08-12  0:00                                                   ` Tim Behrendsen
  1996-08-13  0:00                                                     ` Mike Rubenstein
       [not found]                                                     ` <32 <01bb8923$e1d34280$87ee6fce@timpent.airshields.com>
  1 sibling, 2 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-12  0:00 UTC (permalink / raw)



Mike Rubenstein <miker3@ix.netcom.com> wrote in article
<320f14e5.213196860@nntp.ix.netcom.com>...
> "Tim Behrendsen" <tim@airshields.com> wrote:
> > Mike Rubenstein <miker3@ix.netcom.com> wrote in article
> > <320bf032.7193774@nntp.ix.netcom.com>...
> > 
> > No, this is experienced programmer's disease.  It's not as if
> > I go around optimizing everything, I just do it automatically.
> > When you train yourself to think about efficiency, you naturally
> > do it the efficient way the *first time*.  The cumulative effect
> > of this is fast programs, and it doesn't cost any more effort.
> 
> I see.  Experienced programmers automatically optimize slightly
> inefficient, clear code to incorrect unclear code?  Guess I'll have to
> remain a novice.

Either solution is perfectly clear to me, and perhaps you
could share the secret of a mistake free life, since you appear
to not be able to forgive my one type?
 
> Perhaps you need to get some new experiences.
> 
> Changing correct clear code when there is no specific requirement is
> an assembly programmer's, not an experienced programmer's disease.
> Yes, I know that was just a typo (or several typos).  But that
> happens.  Only an assembly language programmer (and, frankly, not a
> very good one) would assume that efficiency was so important that
> clear code had to be changed and risk the problems.

As I said, I would not go back and change working code
unless there was a good reason.  If I was implementing it
the first time, I would do it efficiently the first time.

> > Yes, I agree.  Personally, they both seem equally clear to me,
> > but I could see how someone may prefer the recursive solution.
> > It is certainly the more "beautiful" of the two.
> 
> So, why did you change it when there was no clear requirement for
> better efficiency?  Even if one does not prefer that solution, it
> certainly was clear enough and correct.

Because I was making a point about recursion, not about GCD
implementations.
  
> > I'm not sure if that's good or not.  One of the famous ways to
> > show recursion is a factorial computation.  Does that mean we
> > actually *want* people to implement factorial that way?
> 
> In scheme (and some other languages) it certainly is the way to
> implement it.
>
> But that's not the point.  If one understands the algorithm, one can
> code it using whatever techniques are appropriate.  Recursion is not
> something to be avoided at all costs even in implmeentations where it
> is not efficient.

"Techniques are appropriate"  That's the issue, isn't it?  I would
say that anybody who implements factorial in C using recursion
qualifies as a bonehead and should have their programming license
revoked.

> Assembly language is not needed to understand
> algorityms or efficiency.

It's not a question of whether it's needed or not.  It's a
question of what's best, and what we're doing now is certainly
not working.  The level of incompetence that I see coming
out of CS schools is astounding to me.

I've asked this before; why do EE graduates seem so much
better prepared than CS graduates?  I think it's because they
start with the low level and work they their way up.
 
> For the record, I have extensive assembly language experience on IBM
> 360/370, 8080, Z80, and 80x86 and a little on IBM709x, Univac 110x, GE
> 425, Honeywell 6000, and DEC PDP 6/10.  I've outgrown it.

So who's arguing that assembly should be used in production
any more?  You have the benefit of all that background, yet
you argue that it's useless.  I think that it benefits you more
than you realize.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-11  0:00                                               ` Dan Pop
  1996-08-12  0:00                                                 ` Tim Behrendsen
@ 1996-08-12  0:00                                                 ` Chris Sonnack
  1996-08-15  0:00                                                   ` Bob Hoffmann
  1 sibling, 1 reply; 688+ messages in thread
From: Chris Sonnack @ 1996-08-12  0:00 UTC (permalink / raw)



Dan Pop (Dan.Pop@cern.ch) wrote:

>>> They're both pretty clear. And any real programmer knows rule #27: "There
>>> should be no constants in your code except the numbers 1 and 0, and you
>>> should view those with suspicion."
>>
>> I would say, "There should be no constants in your code except 0.  Tests
>> should be less than, equal, greater than, or not equal 0.  Otherwise,
>> it better involve a symbol."
>
> This is ludicrous.  When coding a binary search, NO symbol will be better
> than the constant 2.  Ditto for the constant 10 when doing binary to 
> decimal conversions.  And the list could go on and on.

Absolutely! (Although generally binary searches don't need to divide by
2 so much as shift right one bit.) Like most "rules", there's always
exceptions. But it's still a very good rule (of thumb).

--
Chris Sonnack  <cjsonnack@mmm.com>                  http://eishcq.mmm.com
Engineering Information Services/Information Technology/3M, St.Paul, Minn
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
An intellectual is someone whose mind watches itself

Opinions expressed herein are my own and may not represent those of my employer.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-11  0:00                                           ` Tim Behrendsen
  1996-08-11  0:00                                             ` Dan Pop
@ 1996-08-12  0:00                                             ` Peter Seebach
  1996-08-13  0:00                                               ` Tim Behrendsen
  1 sibling, 1 reply; 688+ messages in thread
From: Peter Seebach @ 1996-08-12  0:00 UTC (permalink / raw)



In article <01bb87cf$97ae8e80$87ee6fce@timpent.airshields.com>,
Tim Behrendsen <tim@airshields.com> wrote:
>> >int a[50000],b[50000],c[50000],d[50000],e[50000];

>> >void test1()
>> >{
>> >    int i, j;
>> >    for (j = 0; j < 10; ++j) {
>> >        for (i = 0; i < 50000; ++i) {
>> >            ++a[i]; ++b[i]; ++c[i]; ++d[i]; ++e[i];
>> >        }
>> >    }
>> >}

>> >void test2()
>> >{
>> >    int i, j;
>> >    for (j = 0; j < 10; ++j) {
>> >        for (i = 0; i < 50000; ++i) ++a[i];
>> >        for (i = 0; i < 50000; ++i) ++b[i];
>> >        for (i = 0; i < 50000; ++i) ++c[i];
>> >        for (i = 0; i < 50000; ++i) ++d[i];
>> >        for (i = 0; i < 50000; ++i) ++e[i];
>> >    }
>> >}

>Show me the computer where test1 comes out faster.

gcc/SPARC, with *NO* optimization.  test1 runs 10 times in 15.66 seconds,
test2 takes 22.02 seconds.

With optimization, test2 comes out faster on that compiler.

In other words, which is faster depends on the compiler.  And on context.

>Not gonna find this mythical bizarre beast, except for trivial
>differences because of the extra loop overhead.

Trivial, like a factor of 50%.

>Again, is it the case that I can order my code into any
>algorithmically valid sequence and get identical running times?

No.  However, it is the case that the differences simply don't justify the
overhead.  I see the "faster" version as having more code, and no real reason
for it.  The pithy 10% advantage it has when it's better does not justify the
50% disadvantage it has when it's worse.  On my systems.

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-12  0:00                                                 ` Mike Rubenstein
@ 1996-08-12  0:00                                                   ` Mark Wooding
  1996-08-13  0:00                                                     ` Mike Rubenstein
  1996-08-15  0:00                                                     ` Richard A. O'Keefe
  1996-08-12  0:00                                                   ` Tim Behrendsen
  1 sibling, 2 replies; 688+ messages in thread
From: Mark Wooding @ 1996-08-12  0:00 UTC (permalink / raw)



Mike Rubenstein <miker3@ix.netcom.com> wrote:
> "Tim Behrendsen" <tim@airshields.com> wrote:
>
> > I'm not sure if that's good or not.  One of the famous ways to
> > show recursion is a factorial computation.  Does that mean we
> > actually *want* people to implement factorial that way?
> 
> In scheme (and some other languages) it certainly is the way to
> implement it.

Are we talking about implementing it the truly stupid way as a single
function, e.g.,

	(define (factorial n)
	  (if (zero? n)
	    1
	    (* n (factorial (- n 1)))))

or properly using another function which tail-recurses, e.g.,

	(define (factorial n) (fact-assist 1 n))
	(define (fact-assist a n)
	  (if (zero? n)
	    a
	    (fact-assist (* a n) (- n 1))))

If the former, then please report for re-education; if the latter, then
well and good: tail recursion is the functional-language equivalent of a
loop in a language like C.  The first version above is similar in style
(and in the sorts of problems it causes) to the traditional 

	int factorial(int n) { return (n ? n*factorial(n-1) : 1); }

recursive C implementation, while the second is similar in spirit to the
iterative version.

(My Scheme isn't terribly good: please don't moan at me about it if it's
wrong or nasty.  Also, all of the routines above assume they get valid
inputs, and will do really bad things if they don't: issues of input
checking aren't really relevant to the discussion in hand.)
-- 
[mdw]

`When our backs are against the wall, we shall turn and fight.'
		-- John Major





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-12  0:00                                               ` Tim Behrendsen
  1996-08-12  0:00                                                 ` Mike Rubenstein
@ 1996-08-12  0:00                                                 ` Bob Kitzberger
  1996-08-22  0:00                                                   ` Patrick Horgan
  1 sibling, 1 reply; 688+ messages in thread
From: Bob Kitzberger @ 1996-08-12  0:00 UTC (permalink / raw)



Tim Behrendsen (tim@airshields.com) wrote:

: No, this is experienced programmer's disease.  It's not as if
: I go around optimizing everything, I just do it automatically.
: When you train yourself to think about efficiency, you naturally
: do it the efficient way the *first time*.  The cumulative effect
: of this is fast programs, and it doesn't cost any more effort.

But efficiency is not the be-all and end-all.  You have to balance
efficiency with ease of use, ease of maintenance, etc.  The
fastest or most efficient solution is not always the "best"
solution.  


--
Bob Kitzberger	      Rational Software Corporation       rlk@rational.com
http://www.rational.com http://www.rational.com/pst/products/testmate.html




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-06  0:00                           ` Tim Behrendsen
                                               ` (2 preceding siblings ...)
  1996-08-06  0:00                             ` Dan Pop
@ 1996-08-12  0:00                             ` Robert I. Eachus
  3 siblings, 0 replies; 688+ messages in thread
From: Robert I. Eachus @ 1996-08-12  0:00 UTC (permalink / raw)



In article <01bb87cf$97ae8e80$87ee6fce@timpent.airshields.com> "Tim Behrendsen" <tim@airshields.com> writes:

  > You call 25% a "marginal" increase?  It's attitudes like this
  > that give us the slow code world we have right now.  And how
  > is one less readable than the other?

   On one hard real time project, I went from 23 minutes to 9.7
milliseconds over six months of improving the algorithms.  On another
system, the initial tests showed that speed had to be improved by a
factor of sixty.  It was, but by system redesign, not by bit-fiddling
improvements.  A twenty-five percent improvement is in the noise.  It
may be necessary noise if you are at 110% of budget, but get the logs
out of your eye first.

--

					Robert I. Eachus

with Standard_Disclaimer;
use  Standard_Disclaimer;
function Message (Text: in Clever_Ideas) return Better_Ideas is...




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-06  0:00       ` Ralph Silverman
@ 1996-08-12  0:00         ` Patrick Horgan
  1996-08-13  0:00           ` Darin Johnson
                             ` (8 more replies)
  0 siblings, 9 replies; 688+ messages in thread
From: Patrick Horgan @ 1996-08-12  0:00 UTC (permalink / raw)



In article <4u7hi6$s2b@nntp.seflin.lib.fl.us>, z007400b@bcfreenet.seflin.lib.fl.us (Ralph Silverman) writes:
> 
> 	re.
> 	"...binary sort algorithm..."
> 
> 	i remember such from the
> 		military programming of the 1980's
> 	...
> 
> 	has this reached the textbooks now?
> 
> 	also,
> 	seems pretty hard for a
> 		technical interview question

A binary sort, also known as quicksort, or Hoare's sort is covered extensively
in Knuth's volume three (from 1971) and in every undergraduate data structure
and algorithm course in the world;)...you would expect anyone with a CS degree
to be familiar with it.




-- 

   Patrick J. Horgan    patrick@broadvision.com   Have horse will ride.
   Opinions mine, not my employer's except by most bizarre coincidence.






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-11  0:00                                           ` Tim Behrendsen
@ 1996-08-12  0:00                                             ` Alf P. Steinbach
  1996-08-12  0:00                                               ` Tim Behrendsen
  1996-08-13  0:00                                             ` Szu-Wen Huang
  1 sibling, 1 reply; 688+ messages in thread
From: Alf P. Steinbach @ 1996-08-12  0:00 UTC (permalink / raw)



Tim Behrendsen wrote:
> 
> Peter Seebach <seebs@solutions.solon.com> wrote in article
> > Are you quite sure?  I'd assume that assemblers would let you enter
> constants
> > in at least the three commonly used bases, and I would be completely
> > unsurprised if there were occasionally two ways of spelling a given
> operation,
> > perhaps allowing for a distinction which no longer exists.
> 
> Of course you can enter constants in different bases, but
> that doesn't affect the code that's generated.  The entire
> *point* of assembler is to be a 1-to-1 corresponence.  Otherwise,
> it would be a HLL.


Sorry to intrude again after I opted out when the discussion polarized
(I've been lurking in the shadows, though).  But this is sort of funny.
The assembler language I now know best is x86 for early x86 processors
(386).  Modern x86 assembly does *not* correspond 1-to-1 to machine
code, as P.S. correctly points out.  But for the wrong reasons!

Many high-level constructs and ideas have migrated down to assembler
languages, like the concepts of defining datatypes, symbolic constants,
routines with typed parameter lists, if-then-else and other flow-control
constructs, and so on, including direct support for OOP in Borland's
TASM assembler.  A modern assembler program has a syntactical structure
much the same as HLL program  --  however, the programmer is free to
write everything as one long, grey mass of pure instructions with the
structure embedded as jump instructions and so on, just as you can in C.
For the first fumbling steps, this gray mass is the relevant subset.

The differences from programming in a HLL are mainly (this is opinion):


  - Limited type-checking.  The student who has really crashed a
    machine a few times will probably understand the point of
    typechecking.  Likewise, he/she may gain a better understanding
    of things like bitsets, direct versus twos-complement form
    binary, and simple static datastructures in general.

  - The top level of the memory hierarchy  --  processor registers  --
    is exposed, and under the control of the program.  A high-level
    language hides the registers and allocates them automatically.
    This is good for understanding what "sequential" really means.

  - Memory addressing is exposed and naked to the eye, although some
    assemblers (esp. Micro$oft) try to be confusing about this.  This
    is very good for understanding (A) variables, (B) parameter passing,
    (C) pointers, in order from simple to difficult to the newbie.

  - You *can* have complete control over the instructions and other
    contents ending up in the executable/library/whatever.

  - You have access to absolutely everything the machine can do.  That
    is, you use assembler not for efficiency or code size but to achieve
    something impossible or completely impractical in a HLL.  This
    includes writing linkage stubs, interrupt handlers (esp. requiring
    indirect jumps), detecting 16/32-bit code segment, etc.


Now, as I've stated earlier, I think it's silly to try to keep
knowledge from students.  Anyone who does is not a real educator,
but rather the opposite.  On the other hand, I do not think
students should start learning only assembler.  One argument here
is that students are motivated by what they can achieve of
practical results.  For example, popping up a "Hello, world"
message box in Windows 3.1 required 40-50 lines of assembler,
and not exactly newbie assembler, but only 4 lines in Pascal.

- Alf




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-10  0:00                                             ` Mike Rubenstein
@ 1996-08-12  0:00                                               ` Tim Behrendsen
  1996-08-12  0:00                                                 ` Mike Rubenstein
  1996-08-12  0:00                                                 ` Bob Kitzberger
  0 siblings, 2 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-12  0:00 UTC (permalink / raw)



Mike Rubenstein <miker3@ix.netcom.com> wrote in article
<320bf032.7193774@nntp.ix.netcom.com>...
> "Tim Behrendsen" <tim@airshields.com> wrote:
> > Mike Rubenstein <miker3@ix.netcom.com> wrote in article
> > <320b35a2.43769707@nntp.ix.netcom.com>...
> 
> Perhaps because you think like an assembly language programmer --
> there's an enormous advantage to clear code.  Efficiency may, or may
> not be important.  Yet you assumed that it was worthwhile to make
> Peter's code more efficient.  This is the assembly language
> programmer's disease.

No, this is experienced programmer's disease.  It's not as if
I go around optimizing everything, I just do it automatically.
When you train yourself to think about efficiency, you naturally
do it the efficient way the *first time*.  The cumulative effect
of this is fast programs, and it doesn't cost any more effort.

> If I were doing a lot of gcd calculations, I'd certainly try to
> optimize the program.  But in most applications very little of the
> code is time critical.  Where it is not, clear code wins over
> efficient code.

Yes, I agree.  Personally, they both seem equally clear to me,
but I could see how someone may prefer the recursive solution.
It is certainly the more "beautiful" of the two.

> Furthermore, when teaching algorithms Peter's code is what you want,
> at least for the first cut.  It shows a general technique of algorithm
> design, reducing a problem to a similar but easier one, that your
> code, even if written correctly, hides.

I'm not sure if that's good or not.  One of the famous ways to
show recursion is a factorial computation.  Does that mean we
actually *want* people to implement factorial that way?

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-11  0:00                                               ` Dan Pop
@ 1996-08-12  0:00                                                 ` Tim Behrendsen
  1996-08-12  0:00                                                 ` Chris Sonnack
  1 sibling, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-12  0:00 UTC (permalink / raw)



Dan Pop <Dan.Pop@cern.ch> wrote in article
<danpop.839791373@news.cern.ch>...
> In <01bb86f6$d92cf6a0$32ee6fce@timhome2> "Tim Behrendsen"
<tim@airshields.com> writes:
> 
> >Chris Sonnack <cjsonnack@mmm.com> wrote in article
> ><4ug5u5$kha@dawn.mmm.com>...
> >
> >> They're both pretty clear. And any real programmer knows rule #27:
"There
> >> should be no constants in your code except the numbers 1 and 0, and
you
> >> should view those with suspicion."
> >
> >I would say, "There should be no constants in your code except 0.  Tests
> >should be less than, equal, greater than, or not equal 0.  Otherwise,
> >it better involve a symbol."
> 
> This is ludicrous.  When coding a binary search, NO symbol will be better
> than the constant 2.  Ditto for the constant 10 when doing binary to 
> decimal conversions.  And the list could go on and on.
> 
> The rule is: "there should be no _arbitrary_ constants in your code,
> with no exceptions".

That's a good point; "natural" constants would be an exception.
I was mostly thinking in terms of testing return values.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-12  0:00                                             ` Alf P. Steinbach
@ 1996-08-12  0:00                                               ` Tim Behrendsen
  0 siblings, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-12  0:00 UTC (permalink / raw)



Alf P. Steinbach <alfps@online.no> wrote in article
<320E742A.7288@online.no>...
> Tim Behrendsen wrote:
> > 
> > Of course you can enter constants in different bases, but
> > that doesn't affect the code that's generated.  The entire
> > *point* of assembler is to be a 1-to-1 corresponence.  Otherwise,
> > it would be a HLL.
> 
> Sorry to intrude again after I opted out when the discussion polarized
> (I've been lurking in the shadows, though).  But this is sort of funny.
> The assembler language I now know best is x86 for early x86 processors
> (386).  Modern x86 assembly does *not* correspond 1-to-1 to machine
> code, as P.S. correctly points out.  But for the wrong reasons!
> 
> Many high-level constructs and ideas have migrated down to assembler
> languages, like the concepts of defining datatypes, symbolic constants,
> routines with typed parameter lists, if-then-else and other flow-control
> constructs, and so on, including direct support for OOP in Borland's
> TASM assembler.  A modern assembler program has a syntactical structure
> much the same as HLL program  --  however, the programmer is free to
> write everything as one long, grey mass of pure instructions with the
> structure embedded as jump instructions and so on, just as you can in C.
> For the first fumbling steps, this gray mass is the relevant subset.

Heck, 360 assembler had macros and pseudo-ops; the general point is
that the purpose of assembler is to code machine language without
having to actually look up the codes yourself.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-08  0:00                                     ` Teaching sorts [was Re: What's the best language to start with?] Robert I. Eachus
                                                         ` (2 preceding siblings ...)
  1996-08-10  0:00                                       ` Al Aab
@ 1996-08-12  0:00                                       ` Steve Heller
  1996-08-12  0:00                                         ` Robert Dewar
  1996-08-14  0:00                                       ` Stephen Baynes
  4 siblings, 1 reply; 688+ messages in thread
From: Steve Heller @ 1996-08-12  0:00 UTC (permalink / raw)



eachus@spectre.mitre.org (Robert I. Eachus) wrote:


>    My favorite way to teach sorts is with cards.  You can use playing
>cards, but a set of index cards numbered from 1 to 25 (or even random
>four digit numbers if you are teaching radix sort) eliminates
>distractions.

>    I managed to do the "fun" experiment once.  Take three students
>and have them learn Quicksort, Heapsort, and Bubblesort on "small"
>decks.  At even 50 to 60 cards, the students doing Heapsort and
>Quicksort are racing each other*, and the Bubblesort victim is still
>hard at work well after they have finished.

>    *The race really is to finish the Quicksort before the Heapsorter
>has built a heap. This also shows why quicksort is quick, since the
>Quicksort step of picking up the cards is O(n), not O(n log n)

  Of course, if any are using radix sort, they're done well before
anyone else. Why this O(n) method of sorting isn't taught more widely
escapes me.



Steve Heller, author and software engineer
http://ourworld.compuserve.com/homepages/steve_heller 





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-12  0:00                                       ` Steve Heller
@ 1996-08-12  0:00                                         ` Robert Dewar
  1996-08-16  0:00                                           ` Steve Heller
  0 siblings, 1 reply; 688+ messages in thread
From: Robert Dewar @ 1996-08-12  0:00 UTC (permalink / raw)



Steve Heller said

"  Of course, if any are using radix sort, they're done well before
anyone else. Why this O(n) method of sorting isn't taught more widely
escapes me."

Radix sorts are not O(n), the analysis is not that simple. They are
O(kN) where k is the number of radix digits in the numbers, and if
you use left to right (e.g. radix exchange sorting), the early
termination for subsets of 1 results in an ONlogN behavior after all.

Still it is quite true that a simple radix sort for cards will beat
the heap and quicksort crowds :-)

I agree that both radix sorts and address calculation sorts should be
taught more systematically. The reason that attention tends to focus
on comparison sorts is that these analyze most nicely from an academic
point of view :-)





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-12  0:00                                               ` Tim Behrendsen
@ 1996-08-12  0:00                                                 ` Mike Rubenstein
  1996-08-12  0:00                                                   ` Mark Wooding
  1996-08-12  0:00                                                   ` Tim Behrendsen
  1996-08-12  0:00                                                 ` Bob Kitzberger
  1 sibling, 2 replies; 688+ messages in thread
From: Mike Rubenstein @ 1996-08-12  0:00 UTC (permalink / raw)



"Tim Behrendsen" <tim@airshields.com> wrote:

> Mike Rubenstein <miker3@ix.netcom.com> wrote in article
> <320bf032.7193774@nntp.ix.netcom.com>...
> > "Tim Behrendsen" <tim@airshields.com> wrote:
> > > Mike Rubenstein <miker3@ix.netcom.com> wrote in article
> > > <320b35a2.43769707@nntp.ix.netcom.com>...
> > 
> > Perhaps because you think like an assembly language programmer --
> > there's an enormous advantage to clear code.  Efficiency may, or may
> > not be important.  Yet you assumed that it was worthwhile to make
> > Peter's code more efficient.  This is the assembly language
> > programmer's disease.
> 
> No, this is experienced programmer's disease.  It's not as if
> I go around optimizing everything, I just do it automatically.
> When you train yourself to think about efficiency, you naturally
> do it the efficient way the *first time*.  The cumulative effect
> of this is fast programs, and it doesn't cost any more effort.

I see.  Experienced programmers automatically optimize slightly
inefficient, clear code to incorrect unclear code?  Guess I'll have to
remain a novice.

Perhaps you need to get some new experiences.

Changing correct clear code when there is no specific requirement is
an assembly programmer's, not an experienced programmer's disease.
Yes, I know that was just a typo (or several typos).  But that
happens.  Only an assembly language programmer (and, frankly, not a
very good one) would assume that efficiency was so important that
clear code had to be changed and risk the problems.

> 
> > If I were doing a lot of gcd calculations, I'd certainly try to
> > optimize the program.  But in most applications very little of the
> > code is time critical.  Where it is not, clear code wins over
> > efficient code.
> 
> Yes, I agree.  Personally, they both seem equally clear to me,
> but I could see how someone may prefer the recursive solution.
> It is certainly the more "beautiful" of the two.

So, why did you change it when there was no clear requirement for
better efficiency?  Even if one does not prefer that solution, it
certainly was clear enough and correct.
 
> > Furthermore, when teaching algorithms Peter's code is what you want,
> > at least for the first cut.  It shows a general technique of algorithm
> > design, reducing a problem to a similar but easier one, that your
> > code, even if written correctly, hides.
> 
> I'm not sure if that's good or not.  One of the famous ways to
> show recursion is a factorial computation.  Does that mean we
> actually *want* people to implement factorial that way?

In scheme (and some other languages) it certainly is the way to
implement it.

But that's not the point.  If one understands the algorithm, one can
code it using whatever techniques are appropriate.  Recursion is not
something to be avoided at all costs even in implmeentations where it
is not efficient.  Assembly language is not needed to understand
algorityms or efficiency.

For the record, I have extensive assembly language experience on IBM
360/370, 8080, Z80, and 80x86 and a little on IBM709x, Univac 110x, GE
425, Honeywell 6000, and DEC PDP 6/10.  I've outgrown it.


Michael M Rubenstein




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-12  0:00                                                   ` Tim Behrendsen
@ 1996-08-13  0:00                                                     ` Mike Rubenstein
  1996-08-13  0:00                                                       ` Tim Behrendsen
       [not found]                                                     ` <32 <01bb8923$e1d34280$87ee6fce@timpent.airshields.com>
  1 sibling, 1 reply; 688+ messages in thread
From: Mike Rubenstein @ 1996-08-13  0:00 UTC (permalink / raw)



"Tim Behrendsen" <tim@airshields.com> wrote:

> Mike Rubenstein <miker3@ix.netcom.com> wrote in article
> <320f14e5.213196860@nntp.ix.netcom.com>...
> > "Tim Behrendsen" <tim@airshields.com> wrote:
> > > Mike Rubenstein <miker3@ix.netcom.com> wrote in article
> > > <320bf032.7193774@nntp.ix.netcom.com>...
> > > 
> > > No, this is experienced programmer's disease.  It's not as if
> > > I go around optimizing everything, I just do it automatically.
> > > When you train yourself to think about efficiency, you naturally
> > > do it the efficient way the *first time*.  The cumulative effect
> > > of this is fast programs, and it doesn't cost any more effort.
> > 
> > I see.  Experienced programmers automatically optimize slightly
> > inefficient, clear code to incorrect unclear code?  Guess I'll have to
> > remain a novice.
> 
> Either solution is perfectly clear to me, and perhaps you
> could share the secret of a mistake free life, since you appear
> to not be able to forgive my one type?

I certainly don't claim to be mistake free -- that's why I don't
change clear working code without reason.

>  
> > Perhaps you need to get some new experiences.
> > 
> > Changing correct clear code when there is no specific requirement is
> > an assembly programmer's, not an experienced programmer's disease.
> > Yes, I know that was just a typo (or several typos).  But that
> > happens.  Only an assembly language programmer (and, frankly, not a
> > very good one) would assume that efficiency was so important that
> > clear code had to be changed and risk the problems.
> 
> As I said, I would not go back and change working code
> unless there was a good reason.  If I was implementing it
> the first time, I would do it efficiently the first time.

But you did go back and change it.
 
> > > Yes, I agree.  Personally, they both seem equally clear to me,
> > > but I could see how someone may prefer the recursive solution.
> > > It is certainly the more "beautiful" of the two.
> > 
> > So, why did you change it when there was no clear requirement for
> > better efficiency?  Even if one does not prefer that solution, it
> > certainly was clear enough and correct.
> 
> Because I was making a point about recursion, not about GCD
> implementations.

If the point was that assembly language knowledge helps, I'm afraid
you blew it.

>   
> > > I'm not sure if that's good or not.  One of the famous ways to
> > > show recursion is a factorial computation.  Does that mean we
> > > actually *want* people to implement factorial that way?
> > 
> > In scheme (and some other languages) it certainly is the way to
> > implement it.
> >
> > But that's not the point.  If one understands the algorithm, one can
> > code it using whatever techniques are appropriate.  Recursion is not
> > something to be avoided at all costs even in implmeentations where it
> > is not efficient.
> 
> "Techniques are appropriate"  That's the issue, isn't it?  I would
> say that anybody who implements factorial in C using recursion
> qualifies as a bonehead and should have their programming license
> revoked.
> 
> > Assembly language is not needed to understand
> > algorityms or efficiency.
> 
> It's not a question of whether it's needed or not.  It's a
> question of what's best, and what we're doing now is certainly
> not working.  The level of incompetence that I see coming
> out of CS schools is astounding to me.
> 
> I've asked this before; why do EE graduates seem so much
> better prepared than CS graduates?  I think it's because they
> start with the low level and work they their way up.
>  
> > For the record, I have extensive assembly language experience on IBM
> > 360/370, 8080, Z80, and 80x86 and a little on IBM709x, Univac 110x, GE
> > 425, Honeywell 6000, and DEC PDP 6/10.  I've outgrown it.
> 
> So who's arguing that assembly should be used in production
> any more?  You have the benefit of all that background, yet
> you argue that it's useless.  I think that it benefits you more
> than you realize.

Or you less than you realize.  I, at least, would not have changed
Peter's code unless there was a specific reason.  That is one lesson
assembly language code has taught me.

I've probably been helped more by my knowledge of about 25 higer level
languages.

Obviously, all other things being equal, it is better to know assembly
language that to not know it.  But all other things are seldom equal.
I suspect that most beginning programmers would gain much more from
learning some different HLLs.  Given a choice, I'd strongly recommend
one learn LISP, APL, Icon, or any of a few dozen other languages to
learning assembly language.

Michael M Rubenstein




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-12  0:00                                                   ` Mark Wooding
@ 1996-08-13  0:00                                                     ` Mike Rubenstein
  1996-08-15  0:00                                                     ` Richard A. O'Keefe
  1 sibling, 0 replies; 688+ messages in thread
From: Mike Rubenstein @ 1996-08-13  0:00 UTC (permalink / raw)



mdw@excessus.demon.co.uk (Mark Wooding) wrote:

> Mike Rubenstein <miker3@ix.netcom.com> wrote:
> > "Tim Behrendsen" <tim@airshields.com> wrote:
> >
> > > I'm not sure if that's good or not.  One of the famous ways to
> > > show recursion is a factorial computation.  Does that mean we
> > > actually *want* people to implement factorial that way?
> > 
> > In scheme (and some other languages) it certainly is the way to
> > implement it.
> 
> Are we talking about implementing it the truly stupid way as a single
> function, e.g.,
> 
> 	(define (factorial n)
> 	  (if (zero? n)
> 	    1
> 	    (* n (factorial (- n 1)))))
> 
> or properly using another function which tail-recurses, e.g.,
> 
> 	(define (factorial n) (fact-assist 1 n))
> 	(define (fact-assist a n)
> 	  (if (zero? n)
> 	    a
> 	    (fact-assist (* a n) (- n 1))))
> 
> If the former, then please report for re-education; if the latter, then
> well and good: tail recursion is the functional-language equivalent of a
> loop in a language like C.  The first version above is similar in style
> (and in the sorts of problems it causes) to the traditional 
> 
> 	int factorial(int n) { return (n ? n*factorial(n-1) : 1); }
> 
> recursive C implementation, while the second is similar in spirit to the
> iterative version.
> 
> (My Scheme isn't terribly good: please don't moan at me about it if it's
> wrong or nasty.  Also, all of the routines above assume they get valid
> inputs, and will do really bad things if they don't: issues of input
> checking aren't really relevant to the discussion in hand.)

Obviously, one should do it right.  Dybvig, for example, explains why
the tail recursive version is better without resorting to assembly
language :-)


Michael M Rubenstein




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-28  0:00                 ` J. Christian Blanchette
                                     ` (5 preceding siblings ...)
  1996-08-08  0:00                   ` William Clodius
@ 1996-08-13  0:00                   ` Ole-Hjalmar Kristensen FOU.TD/DELAB
  1996-08-14  0:00                   ` Richard A. O'Keefe
                                     ` (2 subsequent siblings)
  9 siblings, 0 replies; 688+ messages in thread
From: Ole-Hjalmar Kristensen FOU.TD/DELAB @ 1996-08-13  0:00 UTC (permalink / raw)



I have some supporting data for your case. Consider the following
rather different programs which copy data as fast as possible:

::::::::::::::
duff.cc 
::::::::::::::

int from[1000000], to[1000000];

int   main(int ac, char **av)
{
    int count = sizeof(from) / sizeof(int);
    for (int i = 0; i < 100; i++) {
        int * s = from, *d = to;
        
        register int n = (count + 15) / 16;
        switch (count % 16)
            while (--n) {
            case 0: 
                *d++ = *s++; 
            case 15: 
                *d++ = *s++;
            case 14: 
                *d++ = *s++;
            case 13: 
                *d++ = *s++;
            case 12: 
                *d++ = *s++;
            case 11: 
                *d++ = *s++;
            case 10: 
                *d++ = *s++;
            case 9: 
                *d++ = *s++;
            case 8: 
                *d++ = *s++;
            case 7: 
                *d++ = *s++;
            case 6: 
                *d++ = *s++;
            case 5: 
                *d++ = *s++; 
            case 4: 
                *d++ = *s++; 
            case 3: 
                *d++ = *s++; 
            case 2: 
                *d++ = *s++; 
            case 1: 
                *d++ = *s++; 
            }
    }
    return 0;
}

::::::::::::::
loop.cc
::::::::::::::

int from[1000000], to[1000000];

int   main(int ac, char **av)
{
    int count = sizeof(from) / sizeof(int);
    for (int i = 0; i < 100; i++) {
        int * s = from, *d = to;
        for (int j = 0; j < count; j++) {
            *d++ = *s++;
        }
    }
    return 0;
}

::::::::::::::
loop2.cc
::::::::::::::

int from[1000000], to[1000000];

int   main(int ac, char **av)
{
    int count = sizeof(from) / sizeof(int);
    for (int i = 0; i < 100; i++) {
        for (int j = 0; j < 1000000; j++) {
            to[j] = from[j];
        }
    }
    return 0;
}

::::::::::::::
rloop.cc
::::::::::::::

struct buf {
    int i[1000000];
};

buf from;
buf to;

int   main(int ac, char **av)
{
    for (int i = 0; i < 100; i++) {
        to = from;
    }
    return 0;
}


::::::::::::::
mem.cc
::::::::::::::

int from[1000000], to[1000000];

int   main(int ac, char **av)
{

    int count = sizeof(from);
    for (int i = 0; i < 100; i++) {
        memcpy((char *) to, (char *) from,count);
    }
    return 0;
}

All programs are compiled with gcc on a SPARC.

No optimization:

time duff
       76.2 real        65.2 user         1.8 sys  
time loop
       96.0 real        77.1 user         1.7 sys  
time loop2
       89.6 real        72.1 user         1.9 sys  
time rloop
       42.2 real        23.7 user         2.2 sys  
time mem
       45.5 real        24.5 user         1.9 sys  


-O3:

time duff
       42.4 real        23.6 user         1.4 sys  
time loop
       43.5 real        26.0 user         1.8 sys  
time loop2
       40.9 real        24.1 user         1.8 sys  
time rloop
       40.8 real        24.1 user         1.8 sys  
time mem
       43.5 real        23.7 user         1.6 sys  


-O3 -funroll-loops:

time duff
       45.4 real        23.6 user         1.7 sys  
time loop
       45.0 real        23.0 user         1.7 sys  
time loop2
       46.0 real        23.4 user         1.9 sys  
time rloop
       44.3 real        24.3 user         1.9 sys  
time mem
       42.5 real        24.0 user         1.8 sys  


Conclusion: In this case, even manual loop unrolling does not buy you
very much if you are going to compile with an optimizing
compiler. (This example is probably too simple, although the first program
should be obfuscated enough...)

For a loop such as this, using arrays instead of pointers is
marginally faster, unless I turn on the -funroll-loops flag, for some reason.

It is interesting to note that using memcpy does not increase the
speed of copying. This may be different on other CPU's.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-12  0:00         ` Patrick Horgan
  1996-08-13  0:00           ` Darin Johnson
@ 1996-08-13  0:00           ` Ralph Silverman
  1996-08-16  0:00           ` What's the best language to learn? [was Re: Should I learn C or Pascal?] Darin Johnson
                             ` (6 subsequent siblings)
  8 siblings, 0 replies; 688+ messages in thread
From: Ralph Silverman @ 1996-08-13  0:00 UTC (permalink / raw)



Patrick Horgan (patrick@broadvision.com) wrote:
: In article <4u7hi6$s2b@nntp.seflin.lib.fl.us>, z007400b@bcfreenet.seflin.lib.fl.us (Ralph Silverman) writes:
: > 
: > 	re.
: > 	"...binary sort algorithm..."
: > 
: > 	i remember such from the
: > 		military programming of the 1980's
: > 	...
: > 
: > 	has this reached the textbooks now?
: > 
: > 	also,
: > 	seems pretty hard for a
: > 		technical interview question

: A binary sort, also known as quicksort, or Hoare's sort is covered extensively
: in Knuth's volume three (from 1971) and in every undergraduate data structure
: and algorithm course in the world;)...you would expect anyone with a CS degree
: to be familiar with it.




: -- 

:    Patrick J. Horgan    patrick@broadvision.com   Have horse will ride.
:    Opinions mine, not my employer's except by most bizarre coincidence.



--
************begin r.s. response*******************

	are such commonly termed
		binary sort
	?

	in any case these are not the
		intended reference!

************end r.s. response*********************
Ralph Silverman
z007400b@bcfreenet.seflin.lib.fl.us





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-30  0:00                     ` Arra Avakian
                                         ` (2 preceding siblings ...)
  1996-08-02  0:00                       ` Tim Behrendsen
@ 1996-08-13  0:00                       ` Chris Sonnack
  1996-08-16  0:00                         ` Steve Heller
  3 siblings, 1 reply; 688+ messages in thread
From: Chris Sonnack @ 1996-08-13  0:00 UTC (permalink / raw)



Arra Avakian (arra@inmet.com) wrote:

> I do recall being mystified and puzzled by the concept of how FORTRAN
> source code could "execute", but could readily "grok" how a dumb machine
> could blindly execute machine code. It wasn't until I understood the
> concepts behind a compiler that the mystery faded and I could accept
> the abstraction of FORTRAN. What is interesting in hindsight was that 
> the concept of an assembler did not cause any mystification - its role
> as a translator was "obvious", but the role of a compiler was definitely
> not obvious.

I had a similar experience in my first days (circa 1970). Early on, I
was able to write an assembler for the Z80 (using Z80 assembler!), but
it was years before I could have written a compiler. Compilers are, to
my mind, much harder than assemblers....for obvious reasons.

> I still to this day need to understand the execution model of an
> abstraction in order to "really" understand it. I guess my character
> is to be suspicious of the mystery, and not be able to take it on "faith".

My OVERWHELMING experience as a teacher is that most students learn a
thing (any thing) faster and better if they learn the "why" and "how"
that's behind it.

--
Chris Sonnack  <cjsonnack@mmm.com>                  http://eishcq.mmm.com
Engineering Information Services/Information Technology/3M, St.Paul, Minn
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
As to those junk mailing lists: Death does not release you, you know.

Opinions expressed herein are my own and may not represent those of my employer.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-11  0:00                                           ` Tim Behrendsen
  1996-08-12  0:00                                             ` Alf P. Steinbach
@ 1996-08-13  0:00                                             ` Szu-Wen Huang
  1 sibling, 0 replies; 688+ messages in thread
From: Szu-Wen Huang @ 1996-08-13  0:00 UTC (permalink / raw)



Tim Behrendsen (tim@airshields.com) wrote:

: Of course you can enter constants in different bases, but
: that doesn't affect the code that's generated.  The entire
: *point* of assembler is to be a 1-to-1 corresponence.  Otherwise,
: it would be a HLL.

An example is in order.  The 80x86 conditional branches are relative,
so it can't jump too far.  At least one x86 assembler will convert:

   ...
   JLE  somewhere_far    ; jump if less than or equal
   ...

into:

   ...
   JG   skip_next        ; jump if greater than
   JMP  somewhere_far
skip_next:
   ...

There goes your one-to-one correspondence, yet it is immensely useful
for the *assembly* language programmer.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-13  0:00                                     ` Robert I. Eachus
@ 1996-08-13  0:00                                       ` Lawrence Kirby
  1996-08-14  0:00                                       ` Robert Dewar
  1 sibling, 0 replies; 688+ messages in thread
From: Lawrence Kirby @ 1996-08-13  0:00 UTC (permalink / raw)



In article <EACHUS.96Aug12201351@spectre.mitre.org>
           eachus@spectre.mitre.org "Robert I. Eachus" writes:

>In article <839708378snz@genesis.demon.co.uk> Lawrence Kirby
> <fred@genesis.demon.co.uk> writes:
>
>   I said:
>
>   >Quicksort step of picking up the cards is O(n), not O(n log n)
>    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
>
>   In article <839708378snz@genesis.demon.co.uk> Lawrence Kirby
> <fred@genesis.demon.co.uk> writes:
>
>  > What does this phrase mean? Quicksort is an (n log n) (average time)
>  > operation. Building a heap is an O(n) operation.
>
>  It means that when you finish the first stage of the Quicksort, you
>have several piles laid out on the table.  Completing the sort
>requires picking them up in order.

Then you have to sort each pile recursively bringing you back to O(n log n).
Picking up a pile of cards isn't what determines the running time of
quicksort.

  For heap sort, building the heap
>is also an O(n log n) operation, since you maintain a heap.

Wrong, you don't maintain a heap, you just build it. If you approach
that correctly it is an O(n) operation. Quoting from Sedgewick:

"... its can in fact be proven that the construction process takes
 linear time since so many small heaps are processed".

> However
>picking up the cards is also N log N since you have to maintain the
>heap property, and in any case you pick them up one at a time.

However there is a lot more to quicksort than just "picking up the cards".

>   Try the experiment.  A good heapsorter will finish heapifying and
>pick up the first card first.  A couple seconds later, the quicksorter
>will finish the job, assuming the cards were well shuffled.

I did. It took slightly over twice as long to quicksort a pack of 52
playing cards than it did to create a heap from them. The only real thing
of note is that you can perform the individual operations required for
partitioning much quicker than the careful card repositioning you
need to perform to construct a heap. That does reflect to some extent
the respective computer implementations but the effect is much more
pronounced with cards. Overall building a heap required far fewer operations
than the quicksort.

  (Of
>course, one of the fun steps is to tell the students to repeat the
>exercise--without shuffling.  Now the race is between quicksort and
>bubblesort for last place.)

With cards it is easy to implement a random or an approximate half way
pivot selection. Which prings up the question of precisely what algorithms
are actually being used.

Further discussion of this ought to go to a newsgroup such as
comp.programming or via email.

-- 
-----------------------------------------
Lawrence Kirby | fred@genesis.demon.co.uk
Wilts, England | 70734.126@compuserve.com
-----------------------------------------




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-12  0:00         ` Patrick Horgan
@ 1996-08-13  0:00           ` Darin Johnson
  1996-08-13  0:00             ` What's the best language to learn? [was Re: Should I learn C or Pascal?] Tim Behrendsen
                               ` (3 more replies)
  1996-08-13  0:00           ` Should I learn C or Pascal? Ralph Silverman
                             ` (7 subsequent siblings)
  8 siblings, 4 replies; 688+ messages in thread
From: Darin Johnson @ 1996-08-13  0:00 UTC (permalink / raw)



> A binary sort, also known as quicksort, or Hoare's sort is covered extensively
> in Knuth's volume three (from 1971) and in every undergraduate data structure
> and algorithm course in the world;)...you would expect anyone with a CS degree
> to be familiar with it.

Actually, I learned it freshman year, but didn't understand it.
Entire new concepts of programming were still trying to find niches in
my head, leaving no room for understanding what was going on with
quicksort.  Most later classes assumed I already knew it.  I did get
clued into it later on, but I think many of my classmates kept the
"quicksort is the fastest sort" categorization without really
understanding it.  Too many people fall asleep in algorithms class
(then bitch about the waste of time later).
-- 
Darin Johnson
djohnson@ucsd.edu	O-
  - I'm not a well adjusted person, but I play one on the net.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-07  0:00                                   ` Tim Behrendsen
                                                       ` (2 preceding siblings ...)
  1996-08-08  0:00                                     ` Teaching sorts [was Re: What's the best language to start with?] Robert I. Eachus
@ 1996-08-13  0:00                                     ` Robert I. Eachus
  1996-08-14  0:00                                       ` Robert Dewar
  1996-08-15  0:00                                       ` Tom Payne
  1996-08-13  0:00                                     ` Robert I. Eachus
                                                       ` (5 subsequent siblings)
  9 siblings, 2 replies; 688+ messages in thread
From: Robert I. Eachus @ 1996-08-13  0:00 UTC (permalink / raw)



In article <839939925snz@genesis.demon.co.uk> Lawrence Kirby <fred@genesis.demon.co.uk> writes:

  > Then you have to sort each pile recursively bringing you back to
  > O(n log n).  Picking up a pile of cards isn't what determines the
  > running time of quicksort.

   Sorry if it was confusing, in the quicksort approach you lay down
the top card, then sort the remainder of the (current) deck to the
left and right.  Pick up the piles to the left and right (separately)
and do the same.  Iterate until you pick up a small enough pile to
sort in place.  (Your choice of one, two or three cards.)

   That completes the first part of the sort.  Now pick up the cards
left to right.  That is the second step and obviously O(n).

  (I said:)

  > > For heap sort, building the heap
  > > is also an O(n log n) operation, since you maintain a heap.

  > Wrong, you don't maintain a heap, you just build it. If you approach
  > that correctly it is an O(n) operation. Quoting from Sedgewick:

  > "... its can in fact be proven that the construction process takes
  >  linear time since so many small heaps are processed".

  If you use the canonical heapsort it is O(2n) average, O(n log n)
worst case.  (The worst case is when each card added to the bottom of
the heap propagates to the top, using the usual version of the sort,
or propagates to bottom in the other definition, where you start with
an unsorted list/heap and sort in place.)

  (I said:) 

  > > However
  > >picking up the cards is also N log N since you have to maintain the
  > >heap property, and in any case you pick them up one at a time.

  > However there is a lot more to quicksort than just "picking up the cards".

  Of course there is. But here I was saying that when you get to the
stage of picking up cards, heapsort is slower.  And I also pointed
out that quicksort is usually slower up to that point:

  >>   Try the experiment.  A good heapsorter will finish heapifying
  >> and pick up the first card first.  A couple seconds later, the
  >> quicksorter will finish the job, assuming the cards were well
  >> shuffled.

  > I did. It took slightly over twice as long to quicksort a pack of
  >52 playing cards than it did to create a heap from them. The only
  >real thing of note is that you can perform the individual
  >operations required for partitioning much quicker than the careful
  >card repositioning you need to perform to construct a heap. That
  >does reflect to some extent the respective computer implementations
  >but the effect is much more pronounced with cards. Overall building
  >a heap required far fewer operations than the quicksort.

   We are in violent agreement here.  I said the heapsorter would
finish building the heap before the quicksorter picked up a card.  But
I also said that the next stage for quicksorting is much faster than
converting a heap into a sorted stack.

   >>    (Of course, one of the fun steps is to tell the students to
   >> repeat the exercise--without shuffling.  Now the race is
   >> between quicksort and bubblesort for last place.)

   > With cards it is easy to implement a random or an approximate
   > half way pivot selection. Which prings up the question of
   > precisely what algorithms are actually being used.

   Yes it is, but that is not part of quicksort as invented, and on
average slows quicksort down.  If you are concerned about poor worst
case performance you use a heap or radix sort.  You know that, I know
that.  After this demonstration, the students know that. ;-)

  > Further discussion of this ought to go to a newsgroup such as
  > comp.programming or via email.

  In hasn't been in comp.programming or I would have set followups
accordingly.  But I think we have worn the subject out.  It is a cute
computer lab session, it teaches some important fundamentals, and
those lessons are important.  Dexterity effects and the like are
distractions.  But it actually is nice that the time for quicksort
varies as much as it does.  It not only adds interest to the lab,
but it means that the students aren't embarrassed by the results of
any particular trial.


--

					Robert I. Eachus

with Standard_Disclaimer;
use  Standard_Disclaimer;
function Message (Text: in Clever_Ideas) return Better_Ideas is...




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-11  0:00                                             ` Dan Pop
@ 1996-08-13  0:00                                               ` Tim Behrendsen
  1996-08-13  0:00                                                 ` Giuliano Carlini
  1996-08-14  0:00                                                 ` Dan Pop
  0 siblings, 2 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-13  0:00 UTC (permalink / raw)



Dan Pop <Dan.Pop@cern.ch> wrote in article
<danpop.839807980@news.cern.ch>...
> In <01bb87cf$97ae8e80$87ee6fce@timpent.airshields.com> "Tim Behrendsen"
<tim@airshields.com> writes:
> 
> >Dan Pop <Dan.Pop@cern.ch> wrote in article
> ><danpop.839594575@news.cern.ch>...
> >> In <01bb8534$b2718bc0$87ee6fce@timpent.airshields.com> "Tim
Behrendsen"
> ><tim@airshields.com> writes:
> >> 
> >> >Here's an example:
> >> >
> >> >int a[50000],b[50000],c[50000],d[50000],e[50000];
> >> >
> >> >void test1()
> >> >{
> >> >    int i, j;
> >> >    for (j = 0; j < 10; ++j) {
> >> >        for (i = 0; i < 50000; ++i) {
> >> >            ++a[i]; ++b[i]; ++c[i]; ++d[i]; ++e[i];
> >> >        }
> >> >    }
> >> >}
> >> >
> >> >void test2()
> >> >{
> >> >    int i, j;
> >> >    for (j = 0; j < 10; ++j) {
> >> >        for (i = 0; i < 50000; ++i) ++a[i];
> >> >        for (i = 0; i < 50000; ++i) ++b[i];
> >> >        for (i = 0; i < 50000; ++i) ++c[i];
> >> >        for (i = 0; i < 50000; ++i) ++d[i];
> >> >        for (i = 0; i < 50000; ++i) ++e[i];
> >> >    }
> >> >}
> >> >
> >> >On my AIX system, test1 runs in 2.47 seconds, and test2
> >> >runs in 1.95 seconds using maximum optimization (-O3).  The
> >> >reason I knew the second would be faster is because I know
> >> >to limit the amount of context information the optimizer has
> >> >to deal with in the inner loops, and I know to keep memory
> >> >localized.
> >> 
> >> 1. For a marginal speed increase (~25%), you compromised the
readability
> >>    of the code.
> >
> >You call 25% a "marginal" increase?  It's attitudes like this
> >that give us the slow code world we have right now.
> 
> Yes, I do call 25% a marginal increase.  Would you be considerably
happier
> if everything would run 25% faster, at the expense of everything being
> harder to develop and maintain by a factor of 2?

No, I wouldn't be happier with "a factor of 2", but that's a
very large presumption.  Who's doing anything strange and
arcane?  I just made it easier on the compiler / computer.

> >And how is one less readable than the other?
> 
> Methinks one loop is easier to read than five.

Sometimes, but other times I would rather read five small
loops right in a row than one mammoth loop that I have to
take in all at once.

In this particular case, I think they are both equally
readable.

> >> 2. Another compiler, on another system, might generate faster code
> >>    out of the test1.  This is especially true for supercomputers,
> >>    which have no cache memory (and where the micro-optimizations are
done
> >>    based on a completely different set of criteria) and where the cpu
> >time
> >>    is really expensive.
> >
> >Show me the computer where test1 comes out faster. 
> 
> Already done: my notebook.

Yes, trivially faster.  But how about significantly faster?

> >Or shouldn't we depend on the compiler to optimize this?
> 
> Exactly, it's compiler's job to optimize it, not mine.

Well, the compiler didn't do a very good job, does it?
And that's the whole point.  It is ivory tower naivete to
think that compilers optimize everything perfectly every
time.  Yes, you can just wave your hand and say, "well,
obviously AIX sucks", but in my experience at least, ALL
compilers suck in one way or another.
 
> >> Let's see what happens on my 486DX33 box:
> >> 
> >> So, it's 1.10 + 0.23 = 1.33 seconds of cpu time for test1 versus
> >> 1.17 + 0.18 = 1.35 seconds for test2.  Conclusions:
> >> 1. My 486DX33 is faster than your AIX system (or maybe gcc is faster
than
> >>    xlc) :-)
> >Oops!  I just realized my AIX program had 100 for the outer loop, not
> >10 (had me worried for a minute there ...)
> >
> >> 2. Your results are not reproducible.  Your extra "insight" simply
> >"helped"
> >>    you to generate less readable code.
> >
> >Actually, try again with 500 or 5000 in the inner loops; 50000
> >probably flushed the cache.
> 
> Get a clue.  If test1 flushes the cache, test2 will flush it, as well.
> Both implementations behave the same way WRT cache utilization: all
> 5 x 50000 array elements are accessed in a single iteration of the outer
> loop, hence the same thing (cache hit or cache miss) will happen at the
> next iteration in both versions.  Cache is simply a non-issue in your
> example (even if you intended it to be :-)

- Sigh - I must really think about these posts for more than
the two minutes per thread.  You are right, of course.  The
cache is irrelevent to this example.  It's more likely the
fact that the first case is makes better use of page locality.
It may also be that the compiler ran out of address registers
with five arrays (or a combination of both).

Of course, I could restructure the example to take advantage
of the cache, and get even more improvement. :)
 
> >In any case, the code is identically
> >readable either way IMO, and costs you nothing to implement
> >the efficient way.
> 
> You forgot to provide a VALID justification for your claim that test2
> is the "efficient way".

It's probably mostly the paging locality.  Did you run your
test under DOS or Windows?  It would be interesting to see
if there was a difference.

> >So it wasn't on one architecture, big deal.
> 
> Can you prove that it will be on any other architecture?  According
> to your line of argumentation, I could say: "so it was on one
architecture,
> big deal" :-)

I would have to try it on other architectures to be sure, but
can you come up with a scenerio that the second would be
significantly slower?  I would say that on the average paging
architecture, memory locality tends to be quite important to
performance.

> >On the average, it will be, and costs nothing to do it that way/
> 
> I still have to see the proof.  If your time is worth nothing, then
you're
> right, it costs nothing.

Take the same amount of time, either way.  Just a matter of
paying attention to what you do.

> >> The key to proper optimization is profiling.  Additional knowledge
about
> >> a certain platform is a lot less important.  
> >
> >Agreed; optimization shouldn't be to a certain platform, but
> >optimizations can be made on general basis.
> 
> Right.  By selecting the proper algorithm.  Most other kinds of
> optimizations will be specific to a certain machine, or, at best, class
> of machines.  For example, an optimization which tries to improve the
> cache hit rate might hurt on a supercomputer which has no cache, but it
> is adversely affected by certain memory access patterns.

Well, if you're using a super computer, chances are you *will*
optimize to that architecture, because super computer time
tends to be expensive.  For general purpose software, the
architectures are pretty much the same all around.  They all
have caches, they all have paging.

> When programming for a single platform/compiler combination, it makes
> sense to perform micro-optimizations, _if they're rewarding enough_.
> But then, if portability is not an issue, the critical code could be
> written in assembly, as well.  When writing portable code in a HLL,
> micro-optimizations are usually a waste of effort and programmer time.

I think that most architectures in the real world are close
enough that you can find commonality, such as inefficient
recursion.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-12  0:00                                             ` Peter Seebach
@ 1996-08-13  0:00                                               ` Tim Behrendsen
  0 siblings, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-13  0:00 UTC (permalink / raw)



Peter Seebach <seebs@solutions.solon.com> wrote in article
<4unn8o$rkl@solutions.solon.com>...
> In article <01bb87cf$97ae8e80$87ee6fce@timpent.airshields.com>,
> Tim Behrendsen <tim@airshields.com> wrote:
> >> >int a[50000],b[50000],c[50000],d[50000],e[50000];
> 
> >> >void test1()
> >> >{
> >> >    int i, j;
> >> >    for (j = 0; j < 10; ++j) {
> >> >        for (i = 0; i < 50000; ++i) {
> >> >            ++a[i]; ++b[i]; ++c[i]; ++d[i]; ++e[i];
> >> >        }
> >> >    }
> >> >}
> 
> >> >void test2()
> >> >{
> >> >    int i, j;
> >> >    for (j = 0; j < 10; ++j) {
> >> >        for (i = 0; i < 50000; ++i) ++a[i];
> >> >        for (i = 0; i < 50000; ++i) ++b[i];
> >> >        for (i = 0; i < 50000; ++i) ++c[i];
> >> >        for (i = 0; i < 50000; ++i) ++d[i];
> >> >        for (i = 0; i < 50000; ++i) ++e[i];
> >> >    }
> >> >}
> 
> >Show me the computer where test1 comes out faster.
> 
> gcc/SPARC, with *NO* optimization.  test1 runs 10 times in 15.66 seconds,
> test2 takes 22.02 seconds.
> 
> With optimization, test2 comes out faster on that compiler.
> 
> In other words, which is faster depends on the compiler.  And on context.
>
> >Not gonna find this mythical bizarre beast, except for trivial
> >differences because of the extra loop overhead.
> 
> Trivial, like a factor of 50%.

I suspect that no optimization is not using pointers/registers.
Try rewriting the subroutines to use pointers both ways and then
try it with no optimization.

> >Again, is it the case that I can order my code into any
> >algorithmically valid sequence and get identical running times?
> 
> No.  However, it is the case that the differences simply don't justify
the
> overhead.  I see the "faster" version as having more code, and no real
reason
> for it.  The pithy 10% advantage it has when it's better does not justify
the
> 50% disadvantage it has when it's worse.  On my systems.

Only 10% on SPARC?  Crummy paging architecture.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-13  0:00                                                     ` Mike Rubenstein
@ 1996-08-13  0:00                                                       ` Tim Behrendsen
  1996-08-13  0:00                                                         ` Giuliano Carlini
  1996-08-15  0:00                                                         ` Mike Rubenstein
  0 siblings, 2 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-13  0:00 UTC (permalink / raw)



Mike Rubenstein <miker3@ix.netcom.com> wrote in article
<320fe7f4.7066811@nntp.ix.netcom.com>...
> I've probably been helped more by my knowledge of about 25 higer level
> languages.
> 
> Obviously, all other things being equal, it is better to know assembly
> language that to not know it.  But all other things are seldom equal.
> I suspect that most beginning programmers would gain much more from
> learning some different HLLs.  Given a choice, I'd strongly recommend
> one learn LISP, APL, Icon, or any of a few dozen other languages to
> learning assembly language.

We have the world that you want.  This is CS curriculum today; are
you happy with the level of expertise of the graduates?  I'm not,
based on my experience with trying to hire them.  If you're not
either, what do you think the reason is?

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-13  0:00           ` Darin Johnson
@ 1996-08-13  0:00             ` Tim Behrendsen
  1996-08-14  0:00               ` Gabor Egressy
                                 ` (2 more replies)
  1996-08-15  0:00             ` Should I learn C or Pascal? Richard A. O'Keefe
                               ` (2 subsequent siblings)
  3 siblings, 3 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-13  0:00 UTC (permalink / raw)



Darin Johnson <djohnson@tartarus.ucsd.edu> wrote in article
<qqk9v3ji06.fsf@tartarus.ucsd.edu>...
> > A binary sort, also known as quicksort, or Hoare's sort is covered
extensively
> > in Knuth's volume three (from 1971) and in every undergraduate data
structure
> > and algorithm course in the world;)...you would expect anyone with a CS
degree
> > to be familiar with it.
> 
> Actually, I learned it freshman year, but didn't understand it.
> Entire new concepts of programming were still trying to find niches in
> my head, leaving no room for understanding what was going on with
> quicksort.  Most later classes assumed I already knew it.  I did get
> clued into it later on, but I think many of my classmates kept the
> "quicksort is the fastest sort" categorization without really
> understanding it.  Too many people fall asleep in algorithms class
> (then bitch about the waste of time later).

[Sorry Darin to use you as the example of what I'm talking
about on a different thread! Don't take this personally! :) ]

This is a perfect example of how students are being graduated
without fully understanding what programming is all about.  The
phrasing is perfect: "I learned it..., but didn't understand it."

*This is how it happens!*

I interpret this to mean that he was struggling with all the
abstractions while trying to master the concept of "thinking
like a programmer".  Meanwhile, they are packing algorithm after
algorithm into his head when he is not prepared to understand
what they are packing.

Now, what if they had started ol' Darin off with some very
simple concepts in assembly, really showed him the procedural
nature of the computer, data flow, data transformations, etc.,
and *then* moved on to algorithms such as Quicksort.  You
just plain can't fail to understand what's going on!

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-07  0:00                                   ` Tim Behrendsen
                                                       ` (3 preceding siblings ...)
  1996-08-13  0:00                                     ` Robert I. Eachus
@ 1996-08-13  0:00                                     ` Robert I. Eachus
  1996-08-13  0:00                                       ` Lawrence Kirby
  1996-08-14  0:00                                       ` Robert Dewar
  1996-08-14  0:00                                     ` Robert I. Eachus
                                                       ` (4 subsequent siblings)
  9 siblings, 2 replies; 688+ messages in thread
From: Robert I. Eachus @ 1996-08-13  0:00 UTC (permalink / raw)



In article <839708378snz@genesis.demon.co.uk> Lawrence Kirby <fred@genesis.demon.co.uk> writes:

   I said:

   >Quicksort step of picking up the cards is O(n), not O(n log n)
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

   In article <839708378snz@genesis.demon.co.uk> Lawrence Kirby <fred@genesis.demon.co.uk> writes:

  > What does this phrase mean? Quicksort is an (n log n) (average time)
  > operation. Building a heap is an O(n) operation.

  It means that when you finish the first stage of the Quicksort, you
have several piles laid out on the table.  Completing the sort
requires picking them up in order.  For heap sort, building the heap
is also an O(n log n) operation, since you maintain a heap.  However
picking up the cards is also N log N since you have to maintain the
heap property, and in any case you pick them up one at a time.

   Try the experiment.  A good heapsorter will finish heapifying and
pick up the first card first.  A couple seconds later, the quicksorter
will finish the job, assuming the cards were well shuffled.  (Of
course, one of the fun steps is to tell the students to repeat the
exercise--without shuffling.  Now the race is between quicksort and
bubblesort for last place.)
--

					Robert I. Eachus

with Standard_Disclaimer;
use  Standard_Disclaimer;
function Message (Text: in Clever_Ideas) return Better_Ideas is...




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-13  0:00                                               ` Tim Behrendsen
@ 1996-08-13  0:00                                                 ` Giuliano Carlini
  1996-08-14  0:00                                                 ` Dan Pop
  1 sibling, 0 replies; 688+ messages in thread
From: Giuliano Carlini @ 1996-08-13  0:00 UTC (permalink / raw)



Tim Behrendsen wrote:
> 
> Dan Pop <Dan.Pop@cern.ch> wrote in article
> <danpop.839807980@news.cern.ch>...
> > In <01bb87cf$97ae8e80$87ee6fce@timpent.airshields.com> "Tim Behrendsen"
> <tim@airshields.com> writes:
> >
> > >Dan Pop <Dan.Pop@cern.ch> wrote in article
> > ><danpop.839594575@news.cern.ch>...
> > >> In <01bb8534$b2718bc0$87ee6fce@timpent.airshields.com> "Tim
> Behrendsen"
> > ><tim@airshields.com> writes:
> > >>
> > >> >Here's an example:
> > >> >
> > >> >int a[50000],b[50000],c[50000],d[50000],e[50000];
> > >> >
> > >> >void test1()
> > >> >{
> > >> >    int i, j;
> > >> >    for (j = 0; j < 10; ++j) {
> > >> >        for (i = 0; i < 50000; ++i) {
> > >> >            ++a[i]; ++b[i]; ++c[i]; ++d[i]; ++e[i];
> > >> >        }
> > >> >    }
> > >> >}
> > >> >
> > >> >void test2()
> > >> >{
> > >> >    int i, j;
> > >> >    for (j = 0; j < 10; ++j) {
> > >> >        for (i = 0; i < 50000; ++i) ++a[i];
> > >> >        for (i = 0; i < 50000; ++i) ++b[i];
> > >> >        for (i = 0; i < 50000; ++i) ++c[i];
> > >> >        for (i = 0; i < 50000; ++i) ++d[i];
> > >> >        for (i = 0; i < 50000; ++i) ++e[i];
> > >> >    }
> > >> >}
> > >> >
> > >> >On my AIX system, test1 runs in 2.47 seconds, and test2
> > >> >runs in 1.95 seconds using maximum optimization (-O3).  The
> > >> >reason I knew the second would be faster is because I know
> > >> >to limit the amount of context information the optimizer has
> > >> >to deal with in the inner loops, and I know to keep memory
> > >> >localized.
> > >>
> > >> 1. For a marginal speed increase (~25%), you compromised the
> readability
> > >>    of the code.
> > >
> > >You call 25% a "marginal" increase?

Actually, I would too. Typically, attacking slow spots by changing
algorithms, small changes to interface that permit massive
performance speedups, etc will often result in 50 to 90%
speedups. And these optimizations are portable, whereas depending
on the optimizer is not.

> >
> > Yes, I do call 25% a marginal increase.  Would you be considerably
> happier
> > if everything would run 25% faster, at the expense of everything being
> > harder to develop and maintain by a factor of 2?
> 
> No, I wouldn't be happier with "a factor of 2", but that's a
> very large presumption.  Who's doing anything strange and
> arcane?  I just made it easier on the compiler / computer.

As another data point, It did take me a lot longer to understand
code snippet 2 than one. Maybe 50%, maybe 200%, I can't say for
sure; my internal clock doesn't act like a stopwatch ;->

Depending on where in the code we're talking about, either of you could
be "right". The only 90/10 rule (90% of the execution time is taken
by 10% of the code) would dictate using this optimization in frequently
executed paths. The original is superior everyplace else due to
comprehension.

> Well, the compiler didn't do a very good job, does it?
> And that's the whole point.  It is ivory tower naivete to
> think that compilers optimize everything perfectly every
> time.

Generally programmers shouldn't be optimizing code by hand
unless they are sure it affects the "bottom line". So, if
a performance analysis shows that you need to contort your
code to get a major speed-up, I'm all for it. Otherwise,
avoid them like the plague.


g




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-13  0:00                                                       ` Tim Behrendsen
@ 1996-08-13  0:00                                                         ` Giuliano Carlini
  1996-08-14  0:00                                                           ` Tim Behrendsen
  1996-08-15  0:00                                                         ` Mike Rubenstein
  1 sibling, 1 reply; 688+ messages in thread
From: Giuliano Carlini @ 1996-08-13  0:00 UTC (permalink / raw)



Tim Behrendsen wrote:
> 
> Mike Rubenstein <miker3@ix.netcom.com> wrote in article
> <320fe7f4.7066811@nntp.ix.netcom.com>...
> > I've probably been helped more by my knowledge of about 25 higer level
> > languages.
> >
> > Obviously, all other things being equal, it is better to know assembly
> > language that to not know it.  But all other things are seldom equal.
> > I suspect that most beginning programmers would gain much more from
> > learning some different HLLs.  Given a choice, I'd strongly recommend
> > one learn LISP, APL, Icon, or any of a few dozen other languages to
> > learning assembly language.

Every programmer should know assembler as their 2cnd or third language.
Okay, not recreational programmers, or those knocking together small
programs, but every one who puts together programs that are larger than
say 10K lines of code.

You don't need to write in it often, but you need it to be able to
debug competently.

There are far to many times I'm called in to help someone debug, when
it turned out to be a stupid compiler or system bug that a good
understanding of assembler would have spotted immediately.

There are times I need to dive into the compiler runtime, or into the
OS to debug my buggy code.

If you don't understand assembler, your reduced to trying one trivial
change to your program after another. When one finally works, you've
have no clue why. Then you can't document why some monstrous section
of code is the way it is. And that causes problems later on.

g




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-14  0:00                   ` Robert Dewar
  1996-08-14  0:00                     ` Tim Behrendsen
@ 1996-08-14  0:00                     ` Dan Pop
  1996-08-14  0:00                       ` Robert Dewar
  1996-08-15  0:00                     ` Joe Foster
  2 siblings, 1 reply; 688+ messages in thread
From: Dan Pop @ 1996-08-14  0:00 UTC (permalink / raw)



In <dewar.840043674@schonberg> dewar@cs.nyu.edu (Robert Dewar) writes:

>Tim says
>
>""abstraction" question, it's a question of breaking it down
>into fundamental steps of data transformations.  Testing a
>number as even is transforming a number into a bit based on
>its evenness property."
>
>Well it is certainly easy to see why you like to teach assembler early,
>and to me, your viewpoint is a good example of why I do NOT like that
>approach, you have an unrelenting low level viewpoint of things, for
>example what on EARTH from a semantic point of view does a test for
>evenness
>have with a "bit", nothing at all!
 
And I could swear Tim's implementation of the evenness test will break
on machines using the one's complement representation :-)

Dan
--
Dan Pop
CERN, CN Division
Email: Dan.Pop@cern.ch 
Mail:  CERN - PPE, Bat. 31 R-004, CH-1211 Geneve 23, Switzerland




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-14  0:00                   ` Peter Seebach
@ 1996-08-14  0:00                     ` Tim Behrendsen
  1996-08-14  0:00                       ` Peter Seebach
                                         ` (2 more replies)
  1996-08-15  0:00                     ` Bob Gilbert
  1996-08-15  0:00                     ` DAVID A MOLNAR
  2 siblings, 3 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-14  0:00 UTC (permalink / raw)



Peter Seebach <seebs@solutions.solon.com> wrote in article
<4ut1sv$ngv@solutions.solon.com>...
> In article <01bb89f1$31be4f60$87ee6fce@timpent.airshields.com>,
> Tim Behrendsen <tim@airshields.com> wrote:
> >>I am not sure I buy your interpretation.  Thinking like a programmer is
> >>largely a matter of learning to find useful abstractions.  How do you
write
> >>code to test whether a number is even?  Well, you do it by looking at
what
> >>evenness is, and looking for what characteristics it has that would
help you.
> 
> >Solving a problem is not an
> >"abstraction" question, it's a question of breaking it down
> >into fundamental steps of data transformations.
> 
> No, that's implementing a solution.  Different stage.  Consider Cantor's
> diagonal.  A perfectly good algorithm which you *cannot* implement.
> 
> >Testing a
> >number as even is transforming a number into a bit based on
> >its evenness property.
> 
> But how you do that has *NOTHING* to do with the question of whether or
not
> you have defined a solution to the problem.
> 
> Whether or not you can do that will answer the question "is my solution
going
> to work on existing machines".  But if you can't, you still have a
solution,
> it's just not one which meets your requirements.

All of that is very interesting in a theoretical sense.  But
if we're talking about educating a student, you have to start
somewhere, and to just hit them full-bore with "abstraction
of algorithm" theory, I think you're going to lose of lot of
them, *and we are!*

It seems to me that that focusing on the more practical
aspects of implementation gives them a solid foundation to
build the rest of it on, including building up of the concepts
of languages and why their valuable.

> >So what?  You can always learn languages, but you only have
> >one shot at teaching someone to think like a programmer.
> 
> Nonsense.  People learn new things all the time.

I have to disagree; where somebody starts forms the basis of
what they compare all the rest of new knowledge to.  Once the
patterns are set, it's very difficult to re-center someone into
another frame of reference.

> >*Programming is not languages*.  Programming is breaking problems
> >down into fundamental steps to achieve a desired result.  The
> >way the steps are described is not relevent to the general question
> >of a solution.  When we are talking about teaching a student,
> >the less extraneous issues such as languages and the more
> >experience we can given them of the fundamental procedural
> >nature of the computer, the better.
> 
> I don't really think we are going to convince each other.  I believe that
> the "fundemental procedural nature of the computer" may not be a
permanent
> thing.  I'd rather teach them to think in the general terms, and see that
> procedural nature as just one more current limitation, like computers
which
> have only 2 MB of RAM.  This will better prepare them for techniques that
> depend on *not* thinking of the computer as procedural.  Look at Icon...
> Neat language, but if you think in terms of normal procedural code, it
will
> never work for you.

I think this like a theoretical physicist arguing with an
experimental physicist.  The theoretician doesn't care about
practical results, and the experimental guy sneers at theories
that have nothing to do with the real world. :)

I am very much a real-world guy, and computers are very much
a real-world device.  In fact, I think that's where at least
some of the problem is; Computer Scientists have been looked
down upon by the self-proclaimed "real sciences" for so long
(famous joke: "Is Computer Science?" HA HA says the physicicist),
that they are trying too hard to make themselves into a "real"
science.  You only have to look at a few issues of CACM to
see this in action; the amount of fancy terminology that's invoked
is just amazing.  Other sciences have this problem, but CS just
seems to go out of their way to do it.

> >> You keep using assembly as an example of what the basics of computing
are
> >> like.  Please explain how this is a better model *of the problem
domain*
> >> than lisp or C.  Why should students try to learn machine-level and
> >> program-level at the same time?  If you want them to start with
> >> architectures, don't start them on algorithms, start them on trivial
data
> >> manipulation in assembly.
> 
> >It's a better model because it's real.  C and all languages are
> >artificial constructions intended for the maintainance of large
> >projects.
> 
> So?  They're just as real.  *computers* are artificial constructions.

But they're not.  They are abstractions of machine code, which
is what's really execute.  Granted, in a strict sense assembly
is an abstraction of machine code, but we've been over this
ground in another thread.

> >What gets missed in all this is that *computers are simple*!
> >All they do in the fundamental sense is take a very simple
> >instruction, execute it, and repeat very, very fast.  If this
> >is presented the first day, the student can't help but realize
> >the simple nature of the computer, and it loses its mystery.
> 
> I think you underestimate the idiocy of many college students.  :)
> 
> Still, why wouldn't, say, scheme, work for this?  You can make a *VERY*
> simple system - much simpler than assembly for any platform - and show
> the students how everything is built on it.
> 
> If all you want to do is show them that the computer is simple, teach
them
> in Turing machine assembly.  :)

Well, the thing is that real honest-to-goodness assembly is just
not that difficult to learn, and that has built-in practical
aspects to it.

> >Look at ol' Darin; if he didn't have enough brainpower left
> >to focus on the algorithm at hand, what was he being confused
> >by?  I think it was by all the language and abstraction, which
> >is not what programming is.
> 
> But it *is* what programming is.  Programming is turning a problem
> description into a solution description.  The implementation is merely
> coding.  Turning a solution description into a program can be done
> by trained monkeys.  (Or, emprically, compilers.)

But the solution implementation is the "proof" of functionality.
As Abrash says in his finger, "I want to live in theory.  Everything
works in theory." (I like that quote; don't know where he got it).

A student could describe everything on paper, but nothing cements
it in someone's mind like running the program and seeing the
output.  Not to mention see it fail, and trying to understand
why.

> >If he had been given a solid
> >foundation of the simple nature of the computer, I think he would
> >always have a "comfort zone" he could return to whenever he
> >didn't understand something, because it's *always* implemented
> >in the fundamental terms.
> 
> But why insist on making the comfort zone so far from the nature of the
> task?
> 
> This is like teaching people to drive by starting them out with
metallurgy.
> Sure, it's fundemental to the operation of the car, *but they don't need
to
> know it*.  It can help, but it's not a good basis for an understanding at
> the right level.

No, I disagree.  It would be like teaching people to drive by
starting out teach them mechanics, and I'm not sure that wouldn't
be a bad idea. :)

Metallurgy would be the equivalent of teach programmers how to
melt the silicon, which even *I* think is a little excessive... :)

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-07  0:00                                   ` Tim Behrendsen
                                                       ` (4 preceding siblings ...)
  1996-08-13  0:00                                     ` Robert I. Eachus
@ 1996-08-14  0:00                                     ` Robert I. Eachus
  1996-08-15  0:00                                       ` Robert Dewar
  1996-08-15  0:00                                     ` Blair Phillips
                                                       ` (3 subsequent siblings)
  9 siblings, 1 reply; 688+ messages in thread
From: Robert I. Eachus @ 1996-08-14  0:00 UTC (permalink / raw)



In article <dewar.839998195@schonberg> dewar@cs.nyu.edu (Robert Dewar) writes:

  > Building a heap is a worst case O(N) process. I have seen it
  > occasionally misdescribed in a form that would be N logN, but if
  > you look at the original treesort3, you will see that the heap
  > building phase is clearly linear (the proof is straightforward, I
  > can share it offline if you like).

   Ah, the joys of being an old fart:

   Williams, J. W. J.  Algorithm 232: Heapsort, Comm. ACM 7:6, 347-348
   Floyd, R. W. Algorithm 245: treesort3, Comm. ACM 7: 12, 701

   There was a time when NO ONE knew how to build a heap in O(n) time,
but it didn't last long.  However, I was very interesting in sorting
problems right then.  Robert Dewer must have come along a little
later, when treesort3 had almost completely replaced Heapsort.  But
they are two different (but closely related) algorithms.  This is why
I have been complaining about the mixup in names.

   And for those of you who weren't around then, the reason for using
these sorts, and not quicksort was that you were really doing a
polyphase mergesort from tapes, and the inmemory sort of lots of
"little" files that could fit in memory was a first step.  Among other
things, the elimination of the need for stack space--for quicksort--
meant you could sort a slightly larger file. And that stack was
usually hand coded if you did use Quicksort...

--

					Robert I. Eachus

with Standard_Disclaimer;
use  Standard_Disclaimer;
function Message (Text: in Clever_Ideas) return Better_Ideas is...




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-14  0:00                   ` Robert Dewar
@ 1996-08-14  0:00                     ` Tim Behrendsen
  1996-08-14  0:00                     ` Dan Pop
  1996-08-15  0:00                     ` Joe Foster
  2 siblings, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-14  0:00 UTC (permalink / raw)



Robert Dewar <dewar@cs.nyu.edu> wrote in article
<dewar.840043674@schonberg>...
> Tim says
> 
> ""abstraction" question, it's a question of breaking it down
> into fundamental steps of data transformations.  Testing a
> number as even is transforming a number into a bit based on
> its evenness property."
> 
> Well it is certainly easy to see why you like to teach assembler early,
> and to me, your viewpoint is a good example of why I do NOT like that
> approach, you have an unrelenting low level viewpoint of things, for
> example what on EARTH from a semantic point of view does a test for
> evenness
> have with a "bit", nothing at all!

Well, I mean a "bit" in an abstract unit-of-information way; it either is
or it isn't. Actually, the reality would probably be an AND with 0x01,
which would give you a bit (or the "operation was zero" status bit).

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-14  0:00                     ` Tim Behrendsen
@ 1996-08-14  0:00                       ` Peter Seebach
  1996-08-15  0:00                       ` Robert Dewar
  1996-08-15  0:00                       ` Bob Gilbert
  2 siblings, 0 replies; 688+ messages in thread
From: Peter Seebach @ 1996-08-14  0:00 UTC (permalink / raw)



In article <01bb8a24$88d46fe0$87ee6fce@timpent.airshields.com>,
Tim Behrendsen <tim@airshields.com> wrote:
>No, I disagree.  It would be like teaching people to drive by
>starting out teach them mechanics, and I'm not sure that wouldn't
>be a bad idea. :)

>Metallurgy would be the equivalent of teach programmers how to
>melt the silicon, which even *I* think is a little excessive... :)

I think computers are more abstract than you're giving them credit for.
Melting silicon is to computing what quantum physics is to cars.  Cars
only have a couple of layers of abstraction, so far.

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-14  0:00                     ` Dan Pop
@ 1996-08-14  0:00                       ` Robert Dewar
  0 siblings, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-14  0:00 UTC (permalink / raw)



Dan Pop said

"And I could swear Tim's implementation of the evenness test will break
on machines using the one's complement representation :-)
"

Yes, indeed! I have seen this error (thinking that n&1 can be used for
an evenness test on signed integers) in C programs more than once!





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-13  0:00             ` What's the best language to learn? [was Re: Should I learn C or Pascal?] Tim Behrendsen
@ 1996-08-14  0:00               ` Gabor Egressy
  1996-08-15  0:00                 ` Robert Dewar
  1996-08-16  0:00                 ` Mark Wooding
  1996-08-14  0:00               ` What's the best language to learn? [was Re: Should I learn C or Pascal?] Peter Seebach
  1996-08-21  0:00               ` What's the best language to learn? [any language except Ada] Bill Mackay
  2 siblings, 2 replies; 688+ messages in thread
From: Gabor Egressy @ 1996-08-14  0:00 UTC (permalink / raw)



Tim Behrendsen (tim@airshields.com) wrote:
[snip]

: Now, what if they had started ol' Darin off with some very
: simple concepts in assembly, really showed him the procedural

Why oh why would you want to start with assembly? Assembly is great for
writing viruses and small code that needs to be fast but is a pain to write
and maintain. How is assembly going to teach you about "the procedural
nature of computers, data flow, data transformation"? Assmebly is OK after
you have some programming in a high level language under your belt. If you
really want to go back in time why don't you suggest machine code. After
all all you need to know are 1's and 0's. Can't get simpler than that.

: nature of the computer, data flow, data transformations, etc.,
: and *then* moved on to algorithms such as Quicksort.  You

Qicksort isn't that hard to understand. Just grab a deck of cards and plow
through it. A deck of cards works for all the sorts I know and care to
know.

: just plain can't fail to understand what's going on!


--
Gabor Egressy       gegressy@uoguelph.ca
Guelph, Ontario     gabor@snowhite.cis.uoguelph.ca
Canada




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-07-28  0:00                 ` J. Christian Blanchette
                                     ` (6 preceding siblings ...)
  1996-08-13  0:00                   ` Ole-Hjalmar Kristensen FOU.TD/DELAB
@ 1996-08-14  0:00                   ` Richard A. O'Keefe
  1996-08-15  0:00                   ` Teaching sorts [was Re: What's the best language to start with?] Norman H. Cohen
  1996-08-19  0:00                   ` Ted Dennison
  9 siblings, 0 replies; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-08-14  0:00 UTC (permalink / raw)



There's a debate full of sound and fury going on.
So far I haven't noticed anyone else make these points:

(a) I personally don't know any really good programmers who don't understand
    "assembler".  Understand != like; understand != can write large volumes
    of code; understand != normally uses assembly level concepts.

    What's really involved here is a grasp of computer _architecture_.
    8086, 68000, ns32000, HP-PA, SPARC, Power-PC, /370, PDP-11, even KA-10
    have a *heck* of a lot in common.  Having used a B6700 in my
    undergraduate years, these machines all look pretty much the same to
    me:  untagged byte addressed register machines with a linear address
    space (in the case of the 8086, there is a linear address space
    _there_ you just have to do strange things to get to it).  But even
    the B6700 fits into the general
	- linear physical address space
	- load, store, operate
	- fixed set of data sizes
	- approximate arithmetic uses (hardware or software) floating
	  point instead of any of the several alternatives that have
	  been proposed.
    mould, well described by the (abstract) RAM cost model within reasonable
    limits.

    People who say "but if you learn an assembler, you think in terms of
    that machine for the rest of your life" are not talking about *good*
    programmers; they are talking about the trained seals who will write
    C for the rest of their lives whatever programming language they use.
    A *good* programmer is no more limited by the first machine s/he learns
    (in my case I learned the IBM 650 from a book shortly before I learned
    the IBM /360 from another book; I have seen a 650 in a museum but never
    programmed one) than s/he is limited by the first programming language
    s/he learns (in my case, Fortran and Algol 60, more or less at the same
    time).

(b) To me, one of the main advantages of knowing about computer
    architecture is that I have rock-solid confidence that all these
    towers of abstraction actually bottom out in things I understand.
    Well, I've forgotten most of the semiconductor phsyics they taught
    me, but give me enough 74xx chips and I could build a computer if I
    had to.

    The point here is my psychology-of-learning:  I am comfortable with
    abstraction, but I have to know that there is something at the bottom.
    I'm quite happy to work with the lambda calculus, for example, but I
    know how to implement it on hardware that I understand, so I proceed
    in the *confidence* that it is something I could take apart in as much
    detail as I wanted, given time.  Similarly, I was able to learn Prolog
    fast, because I had previously written a theorem prover, in a lisp-
    like language that I had implemented myself.  So I *knew* for certain,
    without any worry, that this was the kind of thing that *could* be
    implemented on a computer.  I didn't have to understand how the Prolog
    system I was using actually worked to be sure that Prolog was the kind
    of thing that _could_ work.

    Let's face it:  how many of you were taught about integration (a
    high level abstraction) before you had some practice with counting
    rods (the natural numbers are an abstraction, but counting rods are
    solid wood that you can hold in your hands and *see* that 1+2 = 2+1 = 3).
    On of the problems with "New Math" was excessive talk about abstractions
    (commutativity and so on) before children had thoroughly mastered the
    concrete experiences (counting, addition) the abstractions are based on.

    So,
	- some people don't handle abstraction well,
	- some people handle abstraction very well, but require an
	  understanding of what's underneath the layers of abstraction
	  for comfort, even if they don't _use_ those layers whil
	  manipulating a particular abstraction
	- some people handle abstraction well, and don't mind not
	  understanding what's underneath the abstraction

    What proportion of the general population belongs to each group is an
    empirical question about which I have no information.

    What proportion of the population of presently or potentially good
    programmers is another empirical question about which I have only
    anecdotal information.  My personal experience has been that people
    in the first group do not become good programmers, and that people
    in the third group write papers, not programs.  But of course my
    experience is limited to a few hundred people from a narrow range
    of ages and cultures.

(c) The conclusion we draw from this is that different people are likely
    to require different approaches (most of the PC computer books are
    clearly aimed at people who have no ability with abstraction at all,
    in consequence I find them frustratingly unreadable) and this 
    includes different initial languages.  Some people will learn best by
    starting with Scheme.  Some people will learn best by starting with
    transistors, going on to gates and flip flops, moving on to ALUs, then
    learning say DEC-10 with its assembler, then Pascal, then Lisp, then ML.
    
    Me, I learned bottom up and I'm glad of it.
    (For the record, I use the highest level languages I can get my hands
    on.  But I still have to check assembly code from time to time.)

(d) The *real* problem we have trying to teach some of this stuff is that
    there just isn't enough *time*.  There are so *many* things out students
    "need" to know:  user interfaces, relational data bases, a couple of
    programming languages, software engineering basics, you name it.  RMIT
    actually has a reputation for graduating employable students, and I
    honestly don't know how we manage it when I think of all the knowledge
    I deploy in programming and the tiny fraction of it they get.  I am
    still rather unhappy about letting anyone use floating point arithmetic
    without a semester of numerical analysis so they at least know where
    the pitfalls are and when to call for help.

    There are no royal roads and there isn't enough time.


Now for something rather different.  I am cosupervising a masters student
who is trying to get a handle on *measuring* the effect of first year language.
This is actually part of RMIT's Quality Control program, and Dr Isaac Balbin
came up with the idea of getting someone to see if we could actually _measure_
whether selecting a particular first year language (in our case, Ada) is or
is not having the intended educational results.  The student has conducted a
literature survey, and has found a couple of survey papers, but they seem to
be weak on actually measuring *outcomes* (other than whether the students
_liked_ it or not, which is important, but not everything).

So can anyone point me to some empirical results, where someone has done
some before and after educational measurements, to see what effect a new
first year language (e.g. switching from Pascal to Miranda, or Fortran to
Modula-2, or whatever) has actually had?  Unpublished results would be fine.

-- 
Fifty years of programming language research, and we end up with C++ ???
Richard A. O'Keefe; http://www.cs.rmit.edu.au/~ok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00                                           ` Tim Behrendsen
  1996-08-08  0:00                                             ` Peter Seebach
@ 1996-08-14  0:00                                             ` Richard A. O'Keefe
  1996-08-16  0:00                                               ` Tim Behrendsen
  1 sibling, 1 reply; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-08-14  0:00 UTC (permalink / raw)



Recursive gcd:
    int gcd(int x, int y) {
	if (y == 0) return x;
	return gcd(y, x%y);
    }

Iterative gcd:
    int gcd(int x, int y) {
	int z;
	while (y != 0) z = y, y = x%y, x = z;
	return y;
    }

Peter Seebach <seebs@solutions.solon.com> wrote that
the compiler he uses would generate the equivalent of the iterative
code from the recursive source.

"Tim Behrendsen" <tim@airshields.com> asked incredulously
>Which compiler is that?  I would like to see the compiler that
>unravels recursion into non-recursive algorithms.

GCC 2.7.2 will do it.  So will SPARCompiler C 4.0.
(I mention those versions because they are the versions I have.)
So will any even half-way decent Lisp or Scheme compiler.

>It has *everthing* to do with algorithms.  You showed me a recursive
>algorithm, I rewrote it to be faster.  The fact that in this case
>it's only 10% is irrelevent

Actually, it isn't irrelevant.  The speed up is only a by a constant
factor, and all it tells you is that you are using a compiler that
does a really bad job of compiling function calls.  Any effort spent
speeding up this version of gcd is wasted, because GCD can be implemented
to have the same cost as division.  On a machine like a V7 SPARC, where
there was no division instruction, a well-written formally recursive
gcd could be as fast as a single x%y.

>for another algorithm it could have
>been much more (compare recursive v.s. non-recursive quicksort).

>I consider a recursive solution to be a different algorithm than a
>non-recursive one, although I realize that is arguably an
>implementation issue.

This is a lot like calling a procedure that uses an 'if (...) goto'
a different *algorithm* from the same procedure expressed using 'while'.
We have known since the 70s how to compile tail calls into jumps, and
since the late 70s at least have had a substantial body of theory about
how to automatically translate many uses of recursion into iteration
at the implementation level.  Turning tail calls into jumps is the very
simplest of these techniques.  (And yes, the loop thus uncovered can of
course be unrolled, just like any other loop.)

If you encounter any significant difference in speed between recursive
and non-recursive quicksort, all that tells you is that it is time to
change compilers.  Come to that, if you really cared about speed you
wouldn't be using quicksort anyway.

>If I implemented Quicksort in APL, say (which has direct
>manipulation of arrays in one operation), 

Given the grade-up primitive, I can't think why anybody would _want_
to implement quicksort in APL.

>the student would
>not see the movement, because they would just see the computer
>doing the array "in one fell swoop", but wouldn't really
>experience the fact that the computer doesn't really do it
>that way in reality.

Interestingly enough, the APL sorting primitive DOESN'T move the data
at all.  It returns a permutation vector.  The APL idiom for sorting is
	X[.GradeUP X]
where the necessary movement is quite visible.

-- 
Fifty years of programming language research, and we end up with C++ ???
Richard A. O'Keefe; http://www.cs.rmit.edu.au/~ok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-13  0:00             ` What's the best language to learn? [was Re: Should I learn C or Pascal?] Tim Behrendsen
  1996-08-14  0:00               ` Gabor Egressy
@ 1996-08-14  0:00               ` Peter Seebach
  1996-08-14  0:00                 ` Tim Behrendsen
  1996-08-21  0:00               ` What's the best language to learn? [any language except Ada] Bill Mackay
  2 siblings, 1 reply; 688+ messages in thread
From: Peter Seebach @ 1996-08-14  0:00 UTC (permalink / raw)



In article <01bb8950$2c8dcc60$87ee6fce@timpent.airshields.com>,
Tim Behrendsen <tim@airshields.com> wrote:
>This is a perfect example of how students are being graduated
>without fully understanding what programming is all about.  The
>phrasing is perfect: "I learned it..., but didn't understand it."

>*This is how it happens!*

We have absolutely no mechanism here, of course.

>I interpret this to mean that he was struggling with all the
>abstractions while trying to master the concept of "thinking
>like a programmer".  Meanwhile, they are packing algorithm after
>algorithm into his head when he is not prepared to understand
>what they are packing.

I am not sure I buy your interpretation.  Thinking like a programmer is
largely a matter of learning to find useful abstractions.  How do you write
code to test whether a number is even?  Well, you do it by looking at what
evenness is, and looking for what characteristics it has that would help you.

>Now, what if they had started ol' Darin off with some very
>simple concepts in assembly, really showed him the procedural
>nature of the computer, data flow, data transformations, etc.,
>and *then* moved on to algorithms such as Quicksort.  You
>just plain can't fail to understand what's going on!

But he would have been completely unprepared for whole families of computer
languages.

If they'd started him off with explaining, in his native language, how to sort
things, and given him sample sets of cards to sort, while following each of a
set of descriptions *in his native language*, he would have understood
quicksort.

You keep using assembly as an example of what the basics of computing are
like.  Please explain how this is a better model *of the problem domain* than
lisp or C.  Why should students try to learn machine-level and program-level
at the same time?  If you want them to start with architectures, don't start
them on algorithms, start them on trivial data manipulation in assembly.

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
       [not found]                                                     ` <32 <01bb8923$e1d34280$87ee6fce@timpent.airshields.com>
@ 1996-08-14  0:00                                                       ` Peter Seebach
  1996-08-14  0:00                                                         ` Tim Behrendsen
  0 siblings, 1 reply; 688+ messages in thread
From: Peter Seebach @ 1996-08-14  0:00 UTC (permalink / raw)



In article <01bb8923$e1d34280$87ee6fce@timpent.airshields.com>,
Tim Behrendsen <tim@airshields.com> wrote:
>We have the world that you want.  This is CS curriculum today; are
>you happy with the level of expertise of the graduates?  I'm not,
>based on my experience with trying to hire them.  If you're not
>either, what do you think the reason is?

I don't know for sure what a CS curriculum is today.  If it's anything like
psych, the first problem is that students aren't taking enough philosophy and
math classes, and aren't learning basic analytic skills like decomposition.
If you can't take a problem and break it down into parts, you can't do
anything.

I don't see assembly as a better model than English for learning this.  I see
the problem as being a tendancy to focus on concrete problem solving in any
number of languages, and no real study to the art that unifies the languages.
A programmer taught in such a curriculum knows some languages and not others,
but would need a class to learn any new language.  This is probably not ideal.

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-13  0:00                                     ` Robert I. Eachus
@ 1996-08-14  0:00                                       ` Robert Dewar
  1996-08-15  0:00                                       ` Tom Payne
  1 sibling, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-14  0:00 UTC (permalink / raw)



Robert Eachs says, reemphasizing an incorrect point :-)

"
  If you use the canonical heapsort it is O(2n) average, O(n log n)
worst case.  (The worst case is when each card added to the bottom of
the heap propagates to the top, using the usual version of the sort,"

No, absolutely NOT. building a heap is worst case O(N) (what the heck
does O (2N) mean -- I would take at least 25% credit off for a student
putting a junk constant like this in a big O formula!)

The proper algorithm for creating a heap is to apply what Knuth calls
siftup successively to nodes N/2, N/2-1, N/2-2 .. 1. This algorithm
is clearly linear.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-13  0:00                                     ` Robert I. Eachus
  1996-08-13  0:00                                       ` Lawrence Kirby
@ 1996-08-14  0:00                                       ` Robert Dewar
  1 sibling, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-14  0:00 UTC (permalink / raw)



Robert Eachus says

"  It means that when you finish the first stage of the Quicksort, you
have several piles laid out on the table.  Completing the sort
requires picking them up in order.  For heap sort, building the heap
is also an O(n log n) operation, since you maintain a heap.  However
picking up the cards is also N log N since you have to maintain the
heap property, and in any case you pick them up one at a time."

A surprising slip :-)
Building a heap is a worst case O(N) process. I have seen it occasionally
misdescribed in a form that would be N logN, but if you look at the
original treesort3, you will see that the heap building phase is clearly
linear (the proof is straightforward, I can share it offline if you like).





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-08  0:00                                     ` Teaching sorts [was Re: What's the best language to start with?] Robert I. Eachus
                                                         ` (3 preceding siblings ...)
  1996-08-12  0:00                                       ` Steve Heller
@ 1996-08-14  0:00                                       ` Stephen Baynes
  1996-08-14  0:00                                         ` Robert Dewar
  1996-08-14  0:00                                         ` Robert Dewar
  4 siblings, 2 replies; 688+ messages in thread
From: Stephen Baynes @ 1996-08-14  0:00 UTC (permalink / raw)


[-- Warning: decoded text below may be mangled, UTF-8 assumed --]
[-- Attachment #1: Type: text/plain, Size: 1124 bytes --]


Robert I. Eachus (eachus@spectre.mitre.org) wrote:


:     I managed to do the "fun" experiment once.  Take three students
: and have them learn Quicksort, Heapsort, and Bubblesort on "small"

Why do people try and teach students Bubblesort? It may be an interlectually
interesting exercise but it is of no use. An insertion sort is simpler and at
least as fast as a bubble sort. For many practical programing problems an
insertion sort is a sensible solution, it is very compact and for small
datasets as fast as anything (Many quicksort implementations switch to
insertion sort for less than about 7 items). The number of times I have
had to redirect graduates who have tried to write a small sort using Bubblesort
(because it was the simplest sort they were taught) or Quicksort (because they
have been taught it is faster in all cases).


--
Stephen Baynes                              baynes@ukpsshp1.serigate.philips.nl
Philips Semiconductors Ltd
Southampton                                 My views are my own.
United Kingdom
 Are you using ISO8859-1? Do you see � as copyright, � as division and � as 1/2?




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-13  0:00                                               ` Tim Behrendsen
  1996-08-13  0:00                                                 ` Giuliano Carlini
@ 1996-08-14  0:00                                                 ` Dan Pop
  1996-08-14  0:00                                                   ` Tim Behrendsen
  1996-08-16  0:00                                                   ` Dik T. Winter
  1 sibling, 2 replies; 688+ messages in thread
From: Dan Pop @ 1996-08-14  0:00 UTC (permalink / raw)



In <01bb88d4$022bf4a0$32ee6fce@timhome2> "Tim Behrendsen" <tim@airshields.com> writes:

>Dan Pop <Dan.Pop@cern.ch> wrote in article
><danpop.839807980@news.cern.ch>...
>> In <01bb87cf$97ae8e80$87ee6fce@timpent.airshields.com> "Tim Behrendsen"
><tim@airshields.com> writes:
>> 
>> >Dan Pop <Dan.Pop@cern.ch> wrote in article
>> ><danpop.839594575@news.cern.ch>...
>> >> In <01bb8534$b2718bc0$87ee6fce@timpent.airshields.com> "Tim
>Behrendsen"
>> ><tim@airshields.com> writes:
>> >> 
>> >> >Here's an example:
>> >> >
>> >> >int a[50000],b[50000],c[50000],d[50000],e[50000];
>> >> >
>> >> >void test1()
>> >> >{
>> >> >    int i, j;
>> >> >    for (j = 0; j < 10; ++j) {
>> >> >        for (i = 0; i < 50000; ++i) {
>> >> >            ++a[i]; ++b[i]; ++c[i]; ++d[i]; ++e[i];
>> >> >        }
>> >> >    }
>> >> >}
>> >> >
>> >> >void test2()
>> >> >{
>> >> >    int i, j;
>> >> >    for (j = 0; j < 10; ++j) {
>> >> >        for (i = 0; i < 50000; ++i) ++a[i];
>> >> >        for (i = 0; i < 50000; ++i) ++b[i];
>> >> >        for (i = 0; i < 50000; ++i) ++c[i];
>> >> >        for (i = 0; i < 50000; ++i) ++d[i];
>> >> >        for (i = 0; i < 50000; ++i) ++e[i];
>> >> >    }
>> >> >}
>> >> >
>> >> >On my AIX system, test1 runs in 2.47 seconds, and test2
>> >> >runs in 1.95 seconds using maximum optimization (-O3).  The
>> >> >reason I knew the second would be faster is because I know
>> >> >to limit the amount of context information the optimizer has
>> >> >to deal with in the inner loops, and I know to keep memory
>> >> >localized.
>> >> 
>> >> 1. For a marginal speed increase (~25%), you compromised the
>readability
>> >>    of the code.
>> >
>> >You call 25% a "marginal" increase?  It's attitudes like this
>> >that give us the slow code world we have right now.
>> 
>> Yes, I do call 25% a marginal increase.  Would you be considerably
>happier
>> if everything would run 25% faster, at the expense of everything being
>> harder to develop and maintain by a factor of 2?
>
>No, I wouldn't be happier with "a factor of 2", but that's a
>very large presumption. 

No, it isn't.  In nontrivial and meaningful examples (i.e. not like yours)
there is quite a difference between the effort to express an algorithm
in the most natural way and in the most efficient way for a certain
compiler/platform combination.  Compare a plain loop with Duff's device
and you'll see what I mean.

>Who's doing anything strange and
>arcane?  I just made it easier on the compiler / computer.

You made it easier on your compiler/computer combination, but not on mine.

>> >And how is one less readable than the other?
>> 
>> Methinks one loop is easier to read than five.
>
>Sometimes, but other times I would rather read five small
>loops right in a row than one mammoth loop that I have to
>take in all at once.

In this particular case, test1 is easier to read.  It also happens to be
faster on some machines.

>In this particular case, I think they are both equally
>readable.

I don't.

>> >> 2. Another compiler, on another system, might generate faster code
>> >>    out of the test1.  This is especially true for supercomputers,
>> >>    which have no cache memory (and where the micro-optimizations are
>done
>> >>    based on a completely different set of criteria) and where the cpu
>> >time
>> >>    is really expensive.
>> >
>> >Show me the computer where test1 comes out faster. 
>> 
>> Already done: my notebook.
>
>Yes, trivially faster.  But how about significantly faster?

There is no machine that I can think of where one of them would be
significantly faster than the other.  Do you have a counterexample?

>> >Or shouldn't we depend on the compiler to optimize this?
>> 
>> Exactly, it's compiler's job to optimize it, not mine.
>
>Well, the compiler didn't do a very good job, does it?

YOUR compiler didn't do a very good job.  By crippling your code for
your compiler you made it less efficient for other platforms.  Hence,
your example was a very bad one.

>And that's the whole point.  It is ivory tower naivete to
>think that compilers optimize everything perfectly every
>time.

It's equally naive to believe that code micro-optimized for one 
platform will be faster on any other platform.

>Yes, you can just wave your hand and say, "well,
>obviously AIX sucks", but in my experience at least, ALL
>compilers suck in one way or another.

And adapting your code to one particular sucking compiler is a great 
idea, if I understood your point right.

>> >> Let's see what happens on my 486DX33 box:
>> >> 
>> >> So, it's 1.10 + 0.23 = 1.33 seconds of cpu time for test1 versus
>> >> 1.17 + 0.18 = 1.35 seconds for test2. 
>> 
>> Get a clue.  If test1 flushes the cache, test2 will flush it, as well.
>> Both implementations behave the same way WRT cache utilization: all
>> 5 x 50000 array elements are accessed in a single iteration of the outer
>> loop, hence the same thing (cache hit or cache miss) will happen at the
>> next iteration in both versions.  Cache is simply a non-issue in your
>> example (even if you intended it to be :-)
>
>- Sigh - I must really think about these posts for more than
>the two minutes per thread.  You are right, of course.  The
>cache is irrelevent to this example.  It's more likely the
>fact that the first case is makes better use of page locality.
>It may also be that the compiler ran out of address registers
>with five arrays (or a combination of both).

I wasn't aware that the POWER architecture has "address registers".
Anyway, my 486 definitely ran out of registers, but this didn't prevent
test1 from be marginally faster.

>Of course, I could restructure the example to take advantage
>of the cache, and get even more improvement. :)

Or make test1 even faster on my notebook, with only 8k of cache :-)
 
>> >In any case, the code is identically
>> >readable either way IMO, and costs you nothing to implement
>> >the efficient way.
>> 
>> You forgot to provide a VALID justification for your claim that test2
>> is the "efficient way".
>
>It's probably mostly the paging locality.

Both versions have excellent paging locality.  Read them again and tell
me which one is likely to produce more page faults than the other.

>Did you run your
>test under DOS or Windows?  It would be interesting to see
>if there was a difference.

I have better uses for my time than to use DOS and Windows.  But the
Linux/gcc combination proved your claim wrong.

>> >So it wasn't on one architecture, big deal.
>> 
>> Can you prove that it will be on any other architecture?  According
>> to your line of argumentation, I could say: "so it was on one
>architecture,
>> big deal" :-)
>
>I would have to try it on other architectures to be sure, but
>can you come up with a scenerio that the second would be
>significantly slower?

Can you come with a scenario that the second would be significantly
faster?

>I would say that on the average paging
>architecture, memory locality tends to be quite important to
>performance.

And I have to repeat that both implementations have similar memory
locality.  Unless you can prove otherwise.

>> >Agreed; optimization shouldn't be to a certain platform, but
>> >optimizations can be made on general basis.
>> 
>> Right.  By selecting the proper algorithm.  Most other kinds of
>> optimizations will be specific to a certain machine, or, at best, class
>> of machines.  For example, an optimization which tries to improve the
>> cache hit rate might hurt on a supercomputer which has no cache, but it
>> is adversely affected by certain memory access patterns.
>
>Well, if you're using a super computer, chances are you *will*
>optimize to that architecture, because super computer time
>tends to be expensive.  For general purpose software, the
>architectures are pretty much the same all around.  They all
>have caches, they all have paging.

Are you kidding or what?  Please explain the paging sheme of MSDOS, the
most popular platform in the world (unfortunately).  And cacheless
architectures are common at both ends of the spectrum.

>> When programming for a single platform/compiler combination, it makes
>> sense to perform micro-optimizations, _if they're rewarding enough_.
>> But then, if portability is not an issue, the critical code could be
>> written in assembly, as well.  When writing portable code in a HLL,
>> micro-optimizations are usually a waste of effort and programmer time.
>
>I think that most architectures in the real world are close
>enough that you can find commonality, such as inefficient
>recursion.

There are plenty of algorithms where recursion is the most efficient
implementation.  Just ask any Fortran programmer who had to implement
such an algorithm without using recursion.

Try to find a better example next time.

Dan
--
Dan Pop
CERN, CN Division
Email: Dan.Pop@cern.ch 
Mail:  CERN - PPE, Bat. 31 R-004, CH-1211 Geneve 23, Switzerland




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-14  0:00               ` What's the best language to learn? [was Re: Should I learn C or Pascal?] Peter Seebach
@ 1996-08-14  0:00                 ` Tim Behrendsen
  1996-08-14  0:00                   ` Peter Seebach
                                     ` (3 more replies)
  0 siblings, 4 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-14  0:00 UTC (permalink / raw)



Peter Seebach <seebs@solutions.solon.com> wrote in article
<4urmvu$dfp@solutions.solon.com>...
> In article <01bb8950$2c8dcc60$87ee6fce@timpent.airshields.com>,
> Tim Behrendsen <tim@airshields.com> wrote:
> >This is a perfect example of how students are being graduated
> >without fully understanding what programming is all about.  The
> >phrasing is perfect: "I learned it..., but didn't understand it."
> 
> >*This is how it happens!*
> 
> We have absolutely no mechanism here, of course.
> 
> >I interpret this to mean that he was struggling with all the
> >abstractions while trying to master the concept of "thinking
> >like a programmer".  Meanwhile, they are packing algorithm after
> >algorithm into his head when he is not prepared to understand
> >what they are packing.
> 
> I am not sure I buy your interpretation.  Thinking like a programmer is
> largely a matter of learning to find useful abstractions.  How do you
write
> code to test whether a number is even?  Well, you do it by looking at
what
> evenness is, and looking for what characteristics it has that would help
you.

But at some point you have to sit down and prepare an
implementation, and that's when you know whether the problem
is truly understood or not.  Solving a problem is not an
"abstraction" question, it's a question of breaking it down
into fundamental steps of data transformations.  Testing a
number as even is transforming a number into a bit based on
its evenness property.
 
> >Now, what if they had started ol' Darin off with some very
> >simple concepts in assembly, really showed him the procedural
> >nature of the computer, data flow, data transformations, etc.,
> >and *then* moved on to algorithms such as Quicksort.  You
> >just plain can't fail to understand what's going on!
> 
> But he would have been completely unprepared for whole families of
computer
> languages.

So what?  You can always learn languages, but you only have
one shot at teaching someone to think like a programmer.
*Programming is not languages*.  Programming is breaking problems
down into fundamental steps to achieve a desired result.  The
way the steps are described is not relevent to the general question
of a solution.  When we are talking about teaching a student,
the less extraneous issues such as languages and the more
experience we can given them of the fundamental procedural
nature of the computer, the better.

> If they'd started him off with explaining, in his native language, how to
sort
> things, and given him sample sets of cards to sort, while following each
of a
> set of descriptions *in his native language*, he would have understood
> quicksort.
> 
> You keep using assembly as an example of what the basics of computing are
> like.  Please explain how this is a better model *of the problem domain*
than
> lisp or C.  Why should students try to learn machine-level and
program-level
> at the same time?  If you want them to start with architectures, don't
start
> them on algorithms, start them on trivial data manipulation in assembly.

It's a better model because it's real.  C and all languages are
artificial constructions intended for the maintainance of large
projects.

What gets missed in all this is that *computers are simple*!
All they do in the fundamental sense is take a very simple
instruction, execute it, and repeat very, very fast.  If this
is presented the first day, the student can't help but realize
the simple nature of the computer, and it loses its mystery.

Look at ol' Darin; if he didn't have enough brainpower left
to focus on the algorithm at hand, what was he being confused
by?  I think it was by all the language and abstraction, which
is not what programming is.  If he had been given a solid
foundation of the simple nature of the computer, I think he would
always have a "comfort zone" he could return to whenever he
didn't understand something, because it's *always* implemented
in the fundamental terms.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-13  0:00                                                         ` Giuliano Carlini
@ 1996-08-14  0:00                                                           ` Tim Behrendsen
  0 siblings, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-14  0:00 UTC (permalink / raw)



Giuliano Carlini <giuliano@ix.netcom.com> wrote in article
<32115234.7B0@ix.netcom.com>...
> Tim Behrendsen wrote:
> > 
> > Mike Rubenstein <miker3@ix.netcom.com> wrote in article
> > <320fe7f4.7066811@nntp.ix.netcom.com>...
> > > I've probably been helped more by my knowledge of about 25 higer
level
> > > languages.
> > >
> > > Obviously, all other things being equal, it is better to know
assembly
> > > language that to not know it.  But all other things are seldom equal.
> > > I suspect that most beginning programmers would gain much more from
> > > learning some different HLLs.  Given a choice, I'd strongly recommend
> > > one learn LISP, APL, Icon, or any of a few dozen other languages to
> > > learning assembly language.
> 
> Every programmer should know assembler as their 2cnd or third language.
> Okay, not recreational programmers, or those knocking together small
> programs, but every one who puts together programs that are larger than
> say 10K lines of code.
> 
> You don't need to write in it often, but you need it to be able to
> debug competently.
> 
> There are far to many times I'm called in to help someone debug, when
> it turned out to be a stupid compiler or system bug that a good
> understanding of assembler would have spotted immediately.
> 
> There are times I need to dive into the compiler runtime, or into the
> OS to debug my buggy code.
> 
> If you don't understand assembler, your reduced to trying one trivial
> change to your program after another. When one finally works, you've
> have no clue why. Then you can't document why some monstrous section
> of code is the way it is. And that causes problems later on.

Or heck, to examine a core dump in the field.  I list out the assembly all
the time to see what happened.  I'm one of the few people in my company
who can do a post-mortem without source-level debugging turned on,
and I don't even know the (RS/6000) assembly language!  I've just seen
enough of them to be able to feel my way around the instruction set.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-14  0:00                                                       ` Peter Seebach
@ 1996-08-14  0:00                                                         ` Tim Behrendsen
  1996-08-14  0:00                                                           ` Peter Seebach
  0 siblings, 1 reply; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-14  0:00 UTC (permalink / raw)



Peter Seebach <seebs@solutions.solon.com> wrote in article
<4urn70$dhi@solutions.solon.com>...
> In article <01bb8923$e1d34280$87ee6fce@timpent.airshields.com>,
> Tim Behrendsen <tim@airshields.com> wrote:
> >We have the world that you want.  This is CS curriculum today; are
> >you happy with the level of expertise of the graduates?  I'm not,
> >based on my experience with trying to hire them.  If you're not
> >either, what do you think the reason is?
> 
> I don't know for sure what a CS curriculum is today.  If it's anything
like
> psych, the first problem is that students aren't taking enough philosophy
and
> math classes, and aren't learning basic analytic skills like
decomposition.
> If you can't take a problem and break it down into parts, you can't do
> anything.
> 
> I don't see assembly as a better model than English for learning this.  I
see
> the problem as being a tendancy to focus on concrete problem solving in
any
> number of languages, and no real study to the art that unifies the
languages.
> A programmer taught in such a curriculum knows some languages and not
others,
> but would need a class to learn any new language.  This is probably not
ideal.

I think this is learning bias on your part; you appear to
learn more from an abstract, theoretical basis rather than
a "sit down and try it out" basis.  I would say the latter
is more typical when learning computers than the former.

But I agree with you; "the art that unifies the languages".  I
think this is what I mean when I say "learning to think like
a programmer".  The question is, how to convey this?  It just
seems to me that the more "pure" and undistracted you can make
it, the better.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-14  0:00                                       ` Stephen Baynes
  1996-08-14  0:00                                         ` Robert Dewar
@ 1996-08-14  0:00                                         ` Robert Dewar
  1 sibling, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-14  0:00 UTC (permalink / raw)



The one advantage of bubble sort is that it is close to optimal on sorted
or nearly sorted arrays. You have to be very careful how you write insertion
sort not to require more compares in the fully sorted case, and you will
almost certainly find you require more overhead, because of the two nested
loops. Yes, you could add a special test in the outer loop for already
being in the right place, but then you complicate the inner loop if you
want to avoid repeating this comparison. A bubble sort is certainly a
much simpler solution to the problem of optimal sorting of a sorted
list, and simplicity of solutoins is interesting if performance is NOT
an issue after all.

So Step




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-14  0:00                                       ` Stephen Baynes
@ 1996-08-14  0:00                                         ` Robert Dewar
  1996-08-16  0:00                                           ` Dik T. Winter
                                                             ` (3 more replies)
  1996-08-14  0:00                                         ` Robert Dewar
  1 sibling, 4 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-14  0:00 UTC (permalink / raw)



Stephen says

 Why do people try and teach students Bubblesort? It may be an interlectually
 interesting exercise but it is of no use. An insertion sort is simpler and at
 least as fast as a bubble sort. For many practical programing problems an
 insertion sort is a sensible solution, it is very compact and for small
 d atasets as fast as anything (Many quicksort implementations switch to
 insertion sort for less than about 7 items). The number of times I have
 had to redirect graduates who have tried to write a small sort using
 Bubblesort (because it was the simplest sort they were taught) or Quicksort
 (because they have been taught it is faster in all cases).

The one advantage of bubble sort is that it is close to optimal on sorted
or nearly sorted arrays. You have to be very careful how you write insertion
sort not to require more compares in the fully sorted case, and you will
almost certainly find you require more overhead, because of the two nested
loops. Yes, you could add a special test in the outer loop for already
being in the right place, but then you complicate the inner loop if you
want to avoid repeating this comparison. A bubble sort is certainly a
much simpler solution to the problem of optimal sorting of a sorted
list, and simplicity of solutoins is interesting if performance is NOT
an issue after all.


For quick sorts, I prefer heapsort to quicksort, because of its bounded
worst case behavior. Note that there is a little-known modification to
heap sort that reduces the number of compares to about NlogN compared
with the normal 2NlogN (the 2 is where Eachus got the O(2N), though of
course constants don't belong in big-O formulas). As far as I know this
is not really properly reported in the literature -- I treat it in detail
in my 1968 thesis, and it is an excercise in Knuth volume 3 (although his
original answer was wrong, I think I kept that $1 Wells Fargo colorful
check somewhere as a souvenir :-)

P.S. I really prefer to call heapsort treesort3, its original name 
bestowed by Floyd, the discoverer of this algorithm. Too many people
think Knuth invented the algorithm, when all he invented (I think?) 
was the name -- though of course, as always Knuth is VERY careful,
almost fanatically careful, to give full credit to original authors,
but someone people sometimes miss these careful credits in the books!





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-14  0:00                                                 ` Dan Pop
@ 1996-08-14  0:00                                                   ` Tim Behrendsen
  1996-08-16  0:00                                                   ` Dik T. Winter
  1 sibling, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-14  0:00 UTC (permalink / raw)



Dan Pop <Dan.Pop@cern.ch> wrote in article
<danpop.840019072@news.cern.ch>...
> >
> >It's probably mostly the paging locality.
> 
> Both versions have excellent paging locality.  Read them again and tell
> me which one is likely to produce more page faults than the other.

By paging locality, I mean that virtual-to-physical address
translation happens faster when you are dealing with one
page at a time rather than "striping" in the second case.

> >I think that most architectures in the real world are close
> >enough that you can find commonality, such as inefficient
> >recursion.
> 
> There are plenty of algorithms where recursion is the most efficient
> implementation.  Just ask any Fortran programmer who had to implement
> such an algorithm without using recursion.

For example?

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-14  0:00                 ` Tim Behrendsen
@ 1996-08-14  0:00                   ` Peter Seebach
  1996-08-14  0:00                     ` Tim Behrendsen
                                       ` (2 more replies)
  1996-08-14  0:00                   ` Robert Dewar
                                     ` (2 subsequent siblings)
  3 siblings, 3 replies; 688+ messages in thread
From: Peter Seebach @ 1996-08-14  0:00 UTC (permalink / raw)



In article <01bb89f1$31be4f60$87ee6fce@timpent.airshields.com>,
Tim Behrendsen <tim@airshields.com> wrote:
>>I am not sure I buy your interpretation.  Thinking like a programmer is
>>largely a matter of learning to find useful abstractions.  How do you write
>>code to test whether a number is even?  Well, you do it by looking at what
>>evenness is, and looking for what characteristics it has that would help you.

>Solving a problem is not an
>"abstraction" question, it's a question of breaking it down
>into fundamental steps of data transformations.

No, that's implementing a solution.  Different stage.  Consider Cantor's
diagonal.  A perfectly good algorithm which you *cannot* implement.

>Testing a
>number as even is transforming a number into a bit based on
>its evenness property.

But how you do that has *NOTHING* to do with the question of whether or not
you have defined a solution to the problem.

Whether or not you can do that will answer the question "is my solution going
to work on existing machines".  But if you can't, you still have a solution,
it's just not one which meets your requirements.

>So what?  You can always learn languages, but you only have
>one shot at teaching someone to think like a programmer.

Nonsense.  People learn new things all the time.

>*Programming is not languages*.  Programming is breaking problems
>down into fundamental steps to achieve a desired result.  The
>way the steps are described is not relevent to the general question
>of a solution.  When we are talking about teaching a student,
>the less extraneous issues such as languages and the more
>experience we can given them of the fundamental procedural
>nature of the computer, the better.

I don't really think we are going to convince each other.  I believe that
the "fundemental procedural nature of the computer" may not be a permanent
thing.  I'd rather teach them to think in the general terms, and see that
procedural nature as just one more current limitation, like computers which
have only 2 MB of RAM.  This will better prepare them for techniques that
depend on *not* thinking of the computer as procedural.  Look at Icon...
Neat language, but if you think in terms of normal procedural code, it will
never work for you.

>> You keep using assembly as an example of what the basics of computing are
>> like.  Please explain how this is a better model *of the problem domain*
>> than lisp or C.  Why should students try to learn machine-level and
>> program-level at the same time?  If you want them to start with
>> architectures, don't start them on algorithms, start them on trivial data
>> manipulation in assembly.

>It's a better model because it's real.  C and all languages are
>artificial constructions intended for the maintainance of large
>projects.

So?  They're just as real.  *computers* are artificial constructions.

>What gets missed in all this is that *computers are simple*!
>All they do in the fundamental sense is take a very simple
>instruction, execute it, and repeat very, very fast.  If this
>is presented the first day, the student can't help but realize
>the simple nature of the computer, and it loses its mystery.

I think you underestimate the idiocy of many college students.  :)

Still, why wouldn't, say, scheme, work for this?  You can make a *VERY*
simple system - much simpler than assembly for any platform - and show
the students how everything is built on it.

If all you want to do is show them that the computer is simple, teach them
in Turing machine assembly.  :)

>Look at ol' Darin; if he didn't have enough brainpower left
>to focus on the algorithm at hand, what was he being confused
>by?  I think it was by all the language and abstraction, which
>is not what programming is.

But it *is* what programming is.  Programming is turning a problem
description into a solution description.  The implementation is merely
coding.  Turning a solution description into a program can be done
by trained monkeys.  (Or, emprically, compilers.)

>If he had been given a solid
>foundation of the simple nature of the computer, I think he would
>always have a "comfort zone" he could return to whenever he
>didn't understand something, because it's *always* implemented
>in the fundamental terms.

But why insist on making the comfort zone so far from the nature of the
task?

This is like teaching people to drive by starting them out with metallurgy.
Sure, it's fundemental to the operation of the car, *but they don't need to
know it*.  It can help, but it's not a good basis for an understanding at
the right level.

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-14  0:00                                                         ` Tim Behrendsen
@ 1996-08-14  0:00                                                           ` Peter Seebach
  0 siblings, 0 replies; 688+ messages in thread
From: Peter Seebach @ 1996-08-14  0:00 UTC (permalink / raw)



In article <01bb89f2$de94b840$87ee6fce@timpent.airshields.com>,
Tim Behrendsen <tim@airshields.com> wrote:
>I think this is learning bias on your part; you appear to
>learn more from an abstract, theoretical basis rather than
>a "sit down and try it out" basis.  I would say the latter
>is more typical when learning computers than the former.

The latter is more typical when learning anything; it's the dominant
form of human learning.  I think it's poorly suited to computers and
math.

>But I agree with you; "the art that unifies the languages".  I
>think this is what I mean when I say "learning to think like
>a programmer".  The question is, how to convey this?  It just
>seems to me that the more "pure" and undistracted you can make
>it, the better.

I agree with this.  I just don't think assembly counts as pure and
undistracted.

Learning to program in assembly strikes me as like trying to learn what red is
by looking at fire trucks.  It'd be easier to comprehend by looking at a patch
of red fabric.

We often use "fire engine red" to name a specific kind of red, but I'm not
sure it's the best way to learn colors.

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-14  0:00                 ` Tim Behrendsen
  1996-08-14  0:00                   ` Peter Seebach
@ 1996-08-14  0:00                   ` Robert Dewar
  1996-08-14  0:00                     ` Tim Behrendsen
                                       ` (2 more replies)
  1996-08-16  0:00                   ` Dr. Richard Botting
  1996-08-16  0:00                   ` Bob Gilbert
  3 siblings, 3 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-14  0:00 UTC (permalink / raw)



Tim says

""abstraction" question, it's a question of breaking it down
into fundamental steps of data transformations.  Testing a
number as even is transforming a number into a bit based on
its evenness property."

Well it is certainly easy to see why you like to teach assembler early,
and to me, your viewpoint is a good example of why I do NOT like that
approach, you have an unrelenting low level viewpoint of things, for
example what on EARTH from a semantic point of view does a test for
evenness
have with a "bit", nothing at all!





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-14  0:00               ` Gabor Egressy
@ 1996-08-15  0:00                 ` Robert Dewar
  1996-08-17  0:00                   ` Lawrence Kirby
  1996-08-16  0:00                 ` Mark Wooding
  1 sibling, 1 reply; 688+ messages in thread
From: Robert Dewar @ 1996-08-15  0:00 UTC (permalink / raw)



Gabor says

"Qicksort isn't that hard to understand. Just grab a deck of cards and plow
through it. A deck of cards works for all the sorts I know and care to
know."

I would disagree with this. The divide and conquer paradigm of QS is of
course trivial to understand if you understand recursion (although for
starting students, that can be a very big if!)

However, the algorithm for the in place partition is quite tricky to
get exactly right, and I have often seen slipups in coding it.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-13  0:00                                     ` Robert I. Eachus
  1996-08-14  0:00                                       ` Robert Dewar
@ 1996-08-15  0:00                                       ` Tom Payne
  1 sibling, 0 replies; 688+ messages in thread
From: Tom Payne @ 1996-08-15  0:00 UTC (permalink / raw)



Robert I. Eachus <eachus@spectre.mitre.org> wrote:
: In article <839939925snz@genesis.demon.co.uk> Lawrence Kirby <fred@genesis.demon.co.uk> writes:

:   > > For heap sort, building the heap
:   > > is also an O(n log n) operation, since you maintain a heap.

:   > Wrong, you don't maintain a heap, you just build it. If you approach
:   > that correctly it is an O(n) operation. Quoting from Sedgewick:

:   > "... its can in fact be proven that the construction process takes
:   >  linear time since so many small heaps are processed".

:   If you use the canonical heapsort it is O(2n) average, O(n log n)
: worst case.  (The worst case is when each card added to the bottom of
: the heap propagates to the top, using the usual version of the sort,
: or propagates to bottom in the other definition, where you start with
: an unsorted list/heap and sort in place.)

Keep in mind that the heap is growing as you add items.  Each item
gets propagated a distance that is not greater than its height from
the leafs in the final tree.  There are n/(2^i) nodes of height i,
so the total number of steps is 
           1*n/2 + 2*n/4 + 3*n/8 + ... log(n)*n/(2^log(n))
which is o(n).

Tom Payne (thp@cs.ucr.edu)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-07-28  0:00                 ` J. Christian Blanchette
                                     ` (7 preceding siblings ...)
  1996-08-14  0:00                   ` Richard A. O'Keefe
@ 1996-08-15  0:00                   ` Norman H. Cohen
  1996-08-16  0:00                     ` Steve Heller
  1996-08-19  0:00                   ` Ted Dennison
  9 siblings, 1 reply; 688+ messages in thread
From: Norman H. Cohen @ 1996-08-15  0:00 UTC (permalink / raw)



In article <dewar.839593893@schonberg>, dewar@cs.nyu.edu (Robert Dewar) writes: 

|> Robert Eachus says
|>
|> "    I managed to do the "fun" experiment once.  Take three students
|> and have them learn Quicksort, Heapsort, and Bubblesort on "small"
|> decks.  At even 50 to 60 cards, the students doing Heapsort and
|> Quicksort are racing each other*, and the Bubblesort victim is still
|> hard at work well after they have finished."
|>
|> Try extending your experiment (I have also used this) a bit. Have a fourth
|> person sort the deck who knows none of these algorithms. That fourth
|> person will typically beat the Quicksort and Heapsort guys. Why? Because
|> the natural way to sort cards is with some physical embodiment of adress
|> calculation sorting, which can have an average time performance that is
|> order (N) rather than order N log N.
|>
|> This can be an instructive addition to your experiment!

If the set of keys on the card is dense (e.g. consecutive numbers from 1
to 60), then a radix sort (also O(n)) works nicely for 50-60 cards.
That how I sort my cancelled checks each month.

--
Norman H. Cohen    ncohen@watson.ibm.com




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-02  0:00                   ` Matt Austern
@ 1996-08-15  0:00                     ` Lawrence Kirby
  0 siblings, 0 replies; 688+ messages in thread
From: Lawrence Kirby @ 1996-08-15  0:00 UTC (permalink / raw)



In article <fxtpw59ybul.fsf@isolde.mti.sgi.com>
           austern@isolde.mti.sgi.com "Matt Austern" writes:

>smosha@most.fw.hac.com (Stephen M O'Shaughnessy) writes:
>
>> >Except, IMO assembly should be used *exclusively* for the first 
>> two
>> >years of a CS degree.  The first two years is usually all 
>> algorithmic
>> >analysis, anyway.  There's nothing you can't learn about 
>> algorithms
>> >that you can't learn and learn it better doing it in assembly.
>> 
>> Learn sorting algorithms in assembly?  Are you serious!?
>
>Why not?  Volume 3 of Knuth is all about sorting algorithms, and
>every program in it is written in MIX assembly language.

Knuth doesn't use MIX to describe the algorithms (it is pretty much
useless for that purpose). Rather after describing the algorithms in
text and action sequences he uses MIX to show an example implementation
which he then uses for quantitative timing analysis.

-- 
-----------------------------------------
Lawrence Kirby | fred@genesis.demon.co.uk
Wilts, England | 70734.126@compuserve.com
-----------------------------------------




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-12  0:00                                                 ` Chris Sonnack
@ 1996-08-15  0:00                                                   ` Bob Hoffmann
  0 siblings, 0 replies; 688+ messages in thread
From: Bob Hoffmann @ 1996-08-15  0:00 UTC (permalink / raw)



Chris Sonnack wrote:
> 
> Dan Pop (Dan.Pop@cern.ch) wrote:
> 
> >>> They're both pretty clear. And any real programmer knows rule #27: "There
> >>> should be no constants in your code except the numbers 1 and 0, and you
> >>> should view those with suspicion."
> >>
> >> I would say, "There should be no constants in your code except 0.  Tests
> >> should be less than, equal, greater than, or not equal 0.  Otherwise,
> >> it better involve a symbol."
> >
> > This is ludicrous.  When coding a binary search, NO symbol will be better
> > than the constant 2.  Ditto for the constant 10 when doing binary to
> > decimal conversions.  And the list could go on and on.
> 
> Absolutely! (Although generally binary searches don't need to divide by
> 2 so much as shift right one bit.) Like most "rules", there's always
> exceptions. But it's still a very good rule (of thumb).
> 
> --
> Chris Sonnack  <cjsonnack@mmm.com>                  http://eishcq.mmm.com
> Engineering Information Services/Information Technology/3M, St.Paul, Minn
> ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
> An intellectual is someone whose mind watches itself
> 
> Opinions expressed herein are my own and may not represent those of my employer.


If you really want to start from scratch, and then ALL LANGUAGES ARE 
EASY AFTER THAT, get an old HP-41C programmable calculator and do some 
"synthetic programming" (something I pioneered with W.C.Wickes and 
others).  

That is the best ! .

All other languages are for kids.

BobX




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-14  0:00                   ` Peter Seebach
  1996-08-14  0:00                     ` Tim Behrendsen
  1996-08-15  0:00                     ` Bob Gilbert
@ 1996-08-15  0:00                     ` DAVID A MOLNAR
  2 siblings, 0 replies; 688+ messages in thread
From: DAVID A MOLNAR @ 1996-08-15  0:00 UTC (permalink / raw)



Peter Seebach (seebs@solutions.solon.com) wrote:
: Still, why wouldn't, say, scheme, work for this?  You can make a *VERY*
: simple system - much simpler than assembly for any platform - and show
: the students how everything is built on it.
: 
: If all you want to do is show them that the computer is simple, teach them
: in Turing machine assembly.  :)
	This reminds me of another point from a previous "CS1 education" 
thread I saw about a year ago. A teacher had designed his course 
around a virtual machine of sorts designed to act as a warm and fuzzy way 
to tackle concepts without having to worry about the pitfalls and 
idiosyncransies (sp?) of a "real" platform. Unfortunately, the students 
complained that they "weren't learning anything USEFUL". They regarded 
learning CS on a machine that was not used in business (and never would 
be!) as a sort of "waste". Result : poor student performance, and an 
eventual migration to Windows 3.1, which the students enjoyed, but the 
instructor was somewhat less satisfied with.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-13  0:00           ` Darin Johnson
  1996-08-13  0:00             ` What's the best language to learn? [was Re: Should I learn C or Pascal?] Tim Behrendsen
@ 1996-08-15  0:00             ` Richard A. O'Keefe
  1996-08-17  0:00               ` Lawrence Kirby
                                 ` (2 more replies)
  1996-08-16  0:00             ` Dr E. Buxbaum
  1996-08-27  0:00             ` Jeffrey C. Dege
  3 siblings, 3 replies; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-08-15  0:00 UTC (permalink / raw)



djohnson@tartarus.ucsd.edu (Darin Johnson) writes:
>Actually, I learned it freshman year, but didn't understand it.
...
>I think many of my classmates kept the
>"quicksort is the fastest sort" categorization without really
>understanding it.  Too many people fall asleep in algorithms class
>(then bitch about the waste of time later).

The *really* sad thing here is that quicksort is *not* the fastest sort.
Quicksort was specifically designed (see Hoare's paper in Computer
Journal, 1960 or 61, can't remember the issue) for a machine with 256
words of memory.  Not 256M.  Not 256k.  Two hundred and fifty-six words
of main memory.  Backing store was a drum with 16384 words, transferred
in 64 word pages.  Hoare knew at the time that it did more comparisons
than merge sort, but merge sort need extra memory that simply wasn't there.

Quicksort still has a niche in embedded processors,
although there is a new version of heapsort (also published in the
Computer Journal, but I can't find the reference) which can challenge it:
the modern heapsort has the virtue that its worst case is O(nlgn) which
makes it a better bet for soft-real-time work.

For general use, a well engineered merge sort is as good as a well engineered
quicksort; sometimes better.

For sorting machine integers, a well engineered radix sort (or even count
sort if the range is small enough) is so much faster that it isn't funny.

For most programmers, the main issue is that it is cheaper to re-use
an existing expertly implemented and thoroughly tested sort procedure
than to write their own buggy code.
-- 
Australian citizen since 14 August 1996.  *Now* I can vote the xxxs out!
Richard A. O'Keefe; http://www.cs.rmit.edu.au/~ok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-12  0:00                                                   ` Mark Wooding
  1996-08-13  0:00                                                     ` Mike Rubenstein
@ 1996-08-15  0:00                                                     ` Richard A. O'Keefe
  1 sibling, 0 replies; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-08-15  0:00 UTC (permalink / raw)



mdw@excessus.demon.co.uk (Mark Wooding) writes:
>Are we talking about implementing it the truly stupid way as a single
>function, e.g.,

>	(define (factorial n)
>	  (if (zero? n)
>	    1
>	    (* n (factorial (- n 1)))))

>or properly using another function which tail-recurses, e.g.,

>If the former, then please report for re-education;

Ahem.
(a) We have known at least since Burstall and Darlington's work in the 70s
    how to AUTOMATICALLY turn this kind of recursion into iteration.

    F(x) = if P(x) then T(x) else H(G(x), F(R(x)))

    under certain conditions on G and H.  The only thing that prevents most
    Scheme compilers doing it is that they can't be sure that you won't
    assign something different to * or - at run time; a compiler that
    prevents you changing the primitives or that analyses the whole program
    and finds that you _don't_ change them could quite easily transform
    this function to iterative form.  Doing it when H and G are user-defined
    is harder.

(b) In a Scheme system supporting the full numeric tower; this will spend
    much more time chomping away at bignum arithmetic thanit will on its
    own procedure calls.  To reduce that cost, you need to unroll, which
    can be done every bit as easily with the recursive form as the
    iterative:

    [substitute value of factorial] =>

	(define (factorial n)
	  (if (zero? n)
	    1
	    (* n ((lambda (n)
                    (if (zero? n)
                        1
                        (* n (factorial (- n 1))) ))
		  (- n 1)) )))
                         
    [Substitute actual parameter for formal in inner lambda] =>

	(define (factorial n)
	  (if (zero? n)
	    1
	    (* n (if (zero? (- n 1))
                     1
                     (* (- n 1) (factorial (- (- n 1) 1)) )) )))

    [Simplify arithmetic forms and distribute * over if]

	(define (factorial n)
	  (if (= n 0)
            1                         
            (if (= n 1)
              1	; was (* n 1) but we know n=1
              (* n (* (- n 1)) (factorial (- n 2)) )) ))

    [Reassociate multiplication to get
     (* (* fixnum fixnum) bignum)
     instead of
     (* fixnum (* fixnum bignum)); rewrite if as case for readability] =>

	(define (factorial n)
	  (case n
            ((0 1) 1)
            (else (* (* n (- n 1)) (factorial (- n 2)) )) ))

    Some measurements, using Gambit 2.2.2 on a Manintosh Centris 660AV.
    All times are in seconds to compute 100!

    recursive, naive:		0.21 sec
    recursive, unrolled:	0.11 sec
    iterative, naive:		0.25 sec (yes, SLOWER than the recursive one!)
    iterative, unrolled:	0.14 sec (yes, SLOWER than the recursive one!)

    The iterative versions were not tail recursifve; they used Scheme's
    direct equivalent of a while loop.  For example, the iterative, naive
    version was
	(define (f3 n)
	  (do ((f 1 (* i f))
	       (i n (- i 1)))
	      ((<= i 1) f)))

    We see in this case that anyone who was sent for "re-education" learned
    something false:  in this compiler on this machine, factorial coded as
    naive recursion is FASTER than factorial coded as a while loop.
    (Actually, the real reason is the order in which the multiplications
    are done.  The recursive version does ((((1*2)*3)*4)* ... n) which
    keeps the intermediate values small as long as possible, while the
    iterative version does (1* ... *(n-2 * (n-1 * n))) which overflows
    into bignum arithmetic earlier.)

    We also see something I first noticed when testing some bignum code in
    C a couple of years back:  replacing iteration by recursion or vice
    versa may save or cost you 10--20%, but unrolling the loop HALVES the
    work.  It would be foolish indeed to spend your time turning recursion
    into iteration in order to buy a 10% speedup (or with this compiler on
    this machine, in order to by a 20% slowdown!) when you could have
    spent the same time unrolling to get a 50% reduction in time.

    The point at issue here is that the use of recursion instead of
    iteration is fully consistent with the "loop unrolling" compiler
    optimisation technique, no less so than iteration.

    Another point is that _scheduling the order in which operations
    are performed_ may have a bigger effect than the control framework
    defining that order.  (The matrix product problem is a well known
    example of this; other examples include merging several lists,
    operating on several sets, &c).

    (I cannot perform this measurement in the current top-of-the-line
    Scheme compiler, Stalin, because Stalin doesn't support bignum
    arithmetic.)

(c) What if you haven't got bignum arithmetic?
    Then turning recursion into iteration won't help with the major problem
    with factorial, which is that 

	13! = 6227020800 > 4294967295 == 2**32 - 1
                         > 2147483647 = 2**31 - 1
                         
    so trying to compute factorials in fixnum (C: int, Ada: Integer)
    arithmetic isn't going to get you very far.  You can do exact integer
    addition, subtraction, comparison, and multiplication with IEEE doubles
    up to 2**53 - 1, which sounds like a lot, but

	19! = 121645100408832000 > 9007199254740991 = 2**53 - 1

    so even flonum (C: double, Ada: Long_Float) arithmetic won't get you
    very far with factorials.

    If you seriously want to compute factorials in a language without
    bignum arithmetic, what you _really_ want to compute is the gamma
    function, which requires completely different techniques and skills.

    What has this got to to with recursion -vs- iteration?
    Well, under a minute typing the naive recursive definition,
    followed by
	(let loop ((x (- (expt 2 31) 1)) (i 1))
	  (if (> (factorial i) x) i (loop x (+ i 1))))
    let me find out how far I could compute factorials using C or Ada
    without a bignum package.  Any time spent optimising the function
    for use (as opposed to optimising it for debate, which is what I've
    been doing) would have been wasted, because it doesn't matter _how_
    you compute the function, the results are too big.

    What matters here, then, is how long it takes you to find this out.
    If the recursive definition gets *you* to *understanding* quicker,
    it's a winner.

-- 
Australian citizen since 14 August 1996.  *Now* I can vote the xxxs out!
Richard A. O'Keefe; http://www.cs.rmit.edu.au/~ok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-09  0:00                 ` Peter Seebach
@ 1996-08-15  0:00                   ` James_Rogers
  1996-08-17  0:00                     ` Tim Behrendsen
  0 siblings, 1 reply; 688+ messages in thread
From: James_Rogers @ 1996-08-15  0:00 UTC (permalink / raw)



Peter Seebach wrote:
> Tim Behrendsen <tim@airshields.com> wrote:
>>>>int a[50000],b[50000],c[50000],d[50000],e[50000];

>>>>void test1()
>>>>{
>>>>  int i,j;
>>>>  for (j=0;j<10;++j){
>>>>    for (i=0;i<50000;++i){
>>>>      ++a[i];++b[i];++c[i];++d[i];++e[i];
>>>>    }
>>>>  }
>>>>}

>>>>void test2()
>>>>{
>>>>  int i,j;
>>>>  for (j=0;j<10;++j){
>>>>    for (i=0;i<50000;++i) ++a[i];
>>>>    for (i=0;i<50000;++i) ++b[i];
>>>>    for (i=0;i<50000;++i) ++c[i];
>>>>    for (i=0;i<50000;++i) ++d[i];
>>>>    for (i=0;i<50000;++i) ++e[i];
>>>>  }
>>>>}

This example also behaves differently than predicted by
Mr Behrendsen when implemented in Ada 95 using the GNAT
compiler on a SPARC 20.

I translated the above functions into Ada as follows:
   procedure test1 is
   begin
      for J in 1..10 loop
         for I in index_type loop
            a(I) := a(I) + 1;
            b(I) := b(I) + 1;
            c(I) := c(I) + 1;
            d(I) := d(I) + 1;
            e(I) := e(I) + 1;
         end loop;
      end loop;
  end test1;
  
  procedure test2 is
  begin
     for J in 1..10 loop
        for I in index_type loop
           a(I) := a(I) + 1;
        end loop;
        for I in index_type loop
           b(I) := b(I) + 1;
        end loop;
        for I in index_type loop
           c(I) := c(I) + 1;
        end loop;
        for I in index_type loop
           d(I) := d(I) + 1;
        end loop;
        for I in index_type loop
           e(I) := e(I) + 1;
        end loop;
     end loop;
  end test2;

Each of the above loops was run 10 times.  The
timings are given below.

The resulting timings with no optimization are:
Test1 execution time:          12.2815
Test2 execution time:          16.0227

With full optimization and inlining of procedures the 
results are:
Test1 execution time:           4.6333
Test2 execution time:           5.2464

Using Ada the performance is quite comperable to Mr Seebach's
C results using gcc.  The Ada results also do not support Mr
Behrendsen's position.

-- 
Jim Rogers
*************************************************************
Celebrate Diplomacy




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-14  0:00                     ` Tim Behrendsen
  1996-08-14  0:00                       ` Peter Seebach
@ 1996-08-15  0:00                       ` Robert Dewar
  1996-08-16  0:00                         ` Joe Foster
  1996-08-18  0:00                         ` Tim Behrendsen
  1996-08-15  0:00                       ` Bob Gilbert
  2 siblings, 2 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-15  0:00 UTC (permalink / raw)



i"Well, the thing is that real honest-to-goodness assembly is just
not that difficult to learn, and that has built-in practical
aspects to it."

That seems false for many modern RISC architectures, and as ILP becomes
more and more of a factor, the instruction level semantics will become
more and more complex.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-14  0:00                                     ` Robert I. Eachus
@ 1996-08-15  0:00                                       ` Robert Dewar
  0 siblings, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-15  0:00 UTC (permalink / raw)



Robert Eachus said

"   There was a time when NO ONE knew how to build a heap in O(n) time,
but it didn't last long.  However, I was very interesting in sorting
problems right then.  Robert Dewer must have come along a little
later, when treesort3 had almost completely replaced Heapsort.  But
they are two different (but closely related) algorithms.  This is why
I have been complaining about the mixup in names.
"


Well I will not argue further about ancient history, people can look that
up for themselves.

The important point is that it is been well known for 30 years that a heap
can be constructed in O(N) time, so it is misleading to even suggest that
it is an O N log N process.

Perhaps the thing that is most importnat is to be aware that a heap can
be constructed not just in O(N) compares, but if it is done right it
takes just about N compares (i.e. the constant is 1). Furthermore, the
subsequent sorting phase of heap sort can be done in about N log(N)
compares (again with a constant of about 1), so that the approach is
close to optimal. This is significant, because in this optimized form,
heap sort is highly competitive with quicksort for the average case,
and of course far superior for the worst case.

And, since this thread hangs around comp.lang.ada, it is relevant to
point out that GNAT provides this optimal heap sort algorithm. Look
at the specs in g-gesora.ads and g-gesorg.ads, which are respectively
access-to-subprogram (shared code) and generic (inlined code) codings
of this algorithgm.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-14  0:00                   ` Robert Dewar
  1996-08-14  0:00                     ` Tim Behrendsen
  1996-08-14  0:00                     ` Dan Pop
@ 1996-08-15  0:00                     ` Joe Foster
  2 siblings, 0 replies; 688+ messages in thread
From: Joe Foster @ 1996-08-15  0:00 UTC (permalink / raw)



In article <dewar.840043674@schonberg>, dewar@cs.nyu.edu (Robert Dewar) writes:

> Tim says

> ""abstraction" question, it's a question of breaking it down
> into fundamental steps of data transformations.  Testing a
> number as even is transforming a number into a bit based on
> its evenness property."

> Well it is certainly easy to see why you like to teach assembler early,
> and to me, your viewpoint is a good example of why I do NOT like that
> approach, you have an unrelenting low level viewpoint of things, for
> example what on EARTH from a semantic point of view does a test for
> evenness
> have with a "bit", nothing at all!

IMHO, the low level view of things provides an excellent
foundation upon which to build higher-level abstractions. A
thorough knowledge of what's really going on inside that beige
box has helped me many times when a higher level tool didn't do
quite what I expected or when tracking down a C/C++ pointer or
compiler bug, or when deciding when to abandon a 4GL in favor of
some good old C code.

Even though I rarely even see assembly code these days, I
wouldn't part with the discipline assembly programming has given
me for anything. For me, using a tool without an understanding of
its workings would be like building on quicksand. For example, an
inadequate understanding of how a particular RDBMS worked led me
to design a report that brought the server to its knees, and only
research into the RDBMS' workings allowed me to redesign the
queries to give the same results with much less demands on the
server.

-- 
Joe Foster (joe@bftsi0.gate.net or joe%bftsi0@uunet.uu.net)
WARNING: I cannot be held responsible for the above        They're   coming  to
because  my cats have  apparently  learned to type.        take me away, ha ha!




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-07  0:00                                   ` Tim Behrendsen
                                                       ` (5 preceding siblings ...)
  1996-08-14  0:00                                     ` Robert I. Eachus
@ 1996-08-15  0:00                                     ` Blair Phillips
  1996-08-27  0:00                                     ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Tanmoy Bhattacharya
                                                       ` (2 subsequent siblings)
  9 siblings, 0 replies; 688+ messages in thread
From: Blair Phillips @ 1996-08-15  0:00 UTC (permalink / raw)



Robert I. Eachus wrote:
> 
> 
>    Ah, the joys of being an old fart:
>
Yeah, anyone who didn't own 3 volumes of Knuth by 1974 loses! 
> 
>    And for those of you who weren't around then, the reason for using
> these sorts, and not quicksort was that you were really doing a
> polyphase mergesort from tapes, and the inmemory sort of lots of
> "little" files that could fit in memory was a first step.  Among other
> things, the elimination of the need for stack space--for quicksort--
> meant you could sort a slightly larger file. And that stack was
> usually hand coded if you did use Quicksort...
> 

The big win for polyphase sort came from the technique of inserting new
items to replace those emitted from the heap. You end up with initial
runs which average twice the size of your sorting space, a big win in
those days when 4MB was a large memeory config.
Knuth Vol 3 Sect 5.4.1 has the details for those who care! 
-- 
Blair Phillips, Email: blairp@spirit.net.au
Canberra, Australia.  Phone: +61 411189724




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-14  0:00                   ` Peter Seebach
  1996-08-14  0:00                     ` Tim Behrendsen
@ 1996-08-15  0:00                     ` Bob Gilbert
  1996-08-18  0:00                       ` Tim Behrendsen
  1996-08-15  0:00                     ` DAVID A MOLNAR
  2 siblings, 1 reply; 688+ messages in thread
From: Bob Gilbert @ 1996-08-15  0:00 UTC (permalink / raw)



In article <4ut1sv$ngv@solutions.solon.com>, seebs@solutions.solon.com (Peter Seebach) writes:
> 
> I don't really think we are going to convince each other.  I believe that
> the "fundemental procedural nature of the computer" may not be a permanent
> thing.  I'd rather teach them to think in the general terms, and see that
> procedural nature as just one more current limitation, like computers which
> have only 2 MB of RAM.  This will better prepare them for techniques that
> depend on *not* thinking of the computer as procedural. 

I certainly agree with this.  When I was first learning to program, computer
time was extremely valuable, and we were taught to use great care when 
writing a program to insure that it was as correct as possible to avoid 
having to *waste* computer time having the compiler find all of your syntax
errors.  In fact, I even had one professor that deducted points on your
programming assignments for each additional compilation you required past
the first two.  This sort of view has certainly changed today, since
computer time is usually a lot cheaper than a programmers time.

Another thing I'm seeing is the greater use of field programmable gate arrays
(FPGA's) in embedded systems. As the capability and density of FPGA's continues
to increase, it allows more and more flexibility to program (?) functionallity
into the FPGA.  I suspect in a few years we will be talking "virtual hardware",
with new and different programming methods, and subsequently new languages, 
that will be developed and learned.

-Bob






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-14  0:00                     ` Tim Behrendsen
  1996-08-14  0:00                       ` Peter Seebach
  1996-08-15  0:00                       ` Robert Dewar
@ 1996-08-15  0:00                       ` Bob Gilbert
  2 siblings, 0 replies; 688+ messages in thread
From: Bob Gilbert @ 1996-08-15  0:00 UTC (permalink / raw)



In article <01bb8a24$88d46fe0$87ee6fce@timpent.airshields.com>, "Tim Behrendsen" <tim@airshields.com> writes:
> 
> All of that is very interesting in a theoretical sense.  But
> if we're talking about educating a student, you have to start
> somewhere, and to just hit them full-bore with "abstraction
> of algorithm" theory, I think you're going to lose of lot of
> them, *and we are!*

Some people are probably more comfortable learning from the bottom
up and do better when they can understand all the little details
that are involved when implementing some algorithm.  However, many
others may find that the reverse is better for them, that they
like to see the big picture first and avoid getting bogged down in
all the little details which they can later become familiar.
I would guess that your preference is the bottom up approach,
and that is fine, but it doesn't work for everybody.

> It seems to me that that focusing on the more practical
> aspects of implementation gives them a solid foundation to
> build the rest of it on, including building up of the concepts
> of languages and why their valuable.

Certainly there needs to be focus on the "aspects of implementation",
but that is separate from learning the concepts of algorithms.
After all, much of the field of discrete and iterative mathematics 
was invented and developed long before there was any machine on
which to implement the algorithm(s).  If everybody was limited to
only thinking about that which could currently be implemented on
available machines, we might never develop better algorithms that
could possibly be implemented on machines that will come in the
future.  In fact, the design of the machines should be driven by
the algorithms we have developed, not the other way around.
Of course we have to be realistic, and many times we have to 
learn work with what we have, so we still need to be able to 
adapt the algorithm, or even develop algorithms, to the 
constraints of the available machine.

So if we always take a bottom up approach, we will always be 
slaved to the constraints of the available machines which will
serve as the procrustean beds in which our algorithms must fit,
and we won't advance.  If we always take the top down approach,
we may never design algorithms for which we can build a machine
to implement it, and nothing will get done.  I really think the
tunnel needs to be dug from both ends of the mountain so that 
there is a greater chance of meeting somewhere in the middle.

-Bob






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-13  0:00                                                       ` Tim Behrendsen
  1996-08-13  0:00                                                         ` Giuliano Carlini
@ 1996-08-15  0:00                                                         ` Mike Rubenstein
  1 sibling, 0 replies; 688+ messages in thread
From: Mike Rubenstein @ 1996-08-15  0:00 UTC (permalink / raw)



"Tim Behrendsen" <tim@airshields.com> wrote:

> Mike Rubenstein <miker3@ix.netcom.com> wrote in article
> <320fe7f4.7066811@nntp.ix.netcom.com>...
> > I've probably been helped more by my knowledge of about 25 higer level
> > languages.
> > 
> > Obviously, all other things being equal, it is better to know assembly
> > language that to not know it.  But all other things are seldom equal.
> > I suspect that most beginning programmers would gain much more from
> > learning some different HLLs.  Given a choice, I'd strongly recommend
> > one learn LISP, APL, Icon, or any of a few dozen other languages to
> > learning assembly language.
> 
> We have the world that you want.  This is CS curriculum today; are
> you happy with the level of expertise of the graduates?  I'm not,
> based on my experience with trying to hire them.  If you're not
> either, what do you think the reason is?

I think that the main problem is too much emphasis on the language and
not enough on the problems and on general principles of programming.
It's certainly not lack of assembly language.

Michael M Rubenstein




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-14  0:00                 ` Tim Behrendsen
                                     ` (2 preceding siblings ...)
  1996-08-16  0:00                   ` Dr. Richard Botting
@ 1996-08-16  0:00                   ` Bob Gilbert
  1996-08-17  0:00                     ` Tim Behrendsen
  3 siblings, 1 reply; 688+ messages in thread
From: Bob Gilbert @ 1996-08-16  0:00 UTC (permalink / raw)



In article <01bb89f1$31be4f60$87ee6fce@timpent.airshields.com>, "Tim Behrendsen" <tim@airshields.com> writes:
> 
> But at some point you have to sit down and prepare an
> implementation, and that's when you know whether the problem
  ^^^^^^^^^^^^^^

Key word here.  Implementing an algorithm is a separate issue
from creating an algorithm.  This is not to say that having
knowledge of how to implement doesn't or shouldn't affect the
algorithm definition, but they are separate issues.

> is truly understood or not.  Solving a problem is not an
> "abstraction" question, it's a question of breaking it down
> into fundamental steps of data transformations.

Guess it depends on your definition of "solving a problem".
I see it as defining the problem, which usually requires a
certain amount of abstraction, developing an algorithm which
provides a solution (more abstraction), and finally implementing
the algorithm in the machine (coding/programming).  I think you
are placing too much emphasis on the last step.

> So what?  You can always learn languages, but you only have
> one shot at teaching someone to think like a programmer.

You're loosing me here..  Why only one shot?

> *Programming is not languages*.

Right (sort of), and languages includes assembly.  
Programming is implementing.

> What gets missed in all this is that *computers are simple*!

Some (perhaps only a few) computers are simple.  While computes are
designed around some basic (simple) concepts, start adding caches, 
pipeline architectures, memory paging schemes, multiple register
sets, any of the various DMA capabilities or co-processor's which
provide for some parallelism, etc., and things get very complicated
very quick.  There aren't many computers made these days that don't
employ most (and many more) of the above features.  To effectively
program in assembly one must fully understand these architectural
issues, and to me that is one of the advantages of teaching a HLL
first as it allows the student to study and understand the development
and implementation of algorithms without being distracted by all the
low level architectural details.

> Look at ol' Darin; if he didn't have enough brainpower left
> to focus on the algorithm at hand, what was he being confused
> by?

Perhaps an attractive girl in the class :-). 

> I think it was by all the language and abstraction, 

I always though abstraction was used to avoid confusion by
elevating away from all the little low level details.  Let's 
you see the forest for the trees kind of thing.

> which is not what programming is.

Right, studing algorithms is not programming.  Programming is
implementing algorithms.

>  If he had been given a solid
> foundation of the simple nature of the computer, I think he would
> always have a "comfort zone" he could return to whenever he
> didn't understand something, because it's *always* implemented
> in the fundamental terms.

Sorry, I just don't follow this reasoning.

-Bob






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-15  0:00                       ` Robert Dewar
@ 1996-08-16  0:00                         ` Joe Foster
  1996-08-18  0:00                         ` Tim Behrendsen
  1 sibling, 0 replies; 688+ messages in thread
From: Joe Foster @ 1996-08-16  0:00 UTC (permalink / raw)



In article <dewar.840082355@schonberg>, dewar@cs.nyu.edu (Robert Dewar) writes:

> i"Well, the thing is that real honest-to-goodness assembly is just
> not that difficult to learn, and that has built-in practical
> aspects to it."

> That seems false for many modern RISC architectures, and as ILP becomes
> more and more of a factor, the instruction level semantics will become
> more and more complex.

Even CISC CPUs have this property now. What with the pilelining
in the Pentium and Pentium Pro chips, it's nearly impossible to
write fully optimized assembly, at least with the documentation I
have. (I understand the CPU makers have special tech support for
compiler writers, who probably won't be as patient with merely
curious weekend assembly hackers...) Still, one of the built-in
practical aspects of being familiar with how C and C++ would
typically be translated into 680x0 and 80x86 assembly is fewer
pointer bugs creeping into my code.

-- 
Joe Foster (joe@bftsi0.gate.net or joe%bftsi0@uunet.uu.net)
WARNING: I cannot be held responsible for the above        They're   coming  to
because  my cats have  apparently  learned to type.        take me away, ha ha!




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-13  0:00           ` Darin Johnson
  1996-08-13  0:00             ` What's the best language to learn? [was Re: Should I learn C or Pascal?] Tim Behrendsen
  1996-08-15  0:00             ` Should I learn C or Pascal? Richard A. O'Keefe
@ 1996-08-16  0:00             ` Dr E. Buxbaum
  1996-08-16  0:00               ` Mike Rubenstein
                                 ` (2 more replies)
  1996-08-27  0:00             ` Jeffrey C. Dege
  3 siblings, 3 replies; 688+ messages in thread
From: Dr E. Buxbaum @ 1996-08-16  0:00 UTC (permalink / raw)



djohnson@tartarus.ucsd.edu (Darin Johnson) wrote:
>> A binary sort, also known as quicksort, or Hoare's sort is covered extensively

>"quicksort is the fastest sort" categorization without really
>understanding it.  

A common misconception. Quicksort is fast for UNSORTED data. For data 
which are largely presorted (a common occurance if a bunch of additional 
data has to be added to a list), Quicksort becomes Slowsort. 

Resume: Spend some time on checking your data first, then decide on the 
proper sorting algorithm!





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-14  0:00                                             ` Richard A. O'Keefe
@ 1996-08-16  0:00                                               ` Tim Behrendsen
  1996-08-20  0:00                                                 ` Richard A. O'Keefe
  0 siblings, 1 reply; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-16  0:00 UTC (permalink / raw)



Richard A. O'Keefe <ok@goanna.cs.rmit.edu.au> wrote in article
<4urqam$r9u@goanna.cs.rmit.edu.au>...
> 
> If you encounter any significant difference in speed between recursive
> and non-recursive quicksort, all that tells you is that it is time to
> change compilers.  Come to that, if you really cared about speed you
> wouldn't be using quicksort anyway.

Yes, but you can use the "get a better compiler" argument to
justify anything.  Real programs run on real computers using
real compilers.  The "Super-Duper Ivory Tower 9000 Compiler"
just doesn't exist.

> >If I implemented Quicksort in APL, say (which has direct
> >manipulation of arrays in one operation), 
> 
> Given the grade-up primitive, I can't think why anybody would _want_
> to implement quicksort in APL.

Well, I realize that APL has a built-in sort, but the point
was in learning quicksort through an implementation in APL.

> >the student would
> >not see the movement, because they would just see the computer
> >doing the array "in one fell swoop", but wouldn't really
> >experience the fact that the computer doesn't really do it
> >that way in reality.
> 
> Interestingly enough, the APL sorting primitive DOESN'T move the data
> at all.  It returns a permutation vector.  The APL idiom for sorting is
> 	X[.GradeUP X]
> where the necessary movement is quite visible.

But the reality is *not* visible.  What has the student really
learned?

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-12  0:00         ` Patrick Horgan
  1996-08-13  0:00           ` Darin Johnson
  1996-08-13  0:00           ` Should I learn C or Pascal? Ralph Silverman
@ 1996-08-16  0:00           ` Darin Johnson
  1996-08-16  0:00             ` Robert Dewar
  1996-08-16  0:00             ` system
  1996-08-16  0:00           ` Should I learn C or Pascal? Darin Johnson
                             ` (5 subsequent siblings)
  8 siblings, 2 replies; 688+ messages in thread
From: Darin Johnson @ 1996-08-16  0:00 UTC (permalink / raw)



> Assmebly is OK after
> you have some programming in a high level language under your belt.

Actually, some assemblers aren't so basic and low level.  VAX MACRO
was almost an HLL in itself.
-- 
Darin Johnson
djohnson@ucsd.edu	O-
  - I'm not a well adjusted person, but I play one on the net.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-12  0:00         ` Patrick Horgan
                             ` (2 preceding siblings ...)
  1996-08-16  0:00           ` What's the best language to learn? [was Re: Should I learn C or Pascal?] Darin Johnson
@ 1996-08-16  0:00           ` Darin Johnson
  1996-08-20  0:00           ` Darin Johnson
                             ` (4 subsequent siblings)
  8 siblings, 0 replies; 688+ messages in thread
From: Darin Johnson @ 1996-08-16  0:00 UTC (permalink / raw)



> Resume: Spend some time on checking your data first, then decide on the 
> proper sorting algorithm!

And that's the whole point.  If a student learns "quicksort is
fastest" they won't ever think about the issues involved.  I ran
across one program where an item was added to a sort list by adding it
the end and then calling qsort().  Yes, very abstract with code
re-use, but it ignored efficiency altogether (it appeared though that
this was written quickly to get things working, and the author never
came back later to improve it).


-- 
Darin Johnson
djohnson@ucsd.edu	O-
	My shoes are too tight, and I have forgotten how to dance - Babylon 5




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-12  0:00                                         ` Robert Dewar
@ 1996-08-16  0:00                                           ` Steve Heller
  1996-08-16  0:00                                             ` Adam Beneschan
                                                               ` (2 more replies)
  0 siblings, 3 replies; 688+ messages in thread
From: Steve Heller @ 1996-08-16  0:00 UTC (permalink / raw)



dewar@cs.nyu.edu (Robert Dewar) wrote:

>Radix sorts are not O(n), the analysis is not that simple. They are
>O(kN) where k is the number of radix digits in the numbers, 
  As I understand it, the O() notation indicates how the time to
process varies with the number of elements. E.g., if 1000 elements
take 1 second and 2000 elements take 2 seconds to sort, the sorting
algorithm is O(n). According to this definition, the distribution
counting sort is O(n).

>and if you use left to right (e.g. radix exchange sorting), the early
>termination for subsets of 1 results in an ONlogN behavior after all.
  I generally use right to left, as it is easier to explain. Also, I
don't recall if the left to right version is stable; I know right to
left is.

>Still it is quite true that a simple radix sort for cards will beat
>the heap and quicksort crowds :-)
  Yes.

>I agree that both radix sorts and address calculation sorts should be
>taught more systematically. The reason that attention tends to focus
>on comparison sorts is that these analyze most nicely from an academic
>point of view :-)
  They're more complex to analyze; is that what you mean?

Steve Heller, author and software engineer
http://ourworld.compuserve.com/homepages/steve_heller 





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-15  0:00                   ` Teaching sorts [was Re: What's the best language to start with?] Norman H. Cohen
@ 1996-08-16  0:00                     ` Steve Heller
  0 siblings, 0 replies; 688+ messages in thread
From: Steve Heller @ 1996-08-16  0:00 UTC (permalink / raw)



ncohen@watson.ibm.com (Norman H. Cohen) wrote:

>If the set of keys on the card is dense (e.g. consecutive numbers from 1
>to 60), then a radix sort (also O(n)) works nicely for 50-60 cards.
>That how I sort my cancelled checks each month.
  I like that sort quite a bit, and have explained its close relative,
distribution counting (as Knuth refers to it) both in my book on
efficient programming and to my students. Interestingly enough, none
of these students so far have ever heard of it, even though they are
seniors in CS!

Steve Heller, author and software engineer
http://ourworld.compuserve.com/homepages/steve_heller 





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-16  0:00           ` What's the best language to learn? [was Re: Should I learn C or Pascal?] Darin Johnson
  1996-08-16  0:00             ` Robert Dewar
@ 1996-08-16  0:00             ` system
  1 sibling, 0 replies; 688+ messages in thread
From: system @ 1996-08-16  0:00 UTC (permalink / raw)



 djohnson@tartarus.ucsd.edu (Darin Johnson) writes:
>> Assmebly is OK after
>> you have some programming in a high level language under your belt.
>
>Actually, some assemblers aren't so basic and low level.  VAX MACRO
>was almost an HLL in itself.
 ^^^

ahem, I think you want the present tense :)

>Darin Johnson

Morphis@physics.niu.edu

Real Men change diapers




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-16  0:00                                           ` Dik T. Winter
@ 1996-08-16  0:00                                             ` Joe Foster
  0 siblings, 0 replies; 688+ messages in thread
From: Joe Foster @ 1996-08-16  0:00 UTC (permalink / raw)



In article <Dw7KD5.8vF@cwi.nl>, dik@cwi.nl (Dik T. Winter) writes:

> In article <dewar.840034153@schonberg> dewar@cs.nyu.edu (Robert Dewar) writes:

>  >                                          A bubble sort is certainly a
>  > much simpler solution to the problem of optimal sorting of a sorted
>  > list, and simplicity of solutoins is interesting if performance is NOT
>  > an issue after all.

> Indeed.  I once used it in an algorithm that calculated the singular
> values of a matrix.  At the end the values were sorted with a bubble
> sort.  Why not?  The most expensive (n^3) part had been done.  And it
> was only a few lines of code in a routine that was already very complex.
> Think about adding a heapsort or quicksort within (and think about the
> overhead when most likely the maximum number of elements to be sorted
> was around 20).

Couldn't you have used the qsort standard library function? That
would be even fewer lines of code for you to write! Or maybe
shellsort, if you weren't writing in C/C++? Even shellsort is
faster than bubblesort except when the data are already very
nearly sorted.

[chomp]

-- 
Joe Foster (joe@bftsi0.gate.net or joe%bftsi0@uunet.uu.net)
WARNING: I cannot be held responsible for the above        They're   coming  to
because  my cats have  apparently  learned to type.        take me away, ha ha!




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-13  0:00                       ` Chris Sonnack
@ 1996-08-16  0:00                         ` Steve Heller
  1996-08-16  0:00                           ` John Hobson
  0 siblings, 1 reply; 688+ messages in thread
From: Steve Heller @ 1996-08-16  0:00 UTC (permalink / raw)



cjsonnack@mmm.com (Chris Sonnack) wrote:

>My OVERWHELMING experience as a teacher is that most students learn a
>thing (any thing) faster and better if they learn the "why" and "how"
>that's behind it.
  My experience as a teacher and author also supports this.


Steve Heller, author and software engineer
http://ourworld.compuserve.com/homepages/steve_heller 





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-16  0:00                                           ` Steve Heller
  1996-08-16  0:00                                             ` Adam Beneschan
@ 1996-08-16  0:00                                             ` Szu-Wen Huang
  1996-08-17  0:00                                               ` Robert Dewar
                                                                 ` (4 more replies)
  1996-08-16  0:00                                             ` Robert Dewar
  2 siblings, 5 replies; 688+ messages in thread
From: Szu-Wen Huang @ 1996-08-16  0:00 UTC (permalink / raw)



Steve Heller (heller@utdallas.edu) wrote:
[snip]
:   As I understand it, the O() notation indicates how the time to
: process varies with the number of elements. E.g., if 1000 elements
: take 1 second and 2000 elements take 2 seconds to sort, the sorting
: algorithm is O(n). According to this definition, the distribution
: counting sort is O(n).
[snip]

Wrong order.  An O(n) program where n=1,000 that takes 1 second to
complete should be able to finish n=2,000 in 2 seconds, not the other
way around.  In other words, you can't derive time complexity by
timing an algorithm, *especially* with only two samples.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-16  0:00             ` Dr E. Buxbaum
  1996-08-16  0:00               ` Mike Rubenstein
@ 1996-08-16  0:00               ` Lawrence Kirby
  1996-08-17  0:00                 ` Paul Hsieh
  1996-08-20  0:00               ` Paul Schlyter
  2 siblings, 1 reply; 688+ messages in thread
From: Lawrence Kirby @ 1996-08-16  0:00 UTC (permalink / raw)



In article <4v1r2a$gh6@falcon.le.ac.uk> EB15@le.ac.uk "Dr E. Buxbaum" writes:

>djohnson@tartarus.ucsd.edu (Darin Johnson) wrote:
>>> A binary sort, also known as quicksort, or Hoare's sort is covered
> extensively
>
>>"quicksort is the fastest sort" categorization without really
>>understanding it.  
>
>A common misconception. Quicksort is fast for UNSORTED data. For data 
>which are largely presorted (a common occurance if a bunch of additional 
>data has to be added to a list), Quicksort becomes Slowsort. 

Quicksort is faster for sorted data than unsorted data: any passable
implementation will use at the very least median-of-three pivot selection.

-- 
-----------------------------------------
Lawrence Kirby | fred@genesis.demon.co.uk
Wilts, England | 70734.126@compuserve.com
-----------------------------------------




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-16  0:00                         ` Steve Heller
@ 1996-08-16  0:00                           ` John Hobson
  0 siblings, 0 replies; 688+ messages in thread
From: John Hobson @ 1996-08-16  0:00 UTC (permalink / raw)



Steve Heller wrote:
> cjsonnack@mmm.com (Chris Sonnack) wrote:
> >My OVERWHELMING experience as a teacher is that most students learn a
> >thing (any thing) faster and better if they learn the "why" and "how"
> >that's behind it.
>   My experience as a teacher and author also supports this.

"Lurgan taught him how the members of a certain caste ate or sat or
spat and, since the hows of this world matter little, why they did
them in that way." -- Rudyard Kipling, "Kim"

-- 
John Hobson             |Whenever someone says to me,
Unix Support Group      |"Have a nice day", I reply,
ComEd, Chicago, IL, USA |"Sorry, I've made other plans."
jhobson@ceco.ceco.com   |	-- Sir Peter Ustinov




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-16  0:00           ` What's the best language to learn? [was Re: Should I learn C or Pascal?] Darin Johnson
@ 1996-08-16  0:00             ` Robert Dewar
  1996-08-16  0:00             ` system
  1 sibling, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-16  0:00 UTC (permalink / raw)



Darin says

"Actually, some assemblers aren't so basic and low level.  VAX MACRO
was almost an HLL in itself."

That seems a bit stretched. The VAX instruction set is register oriented
and pretty conventional. Yes, it has a few CISC style instructions (most
of which you should never use if you are intersted in efficiency) that
do things analogous to HLL semantics, but to say it was almost an HLL
in itself is going too far.

On the other hand, there definitely ARE machines where that characterization
makes more sense. The B5500 and the KDF-9 are examples, and I would think
even the Inmos Transputer is a better example than the VAX.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-16  0:00                                           ` Steve Heller
  1996-08-16  0:00                                             ` Adam Beneschan
  1996-08-16  0:00                                             ` Szu-Wen Huang
@ 1996-08-16  0:00                                             ` Robert Dewar
  1996-08-18  0:00                                               ` Steve Heller
  2 siblings, 1 reply; 688+ messages in thread
From: Robert Dewar @ 1996-08-16  0:00 UTC (permalink / raw)



Steve Heller said

">I agree that both radix sorts and address calculation sorts should be
>taught more systematically. The reason that attention tends to focus
>on comparison sorts is that these analyze most nicely from an academic
>point of view :-)
  They're more complex to analyze; is that what you mean?"

No, they are much simpler to analyze, analyzing the left to right
radix sort (the only reasonable implementation, the right to left sort
is never a good choice), or the address calculation sort, is not easy.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-14  0:00               ` Gabor Egressy
  1996-08-15  0:00                 ` Robert Dewar
@ 1996-08-16  0:00                 ` Mark Wooding
  1996-08-17  0:00                   ` Dan Pop
  1 sibling, 1 reply; 688+ messages in thread
From: Mark Wooding @ 1996-08-16  0:00 UTC (permalink / raw)



Gabor Egressy <gegressy@uoguelph.ca> wrote:

> Why oh why would you want to start with assembly? Assembly is great
> for writing viruses and small code that needs to be fast but is a pain
> to write and maintain.

Eeek.  It's all evil lies.  It's a vicious rumour put about by a secret
cabal for their own nefarious purposes.

I'll admit that I've written assembler code which is almost utterly
illegible to me now.  I've also written some awful C code, so that
doesn't mean much.  However, I have /lots/ more beautiful looking and
instantly readable assembler code.

Just because it's low-level stuff doesn't mean it has to be hard to
understand.  Just like any other language, if you take a bit of care to
present your code nicely, it will be readable; if you don't, it will be
ghastly.

I'll go off to comp.lang.assembler.hackers.therapy now and take some
more of the pills.
-- 
[mdw]

`When our backs are against the wall, we shall turn and fight.'
		-- John Major





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-14  0:00                 ` Tim Behrendsen
  1996-08-14  0:00                   ` Peter Seebach
  1996-08-14  0:00                   ` Robert Dewar
@ 1996-08-16  0:00                   ` Dr. Richard Botting
  1996-08-18  0:00                     ` Tim Behrendsen
  1996-08-16  0:00                   ` Bob Gilbert
  3 siblings, 1 reply; 688+ messages in thread
From: Dr. Richard Botting @ 1996-08-16  0:00 UTC (permalink / raw)



Tim Behrendsen (tim@airshields.com) wrote:
: So what?  You can always learn languages, but you only have
: one shot at teaching someone to think like a programmer.
: *Programming is not languages*.  
Yes!!!   (most of the time!)
: Programming is breaking problems
: down into fundamental steps to achieve a desired result.  The
: way the steps are described is not relevent to the general question
: of a solution.
This, as stated, is a very labor intensive way of programming.
Ultimatly we may end up with a a structure that is full of
fundamental steps.
However you can produce code faster and more reliably by looking
for cliches that you already know how to code... like
	SORT STUDENTS ASCENDING GRADE.

: When we are talking about teaching a student,
: the less extraneous issues such as languages and the more
: experience we can given them of the fundamental procedural
: nature of the computer, the better.
But we should also be helping them to develop
a portfolio of standard solutions to standard problems.

Problem is they'll need to have them in a language of
somekind or other.  I would suggest it is better for
stiudents to see the same algorithm in several languages
than to only ever have them in one language.  And one
of these should be a simple low level language, and perhaps
one should be a the most abstract and powerful languages
arround (like the one I used above:-)

[...]
: > If they'd started him off with explaining, in his native language, how to
: sort
: > things, and given him sample sets of cards to sort, while following each
: of a
: > set of descriptions *in his native language*, he would have understood
: > quicksort.
I first saw this done by a colleague with UK Freshman in their
first programming class in 1974.   It worked very well.  I stole the
idea and still use it.

Works with searching algorithms as well. I still looking for
a similar thing for linked data.

--
dick botting     http://www.csci.csusb.edu/dick/signature.html
Disclaimer:      CSUSB may or may not agree with this message.
Copyright(1996): Copy freely but say where it came from.
	I have nothing to sell, and I'm giving it away.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-16  0:00                                           ` Steve Heller
@ 1996-08-16  0:00                                             ` Adam Beneschan
  1996-08-18  0:00                                               ` Steve Heller
  1996-08-16  0:00                                             ` Szu-Wen Huang
  1996-08-16  0:00                                             ` Robert Dewar
  2 siblings, 1 reply; 688+ messages in thread
From: Adam Beneschan @ 1996-08-16  0:00 UTC (permalink / raw)



heller@utdallas.edu (Steve Heller) writes:
>dewar@cs.nyu.edu (Robert Dewar) wrote:

 >>Radix sorts are not O(n), the analysis is not that simple. They are
 >>O(kN) where k is the number of radix digits in the numbers, 
 
 >  As I understand it, the O() notation indicates how the time to
 >process varies with the number of elements. E.g., if 1000 elements
 >take 1 second and 2000 elements take 2 seconds to sort, the sorting
 >algorithm is O(n). According to this definition, the distribution
 >counting sort is O(n).

The following definition of O-notation comes from Knuth's _Art of
Compute Programming_, volume 1.  To quote (and I've had to fudge the
notation to get it into ASCII):

    In general, the notation O(f(n)) may be used whenever f(n) is
    a function of the positive integer n; it stands for _a
    quantity which is not explicitly known_, except that its
    magnitude isn't too large.*  Every appearance of O(f(n))
    means precisely this: there is a positive constant M such
    that the number Xn represented by O(f(n)) satisfies the
    condition abs(Xn) <= M * abs(f(n)). . . .

* This last phrase makes more sense in the context of asymptotic
representations, in which it appears in Knuth's book.

More succinctly, from Thomas A. Standish's _Data Structure Techniques_: 

    We say that g(n) = O(f(n)) if there exist two constants K and
    n0 such that abs(g(n)) <= K * abs(f(n)) for all n >= n0.

These definitions only allow for functions of one integer variable;
however, there is no reason they can't be extended to functions of more
than one variable.  I have no idea whether this has been done in
mathematical literature, but I don't see why one can't say

    We say that g(m,n) = O(f(m,n)) if there exist three constants K,
    m0, and n0 such that abs(g(m,n)) <= K * abs(f(m,n)) whenever 
    n >= n0 and m >= m0.

So, going back to Robert's statement, if "k", the number of radix digits
in the numbers, and "N", the number of elements, are allowed to vary
over the entire space of positive integers, then it would be wrong to
say the sort time is O(N).  This is because no value of K that satisfies
the above definition could be found---you could always make "k" large
enough to make the inequality false.  So O(kN) is correct, as Robert
said.  Steve's understanding of O-notation is fine for most practical
purposes, but it's an oversimplification.

                                -- Adam





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-14  0:00                                         ` Robert Dewar
@ 1996-08-16  0:00                                           ` Dik T. Winter
  1996-08-16  0:00                                             ` Joe Foster
  1996-08-18  0:00                                           ` Glenn Rhoads
                                                             ` (2 subsequent siblings)
  3 siblings, 1 reply; 688+ messages in thread
From: Dik T. Winter @ 1996-08-16  0:00 UTC (permalink / raw)



In article <dewar.840034153@schonberg> dewar@cs.nyu.edu (Robert Dewar) writes:
 >                                          A bubble sort is certainly a
 > much simpler solution to the problem of optimal sorting of a sorted
 > list, and simplicity of solutoins is interesting if performance is NOT
 > an issue after all.

Indeed.  I once used it in an algorithm that calculated the singular
values of a matrix.  At the end the values were sorted with a bubble
sort.  Why not?  The most expensive (n^3) part had been done.  And it
was only a few lines of code in a routine that was already very complex.
Think about adding a heapsort or quicksort within (and think about the
overhead when most likely the maximum number of elements to be sorted
was around 20).
-- 
dik t. winter, cwi, kruislaan 413, 1098 sj  amsterdam, nederland, +31205924098
home: bovenover 215, 1025 jn  amsterdam, nederland; http://www.cwi.nl/~dik/




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-14  0:00                                                 ` Dan Pop
  1996-08-14  0:00                                                   ` Tim Behrendsen
@ 1996-08-16  0:00                                                   ` Dik T. Winter
  1 sibling, 0 replies; 688+ messages in thread
From: Dik T. Winter @ 1996-08-16  0:00 UTC (permalink / raw)



In article <danpop.840019072@news.cern.ch> Dan.Pop@cern.ch (Dan Pop) writes:
 > >> >> >    for (j = 0; j < 10; ++j) {
 > >> >> >        for (i = 0; i < 50000; ++i) {
 > >> >> >            ++a[i]; ++b[i]; ++c[i]; ++d[i]; ++e[i];
 > >> >> >        }
 > >> >> >    }
...
 > Both versions have excellent paging locality.  Read them again and tell
 > me which one is likely to produce more page faults than the other.

Dan, obviously on a machine with only four pages the code above produces
an enormous amount of page faults!  Ah, a Commodore C64 with virtual
memory.
-- 
dik t. winter, cwi, kruislaan 413, 1098 sj  amsterdam, nederland, +31205924098
home: bovenover 215, 1025 jn  amsterdam, nederland; http://www.cwi.nl/~dik/




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-16  0:00             ` Dr E. Buxbaum
@ 1996-08-16  0:00               ` Mike Rubenstein
  1996-08-16  0:00               ` Lawrence Kirby
  1996-08-20  0:00               ` Paul Schlyter
  2 siblings, 0 replies; 688+ messages in thread
From: Mike Rubenstein @ 1996-08-16  0:00 UTC (permalink / raw)



"Dr E. Buxbaum" <EB15@le.ac.uk> wrote:

> djohnson@tartarus.ucsd.edu (Darin Johnson) wrote:
> >> A binary sort, also known as quicksort, or Hoare's sort is covered extensively
> 
> >"quicksort is the fastest sort" categorization without really
> >understanding it.  
> 
> A common misconception. Quicksort is fast for UNSORTED data. For data 
> which are largely presorted (a common occurance if a bunch of additional 
> data has to be added to a list), Quicksort becomes Slowsort. 

Only if it is implemented by an idiot.  Techniques that make O(n^2)
operation extremely unlikely are well known and are covered in any
decent book of algorithms.  Hoare's 1962 paper points out one of them.

If the data is known to be very close to being sorted, quicksort
(properly implemented) isn't the sort of choice, but it's not
terrible.

Michael M Rubenstein




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-15  0:00             ` Should I learn C or Pascal? Richard A. O'Keefe
  1996-08-17  0:00               ` Lawrence Kirby
@ 1996-08-17  0:00               ` Mike Rubenstein
  1996-08-17  0:00               ` Alexander J Russell
  2 siblings, 0 replies; 688+ messages in thread
From: Mike Rubenstein @ 1996-08-17  0:00 UTC (permalink / raw)



ok@goanna.cs.rmit.edu.au (Richard A. O'Keefe) wrote:

> djohnson@tartarus.ucsd.edu (Darin Johnson) writes:
> >Actually, I learned it freshman year, but didn't understand it.
> ...
> >I think many of my classmates kept the
> >"quicksort is the fastest sort" categorization without really
> >understanding it.  Too many people fall asleep in algorithms class
> >(then bitch about the waste of time later).
> 
> The *really* sad thing here is that quicksort is *not* the fastest sort.
> Quicksort was specifically designed (see Hoare's paper in Computer
> Journal, 1960 or 61, can't remember the issue) for a machine with 256
> words of memory.  Not 256M.  Not 256k.  Two hundred and fifty-six words
> of main memory.  Backing store was a drum with 16384 words, transferred
> in 64 word pages.  Hoare knew at the time that it did more comparisons
> than merge sort, but merge sort need extra memory that simply wasn't there.

Where did you get this idea?  Certainly not from the Hoare's paper.
In the paper he reports testing on such a machine, but he does not say
that quicksort was designed for such a machine.  In fact, he concludes
that

	Quicksort is a sorting method ideally adapted for sorting in 
	the random access store of a computer.  ...  Quicksort is 
	likely to recommend itself as the standard sorting method on
	most computers with a large enough random access store to make

	internal sorting worthwhile.

He includes the case where the random access store is disk or drum.
Not unreasonable on the computers of the early 60s, but not realistic
today.  In general, today quicksort is not suitable unless the data to
be sorted can pretty much be held in real memory.

> 
> Quicksort still has a niche in embedded processors,
> although there is a new version of heapsort (also published in the
> Computer Journal, but I can't find the reference) which can challenge it:
> the modern heapsort has the virtue that its worst case is O(nlgn) which
> makes it a better bet for soft-real-time work.

It's not that new -- Knuth discusses heapsort with guaranteed O(N log
N) performancein his 1973 book.  I was unaware that there was ever a
version that does not guarantee O(N log N) behavior.  But it's very
hard to get the constant as low as that for quicksort and quite easy
to make O(N log N) behavior almost certain in quicksort.  Sometimes
heapsort is better for soft real time work, but not always.

In fact, if the maximum number of items to be sorted is fixed in
advance, insertion sort is often very good if the number is very small
and Shell sort for several hundred items.

> 
> For general use, a well engineered merge sort is as good as a well engineered
> quicksort; sometimes better.
> 
> For sorting machine integers, a well engineered radix sort (or even count
> sort if the range is small enough) is so much faster that it isn't funny.

Of course, but how often do you sort machine integers?  In 35 years of
programming, I can count on my fingers how often that problem came up
and in most of those cases the amount of data was so small that it
would be hard to choose a sort that isn't adequate.

> 
> For most programmers, the main issue is that it is cheaper to re-use
> an existing expertly implemented and thoroughly tested sort procedure
> than to write their own buggy code.

Sometimes.  It pays to think.  A number of years ago I did a billing
program for a large retailer.  Transactions were stored in the order
in which they were received and had to be sorted in order of
transaction date.  For the most part, the data was already sorted
correctly.  The main exception was that billing transactions for items
shipped were batched and processed overnight.  In some cases they were
not processed for a day or two.  If a person placed an order the day
after an item was shipped, the data could be out of order.

Aside from the fact that the data was almost in order, the number of
items to be sorted was quite small (for a variety of reasons it was
not desirable to presort all the transactions, so the transactions for
each customer were sorted separately).  That alone made any sort
package undesirable.

Here C and C++ are much better than most languages I know.  qsort() is
generally decent for small sorts (though I'd not use it in the case I
jsut mentioned).  I've not done much work in them recently, but when I
programmed in PL/I and COBOL their sort routines were completely
unsuitable if one was doing a large number of small sorts (unlike C's,
they were extremely good for very large sorts).

Michael M Rubenstein




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-16  0:00                 ` Mark Wooding
@ 1996-08-17  0:00                   ` Dan Pop
  1996-08-17  0:00                     ` Tim Behrendsen
  1996-08-21  0:00                     ` Tanmoy Bhattacharya
  0 siblings, 2 replies; 688+ messages in thread
From: Dan Pop @ 1996-08-17  0:00 UTC (permalink / raw)



In <slrn517h1d.483.mdw@excessus.demon.co.uk> mdw@excessus.demon.co.uk (Mark Wooding) writes:

>Gabor Egressy <gegressy@uoguelph.ca> wrote:
>
>> Why oh why would you want to start with assembly? Assembly is great
>> for writing viruses and small code that needs to be fast but is a pain
>> to write and maintain.
>
>Eeek.  It's all evil lies.  It's a vicious rumour put about by a secret
>cabal for their own nefarious purposes.
>
>I'll admit that I've written assembler code which is almost utterly
>illegible to me now.  I've also written some awful C code, so that
>doesn't mean much.  However, I have /lots/ more beautiful looking and
>instantly readable assembler code.

Instantly readable to whom?  Definitely not to another assembly programmer
who doesn't know your particular assembly language(s).

>Just because it's low-level stuff doesn't mean it has to be hard to
>understand.  Just like any other language, if you take a bit of care to
>present your code nicely, it will be readable; if you don't, it will be
>ghastly.

Readable or not, it's still a hell to port it to another architecture.
Porting it even to another OS running on the same architecture might
not be exactly a piece of cake.

Maintaining a piece of assembly code which has been already optimized
for a certain modern processor is also a royal pain in the ass if you
want the result to be still optimal.  The days when the only concern
was to get it right from the logical point of view are long gone: if
your assembly code is slower than the compiler output, what is the point
in using assembly in the first place?  And the cases when the assembly
code is faster than the compiler output are fewer and fewer and farther
between on the current CISC and RISC architectures.

Dan
--
Dan Pop
CERN, CN Division
Email: Dan.Pop@cern.ch 
Mail:  CERN - PPE, Bat. 31 R-004, CH-1211 Geneve 23, Switzerland




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-10  0:00           ` Mike Rubenstein
  1996-08-11  0:00             ` Szu-Wen Huang
@ 1996-08-17  0:00             ` Richard Chiu
  1996-09-04  0:00               ` Lawrence Kirby
  1 sibling, 1 reply; 688+ messages in thread
From: Richard Chiu @ 1996-08-17  0:00 UTC (permalink / raw)



miker3@ix.netcom.com (Mike Rubenstein) writes:
>But that is not always true.

>A number of years ago I developed a program that had to do a large
>number of sorts with the following characteristics:

>	1.  The mean number of items to be sorted was about 5.  In a 
>	    test sample of a million cases, the larges number of items
>	    to be sorted was 35.

>	2.  The items were usually in order.  In the test sample, 90% 
>	    were in order, and most of the rest were in order except 
>	    for a single interchange of adjacent items.  Only 8 were 
>	    out of order by more than three interchanges or required 
>	    interchanges of nonadjacent items.

>Care to try this with quicksort?  or heapsort?  A good old O(n^2)
>insertion sort works quite nicely.

By the same thread of reasoning, a program that does nothing but
reture the inputs will be the best algorithm for sorting if the 
numbers are already sorted. (Which is true!)

Richard




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-16  0:00               ` Lawrence Kirby
@ 1996-08-17  0:00                 ` Paul Hsieh
  1996-08-17  0:00                   ` Mike Rubenstein
  0 siblings, 1 reply; 688+ messages in thread
From: Paul Hsieh @ 1996-08-17  0:00 UTC (permalink / raw)



Lawrence Kirby wrote:
> 
> In article <4v1r2a$gh6@falcon.le.ac.uk> EB15@le.ac.uk "Dr E. Buxbaum" writes:
> 
> >djohnson@tartarus.ucsd.edu (Darin Johnson) wrote:
> >>> A binary sort, also known as quicksort, or Hoare's sort is covered
> > extensively
> >
> >>"quicksort is the fastest sort" categorization without really
> >>understanding it.
> >
> >A common misconception. Quicksort is fast for UNSORTED data. For data
> >which are largely presorted (a common occurance if a bunch of > >additional
> >data has to be added to a list), Quicksort becomes Slowsort.
> 
> Quicksort is faster for sorted data than unsorted data: any passable
> implementation will use at the very least median-of-three pivot > selection.

Geez ... pay attention in class next time folks:

On a reversed list, quicksort==bubble sort.
On a shifted list, quicksort==bubble sort.
On most other arrangements quicksort = O(nlnn)

Merge sort is O(nln) in all situations and hence is better when reliable
running time is required.

-- 
Paul Hsieh
qed@chromatic.com
http://www.geocities.com/SiliconValley/9498
Graphics Programmer
Chromatic Research

What I say and what my company says are not always the same thing




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-15  0:00             ` Should I learn C or Pascal? Richard A. O'Keefe
@ 1996-08-17  0:00               ` Lawrence Kirby
  1996-08-18  0:00                 ` Ken Pizzini
  1996-08-19  0:00                 ` Richard A. O'Keefe
  1996-08-17  0:00               ` Mike Rubenstein
  1996-08-17  0:00               ` Alexander J Russell
  2 siblings, 2 replies; 688+ messages in thread
From: Lawrence Kirby @ 1996-08-17  0:00 UTC (permalink / raw)



In article <4uu9v3$hrp@goanna.cs.rmit.edu.au>
           ok@goanna.cs.rmit.edu.au "Richard A. O'Keefe" writes:

>Quicksort still has a niche in embedded processors,
>although there is a new version of heapsort (also published in the
>Computer Journal, but I can't find the reference) which can challenge it:

My experience of heapsort is that while it can beat quicksort in number
of comparisons it requires more data movement and has more algorithmic
overhead which makes it slower.

>the modern heapsort has the virtue that its worst case is O(nlgn) which
>makes it a better bet for soft-real-time work.

Heapsort has always been O(n log n). You're right that a guaranteed
reasonable worst case is sometimes useful.

>For general use, a well engineered merge sort is as good as a well engineered
>quicksort; sometimes better.

It is close but very rare indeed for a heapsort to run faster than
a quicksort.

>For sorting machine integers, a well engineered radix sort (or even count
>sort if the range is small enough) is so much faster that it isn't funny.
>
>For most programmers, the main issue is that it is cheaper to re-use
>an existing expertly implemented and thoroughly tested sort procedure
>than to write their own buggy code.

As far as the C language is concerned merge sort and radix sort aren't good
choices for qsort().

-- 
-----------------------------------------
Lawrence Kirby | fred@genesis.demon.co.uk
Wilts, England | 70734.126@compuserve.com
-----------------------------------------




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-07  0:00                               ` Tim Behrendsen
  1996-08-08  0:00                                 ` Thomas Hood
@ 1996-08-17  0:00                                 ` Lawrence Kirby
  1996-08-17  0:00                                   ` Tim Behrendsen
  1 sibling, 1 reply; 688+ messages in thread
From: Lawrence Kirby @ 1996-08-17  0:00 UTC (permalink / raw)



In article <01bb846d$c6c01780$87ee6fce@timpent.airshields.com>
           tim@airshields.com "Tim Behrendsen" writes:

>Who's talking about showing them?  I would suggest that if
>they wrote a quicksort in assembler, they will have a much
>better "feel" for the algorithm, than if they wrote it in C.

They might have a good feel for how to implement quicksort in that
particular machine code however it would be so wrapped up in
implementation details that they wouldn't have a good understanding of
the algorithm itself, at last not in the same timescale.

-- 
-----------------------------------------
Lawrence Kirby | fred@genesis.demon.co.uk
Wilts, England | 70734.126@compuserve.com
-----------------------------------------




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-17  0:00                   ` Dan Pop
@ 1996-08-17  0:00                     ` Tim Behrendsen
  1996-08-17  0:00                       ` Robert Dewar
                                         ` (2 more replies)
  1996-08-21  0:00                     ` Tanmoy Bhattacharya
  1 sibling, 3 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-17  0:00 UTC (permalink / raw)



Dan Pop <Dan.Pop@cern.ch> wrote in article
<danpop.840244373@news.cern.ch>...
> In <slrn517h1d.483.mdw@excessus.demon.co.uk> mdw@excessus.demon.co.uk
(Mark Wooding) writes:
> 
> >
> >I'll admit that I've written assembler code which is almost utterly
> >illegible to me now.  I've also written some awful C code, so that
> >doesn't mean much.  However, I have /lots/ more beautiful looking and
> >instantly readable assembler code.
> 
> Instantly readable to whom?  Definitely not to another assembly
programmer
> who doesn't know your particular assembly language(s).

Well, what sense does that make?  C is not instantly readable to
someone who only knows Fortran.  If you don't know the language,
then it's not going to be instantly readable.

> >Just because it's low-level stuff doesn't mean it has to be hard to
> >understand.  Just like any other language, if you take a bit of care to
> >present your code nicely, it will be readable; if you don't, it will be
> >ghastly.
> 
> Readable or not, it's still a hell to port it to another architecture.
> Porting it even to another OS running on the same architecture might
> not be exactly a piece of cake.

Which is true, but irrelevent to the fact that assembly can be
quite maintainable.

> Maintaining a piece of assembly code which has been already optimized
> for a certain modern processor is also a royal pain in the ass if you
> want the result to be still optimal.  The days when the only concern
> was to get it right from the logical point of view are long gone: if
> your assembly code is slower than the compiler output, what is the point
> in using assembly in the first place?  And the cases when the assembly
> code is faster than the compiler output are fewer and fewer and farther
> between on the current CISC and RISC architectures.

It's extremely rare that hand coded assembly is slower than
compiler output.  Compilers are *extremely* stupid; anyone who
thinks otherwise has either 1) not coded in assembly, and 2) not
viewed the assembly output from compilers.  They are certainly
less stupid than they used to be, but to imagine that on the
average they beat even the average human assembly programmer is
just nonsense.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-16  0:00                                             ` Szu-Wen Huang
@ 1996-08-17  0:00                                               ` Robert Dewar
  1996-08-20  0:00                                                 ` Szu-Wen Huang
  1996-08-21  0:00                                                 ` Tanmoy Bhattacharya
  1996-08-17  0:00                                               ` Robert Dewar
                                                                 ` (3 subsequent siblings)
  4 siblings, 2 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-17  0:00 UTC (permalink / raw)



"Wrong order.  An O(n) program where n=1,000 that takes 1 second to
complete should be able to finish n=2,000 in 2 seconds, not the other
way around.  In other words, you can't derive time complexity by
timing an algorithm, *especially* with only two samples."

This statement is wrong (big-O seems to be generating a lot of confusion).
big-O notation is about asymptotic behavior. If you know an algorithm
is O(N) and you know that the time for n=1000 is one second, you know
ABSOLUTELY NOTHING about the time for n=2000, NOTHING AT ALL!

For example, suppose the behavior of an algorithm is

  for n up to 1000, time is 1 second
  for n greater than 1000, time is 1000*n seconds

that's clearly O(N), but the time for 2000 items will be 2_000_000 seconds.

A good reference for big-O notation is Mike Feldman's data structure book.
It emphasizes big-O more than some other books at this level, and has a
clear explanation and some nice examples. If everyone reads this we will
have a clearer discussion here (and avoid things like O(2N) :-)
Of  course what it has to do with the origins of the thread is obscure :-)





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-17  0:00                     ` Tim Behrendsen
  1996-08-17  0:00                       ` Robert Dewar
  1996-08-17  0:00                       ` Dan Pop
@ 1996-08-17  0:00                       ` Peter Seebach
  1996-08-18  0:00                         ` Tim Behrendsen
  2 siblings, 1 reply; 688+ messages in thread
From: Peter Seebach @ 1996-08-17  0:00 UTC (permalink / raw)



In article <01bb8c6d$c62d44c0$87ee6fce@timpent.airshields.com>,
Tim Behrendsen <tim@airshields.com> wrote:
>Well, what sense does that make?  C is not instantly readable to
>someone who only knows Fortran.  If you don't know the language,
>then it's not going to be instantly readable.

I object to this.  I found C instantly readable.  When I was a kid,
I had the source (in C) to a game called hack.  I had never even heard
of C; I didn't know what it was, I just knew that it supposedly was
"the source" to hack.  (My understanding of source had to do with an
unnamed language in which every statement had a number, and there were
52 variables, half of which were strings, and half of which were
numbers.)

But when I wanted to know how the game worked, I paged through printouts.
Nothing hard about it.  What's tricky (to someone who knows that "HP: 12/12"
means you have 12 out of a possible 12 "hit points") about
	u.hp = u.maxhp = 12;
?

>It's extremely rare that hand coded assembly is slower than
>compiler output.

Depends.  Recently, I saw someone looking for a good way to speed something
up.  The C version he ended up using (admittedly, non-portable C) was
actually about 2 instructions *faster* than a previous hand-tuned assembly
implementation.  Sure, he could have done it in assembly, too, but it would
have been exactly the same code.

However, I suspect that the realization that led to the improvement would have
been less obvious in assembly.

>Compilers are *extremely* stupid; anyone who
>thinks otherwise has either 1) not coded in assembly, and 2) not
>viewed the assembly output from compilers.  They are certainly
>less stupid than they used to be, but to imagine that on the
>average they beat even the average human assembly programmer is
>just nonsense.

I don't know that much about assembly programmers, but assuming that the
average assembly programmer has a level of skill at least comprable to the
average human C programmer, I'd expect a pretty pathetic compiler to beat the
assembly programmer about half the time.

Most programmers of any language appear to be dodgy at best.

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-15  0:00             ` Should I learn C or Pascal? Richard A. O'Keefe
  1996-08-17  0:00               ` Lawrence Kirby
  1996-08-17  0:00               ` Mike Rubenstein
@ 1996-08-17  0:00               ` Alexander J Russell
  2 siblings, 0 replies; 688+ messages in thread
From: Alexander J Russell @ 1996-08-17  0:00 UTC (permalink / raw)



In article <4uu9v3$hrp@goanna.cs.rmit.edu.au>, ok@goanna.cs.rmit.edu.au says...
>
>djohnson@tartarus.ucsd.edu (Darin Johnson) writes:
>>Actually, I learned it freshman year, but didn't understand it.
>...
>>I think many of my classmates kept the
>>"quicksort is the fastest sort" categorization without really
>>understanding it.  Too many people fall asleep in algorithms class
>>(then bitch about the waste of time later).
>
>The *really* sad thing here is that quicksort is *not* the fastest sort.
>Quicksort was specifically designed (see Hoare's paper in Computer
>Journal, 1960 or 61, can't remember the issue) for a machine with 256
>words of memory.  Not 256M.  Not 256k.  Two hundred and fifty-six words
>of main memory.  Backing store was a drum with 16384 words, transferred
>in 64 word pages.  Hoare knew at the time that it did more comparisons
>than merge sort, but merge sort need extra memory that simply wasn't there.
>
>Quicksort still has a niche in embedded processors,
>although there is a new version of heapsort (also published in the
>Computer Journal, but I can't find the reference) which can challenge it:
>the modern heapsort has the virtue that its worst case is O(nlgn) which
>makes it a better bet for soft-real-time work.
>
>For general use, a well engineered merge sort is as good as a well engineered
>quicksort; sometimes better.
>
>For sorting machine integers, a well engineered radix sort (or even count
>sort if the range is small enough) is so much faster that it isn't funny.
>
>For most programmers, the main issue is that it is cheaper to re-use
>an existing expertly implemented and thoroughly tested sort procedure
>than to write their own buggy code.
>-- 
>Australian citizen since 14 August 1996.  *Now* I can vote the xxxs out!
>Richard A. O'Keefe; http://www.cs.rmit.edu.au/~ok; RMIT Comp.Sci.

heap sort: average N log N, worst case N log N
quick sort: average N log N, worst case N^2

But heap sort is more complicated, requires more code and generally runs slower 
than quick-sort for random data.

Quick-Sort is more fragile, and is realy bad for sorting sorted data.

So, heap sort CAN be faster than quick-sort, but it is false to clain that it 
is ALWAYS faster.

In general it is good to know a number of sorting algoriths and use the one 
best suited to the data being sorted.
-- 
The AnArChIsT! Anarchy! Not Chaos!
aka
Alex Russell
alexad3@iceonline.com





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-17  0:00                   ` Lawrence Kirby
@ 1996-08-17  0:00                     ` Robert Dewar
  1996-08-20  0:00                       ` Lawrence Kirby
  0 siblings, 1 reply; 688+ messages in thread
From: Robert Dewar @ 1996-08-17  0:00 UTC (permalink / raw)



What on earth is a "list qiucksort". As soon as you are not trying to do
things in place, why on earth would you use quicksort, when mergesort is
obviosuly as efficient and has a worst case performance of NlogN.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-17  0:00                     ` Tim Behrendsen
@ 1996-08-17  0:00                       ` Robert Dewar
  1996-08-17  0:00                       ` Dan Pop
  1996-08-17  0:00                       ` Peter Seebach
  2 siblings, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-17  0:00 UTC (permalink / raw)



Tim says

"It's extremely rare that hand coded assembly is slower than
compiler output.  Compilers are *extremely* stupid; anyone who
thinks otherwise has either 1) not coded in assembly, and 2) not
viewed the assembly output from compilers.  They are certainly
less stupid than they used to be, but to imagine that on the
average they beat even the average human assembly programmer is
just nonsense.
"

This is a reasonable statement for 1975, but not for today, and makes me
think you have not coded for modern processes with reasonable levels of
ILP. Coding for such machines by hand is extremely difficult, and it is
often the case that compilers can do better.

At the very lease you need an optimizing assembler that will do the
scheduling. Trace scheduling and speculative execution are not something
that are easy to deal with by hand. Register renaming can also immensely
complicate hand coding, but is something a compiler can do reasonably
well.






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-17  0:00                                 ` Lawrence Kirby
@ 1996-08-17  0:00                                   ` Tim Behrendsen
  1996-08-19  0:00                                     ` Bob Gilbert
  1996-08-22  0:00                                     ` Bengt Richter
  0 siblings, 2 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-17  0:00 UTC (permalink / raw)



Lawrence Kirby <fred@genesis.demon.co.uk> wrote in article
<840288278snz@genesis.demon.co.uk>...
> In article <01bb846d$c6c01780$87ee6fce@timpent.airshields.com>
>            tim@airshields.com "Tim Behrendsen" writes:
> 
> >Who's talking about showing them?  I would suggest that if
> >they wrote a quicksort in assembler, they will have a much
> >better "feel" for the algorithm, than if they wrote it in C.
> 
> They might have a good feel for how to implement quicksort in that
> particular machine code however it would be so wrapped up in
> implementation details that they wouldn't have a good understanding of
> the algorithm itself, at last not in the same timescale.

And they don't get wrapped up in implementation
details if they write it in C?  My primary point is
when it's implemented in assembly, and you have to
manually move the bytes from one location to another
rather than have the compiler "carry them for you",
you get a better view feel for not only what the
algorithm is doing, but what the computer itself is
doing.

The most important thing a student can learn about
computers is the fundamentally simple nature.  The
mystery of execution *must* be broken down in order for
the student to being to think like a programmer.
Once they can think like a programmer, all the rest
of the knowledge they learn becomes trivial.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-17  0:00                     ` Tim Behrendsen
  1996-08-17  0:00                       ` Robert Dewar
@ 1996-08-17  0:00                       ` Dan Pop
  1996-08-18  0:00                         ` Mark Wooding
  1996-08-17  0:00                       ` Peter Seebach
  2 siblings, 1 reply; 688+ messages in thread
From: Dan Pop @ 1996-08-17  0:00 UTC (permalink / raw)



In <01bb8c6d$c62d44c0$87ee6fce@timpent.airshields.com> "Tim Behrendsen" <tim@airshields.com> writes:

>It's extremely rare that hand coded assembly is slower than
>compiler output.  Compilers are *extremely* stupid; anyone who
>thinks otherwise has either 1) not coded in assembly, and 2) not
>viewed the assembly output from compilers.  They are certainly
>less stupid than they used to be, but to imagine that on the
>average they beat even the average human assembly programmer is
>just nonsense.

Try to test your assertions with a nontrivial piece of code on a P6
or any modern RISC processor.  The human mind is very badly adapted to
the task of generating optimal code for these architectures, while
compilers could be "trained" for this job a lot easier and more
successfully.

There are "pathological" cases, when an algorithm cannot be suitably
implemented in a HLL but can be implemented in only a few assembly lines
(this is especially true for some data conversion or multiple precision
arithmetic algorithms), but these are the exceptions, not the rule.  

The days of Z80 and i8086, when a relatively inexperienced assembly
programmer could beat the best compiler of the day single handed are
long gone.

Dan
--
Dan Pop
CERN, CN Division
Email: Dan.Pop@cern.ch 
Mail:  CERN - PPE, Bat. 31 R-004, CH-1211 Geneve 23, Switzerland




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-15  0:00                   ` James_Rogers
@ 1996-08-17  0:00                     ` Tim Behrendsen
  0 siblings, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-17  0:00 UTC (permalink / raw)





James_Rogers <jrogers@velveeta.apdev.cs.mci.com> wrote in article
<32134490.41C6@velveeta.apdev.cs.mci.com>...
> Peter Seebach wrote:
> > Tim Behrendsen <tim@airshields.com> wrote:
> >>>>int a[50000],b[50000],c[50000],d[50000],e[50000];
> 
> >>>>void test1()
> >>>>{
> >>>>  int i,j;
> >>>>  for (j=0;j<10;++j){
> >>>>    for (i=0;i<50000;++i){
> >>>>      ++a[i];++b[i];++c[i];++d[i];++e[i];
> >>>>    }
> >>>>  }
> >>>>}
> 
> >>>>void test2()
> >>>>{
> >>>>  int i,j;
> >>>>  for (j=0;j<10;++j){
> >>>>    for (i=0;i<50000;++i) ++a[i];
> >>>>    for (i=0;i<50000;++i) ++b[i];
> >>>>    for (i=0;i<50000;++i) ++c[i];
> >>>>    for (i=0;i<50000;++i) ++d[i];
> >>>>    for (i=0;i<50000;++i) ++e[i];
> >>>>  }
> >>>>}
> 
> This example also behaves differently than predicted by
> Mr Behrendsen when implemented in Ada 95 using the GNAT
> compiler on a SPARC 20.
> 
> I translated the above functions into Ada as follows:
>    procedure test1 is
>    begin
>       for J in 1..10 loop
>          for I in index_type loop
>             a(I) := a(I) + 1;
>             b(I) := b(I) + 1;
>             c(I) := c(I) + 1;
>             d(I) := d(I) + 1;
>             e(I) := e(I) + 1;
>          end loop;
>       end loop;
>   end test1;
>   
>   procedure test2 is
>   begin
>      for J in 1..10 loop
>         for I in index_type loop
>            a(I) := a(I) + 1;
>         end loop;
>         for I in index_type loop
>            b(I) := b(I) + 1;
>         end loop;
>         for I in index_type loop
>            c(I) := c(I) + 1;
>         end loop;
>         for I in index_type loop
>            d(I) := d(I) + 1;
>         end loop;
>         for I in index_type loop
>            e(I) := e(I) + 1;
>         end loop;
>      end loop;
>   end test2;
> 
> Each of the above loops was run 10 times.  The
> timings are given below.
> 
> The resulting timings with no optimization are:
> Test1 execution time:          12.2815
> Test2 execution time:          16.0227
> 
> With full optimization and inlining of procedures the 
> results are:
> Test1 execution time:           4.6333
> Test2 execution time:           5.2464
> 
> Using Ada the performance is quite comperable to Mr Seebach's
> C results using gcc.  The Ada results also do not support Mr
> Behrendsen's position.

My original timings where off by a factor of 10, BTW.  The
original implementation I ran used 100 for the outer loop,
not 10.

But this *is* interesting; that implies to me that the ADA
compiler is not using pointer arithmetic when optimizing, and
is calculating the indexes every time.  This would make the
overhead of the extra looping constructs overwhelm the
gain in page locality.

Just for some ADA to C performance comparison, it would be
interesting to compile the C code on your SPARC 20 and see
what the difference is.  Since ADA has runtime array boundary
checking, it makes we wonder if the optimizer is smart enough
to know to get rid of the boundary checking when it's within
a known loop like this.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-15  0:00                 ` Robert Dewar
@ 1996-08-17  0:00                   ` Lawrence Kirby
  1996-08-17  0:00                     ` Robert Dewar
  0 siblings, 1 reply; 688+ messages in thread
From: Lawrence Kirby @ 1996-08-17  0:00 UTC (permalink / raw)



In article <dewar.840108016@schonberg> dewar@cs.nyu.edu "Robert Dewar" writes:

>Gabor says
>
>"Qicksort isn't that hard to understand. Just grab a deck of cards and plow
>through it. A deck of cards works for all the sorts I know and care to
>know."
>
>I would disagree with this. The divide and conquer paradigm of QS is of
>course trivial to understand if you understand recursion (although for
>starting students, that can be a very big if!)
>
>However, the algorithm for the in place partition is quite tricky to
>get exactly right, and I have often seen slipups in coding it.

One problem is that there are several possible partitioning algorithms,
even for in place partitioning. Also, in place partitioning isn't
necessarily the easiest or best approach for cards (something closer
to a list quicksort is probably better).

-- 
-----------------------------------------
Lawrence Kirby | fred@genesis.demon.co.uk
Wilts, England | 70734.126@compuserve.com
-----------------------------------------




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-16  0:00                                             ` Szu-Wen Huang
  1996-08-17  0:00                                               ` Robert Dewar
@ 1996-08-17  0:00                                               ` Robert Dewar
  1996-08-18  0:00                                               ` Steve Heller
                                                                 ` (2 subsequent siblings)
  4 siblings, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-17  0:00 UTC (permalink / raw)



Szu-Wen said

"Wrong order.  An O(n) program where n=1,000 that takes 1 second to
complete should be able to finish n=2,000 in 2 seconds, not the other
way around.  In other words, you can't derive time complexity by
timing an algorithm, *especially* with only two samples."

Well this thread has wandered far from its subject, and also far from
the expertise of the posters to the thread I fear. The above statement
is quite wrong, as is the one to which it responds.

big-O notation is about asymptotic behavior. If you know an algorithm
is O(N) and you know that the time for n=1000 is one second, you know
ABSOLUTELY NOTHING about the time for n=2000, NOTHING AT ALL! 

For example, suppose the behavior of an algorithm is

  for n up to 1000, time is 1 second
  for n greater than 1000, time is 1000*n seconds

that's clearly O(N), but the time for 2000 items will be 2_000_000 seconds.

A good reference for big-O notation is Mike Feldman's data structure book.
It emphasizes big-O more than some other books at this level, and has a
clear explanation and some nice examples. If everyone reads this we will
have a clearer discussion here (and avoid howlers like O(2N) :-)
Of  course what it has to do with the origins of the thread is obscure :-)





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-16  0:00                   ` Bob Gilbert
@ 1996-08-17  0:00                     ` Tim Behrendsen
  1996-08-18  0:00                       ` Robert Dewar
  1996-08-19  0:00                       ` John Hobson
  0 siblings, 2 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-17  0:00 UTC (permalink / raw)



Bob Gilbert <rgilbert@unconfigured.xvnews.domain> wrote in article
<4v1pnf$8du@zeus.orl.mmc.com>...
> In article <01bb89f1$31be4f60$87ee6fce@timpent.airshields.com>, "Tim
Behrendsen" <tim@airshields.com> writes:

> > is truly understood or not.  Solving a problem is not an
> > "abstraction" question, it's a question of breaking it down
> > into fundamental steps of data transformations.
> 
> Guess it depends on your definition of "solving a problem".
> I see it as defining the problem, which usually requires a
> certain amount of abstraction, developing an algorithm which
> provides a solution (more abstraction), and finally implementing
> the algorithm in the machine (coding/programming).  I think you
> are placing too much emphasis on the last step.

Perhaps, but CS has gone so abstraction-happy that I think
it has wandered away from the basic tenet that computers
are supposed to *do* something.

> > So what?  You can always learn languages, but you only have
> > one shot at teaching someone to think like a programmer.
> 
> You're loosing me here..  Why only one shot?

Because I've found that people tend to stick with the first
[dare I use the word] paradigm that they are introduced to.
Everything else they learn will be compared against the first
thing they learn, and if the first thing they learn is the
step-by-step nature of the computer through a simple assembly
language (they don't have to start with the sophisticated RISC
chip on the planet), then they'll always have that safe, simple
place to go back to when trying to understand more sophisticated
concepts.

> > What gets missed in all this is that *computers are simple*!
> 
> Some (perhaps only a few) computers are simple.  While computes are
> designed around some basic (simple) concepts, start adding caches, 
> pipeline architectures, memory paging schemes, multiple register
> sets, any of the various DMA capabilities or co-processor's which
> provide for some parallelism, etc., and things get very complicated
> very quick.  There aren't many computers made these days that don't
> employ most (and many more) of the above features.  To effectively
> program in assembly one must fully understand these architectural
> issues, and to me that is one of the advantages of teaching a HLL
> first as it allows the student to study and understand the development
> and implementation of algorithms without being distracted by all the
> low level architectural details.

But all the caching, etc are all performance optimizations.  That
is irrelevent to the fact that they're still "load and execute"
cycles in the simple case.

You don't *have* to start them out with all this.  They can learn
just as well on a simple processor.

> > Look at ol' Darin; if he didn't have enough brainpower left
> > to focus on the algorithm at hand, what was he being confused
> > by?
> 
> Perhaps an attractive girl in the class :-). 
> 
> > I think it was by all the language and abstraction, 
> 
> I always though abstraction was used to avoid confusion by
> elevating away from all the little low level details.  Let's 
> you see the forest for the trees kind of thing.

That is the theory, and that theory *has failed* based on my
testing of newly graduated students.  What I find interesting
is that nobody is defending that newly graduated students are
well prepared to program real programs.  They just plain aren't.

IMO languages such as C are far, far more difficult than
assembly language.  We've had a few posts during this discussion
from people (including myself) who learned assembly early on as
their primary language.  They all felt that assembly was what
crystallized the concept of computers in their mind.  This
shouldn't be that hard to believe; there aren't that many
fundamental instructions in assembly, and nothing about the
computer's nature is hidden (or abstracted) away.

I'm not sure how this thought that "assembly is hard" came
about.  The only thing I can think of that so few people learn
assembly nowadays that they think it is all black magic, and
anything with mystery is assumed to be hard.

> > which is not what programming is.
> 
> Right, studing algorithms is not programming.  Programming is
> implementing algorithms.

Agreed.  If I had my way, I would give students one very
simple program a day to implement in assembly.  One after
another, just to get them used to implementing an algorithm
and *watching it run*.  That latter is the most important when
learning about the procedural nature of the computer.  The
most important thing they can learn is that there is nothing
"magic" about the computer, it's all take an instruction,
execute it, and move on.

> >  If he had been given a solid
> > foundation of the simple nature of the computer, I think he would
> > always have a "comfort zone" he could return to whenever he
> > didn't understand something, because it's *always* implemented
> > in the fundamental terms.
> 
> Sorry, I just don't follow this reasoning.

Well, look at OOP.  This is another thing that has developed
a mythical high level of difficult.  What is OOP?  It is the
most simple concept on earth:

Non-OOP: Execute operation on area of memory.
OOP:     Execute operation abstractly bound to an area of
         memory (AKA object).

That is *all* of it.  All of the rest of it (inheritence, etc)
are all extensions of the fundamental theory of OOP that we are
assigning a set of operations to an area of memory.  Now
granted, this could be considered an implementation of OOP,
but so what?  The point is that it's simple to understand, and
from there you can on to the "grand generalizations".

Now, how is OOP taught now?  As far as I can see, they shower
a dump-truck full of terms onto the students, and give them
a whirlwind of abstractions.  If they would focus on the
reality of the implementation, we would see far fewer confused
students.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-17  0:00                 ` Paul Hsieh
@ 1996-08-17  0:00                   ` Mike Rubenstein
  1996-08-19  0:00                     ` Richard A. O'Keefe
  0 siblings, 1 reply; 688+ messages in thread
From: Mike Rubenstein @ 1996-08-17  0:00 UTC (permalink / raw)



Paul Hsieh <qed@xenon.chromatic.com> wrote:

> Lawrence Kirby wrote:
> > 
> > In article <4v1r2a$gh6@falcon.le.ac.uk> EB15@le.ac.uk "Dr E. Buxbaum" writes:
> > 
> > >djohnson@tartarus.ucsd.edu (Darin Johnson) wrote:
> > >>> A binary sort, also known as quicksort, or Hoare's sort is covered
> > > extensively
> > >
> > >>"quicksort is the fastest sort" categorization without really
> > >>understanding it.
> > >
> > >A common misconception. Quicksort is fast for UNSORTED data. For data
> > >which are largely presorted (a common occurance if a bunch of > >additional
> > >data has to be added to a list), Quicksort becomes Slowsort.
> > 
> > Quicksort is faster for sorted data than unsorted data: any passable
> > implementation will use at the very least median-of-three pivot > selection.
> 
> Geez ... pay attention in class next time folks:
> 
> On a reversed list, quicksort==bubble sort.
> On a shifted list, quicksort==bubble sort.
> On most other arrangements quicksort = O(nlnn)
> 
> Merge sort is O(nln) in all situations and hence is better when reliable
> running time is required.

You should not have cut class the day they covered the techniques for
making O(N^2) performance very unlikely.  In fact, properly
implemented, quicksort will give O( N log N) performance on both a
reversed list and a shifted list.

Michael M Rubenstein




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-16  0:00                                             ` Adam Beneschan
@ 1996-08-18  0:00                                               ` Steve Heller
  1996-08-18  0:00                                                 ` Jeff Dege
  0 siblings, 1 reply; 688+ messages in thread
From: Steve Heller @ 1996-08-18  0:00 UTC (permalink / raw)



adam@irvine.com (Adam Beneschan) wrote:

>So, going back to Robert's statement, if "k", the number of radix digits
>in the numbers, and "N", the number of elements, are allowed to vary
>over the entire space of positive integers, then it would be wrong to
>say the sort time is O(N).  This is because no value of K that satisfies
>the above definition could be found---you could always make "k" large
>enough to make the inequality false.  So O(kN) is correct, as Robert
>said.  Steve's understanding of O-notation is fine for most practical
>purposes, but it's an oversimplification.
  I'll accept your analysis as being correct theoretically. However, I
have never seen a real application where the maximum key length was
not both:
 1. known in advance and 
 2. relatively small relative to the number of elements to be sorted.
  Under those conditions, I believe O(n) is appropriate. Of course, I
welcome corrections if I have misstated something here.




Steve Heller, author and software engineer
http://ourworld.compuserve.com/homepages/steve_heller 





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-16  0:00                                             ` Szu-Wen Huang
  1996-08-17  0:00                                               ` Robert Dewar
  1996-08-17  0:00                                               ` Robert Dewar
@ 1996-08-18  0:00                                               ` Steve Heller
  1996-08-21  0:00                                               ` Matt Austern
  1996-08-23  0:00                                               ` Tanmoy Bhattacharya
  4 siblings, 0 replies; 688+ messages in thread
From: Steve Heller @ 1996-08-18  0:00 UTC (permalink / raw)



huang@mnsinc.com (Szu-Wen Huang) wrote:

>Wrong order.  An O(n) program where n=1,000 that takes 1 second to
>complete should be able to finish n=2,000 in 2 seconds, not the other
>way around.  In other words, you can't derive time complexity by
>timing an algorithm, *especially* with only two samples.
  How about by looking at the algorithm and determining that it makes
two passes through the keys for each key position, which is therefore
O(n) given a fixed maximum key length?


Steve Heller, author and software engineer
http://ourworld.compuserve.com/homepages/steve_heller 





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-18  0:00                                               ` Steve Heller
@ 1996-08-18  0:00                                                 ` Jeff Dege
  1996-08-18  0:00                                                   ` Robert Dewar
  0 siblings, 1 reply; 688+ messages in thread
From: Jeff Dege @ 1996-08-18  0:00 UTC (permalink / raw)



On Sun, 18 Aug 1996 03:43:52 GMT, Steve Heller (heller@utdallas.edu) wrote:
:   I'll accept your analysis as being correct theoretically. However, I
: have never seen a real application where the maximum key length was
: not both:
:  1. known in advance and 
:  2. relatively small relative to the number of elements to be sorted.
:   Under those conditions, I believe O(n) is appropriate. Of course, I
: welcome corrections if I have misstated something here.

I've written code that was intended to work on keys of arbitrary length.
True, in most actual uses of the code, the keys had fixed length, but
the code couldn't depend upon that.  I've had occassions to sort a
couple of dozen strings of 80 characters, on occasion, as well.

What I would consider to be closer to the truth is that if the maximum
key length isn't both:
 1. known in advance and
 2. relatively small relative to the number of elements to be sorted,
nobody would even consider radix sort.

And so it falls quietly out of the set of cases you envision when you
think about radix sorting.

-- 
    Nearly every electrical engineer believes deep in his heart that he
is better at writing computer software than any computer programmer,
and can show as proof the fact that he has written a number of small
applications, each of which was done quickly, easily, and exactly met
his needs.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-14  0:00                                         ` Robert Dewar
  1996-08-16  0:00                                           ` Dik T. Winter
@ 1996-08-18  0:00                                           ` Glenn Rhoads
  1996-08-19  0:00                                           ` Stephen Baynes
  1996-08-19  0:00                                           ` Richard A. O'Keefe
  3 siblings, 0 replies; 688+ messages in thread
From: Glenn Rhoads @ 1996-08-18  0:00 UTC (permalink / raw)



dewar@cs.nyu.edu (Robert Dewar) writes:

> Why do people try and teach students Bubblesort? It may be an interlectually
> interesting exercise but it is of no use. An insertion sort is simpler and at
> least as fast as a bubble sort. For many practical programing problems an
> insertion sort is a sensible solution, it is very compact and for small
> datasets as fast as anything (Many quicksort implementations switch to
> insertion sort for less than about 7 items). The number of times I have
> had to redirect graduates who have tried to write a small sort using
> Bubblesort (because it was the simplest sort they were taught) or Quicksort
> (because they have been taught it is faster in all cases).

>The one advantage of bubble sort is that it is close to optimal on sorted
>or nearly sorted arrays. You have to be very careful how you write insertion
>sort not to require more compares in the fully sorted case, and you will
>almost certainly find you require more overhead, because of the two nested
>loops. Yes, you could add a special test in the outer loop for already
>being in the right place, but then you complicate the inner loop if you
>want to avoid repeating this comparison. A bubble sort is certainly a
>much simpler solution to the problem of optimal sorting of a sorted
>list, and simplicity of solutoins is interesting if performance is NOT
>an issue after all.

I believe you are confused as to which sort is insertion sort and which is
bubble sort.  If you replace every occurrence of bubble sort with insertion
sort and vice-versa in the above paragraph, then you are correct (as is
the poster you are responding to).

-- Glenn Rhoads














^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-17  0:00                     ` Tim Behrendsen
@ 1996-08-18  0:00                       ` Robert Dewar
  1996-08-18  0:00                         ` Tim Behrendsen
  1996-08-26  0:00                         ` Patrick Horgan
  1996-08-19  0:00                       ` John Hobson
  1 sibling, 2 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-18  0:00 UTC (permalink / raw)



Tim says

"Because I've found that people tend to stick with the first
[dare I use the word] paradigm that they are introduced to.
Everything else they learn will be compared against the first
thing they learn"

How true is this? Certainly true to some extent, and is of course
the fundamental reason why it is a huge mistake to teach assembly
to begin with.

Still lukcily this can remain a moot issue, no one I know of seriously
proposes teaching CS this way, and no school I know of teaches CS this
way, so we do not have to worry about it.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-15  0:00                       ` Robert Dewar
  1996-08-16  0:00                         ` Joe Foster
@ 1996-08-18  0:00                         ` Tim Behrendsen
  1996-08-20  0:00                           ` James Youngman
  1996-08-21  0:00                           ` Szu-Wen Huang
  1 sibling, 2 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-18  0:00 UTC (permalink / raw)



Robert Dewar <dewar@cs.nyu.edu> wrote in article
<dewar.840082355@schonberg>...
> i"Well, the thing is that real honest-to-goodness assembly is just
> not that difficult to learn, and that has built-in practical
> aspects to it."
> 
> That seems false for many modern RISC architectures, and as ILP becomes
> more and more of a factor, the instruction level semantics will become
> more and more complex.

Actually, RISC is usually easier to learn, just because the
instructions sets are more orthogonal.  Now, optimizing RISC
machines may be harder, but that's a different issue.

The point is not to learn assembly because they will be using
it every day, the point is to learn it so they can better
understand what computers really are, and how they work.  You
simply can't get the same "Ah-HA!  *Now* I understand!"
experience from programming in a HLL that you can from
programming assembly.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-18  0:00                                                 ` Robert Dewar
@ 1996-08-18  0:00                                                   ` Steve Heller
  1996-08-18  0:00                                                     ` Robert Dewar
  0 siblings, 1 reply; 688+ messages in thread
From: Steve Heller @ 1996-08-18  0:00 UTC (permalink / raw)



dewar@cs.nyu.edu (Robert Dewar) wrote:

>Steve said

>">No, they are much simpler to analyze, analyzing the left to right
>>radix sort (the only reasonable implementation, the right to left sort
>>is never a good choice), or the address calculation sort, is not easy.
>  Never? That's interesting, because it has very predictable behavior
>and runs very quickly (given enough real memory)."

>Well if the key is one digit, then of course you just have a counting
>sort and it is fine, but if the key is any significant number of
>digits, then NlogN < KN, e.g. 32 bit numbers with less than 2**32
>of them to sort!

  Well, let's see. If we have 2**24 elements of 32 bits each to sort,
and we sort the keys two bytes at a time (64K elements in the counting
table), then we have two passes to count and two to sort, so K is
four. On the other hand, logN, for base 2, is 24. The latter is
clearly 6 times the former, if the same time is required per
operation, and I suspect that the operations in the distribution
counting sort (especially the counting passes) are less complex than
those in quicksort or heapsort.


Steve Heller, author and software engineer
http://ourworld.compuserve.com/homepages/steve_heller 





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-17  0:00                       ` Peter Seebach
@ 1996-08-18  0:00                         ` Tim Behrendsen
  0 siblings, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-18  0:00 UTC (permalink / raw)



Peter Seebach <seebs@solutions.solon.com> wrote in article
<4v5aji$qmo@solutions.solon.com>...
> In article <01bb8c6d$c62d44c0$87ee6fce@timpent.airshields.com>,
> Tim Behrendsen <tim@airshields.com> wrote:
> >Well, what sense does that make?  C is not instantly readable to
> >someone who only knows Fortran.  If you don't know the language,
> >then it's not going to be instantly readable.
> 
> I object to this.  I found C instantly readable.  When I was a kid,
> I had the source (in C) to a game called hack.  I had never even heard
> of C; I didn't know what it was, I just knew that it supposedly was
> "the source" to hack.  (My understanding of source had to do with an
> unnamed language in which every statement had a number, and there were
> 52 variables, half of which were strings, and half of which were
> numbers.)

Well, there's muddling through, and then there's "instantly readable".
I can muddle through POWER assembly on my RS/6000 box when I'm
decoding a core dump, but I couldn't sit down and start writing
a real program.
 
> But when I wanted to know how the game worked, I paged through printouts.
> Nothing hard about it.  What's tricky (to someone who knows that "HP:
12/12"
> means you have 12 out of a possible 12 "hit points") about
> 	u.hp = u.maxhp = 12;

The tricky part is understanding the structure notation.  If I
was a FORTRAN programmer, for example, that would be completely
alien to me.  It's not I couldn't figure it out eventually, but
it wouldn't be immediately apparent.  Actually, the only FORTRAN
I remember is 66; did they add structures to any later revisions?

OK, how about I've only used BASIC my whole life, and you
shove an APL program in front of me?  :)

-- Tim Behrendsen (tim@airshields.com)





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-16  0:00                   ` Dr. Richard Botting
@ 1996-08-18  0:00                     ` Tim Behrendsen
  1996-08-21  0:00                       ` Szu-Wen Huang
  0 siblings, 1 reply; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-18  0:00 UTC (permalink / raw)



Dr. Richard Botting <dick@silicon.csci.csusb.edu> wrote in article
<4v26j4$hkg@news.csus.edu>...
> [...]
> : > If they'd started him off with explaining, in his native language,
how to
> : sort
> : > things, and given him sample sets of cards to sort, while following
each
> : of a
> : > set of descriptions *in his native language*, he would have
understood
> : > quicksort.
> I first saw this done by a colleague with UK Freshman in their
> first programming class in 1974.   It worked very well.  I stole the
> idea and still use it.
> 
> Works with searching algorithms as well. I still looking for
> a similar thing for linked data.

Hmmm... moving a deck of cards around to learn a sorting
technique.  Reducing the problem to a very low-level set of
movement operations to help in understanding procedurally
what the computer is doing.  <s>Naaah, couldn't work.  Much easier
to focus on the high-level C abstraction of the sorting
algorithm. </s> ;->

-- Tim Behrendsen (tim@airshields.com)





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-18  0:00                       ` Robert Dewar
@ 1996-08-18  0:00                         ` Tim Behrendsen
  1996-08-26  0:00                         ` Patrick Horgan
  1 sibling, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-18  0:00 UTC (permalink / raw)



Robert Dewar <dewar@cs.nyu.edu> wrote in article
<dewar.840342288@schonberg>...
> Tim says
> 
> "Because I've found that people tend to stick with the first
> [dare I use the word] paradigm that they are introduced to.
> Everything else they learn will be compared against the first
> thing they learn"
> 
> How true is this? Certainly true to some extent, and is of course
> the fundamental reason why it is a huge mistake to teach assembly
> to begin with.
> 
> Still lukcily this can remain a moot issue, no one I know of seriously
> proposes teaching CS this way, and no school I know of teaches CS this
> way, so we do not have to worry about it.

I suppose I should be happy about this, since I can continue
to test for the bright students, and my competition can hire
my rejects.

But I can't imagine that you're happy with the level of
competence of CS graduates.  If EE students can be taught
from the bottom up, and seem to be so better prepared than
CS students (I don't hear the same level of complaints, in
any case), isn't it time to try something that is working
elsewhere?

- SIGH - I need to open my own CS academy. :-)

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-15  0:00                     ` Bob Gilbert
@ 1996-08-18  0:00                       ` Tim Behrendsen
  0 siblings, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-18  0:00 UTC (permalink / raw)



Bob Gilbert <rgilbert@unconfigured.xvnews.domain> wrote in article
<4uv3ef$do2@zeus.orl.mmc.com>...
> In article <4ut1sv$ngv@solutions.solon.com>, seebs@solutions.solon.com
(Peter Seebach) writes:
> > 
> > I don't really think we are going to convince each other.  I believe
that
> > the "fundemental procedural nature of the computer" may not be a
permanent
> > thing.  I'd rather teach them to think in the general terms, and see
that
> > procedural nature as just one more current limitation, like computers
which
> > have only 2 MB of RAM.  This will better prepare them for techniques
that
> > depend on *not* thinking of the computer as procedural. 
> 
> I certainly agree with this.  When I was first learning to program,
computer
> time was extremely valuable, and we were taught to use great care when 
> writing a program to insure that it was as correct as possible to avoid 
> having to *waste* computer time having the compiler find all of your
syntax
> errors.  In fact, I even had one professor that deducted points on your
> programming assignments for each additional compilation you required past
> the first two.  This sort of view has certainly changed today, since
> computer time is usually a lot cheaper than a programmers time.
> 
> Another thing I'm seeing is the greater use of field programmable gate
arrays
> (FPGA's) in embedded systems. As the capability and density of FPGA's
continues
> to increase, it allows more and more flexibility to program (?)
functionallity
> into the FPGA.  I suspect in a few years we will be talking "virtual
hardware",
> with new and different programming methods, and subsequently new
languages, 
> that will be developed and learned.

Certainly there will be new methods, and new architectures, but
computers will *always* be procedural.  What is a computer, except
an engine that does data transformations over time?  Even our
brains work this way in a simplified sense; pattern transformations
from sensory input patterns to muscular output patterns.

Just because you start out by teaching someone a linear point of
view as a foundation doesn't preclude them from understanding more
complex transformation models.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-18  0:00                                                   ` Steve Heller
@ 1996-08-18  0:00                                                     ` Robert Dewar
  1996-08-20  0:00                                                       ` Steve Heller
  0 siblings, 1 reply; 688+ messages in thread
From: Robert Dewar @ 1996-08-18  0:00 UTC (permalink / raw)



Steve Heller said

"  Well, let's see. If we have 2**24 elements of 32 bits each to sort,
and we sort the keys two bytes at a time (64K elements in the counting
table), then we have two passes to count and two to sort, so K is
four. On the other hand, logN, for base 2, is 24. The latter is
clearly 6 times the former, if the same time is required per
operation, and I suspect that the operations in the distribution
counting sort (especially the counting passes) are less complex than
those in quicksort or heapsort.

"

The distribution counting sort is quite different from a right to left
radix sort (the latter being the sort you do on a sorting machine for
80 column cards, if people still remember!) My comment was on the latter,
so Steve's comment on the former is quite right, but they are two
different algorithms.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-17  0:00                       ` Dan Pop
@ 1996-08-18  0:00                         ` Mark Wooding
  1996-08-20  0:00                           ` Peter Seebach
  1996-08-21  0:00                           ` Szu-Wen Huang
  0 siblings, 2 replies; 688+ messages in thread
From: Mark Wooding @ 1996-08-18  0:00 UTC (permalink / raw)



Dan Pop <Dan.Pop@cern.ch> wrote:
> Ine <01bb8c6d$c62d44c0$87ee6fce@timpent.airshields.com> "Tim Behrendsen" <tim@airshields.com> writes:
>
> > ... to imagine that on the average they beat even the average human
> > assembly programmer is just nonsense.
> 
> Try to test your assertions with a nontrivial piece of code on a P6
> or any modern RISC processor.

Can I play?  Or don't you consider ARMs to be `modern' (or RISC) enough?
Or maybe I'm just above average ;-).

Oh, back to the point (or what passes for a point) of this thread.  A
while ago, while traipsing merrily through a disassembly, I discovered
an atrocity, which could only have been generated from something very
similar to the following C code:

	char buf[...];
	char *p;

	...

	while (buf[0]==' ')
	{
	  for (p=buf;p[0]=p[1];p++)
	    ;
	}

	while (buf[strlen(buf)-1]==' ')
	  buf[strlen(buf)-1]=0

I can't believe that anyone with an understanding of what goes on `under
the covers' would possibly write anything like this without feeling ill.
An inkling of what this would be translated into by any implementation
would surely avoid horrors like this.

Anyone who asks `what's wrong with that' will be shot.
-- 
[mdw]

`When our backs are against the wall, we shall turn and fight.'
		-- John Major





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-17  0:00               ` Lawrence Kirby
@ 1996-08-18  0:00                 ` Ken Pizzini
  1996-08-19  0:00                 ` Richard A. O'Keefe
  1 sibling, 0 replies; 688+ messages in thread
From: Ken Pizzini @ 1996-08-18  0:00 UTC (permalink / raw)



In article <840279292snz@genesis.demon.co.uk>,
Lawrence Kirby  <fred@genesis.demon.co.uk> wrote:
>As far as the C language is concerned merge sort and radix sort aren't good
>choices for qsort().

The function prototype for qsort() indeed makes radix sort a poor
choice.  I also see that a "traditional" merge sort's memory
requirements make that a questionable implementation, but there are
in-place merge sort algorithms with reasonable performance; I don't
recall off hand if any actually can compete with quicksort for the
average case (quicksort has a mean inner loop), but I recall that
some of the variants are quite respectable in their worst case
behavior.

My memory is a little fuzzy on this because I tend to use the
implementation offered by the system; it is rarely worth my
time to try and get an application-specific improvement in
my sorts.

		--Ken Pizzini




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-08  0:00                                       ` Tim Behrendsen
  1996-08-08  0:00                                         ` Peter Seebach
  1996-08-09  0:00                                         ` Dan Pop
@ 1996-08-18  0:00                                         ` Sam B. Siegel
  1996-08-19  0:00                                           ` Dan Pop
  2 siblings, 1 reply; 688+ messages in thread
From: Sam B. Siegel @ 1996-08-18  0:00 UTC (permalink / raw)



I have to agree with Tim on this one.  Good working knowldge of
assembler and machine acretecture (sp?) make it painfully obviously on
how memory and data is manipulated.
Sam
"Tim Behrendsen" <tim@airshields.com> wrote:

>Dan Pop <Dan.Pop@cern.ch> wrote in article
><danpop.839450672@news.cern.ch>...
>> In <01bb83f5$923391e0$87ee6fce@timpent.airshields.com> "Tim Behrendsen"
><tim@airshields.com> writes:
>> 
>> >The problem is that we *can't* think purely abstractly,
>> >otherwise we end up with slow crap code.
>> 
>> Care to provide some concrete examples?

>Look at the code-bloated and slow-software world we live in,
>particularly on desktop platforms.  I think this is caused by
>people not truly understanding what's *really* going on.

>For example, look at OOP.  Very naive implementations of OOP
>used a huge amount of dynamic memory allocation, which can cause
>severe performance problems.  That's why I don't use C++ for my
>products; to do it right you have to do a very careful analysis
>of how your classes are going fit together efficiently.

>I've spoken to enough people that have had C++ disasters to
>convince me that the more abstraction there is, the more
>danger there is of inefficient code.  This shouldn't be that
>hard to believe; any time you abstract away details you are
>giving up knowledge of what is going to be efficient.

>I alluded to this in another post, but a good example is Motif
>and X11.  A programmer who only understands Motif, but does not
>understand X11 is going to write slow crap, period.

>> >It is simply not
>> >possible to ignore the way code is structured, and completely
>> >depend on the compiler to save us.
>> 
>> This doesn't make any sense to me.  Could you be a little bit more
>> explicit?
>> 
>> The compiler definitely won't save my ass if I choose to use bubblesort
>> instead of quicksort on a large dataset, but the selection between the
>> two algorithms is made based on an abstraction (algorithm analysis) not
>> on how the compiler generates code for one or another.  It's very likely
>> that quicksort will be better, no matter the compiler and the underlying 
>> platform.
>> 
>> Once you put micro-optimizations based on knowledge about the
>> compiler and/or hardware into the code, you impair both the 
>> readability/maintainability/portability of the code and the opportunities
>> of another compiler, on another platform, to generate optimal code.
>> There are situations when this _has_ to be done, but they are isolated
>> exceptions, not the norm.

>I gave this example in another post, but nobody responded.  I think
>it was too good. :-)  I'll try again ...

>I can prove that your statement is wrong.

>Let's say I have the perfect optimizer that takes C code and
>provides the absolute most efficient translation possible.  Given
>that's the case, it won't improve an O(n) algorithm to an O(n^2)
>algorithm.

>Now, that should mean that I can order my C code into any
>algorithmically valid sequence and end up with exactly the same
>running time, because the optimizer is always perfect.

>Now, we know that this does not reflect the real world.  The
>question is, how does a programmer learn the efficient
>implementations that the optimizer can deal with effectively
>from the boneheaded ones?

>Here's an example:

>int a[50000],b[50000],c[50000],d[50000],e[50000];

>void test1()
>{
>    int i, j;
>    for (j = 0; j < 10; ++j) {
>        for (i = 0; i < 50000; ++i) {
>            ++a[i]; ++b[i]; ++c[i]; ++d[i]; ++e[i];
>        }
>    }
>}

>void test2()
>{
>    int i, j;
>    for (j = 0; j < 10; ++j) {
>        for (i = 0; i < 50000; ++i) ++a[i];
>        for (i = 0; i < 50000; ++i) ++b[i];
>        for (i = 0; i < 50000; ++i) ++c[i];
>        for (i = 0; i < 50000; ++i) ++d[i];
>        for (i = 0; i < 50000; ++i) ++e[i];
>    }
>}

>On my AIX system, test1 runs in 2.47 seconds, and test2
>runs in 1.95 seconds using maximum optimization (-O3).  The
>reason I knew the second would be faster is because I know
>to limit the amount of context information the optimizer has
>to deal with in the inner loops, and I know to keep memory
>localized.

>Now I submit that if I showed the average C programmer
>both programs, they would guess that test1 is faster because
>it has "less code", and that is where abstraction,
>ignorance, and niavete begin to hurt.

>-- Tim Behrendsen (tim@airshields.com)








^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-16  0:00                                             ` Robert Dewar
@ 1996-08-18  0:00                                               ` Steve Heller
  1996-08-18  0:00                                                 ` Robert Dewar
  0 siblings, 1 reply; 688+ messages in thread
From: Steve Heller @ 1996-08-18  0:00 UTC (permalink / raw)



dewar@cs.nyu.edu (Robert Dewar) wrote:

>No, they are much simpler to analyze, analyzing the left to right
>radix sort (the only reasonable implementation, the right to left sort
>is never a good choice), or the address calculation sort, is not easy.
  Never? That's interesting, because it has very predictable behavior
and runs very quickly (given enough real memory).



Steve Heller, author and software engineer
http://ourworld.compuserve.com/homepages/steve_heller 





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-18  0:00                                               ` Steve Heller
@ 1996-08-18  0:00                                                 ` Robert Dewar
  1996-08-18  0:00                                                   ` Steve Heller
  0 siblings, 1 reply; 688+ messages in thread
From: Robert Dewar @ 1996-08-18  0:00 UTC (permalink / raw)



Steve said

">No, they are much simpler to analyze, analyzing the left to right
>radix sort (the only reasonable implementation, the right to left sort
>is never a good choice), or the address calculation sort, is not easy.
  Never? That's interesting, because it has very predictable behavior
and runs very quickly (given enough real memory)."

Well if the key is one digit, then of course you just have a counting
sort and it is fine, but if the key is any significant number of
digits, then NlogN < KN, e.g. 32 bit numbers with less than 2**32
of them to sort!





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-18  0:00                                                 ` Jeff Dege
@ 1996-08-18  0:00                                                   ` Robert Dewar
  0 siblings, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-18  0:00 UTC (permalink / raw)



Jeff says

"What I would consider to be closer to the truth is that if the maximum
key length isn't both:
 1. known in advance and
 2. relatively small relative to the number of elements to be sorted,
nobody would even consider radix sort.
"

No, that's wrong, left to right radix exchange sorts are quite appropriate
for this situation, 





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-19  0:00                       ` John Hobson
@ 1996-08-19  0:00                         ` Tim Behrendsen
  1996-08-19  0:00                           ` John Hobson
  1996-08-23  0:00                           ` Alan Bowler
  0 siblings, 2 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-19  0:00 UTC (permalink / raw)



John Hobson <jhobson@ceco.ceco.com> wrote in article
<32187C76.41C67EA6@ceco.ceco.com>...
> Tim Behrendsen wrote:
> <snip>
> 
> > I'm not sure how this thought that "assembly is hard" came
> > about.  The only thing I can think of that so few people learn
> > assembly nowadays that they think it is all black magic, and
> > anything with mystery is assumed to be hard.
> <more snippage>
> 
> I can certainly tell you why I think that assembler is hard -- it's
> because assembler is so long winded.  If you want to write the
> equivalent of a = b / c; in assembler, it takes about 5 or 6 lines.
> In *The Mythical Man-Month*, Fredrick Brooks says that the best gauge
> of programmer productivity is lines of code written per <time period>.
> Since assembler takes so many more lines to do something than (say) C
> does, it at least appears harder because it takes longer.

I think Brooks' point is to measure relative programmer
productivity in the same language, not different languages.  What
is it the average programmer produces; like, 100 lines/month
of fully debugged code or some low number?  Obviously, it's
not the amount of typing that influences productivity.

In fact, let's *dramatically* increase productivity and move
to APL!  Heck, I remember a friend of mine wrote a "moon rocket
lander" game in 1 line of APL on a dare.  A very *long* line,
mind you...

> As Tim also says:
> > Because I've found that people tend to stick with the first
> > [dare I use the word] paradigm that they are introduced to.
> > Everything else they learn will be compared against the first
> > thing they learn, and if the first thing they learn is the
> > step-by-step nature of the computer through a simple assembly
> > language (they don't have to start with the sophisticated RISC
> > chip on the planet), then they'll always have that safe, simple
> > place to go back to when trying to understand more sophisticated
> > concepts.
> 
> I have had exposure to assembler (IBM BAL on a 370 -- BAL standing for
> Basic Assembly Language), and I feel towards it much as Martin Luther
> felt towards monasticism.  After he broke with Rome, Luther damned
> monasteries and all that they stood for, but he also said that he was
> glad that he had had the experience.  IMHO, I have found it very useful
> at times to have some sort of idea of what the computer is actually
> doing.  I will also say that there are times when the best way to debug
> programs is to read the assembly listing.  However, I am very glad that
> I don't actually have to write assembler.  The first language I learnt
> was FORTRAN II, which was really just one step up from assembler
> (e.g., the FORTRAN arithmetic IF can easily be seen as a macro) and
> this gave me a good insight into what the machine was doing as it
> executed my code.  It also gave me an appreciation of the ease that
> things like true IF-THEN-ELSE structures gave the programmer.

I don't think anyone is advocating a return to assembly for
"real world" large-project purposes, only for educational
purposes.  I'll add you to the list of people who found it
to be of value early on.  :-)

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-19  0:00                         ` Tim Behrendsen
@ 1996-08-19  0:00                           ` John Hobson
  1996-08-20  0:00                             ` Szu-Wen Huang
  1996-08-23  0:00                           ` Alan Bowler
  1 sibling, 1 reply; 688+ messages in thread
From: John Hobson @ 1996-08-19  0:00 UTC (permalink / raw)
  To: Tim Behrendsen


Tim Behrendsen wrote:
> 
> John Hobson <jhobson@ceco.ceco.com> wrote in article
> <32187C76.41C67EA6@ceco.ceco.com>...
> > Tim Behrendsen wrote:

> > In *The Mythical Man-Month*, Fredrick Brooks says that the best gauge
> > of programmer productivity is lines of code written per <time period>.
> > Since assembler takes so many more lines to do something than (say) C
> > does, it at least appears harder because it takes longer.
> 
> I think Brooks' point is to measure relative programmer
> productivity in the same language, not different languages.  What
> is it the average programmer produces; like, 100 lines/month
> of fully debugged code or some low number?  Obviously, it's
> not the amount of typing that influences productivity.

Actually, Brooks says specifically that it is a measure of programmer
productivity IRREGARDLESS of language.  One of his points is
that this is an argument in favour of HLLs.

> I don't think anyone is advocating a return to assembly for
> "real world" large-project purposes, only for educational
> purposes.  I'll add you to the list of people who found it
> to be of value early on.  :-)

Yes, you can put me down as one of the people who is in favour of
learning assembler.  You can also put me down as one who is grateful
that he does not actually have to program in it.

-- 
John Hobson             |Whenever someone says to me,
Unix Support Group      |"Have a nice day", I reply,
ComEd, Chicago, IL, USA |"Sorry, I've made other plans."
jhobson@ceco.ceco.com   |	-- Sir Peter Ustinov




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-07-28  0:00                 ` J. Christian Blanchette
                                     ` (8 preceding siblings ...)
  1996-08-15  0:00                   ` Teaching sorts [was Re: What's the best language to start with?] Norman H. Cohen
@ 1996-08-19  0:00                   ` Ted Dennison
  1996-08-23  0:00                     ` Richard A. O'Keefe
  9 siblings, 1 reply; 688+ messages in thread
From: Ted Dennison @ 1996-08-19  0:00 UTC (permalink / raw)



> In article <dewar.839593893@schonberg>, dewar@cs.nyu.edu (Robert Dewar) writes:
> 
> |> Robert Eachus says
> |>
> |> "    I managed to do the "fun" experiment once.  Take three students
> |> and have them learn Quicksort, Heapsort, and Bubblesort on "small"
> |> decks.  At even 50 to 60 cards, the students doing Heapsort and
> |> Quicksort are racing each other*, and the Bubblesort victim is still
> |> hard at work well after they have finished."
> |>
> |> Try extending your experiment (I have also used this) a bit. Have a fourth
> |> person sort the deck who knows none of these algorithms. That fourth
> |> person will typically beat the Quicksort and Heapsort guys. Why? Because
> |> the natural way to sort cards is with some physical embodiment of adress
> |> calculation sorting, which can have an average time performance that is
> |> order (N) rather than order N log N.
> |>
> |> This can be an instructive addition to your experiment!

I had some time on my hands this weekend, so I tried this myself (On vacation
at the beach with a broken arm, what else was I to do?) With no practice,
quicksort on a deck of playing cards took me less than 5 minutes. After a bit
of practice, heapsort still took me more than 15 minutes (and a LOT of table
space).

If you think about this exercise, comparisons take WAY less time than swaps
(especially with one arm). So it would seem to me that you would need several
decks of playing cards (each) to make it much of a race.

-- 
T.E.D.          
                |  Work - mailto:dennison@escmail.orl.mmc.com  |
                |  Home - mailto:dennison@iag.net              |
                |  URL  - http://www.iag.net/~dennison         |




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-19  0:00                                     ` Bob Gilbert
@ 1996-08-19  0:00                                       ` Tim Behrendsen
  1996-08-19  0:00                                         ` Tim Hollebeek
                                                           ` (7 more replies)
  0 siblings, 8 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-19  0:00 UTC (permalink / raw)



Bob Gilbert <rgilbert@unconfigured.xvnews.domain> wrote in article
<4v9nei$kjf@zeus.orl.mmc.com>...
> In article <01bb8c89$9023e3e0$87ee6fce@timpent.airshields.com>, "Tim
Behrendsen" <tim@airshields.com> writes:
> > 
> > And they don't get wrapped up in implementation
> > details if they write it in C? 
> 
> As compared to assembly, I'd say they certainly won't get wrapped
> up in the details (I'd prefer Ada over C though, C can be a bit 
> cryptic).  I can write in a HOL and not worry about memory addressing
> modes (darn, forgot to load that data page pointer!), I don't have
> to worry about internal data representation (now can't I just shift
> right to divide by two, even if it is a float?), I don't have to
> worry about pipeline conflicts (I just read X into register R1, so
> why can't I test R1 in the very next instruction?), and whole host
> of other architectural details that obfuscate the lesson to be
> learned (sort algorithm, or whatever).  And what happens next term
> when the school replaces the system on which they teach assembly
> now for some new and different system?  Now your students have to
> go through the learning curve (of assembly) all over again.  At least
> HOL's are fairly portable and stable across platforms.

Well, all that is certainly true if you intentionally go out
and find the most complex, arcane architecture you can find.
If you start with a nice, simple one like a 6809 or 68000 or
something, then implementation details are practially nil.

> > My primary point is
> > when it's implemented in assembly, and you have to
> > manually move the bytes from one location to another
> > rather than have the compiler "carry them for you",
> > you get a better view feel for not only what the
> > algorithm is doing, but what the computer itself is
> > doing.
> 
> I agree that learning some sort of assembly and implementing
> a number of programs (algorithms) is a valuable lesson, but it 
> is a lesson in computer architecture much more so than a 
> lesson in learning algorithms or problem solving.

Yes, exactly.  I understand that theoretically computers
can be defined in terms of any architectural abstraction,
but that's not the point.  The point is to give the students
a foundation to build on, and a very simple computer
architecture is much simpler than complex language syntax.
 
> > The most important thing a student can learn about
> > computers is the fundamentally simple nature.  The
> > mystery of execution *must* be broken down in order for
> > the student to being to think like a programmer.
> > Once they can think like a programmer, all the rest
> > of the knowledge they learn becomes trivial.
> 
> I think your viewed scope of computer science is tad narrow.
> I'd like you to show how learning assembly as a first language
> will help a student better understand the design of a relational 
> database, or solve any of the class of problems in mapping theory
> (find the shortest or fastest route from Miami to Seattle), or deal
> with networking and concurrency issues.  And it seems to me that 
> one of the largest areas in the study of computer science is language
> theory and compiler design, and it is hard to teach that if the 
> student isn't introduced to a HOL early (perhaps first) in their 
> curriculum.

Focusing on individual problems is a narrow view of learning.

All computer problems have certain things in common; they have an
input, they have an output, and they have a set of procedures
in between.  Teaching the student how to think in terms of
breaking problems into procedures is by far the most difficult
task in teaching someone to program.  Yes, you can pack algorithms
into their head, but that doesn't mean they are learning why
the algorithm is way that it is.  I'm reminded of the one guy
on another thread that said "I learned Quicksort, but I didn't
really understand it."  This is what happens when you focus on
packing knowledge, without teaching understanding.

Let's take the RDB example.  If I take the normal CS graduate,
sit them down in front of terminal, and say "whip up an SQL
parser that's comparable in performance to Oracle", would
they be able to do it based on the knowledge they have learned
in the normal CS program?  I don't think it's an exaggeration
to say "hell no."  Why?  Because the haven't been taught The
Algorithm that will compare to Oracle.  They would have no
clue where to start.  This is my experience with testing new
graduates.

Now what if that same student had gone to the Tim Behrendsen
Academy of Programming, and solved new problems every day in
low level programming, and left with a degree in thinking, but
was relatively light on HLLs and his Bag O' Algorithms was
heavy on the basics, but light on the specifics of "mapping
theory" or whatever?  I think my student would get way
farther than this standard CS student, because they would
know how to take complex problems such as SQL optimization
that hadn't necessarily been seen before, and could make
a reasonable go of it.  Not that either would be able to
reproduce a multi-hundred man year product like Oracle. :)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
       [not found]                                             ` <dewar.840491732@schonberg>
@ 1996-08-19  0:00                                               ` Robert Dewar
  1996-08-22  0:00                                                 ` Stephen Baynes
  1996-08-27  0:00                                                 ` Richard A. O'Keefe
  0 siblings, 2 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-19  0:00 UTC (permalink / raw)



(slightly corrected version, I noticed a minor typo in the first version,
which I canceled).

Richard O'Keefe says

        void insertion_sort(elt *a, int n) {
            int i, j;

            /* invariant: a[0..i-1] is a sorted permutation of old a[0..i-1] */
            for (i = 1; i < N; i++) {
                elt const t = a[i];
                for (j = i; j > 0 && t < a[j-1]; j--)
                    a[j] = a[j-1];
                a[j] = t;
            }
        }

   If a is already sorted, this does N-1 comparisons, which is optimal.
   I don't see any need for extreme care here.

Well the extreme here is your word not mine, I simply said "very careful".
In fact the critical thing to preserve the number of compares is to run
the inner loop backwards. That comes naturally in C, but not in Ada,
where the preferable code is to find the insertion point and then do a
slice assignment to shuffle the data (the slice assignment is not only
at a preferable higher level, but with a good optimizing compiler has
the potential of generating much more efficient code on some architectures
than the individual moves.

So, you ran the inner loop backwards and indeed got the number of
compares right, but you still have the overhead of both loops testing
the termination condition, whereas a bubble sort written in the normal
manner will have only one loop, and therefore execution only half the
number of loop termination tests, that's what I meant by extra overhead
even if you are careful to ensure no extra comparisons.

I find Hillam's initial bubble sort algorithm curious. I certainly don't
call this bubble sort. To me you repeat the outer loop only if you have
found that the inner loop made at least one exchange. Well anyone can
call any algorithm by any name, *I* use the word bubble sort to include
the notation of early termination of the outer loop if no exchanges have
been made in the inner loop. To avoid too much overhead in setting this
flag, it is useful to use the PC to encode the binary state of this flag.

The modified bubble sort from Hillam has two "optimizations" compared to
his original. First he uses the flag SORTED in the manner I suggest above,
but encoding this as a boolean flag is inefficient, since it keeps getting
set, better to encode it in the program counter, then the cost of setting
it is almost free (one extra jump, executed once per inner loop).

The second is the IN_PLACE optimization, which is reasonable, and helps
sometimes, but is not necessary for the point of view of this discussion.

  "Now the funny thing here is that Robert Dewar wrote
  >you will almost certainly find you require more overhead
  [for insertion sort than bubble sort]
  >because of the two nested loops.
  But both versions of bubble sort have two nested loops as well!"

You missed the point. In the bubble sort, the outer loop is executed only
once for a sorted list. For your insertion sort, the outer loop runs a full
number of times. Your insertion sort has 2*N loop termination tests, while
a properly written bubble sort has N loop termination tests.

I do not know any way to write the insertion sort so that it has only N
loop termination tests, certainly your example has 2*N tests.

Here is a simple example of bubble sort the way I would code it

   procedure Sort (N : Positive; Exch : Exch_Procedure; Lt : Lt_Function) is
      Switched : Boolean;
   begin
      loop
         Switched := False;

         for J in 1 .. N - 1 loop
            if Lt (J + 1, J) then
               Exch (J, J + 1);
               Switched := True;
            end if;
         end loop;

         exit when not Switched;
      end loop;
   end Sort;

I do not seem to count five variables here! More like two, one of which
is a loop control constant. True, it uses an exit, but I find the attempt
to eliminate this to read to less readable, not more readable code (I judge
code by its readability, not by whether it satisfies someone's idea of what
structured code should be :-)

Now this uses a flag to encode the switch. If we want to encode this switch
in the PC, which is more efficient, but a little less clear:

   procedure Sort (N : Positive; Move : Move_Procedure; Lt : Lt_Function) is
   begin
      for J in 1 .. N - 1 loop
         if Lt (J + 1, J) then
            Exch (J, J + 1);

            for K in J + 1 .. N - 1 loop
               if Lt (K + 1, K) then
                  Exch (K, K + 1);
               end if;
            end loop;

            Sort (N, Move, Lt);
         end if;
      end loop;
   end Sort;

If you don't trust your compiler to eliminate the tail recursion, then
you probabably should replace the recursive call with a goto!





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-19  0:00                                       ` Tim Behrendsen
@ 1996-08-19  0:00                                         ` Tim Hollebeek
  1996-08-20  0:00                                           ` Tim Behrendsen
  1996-08-20  0:00                                         ` Bob Gilbert
                                                           ` (6 subsequent siblings)
  7 siblings, 1 reply; 688+ messages in thread
From: Tim Hollebeek @ 1996-08-19  0:00 UTC (permalink / raw)



Tim Behrendsen (tim@airshields.com) wrote:

: Well, all that is certainly true if you intentionally go out
: and find the most complex, arcane architecture you can find.
: If you start with a nice, simple one like a 6809 or 68000 or
: something, then implementation details are practially nil.

Of course, any modern CPU is going to have tons of things like
register windows, delay slots, and pipelines.  I'd hate to manage that
stuff directly.  The future appears to be RISC and/or superscalar, and
both are a pain to program by hand.  Of course, I could be wrong about
that.  But I do know one thing:  Whatever chip will be in my computer
10 years from now, it will run my C code without modification.  Can
you say that about your assembly?

BTW, where do you find the time for 10 posts a day?  I don't have time
to *read* everything you post :-)

---------------------------------------------------------------------------
Tim Hollebeek         | Disclaimer :=> Everything above is a true statement,
Electron Psychologist |                for sufficiently false values of true.
Princeton University  | email: tim@wfn-shop.princeton.edu
----------------------| http://wfn-shop.princeton.edu/~tim (NEW! IMPROVED!)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-19  0:00                                           ` Stephen Baynes
@ 1996-08-19  0:00                                             ` Robert Dewar
  1996-08-19  0:00                                             ` Robert Dewar
  1 sibling, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-19  0:00 UTC (permalink / raw)



Stephen, in your comparison of the two sorting methods, you used code
that has an important difference, in the insertion sort you run the
loop backwards, which generates a more efficient termination test, but
in the bubble sort you run the loop forward, which results in a slower
termination test. This kind of accidental difference can cloud the results,
since we are looking for differences in loop overhead management here!

You *really* have to be careful in writing the code if you are going to
base conclusions on your measurements!






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-19  0:00                                           ` Stephen Baynes
  1996-08-19  0:00                                             ` Robert Dewar
@ 1996-08-19  0:00                                             ` Robert Dewar
  1 sibling, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-19  0:00 UTC (permalink / raw)



Stephen says

"Having just done some testing, I don't think I can agree with your statement
about bubble sort being optimial on nearly sorted arrays. On the cases I tested
insertion sort seems to be much quicker, unless the two arrays are so nearly
sorted that I can't easily measure the time to sort them. I have not used any
special cases of the sort you suggest in the insertion sort. The bubble sort
function is also over 50% more code than the insertion sort so I can't call it
simpler."

But! the cases in your "unless" case are exactly the ones where the 
difference is appreciable, so you are not measuring the interesting cases.

What do you mean, you
"can't easily measure the time to sort them"

Of course you can, use a completely sorted vector and just do the sort
enough times so you can measure it.

Of course it may well be that you get an anomolous result due to
compiler optimnizations applying in one case and not the other,
so it is best to test this using carefully optimized hand written
assembler, to minimize difficulties of this type, but that of course
is much harder work!





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-18  0:00                                         ` Sam B. Siegel
@ 1996-08-19  0:00                                           ` Dan Pop
  0 siblings, 0 replies; 688+ messages in thread
From: Dan Pop @ 1996-08-19  0:00 UTC (permalink / raw)



In <4v62h3$33c4@news-s01.ny.us.ibm.net> sam0001@ibm.net (Sam B. Siegel) writes:

>I have to agree with Tim on this one.  Good working knowldge of
>assembler and machine acretecture (sp?) make it painfully obviously on
>how memory and data is manipulated.

What happens to the code when it is ported to a machine with a different
architecture (this is the correct spelling) and assembly language?
Chances are that, if it was correctly written on the first machine, it
will run just fine on the second one.  So, the knowledge about the
assembly and architecture of the first machine was not that important.

What is really important when learning programming in a HLL is a good
understanding of the basics of digital computers, not of the specific
details of one or another.

Dan
--
Dan Pop
CERN, CN Division
Email: Dan.Pop@cern.ch 
Mail:  CERN - PPE, Bat. 31 R-004, CH-1211 Geneve 23, Switzerland




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-17  0:00               ` Lawrence Kirby
  1996-08-18  0:00                 ` Ken Pizzini
@ 1996-08-19  0:00                 ` Richard A. O'Keefe
  1996-08-23  0:00                   ` Joe Keane
  1 sibling, 1 reply; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-08-19  0:00 UTC (permalink / raw)



I wrote:
>>the modern heapsort has the virtue that its worst case is O(nlgn) which
>>makes it a better bet for soft-real-time work.

Lawrence Kirby <fred@genesis.demon.co.uk> writes:

>Heapsort has always been O(n log n). You're right that a guaranteed
>reasonable worst case is sometimes useful.

Mea culpa.  What I meant, of course, was that it has now been known for
several years (from my known knowedge) [since 1968 at least, according
to Robert Dewar] how to do heapsort with 1.0 N.lg N + cruft element
comparisons.

>>For general use, a well engineered merge sort is as good as a well engineered
>>quicksort; sometimes better.

>It is close but very rare indeed for a heapsort to run faster than
>a quicksort.

Perhaps, but I wrote "a well engineered MERGE sort" at this point, and
that's what I meant.  It is quite common for a well engineered merge sort
to beat quick sort; I've beaten UNIX qsort by 20% on occasion. (It is
*much* harder to beat the Bentley & someone "engineered" quick sort in
Software Practice & Experience a year or two ago, but it isn't at all hard
to *match* it).

>As far as the C language is concerned merge sort and radix sort aren't good
>choices for qsort().

That was not proposed by anyone.
It is so ruddy obvious that a sorting interface which accepts an
arbitrary comparison function cannot use radix sort that surely Kirby
didn't need to mention it.

As for whether merge sort is a good choice for qsort():  if you have the
spare memory (which in most cases on workstations and mainframes you _have_)
and if you care about performance (which again one usually does) merge sort
makes an *excellent* implementation of qsort(), if you are as good a
hacker as whoever would have written the quicksort.  UNIX on a PDP-11 *had*
to make use of a less efficient sort than merge sort because in 64k you
didn't have the memory to spare for anything.

-- 
Australian citizen since 14 August 1996.  *Now* I can vote the xxxs out!
Richard A. O'Keefe; http://www.cs.rmit.edu.au/~ok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-17  0:00                   ` Mike Rubenstein
@ 1996-08-19  0:00                     ` Richard A. O'Keefe
  1996-08-20  0:00                       ` Mike Rubenstein
  0 siblings, 1 reply; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-08-19  0:00 UTC (permalink / raw)



miker3@ix.netcom.com (Mike Rubenstein) writes:
>You should not have cut class the day they covered the techniques for
>making O(N^2) performance very unlikely.  In fact, properly
>implemented, quicksort will give O( N log N) performance on both a
>reversed list and a shifted list.

Yes, well I suppose all of us missed _something_ in our CS educations,
such as the fact that the "proof" that O(N**2) is unlikely assumes that
every possible permutation of the input is equally likely, which is not
true in the real world.  Check the paper by Bentley and someone (McIlroy?)
in Software Practice and Experience on "Engineering a Sort":  the UNIX
qsort() routine *did* yield O(N**2) behaviour in actual use by people who
were trying to solve a real problem, not break the sorting routine.

The likelihood of quicksort doing badly on *your* data depends on the
actual probability distribution of *your* data.

This leads to the odd observation that if you don't know what the
distribution of your data is, and you would like to be confident that
bad things are genuinely unlikely, a good first step is to *scramble*
your array before sorting it, so that the assumption of this proof is
known to be true!  (Permuting an array randomly is only O(N).)

-- 
Australian citizen since 14 August 1996.  *Now* I can vote the xxxs out!
Richard A. O'Keefe; http://www.cs.rmit.edu.au/~ok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-14  0:00                                         ` Robert Dewar
                                                             ` (2 preceding siblings ...)
  1996-08-19  0:00                                           ` Stephen Baynes
@ 1996-08-19  0:00                                           ` Richard A. O'Keefe
       [not found]                                             ` <dewar.840491732@schonberg>
  3 siblings, 1 reply; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-08-19  0:00 UTC (permalink / raw)



dewar@cs.nyu.edu (Robert Dewar) writes:

>The one advantage of bubble sort is that it is close to optimal on sorted
>or nearly sorted arrays. You have to be very careful how you write insertion
>sort not to require more compares in the fully sorted case, and you will
>almost certainly find you require more overhead, because of the two nested
>loops.

Hmm.  
Here's C code to sort an N-element array a[0..N-1].

	void insertion_sort(elt *a, int n) {
	    int i, j;

	    /* invariant: a[0..i-1] is a sorted permutation of old a[0..i-1] */
	    for (i = 1; i < N; i++) {
		elt const t = a[i];
		for (j = i; j > 0 && t < a[j-1]; j--)
		    a[j] = a[j-1];
		a[j] = t;
	    }
	}

If a is already sorted, this does N-1 comparisons, which is optimal.
I don't see any need for extreme care here.

Let's put that into Ada:

	generic
	    type Element is private;
	    with function "<"(Left, Right: Element) return Boolean;
	    type Index is (<>);
	    type Vector is array (Index) of Element;
	procedure Insertion_Sort(A: in out Vector);

	procedure Insertion_Sort(A: in out Vector) is
	begin
	    for I in Index'Succ(A'First) .. A'Last loop
		-- invariant: A(A'First .. I) is a sorted permutation
		-- of old A(A'First .. I).
		declare
		    T: constant Element := A(I);
		begin
	Insert:	    for J in reverse A'First .. I loop
			if T < A(Index'Pred(J)) then
			    A(J) := A(Index'Pred(J));
			else
			    A(J) := T;
			    exit Insert;
			end if;
		    end loop Insert;
		end;
	    end loop;
	end Insertion_Sort;


Now let's see bubble-sort, from "Introduction to Abstract Data Types
using Ada" by Hillam.  It's figure 11.1.2 on p380.

	generic
	  type ITEM_TYPE is private;
	  type VECTOR is array (integer range < >) of ITEM_TYPE;
	  with function "<"(LEFT, RIGHT : ITEM_TYPE) return boolean;  
	procedure BUBBLE_SORT (V : in out VECTOR);

	procedure BUBBLE_SORT (V : in out VECTOR) is
	  TEMP_ITEM : ITEM_TYPE;
	begin
	  for OUTER IN V'first .. V'last-1 loop
	  -- note same number of outer loop iterations as insertion sort
	    for INNER in V'first + 1 .. V'last loop
	    -- note no early exit
	      if V(INNER) < V(INNER - 1) then
		TEMP_ITEM := V(INNER);
		V(INNER) := V(INNER - 1);
		V(INNER - 1) := TEMP_ITEM;
	      end if;
	    end loop;
	  end loop;
	end BUBBLE_SORT;

This clearly cannot be anywhere near close to optimal for sorted or
nearly sorted arrays, because it always does 1/2N**2 + O(N) element
comparisons.  The version of bubble sort that does well in those
cases is called "modified bubble sort" in Hillman, and is in his
figure 11.1.4

	generic
	  type ITEM_TYPE is private;
	  type VECTOR is array (integer range < >) of ITEM_TYPE;
	  with function "<"(LEFT, RIGHT : ITEM_TYPE) return boolean;  
	procedure BUBBLE_SORT (V : in out VECTOR);

	procedure BUBBLE_SORT (V : in out VECTOR) is
	  SORTED : boolean false;			-- sic!
	  TEMP_ITEM : ITEM_TYPE;
	  IN_PLACE : integer := 0; -- keeps track of number of items known
                                   -- to b in their final place at the
                                   -- beginning of each phase
	  INDEX : integer := V'first;
	begin
	  while not SORTED and then INDEX < V'last loop
	    SORTED := true;
	    INDEX := INDEX + 1;
	    for INNER in V'first + 1 .. V'last - IN_PLACE loop
	      if (V(INNER) < V(INNER - 1) then		-- sic!
		TEMP_ITEM := V(INNER);
		V(INNER) := V(INNER - 1);
		V(INNER - 1) := TEMP_ITEM;
		SORTED := false;
	      end if;
	    end loop;
	    IN_PLACE := IN_PLACE + 1;
	  end loop;
	end BUBBLE_SORT;

This is 20 non-comment lines for the body of "modified bubble sort",
compared with 17 for the body of my Ada insertion sort.  But that
could have been shorted if I hadn'tdeclared T as locally as possible so
that I could declare it as a constant.  Let's eliminate that block,
and while we're at it, let's eliminate the loop exit in the same of
structured programming purity.

	procedure Insertion_Sort(A: in out Vector) is
	    T: Element;
	    J: Index;
	begin
	    for I in Index'Succ(A'First) .. A'Last loop
		T := A(I);
		J := I;
		while J > A'First and then T < A(Index'Pred(J)) loop
		    A(J) := A(Index'Pred(J));
		    J := Index'Pred(J);
		end loop;
		A(J) := T;
	    end loop;
	end Insertion_Sort;

Now the funny thing here is that Robert Dewar wrote
>you will almost certainly find you require more overhead
[for insertion sort than bubble sort]
>because of the two nested loops.
But both versions of bubble sort have two nested loops as well!

>A bubble sort is certainly a much simpler solution to the problem
>of optimal sorting of a sorted list,

I do not call       20 lines with two loops and 5 variables
"much simpler" than 14 lines with two loops and 3 variables.

This leaves no apparent use for bubble sort at all.

>For quick sorts, I prefer heapsort to quicksort, because of its bounded
>worst case behavior. Note that there is a little-known modification to
>heap sort that reduces the number of compares to about NlogN compared
>with the normal 2NlogN (the 2 is where Eachus got the O(2N), though of
>course constants don't belong in big-O formulas). As far as I know this
>is not really properly reported in the literature -- I treat it in detail
>in my 1968 thesis, and it is an excercise in Knuth volume 3 (although his
>original answer was wrong, I think I kept that $1 Wells Fargo colorful
>check somewhere as a souvenir :-)

Papers were still appearing in The Computer Journal well after 1968
with improvements on heapsort; I feel so *stupid* for not including
the proper citation in the source code I have.  I don't suppose your
thesis is on the Web anywhere (mine certainly isn't).

-- 
Australian citizen since 14 August 1996.  *Now* I can vote the xxxs out!
Richard A. O'Keefe; http://www.cs.rmit.edu.au/~ok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-11  0:00                                           ` Mark Wooding
@ 1996-08-19  0:00                                             ` James Youngman
  0 siblings, 0 replies; 688+ messages in thread
From: James Youngman @ 1996-08-19  0:00 UTC (permalink / raw)



In article <4ul191$38j@excessus.demon.co.uk>, mdw@excessus.demon.co.uk says...

>Why is it, by the way, that no-one here (apart from me) has mentioned
>the issue of code size?  Does no-one care?  

Perhaps because it's orthogonal....

-- 
James Youngman                               VG Gas Analysis Systems
The trouble with the rat-race is, even if you win, you're still a rat.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-17  0:00                                   ` Tim Behrendsen
@ 1996-08-19  0:00                                     ` Bob Gilbert
  1996-08-19  0:00                                       ` Tim Behrendsen
  1996-08-22  0:00                                     ` Bengt Richter
  1 sibling, 1 reply; 688+ messages in thread
From: Bob Gilbert @ 1996-08-19  0:00 UTC (permalink / raw)



In article <01bb8c89$9023e3e0$87ee6fce@timpent.airshields.com>, "Tim Behrendsen" <tim@airshields.com> writes:
> 
> And they don't get wrapped up in implementation
> details if they write it in C? 

As compared to assembly, I'd say they certainly won't get wrapped
up in the details (I'd prefer Ada over C though, C can be a bit 
cryptic).  I can write in a HOL and not worry about memory addressing
modes (darn, forgot to load that data page pointer!), I don't have
to worry about internal data representation (now can't I just shift
right to divide by two, even if it is a float?), I don't have to
worry about pipeline conflicts (I just read X into register R1, so
why can't I test R1 in the very next instruction?), and whole host
of other architectural details that obfuscate the lesson to be
learned (sort algorithm, or whatever).  And what happens next term
when the school replaces the system on which they teach assembly
now for some new and different system?  Now your students have to
go through the learning curve (of assembly) all over again.  At least
HOL's are fairly portable and stable across platforms.

> My primary point is
> when it's implemented in assembly, and you have to
> manually move the bytes from one location to another
> rather than have the compiler "carry them for you",
> you get a better view feel for not only what the
> algorithm is doing, but what the computer itself is
> doing.

I agree that learning some sort of assembly and implementing
a number of programs (algorithms) is a valuable lesson, but it 
is a lesson in computer architecture much more so than a 
lesson in learning algorithms or problem solving.

> The most important thing a student can learn about
> computers is the fundamentally simple nature.  The
> mystery of execution *must* be broken down in order for
> the student to being to think like a programmer.
> Once they can think like a programmer, all the rest
> of the knowledge they learn becomes trivial.

I think your viewed scope of computer science is tad narrow.
I'd like you to show how learning assembly as a first language
will help a student better understand the design of a relational 
database, or solve any of the class of problems in mapping theory
(find the shortest or fastest route from Miami to Seattle), or deal
with networking and concurrency issues.  And it seems to me that 
one of the largest areas in the study of computer science is language
theory and compiler design, and it is hard to teach that if the 
student isn't introduced to a HOL early (perhaps first) in their 
curriculum.

-Bob






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-14  0:00                                         ` Robert Dewar
  1996-08-16  0:00                                           ` Dik T. Winter
  1996-08-18  0:00                                           ` Glenn Rhoads
@ 1996-08-19  0:00                                           ` Stephen Baynes
  1996-08-19  0:00                                             ` Robert Dewar
  1996-08-19  0:00                                             ` Robert Dewar
  1996-08-19  0:00                                           ` Richard A. O'Keefe
  3 siblings, 2 replies; 688+ messages in thread
From: Stephen Baynes @ 1996-08-19  0:00 UTC (permalink / raw)


[-- Warning: decoded text below may be mangled, UTF-8 assumed --]
[-- Attachment #1: Type: text/plain, Size: 4809 bytes --]


Robert Dewar (dewar@cs.nyu.edu) wrote:
: Stephen says

:  Why do people try and teach students Bubblesort? It may be an interlectually
:  interesting exercise but it is of no use. An insertion sort is simpler and at
:  least as fast as a bubble sort. For many practical programing problems an
:  insertion sort is a sensible solution, it is very compact and for small
:  d atasets as fast as anything (Many quicksort implementations switch to
:  insertion sort for less than about 7 items). The number of times I have
:  had to redirect graduates who have tried to write a small sort using
:  Bubblesort (because it was the simplest sort they were taught) or Quicksort
:  (because they have been taught it is faster in all cases).

: The one advantage of bubble sort is that it is close to optimal on sorted
: or nearly sorted arrays. You have to be very careful how you write insertion
: sort not to require more compares in the fully sorted case, and you will
: almost certainly find you require more overhead, because of the two nested
: loops. Yes, you could add a special test in the outer loop for already
: being in the right place, but then you complicate the inner loop if you
: want to avoid repeating this comparison. A bubble sort is certainly a
: much simpler solution to the problem of optimal sorting of a sorted
: list, and simplicity of solutoins is interesting if performance is NOT
: an issue after all.

Having just done some testing, I don't think I can agree with your statement
about bubble sort being optimial on nearly sorted arrays. On the cases I tested
insertion sort seems to be much quicker, unless the two arrays are so nearly
sorted that I can't easily measure the time to sort them. I have not used any
special cases of the sort you suggest in the insertion sort. The bubble sort
function is also over 50% more code than the insertion sort so I can't call it
simpler.

Case 1: Take an array of 2000000 ints already sorted (the largest
    I could malloc). Reverse the order of elements [0]..[9], 
        reverse the order [10]..[19],...
    Insertion sort 1 seconds, bubble sort 2 seconds

Case 2: Take an array of 100000 ints already sorted.
    Swap the entry 1/3 of the way through with the one 2/3 of they way through.
    Insertion sort <1 seconds, bubble sort about 400 seconds

HPUX A.09.05 using HP's ANSI C compiler with -O optimization.

If you want to try it, here is the code (setup for case 2, change the two 
'#if 0's to '#if 1' to get case 1):

#include <stdio.h>
#include <stdlib.h>
#include <time.h>

void isort( int *a, size_t l )
{
    size_t i;
    for( i = 1; i < l ; i ++  ){
        /* Insert a[i] into a[0]..a[i-1] */
        int t = a[i];
        size_t j = i;
        while( j > 0 && a[j-1] > t ){
            a[j] = a[j-1];
            j--;
        }       
        a[j] = t;
    }
}

void bsort( int *a, size_t l )
{
    size_t i;
    int change = 1;
    for( i = l; change && (i > 0) ; i--  ){
        size_t j;
        change = 0; /* Clear change flag */

        /* one pass of bubbles */
        for( j = 1; j < i ; j ++ ){
            if( a[j-1] > a[j] ){
                /*Swap */
                int t;
                t = a[j];
                a[j] = a[j-1];
                a[j-1] = t;
                change = 1;
             }
        }
    }
}


void check( int const *a, size_t l )
{
    size_t i;
    for( i = 0; i < l ; i ++  ){
        if( a[i] != i ) printf( "Error at entry %lu\n", (unsigned long)i );
    }
}

void init( int *a, size_t l )
{
    size_t i;
#if 0
    size_t i1, i2;
    int t;
    for( i = 0; i < l ; i ++  ){
        a[i] = i;
    }
    /* Swap elements at 1/3rds */
    i1 = l/3;
    i2 = (2*l)/3;
    t = a[i1];
    a[i1] = a[i2];
    a[i2] = t;
#else
    /* Assumes l is multiple of 10 */
    for( i = 0; i < l ; i += 10  ){
        size_t j;
        for( j = 0; j < 10; j++ ){
            a[i+j] = i+9-j;
        }
    }
#endif
}
#if 0
#define SZ (100000)
#else
#define SZ (2000000)
#endif

int main( void )
{
    /* Note assume unix - so time_t is integral seconds */
    int *a = malloc( sizeof( int ) * SZ );
    time_t t;

    init( a, SZ );
    t = time(NULL);    
    isort( a, SZ );
    printf( "Time for Insertion sort of %ld ints is %ld\n",
        (long) SZ, (long)time(NULL) - (long)t );
    check( a, SZ );

    init( a, SZ );
    t = time(NULL);    
    bsort( a, SZ );
    printf( "Time for Bubble sort of %ld ints is %ld\n",
        (long) SZ, (long)time(NULL) - (long)t );
    check( a, SZ );

    return 0;
}

--
Stephen Baynes                              baynes@ukpsshp1.serigate.philips.nl
Philips Semiconductors Ltd
Southampton                                 My views are my own.
United Kingdom
 Are you using ISO8859-1? Do you see � as copyright, � as division and � as 1/2?




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-17  0:00                     ` Tim Behrendsen
  1996-08-18  0:00                       ` Robert Dewar
@ 1996-08-19  0:00                       ` John Hobson
  1996-08-19  0:00                         ` Tim Behrendsen
  1 sibling, 1 reply; 688+ messages in thread
From: John Hobson @ 1996-08-19  0:00 UTC (permalink / raw)



Tim Behrendsen wrote:
<snip>

<snip> 

> IMO languages such as C are far, far more difficult than
> assembly language.  We've had a few posts during this discussion
> from people (including myself) who learned assembly early on as
> their primary language.  They all felt that assembly was what
> crystallized the concept of computers in their mind.  This
> shouldn't be that hard to believe; there aren't that many
> fundamental instructions in assembly, and nothing about the
> computer's nature is hidden (or abstracted) away.
> 
> I'm not sure how this thought that "assembly is hard" came
> about.  The only thing I can think of that so few people learn
> assembly nowadays that they think it is all black magic, and
> anything with mystery is assumed to be hard.
<more snippage>

I can certainly tell you why I think that assembler is hard -- it's
because assembler is so long winded.  If you want to write the
equivalent of a = b / c; in assembler, it takes about 5 or 6 lines.
In *The Mythical Man-Month*, Fredrick Brooks says that the best gauge
of programmer productivity is lines of code written per <time period>.
Since assembler takes so many more lines to do something than (say) C
does, it at least appears harder because it takes longer.

As Tim also says:
> Because I've found that people tend to stick with the first
> [dare I use the word] paradigm that they are introduced to.
> Everything else they learn will be compared against the first
> thing they learn, and if the first thing they learn is the
> step-by-step nature of the computer through a simple assembly
> language (they don't have to start with the sophisticated RISC
> chip on the planet), then they'll always have that safe, simple
> place to go back to when trying to understand more sophisticated
> concepts.

I have had exposure to assembler (IBM BAL on a 370 -- BAL standing for
Basic Assembly Language), and I feel towards it much as Martin Luther
felt towards monasticism.  After he broke with Rome, Luther damned
monasteries and all that they stood for, but he also said that he was
glad that he had had the experience.  IMHO, I have found it very useful
at times to have some sort of idea of what the computer is actually
doing.  I will also say that there are times when the best way to debug
programs is to read the assembly listing.  However, I am very glad that
I don't actually have to write assembler.  The first language I learnt
was FORTRAN II, which was really just one step up from assembler
(e.g., the FORTRAN arithmetic IF can easily be seen as a macro) and
this gave me a good insight into what the machine was doing as it
executed my code.  It also gave me an appreciation of the ease that
things like true IF-THEN-ELSE structures gave the programmer.

My favourite language?  Actually, it's C.
-- 
John Hobson             |Whenever someone says to me,
Unix Support Group      |"Have a nice day", I reply,
ComEd, Chicago, IL, USA |"Sorry, I've made other plans."
jhobson@ceco.ceco.com   |	-- Sir Peter Ustinov




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-19  0:00                                         ` Tim Hollebeek
@ 1996-08-20  0:00                                           ` Tim Behrendsen
  0 siblings, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-20  0:00 UTC (permalink / raw)



Tim Hollebeek <tim@franck> wrote in article
<4vard9$9rj@cnn.Princeton.EDU>...
> Tim Behrendsen (tim@airshields.com) wrote:
> 
> : Well, all that is certainly true if you intentionally go out
> : and find the most complex, arcane architecture you can find.
> : If you start with a nice, simple one like a 6809 or 68000 or
> : something, then implementation details are practially nil.
> 
> Of course, any modern CPU is going to have tons of things like
> register windows, delay slots, and pipelines.  I'd hate to manage that
> stuff directly.  The future appears to be RISC and/or superscalar, and
> both are a pain to program by hand.  Of course, I could be wrong about
> that.  But I do know one thing:  Whatever chip will be in my computer
> 10 years from now, it will run my C code without modification.  Can
> you say that about your assembly?

Nope, which is why I don't do my application programming in
assembly.  Doesn't change the fact that architectures are
still implemented in assembly, and the fact that assembly
is a much better tool to learn from because of the simple
nature (even though performance optimizations can be complex).

> BTW, where do you find the time for 10 posts a day?  I don't have time
> to *read* everything you post :-)

Well, it helps to be able to read fast and type fast.  Its
one of the benefits to being naturally impatient. :-)

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-17  0:00                     ` Robert Dewar
@ 1996-08-20  0:00                       ` Lawrence Kirby
  0 siblings, 0 replies; 688+ messages in thread
From: Lawrence Kirby @ 1996-08-20  0:00 UTC (permalink / raw)



In article <dewar.840313069@schonberg> dewar@cs.nyu.edu "Robert Dewar" writes:

>What on earth is a "list qiucksort".

It is a quicksort algorithm for lists. Partitioning implements very easily
on lists.

> As soon as you are not trying to do
>things in place, why on earth would you use quicksort, when mergesort is
>obviosuly as efficient and has a worst case performance of NlogN.

I agree and I was in no way advocating the use of list quicksorts; I was
simply pointing out that the best way to quicksort a pack of cards equates
most closely to a list quicksort. In place algorithms don't tend to
work well on cards (larger constant factor in the individual operations).

-- 
-----------------------------------------
Lawrence Kirby | fred@genesis.demon.co.uk
Wilts, England | 70734.126@compuserve.com
-----------------------------------------




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-18  0:00                         ` Mark Wooding
@ 1996-08-20  0:00                           ` Peter Seebach
  1996-08-21  0:00                           ` Szu-Wen Huang
  1 sibling, 0 replies; 688+ messages in thread
From: Peter Seebach @ 1996-08-20  0:00 UTC (permalink / raw)



In article <slrn51ekt2.5rj.mdw@excessus.demon.co.uk>,
Mark Wooding <mdw@excessus.demon.co.uk> wrote:
>	char buf[...];
>	char *p;

>	...

>	while (buf[0]==' ')
>	{
>	  for (p=buf;p[0]=p[1];p++)
>	    ;
>	}

>	while (buf[strlen(buf)-1]==' ')
>	  buf[strlen(buf)-1]=0

>I can't believe that anyone with an understanding of what goes on `under
>the covers' would possibly write anything like this without feeling ill.
>An inkling of what this would be translated into by any implementation
>would surely avoid horrors like this.

I am told at least one compiler is clever enough to handle the second part
usefully.

I must disagree with your assertion.  Not that long ago, someone posted a bit
of disassembled code from a major vendor's library, in which strchr()
was implemented roughly as
	memchr(s, c, strlen(s));
- the key being that it was implemented in assembly.  From the usage, my
understanding is that it had been written in assembly to avoid the overhead of
the function call to strlen().

There are idiots programming at every level of the machine.

(Curiously, if you look closely at the above code, you'll notice that
it will write past the beginning of at least one obvious string.)

-s
>Anyone who asks `what's wrong with that' will be shot.
>-- 
>[mdw]
>
>`When our backs are against the wall, we shall turn and fight.'
>		-- John Major
>


-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-20  0:00                                                 ` Richard A. O'Keefe
@ 1996-08-20  0:00                                                   ` Alan Bowler
  1996-08-21  0:00                                                   ` Tim Behrendsen
  1 sibling, 0 replies; 688+ messages in thread
From: Alan Bowler @ 1996-08-20  0:00 UTC (permalink / raw)



In article <4vbbf6$g0a@goanna.cs.rmit.edu.au> ok@goanna.cs.rmit.edu.au (Richard A. O'Keefe) writes:
>"Tim Behrendsen" <tim@airshields.com> writes:
>
>>Yes, but you can use the "get a better compiler" argument to
>>justify anything.  Real programs run on real computers using
>>real compilers.  The "Super-Duper Ivory Tower 9000 Compiler"
>>just doesn't exist.
>
>This is a bogus argument, because the better compilers *I* was talking
>about ACTUALLY EXIST.  As a particular example, GCC does self-tail-call
>optimisation and SPARCompiler C 4.0 does general tail-call optimisation.

The compiler implemention may have chosen not to to implement this
optimization.  There are still systems where the there are limits on
the resources that the compiler itself has available, and the so the
total amount of code in the compiler has a practical limit.  Also
tail call optimizations may conflict with other useful extensions
such as the old nargs() function.

While you should not spend excessive time prematurely optimizing code,
you also should not bloat your code with junk that depends on a fancy
optimizer simply to get decent performance.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-20  0:00                                                 ` Szu-Wen Huang
@ 1996-08-20  0:00                                                   ` Dann Corbit
  1996-08-21  0:00                                                     ` Tim Behrendsen
  1996-08-21  0:00                                                   ` Dik T. Winter
  1 sibling, 1 reply; 688+ messages in thread
From: Dann Corbit @ 1996-08-20  0:00 UTC (permalink / raw)





Szu-Wen Huang <huang@mnsinc.com> wrote in article
<4vd05f$rt5@news1.mnsinc.com>...
{snip}
> I disagree.  I expect a linear algorithm to display linear behavior
> unless otherwise stated.  What you cite is a case where it needs to
> be explicitly stated, because calling that algorithm "O(n)" is next
> to useless in predicting its behavior without knowing this peculiar
> behavior at n=1,000.  IOW, this prediction holds if I'm predicting
> for n in (0, 1,000] and (1,000, infinity), and will hold for other
> algorithms that do not have this peculiarity.

Nonetheless, it only has to be true in the limiting case.
Try, for instance, some O(n*log(n) ) algorithm like heapsort
on data that is random, ordered, reverse ordered, sawtooth, etc,
and you will find that it takes different amounts of time even for
the same number of input values.

Or take quicksort, which is worst case O(n*n) and you will
see that it usually plots as approximately O(n*log(n)), but
can even appear to be linear for some special data sets and 
algorithm implementations.

Definitions:
n = number of elements
m = slope in units/second
t = time in seconds
b = some constant number of seconds

What is really demanded for an algorithm O(n) is that

	t <= m*n + b

for some constants m and b.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-12  0:00         ` Patrick Horgan
                             ` (3 preceding siblings ...)
  1996-08-16  0:00           ` Should I learn C or Pascal? Darin Johnson
@ 1996-08-20  0:00           ` Darin Johnson
  1996-08-21  0:00           ` What's the best language to learn? [was Re: Should I learn C or Pascal?] Darin Johnson
                             ` (3 subsequent siblings)
  8 siblings, 0 replies; 688+ messages in thread
From: Darin Johnson @ 1996-08-20  0:00 UTC (permalink / raw)



> Why not just pick the pivot randomly?  This was suggested by Hoare in
> 1962.

I seem to recall a paper or method that basically said that the better
you picked your pivot, the better the quicksort, and then gave some
alternatives.  My recollection was that you could pick a quick set of
points (first, last, middle; or all random) and then choose the median
of that set.

And of course, you can always sort in O(n) as long as you have a
finite set of things to sort :-)  (actually, in O(1) theoretically
but real world implications say that a finite 'n' is still an 'n')
-- 
Darin Johnson
djohnson@ucsd.edu	O-
    "Floyd here now!"




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-17  0:00                                               ` Robert Dewar
@ 1996-08-20  0:00                                                 ` Szu-Wen Huang
  1996-08-20  0:00                                                   ` Dann Corbit
  1996-08-21  0:00                                                   ` Dik T. Winter
  1996-08-21  0:00                                                 ` Tanmoy Bhattacharya
  1 sibling, 2 replies; 688+ messages in thread
From: Szu-Wen Huang @ 1996-08-20  0:00 UTC (permalink / raw)



Robert Dewar (dewar@cs.nyu.edu) wrote:
: "Wrong order.  An O(n) program where n=1,000 that takes 1 second to
: complete should be able to finish n=2,000 in 2 seconds, not the other
: way around.  In other words, you can't derive time complexity by
: timing an algorithm, *especially* with only two samples."

: This statement is wrong (big-O seems to be generating a lot of confusion).
: big-O notation is about asymptotic behavior. If you know an algorithm
: is O(N) and you know that the time for n=1000 is one second, you know
: ABSOLUTELY NOTHING about the time for n=2000, NOTHING AT ALL!

: For example, suppose the behavior of an algorithm is

:   for n up to 1000, time is 1 second

"time is *n* seconds", I presume.  Otherwise this would be O(1).

:   for n greater than 1000, time is 1000*n seconds

: that's clearly O(N), but the time for 2000 items will be 2_000_000
: seconds.
[snip]

I disagree.  I expect a linear algorithm to display linear behavior
unless otherwise stated.  What you cite is a case where it needs to
be explicitly stated, because calling that algorithm "O(n)" is next
to useless in predicting its behavior without knowing this peculiar
behavior at n=1,000.  IOW, this prediction holds if I'm predicting
for n in (0, 1,000] and (1,000, infinity), and will hold for other
algorithms that do not have this peculiarity.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-19  0:00                                       ` Tim Behrendsen
  1996-08-19  0:00                                         ` Tim Hollebeek
@ 1996-08-20  0:00                                         ` Bob Gilbert
  1996-08-21  0:00                                           ` Tim Behrendsen
  1996-09-06  0:00                                         ` Robert I. Eachus
                                                           ` (5 subsequent siblings)
  7 siblings, 1 reply; 688+ messages in thread
From: Bob Gilbert @ 1996-08-20  0:00 UTC (permalink / raw)



In article <01bb8df1$2e19d420$87ee6fce@timpent.airshields.com>, "Tim Behrendsen" <tim@airshields.com> writes:
> Bob Gilbert <rgilbert@unconfigured.xvnews.domain> wrote in article
> <4v9nei$kjf@zeus.orl.mmc.com>...
> > In article <01bb8c89$9023e3e0$87ee6fce@timpent.airshields.com>, "Tim
> Behrendsen" <tim@airshields.com> writes:
> > > 
> > > And they don't get wrapped up in implementation
> > > details if they write it in C? 
> > 
> > As compared to assembly, I'd say they certainly won't get wrapped
> > up in the details (I'd prefer Ada over C though, C can be a bit 
> > cryptic).  I can write in a HOL and not worry about memory addressing
> > modes (darn, forgot to load that data page pointer!), I don't have
> > to worry about internal data representation (now can't I just shift
> > right to divide by two, even if it is a float?), I don't have to
> > worry about pipeline conflicts (I just read X into register R1, so
> > why can't I test R1 in the very next instruction?), and whole host
> > of other architectural details that obfuscate the lesson to be
> > learned (sort algorithm, or whatever).  And what happens next term
> > when the school replaces the system on which they teach assembly
> > now for some new and different system?  Now your students have to
> > go through the learning curve (of assembly) all over again.  At least
> > HOL's are fairly portable and stable across platforms.

> Well, all that is certainly true if you intentionally go out
> and find the most complex, arcane architecture you can find.
> If you start with a nice, simple one like a 6809 or 68000 or
> something, then implementation details are practially nil.

There aren't many processors being made today that don't employ
many (and many more) of the architectural features I mentioned
above.  

> > > My primary point is
> > > when it's implemented in assembly, and you have to
> > > manually move the bytes from one location to another
> > > rather than have the compiler "carry them for you",
> > > you get a better view feel for not only what the
> > > algorithm is doing, but what the computer itself is
> > > doing.
> > I agree that learning some sort of assembly and implementing
> > a number of programs (algorithms) is a valuable lesson, but it 
> > is a lesson in computer architecture much more so than a 
> > lesson in learning algorithms or problem solving.

> Yes, exactly.  I understand that theoretically computers
> can be defined in terms of any architectural abstraction,
> but that's not the point.  The point is to give the students
> a foundation to build on, and a very simple computer
> architecture is much simpler than complex language syntax.

Apples and oranges.  Learning problem solving and understanding
basic computer architecture are two different things, although
I do agree that understanding the underlying architecture helps
in developing solutions (gives a target to shoot for), it can 
also hinder the process by prejudicing the student's thinking
into terms of the limited set of known targets.

> > > The most important thing a student can learn about
> > > computers is the fundamentally simple nature.  The
> > > mystery of execution *must* be broken down in order for
> > > the student to being to think like a programmer.
> > > Once they can think like a programmer, all the rest
> > > of the knowledge they learn becomes trivial.

> > I think your viewed scope of computer science is tad narrow.
> > I'd like you to show how learning assembly as a first language
> > will help a student better understand the design of a relational 
> > database, or solve any of the class of problems in mapping theory
> > (find the shortest or fastest route from Miami to Seattle), or deal
> > with networking and concurrency issues.  And it seems to me that 
> > one of the largest areas in the study of computer science is language
> > theory and compiler design, and it is hard to teach that if the 
> > student isn't introduced to a HOL early (perhaps first) in their 
> > curriculum.

> Focusing on individual problems is a narrow view of learning.

I was only offering some examples of other rather large and broad subject
areas in the field of computer science for which an understanding of the 
basic architecture (via assembly language) is of limited use, and where 
introduction of a HOL early on or first would be advisable.

> All computer problems have certain things in common; they have an
> input, they have an output, and they have a set of procedures
> in between.  Teaching the student how to think in terms of
> breaking problems into procedures is by far the most difficult
> task in teaching someone to program. 

A very procedural point of view.  Many of the proponents of object
oriented design might have a problem with this view, and demonstrates
my point about allowing the details of implementation to obscure the
higher level problem solving process.

>   Yes, you can pack algorithms
> into their head, but that doesn't mean they are learning why
> the algorithm is way that it is.

Not at all what I am suggesting (blind memorization of algorithms).  
I am suggesting that they look at various problem domains and see 
some of the more classic solutions as examples of how they might
attack a new problem, or as is more often the case, recognize a
problem as belonging to an existing domain for which mature solutions
may already exists, and avoid re-inventing the wheel.  They are
then free to choose, modify, or even invent new, one of the many
available solutions based on the specific architecture they are 
dealing with.

>  I'm reminded of the one guy
> on another thread that said "I learned Quicksort, but I didn't
> really understand it."  This is what happens when you focus on
> packing knowledge, without teaching understanding.

Agreed.  Sorting is a good example of a classic problem domain for
which there exists many mature solutions.  I still fail to see how
implementing one or more of these solutions in assembly vs a HOL
improves the understanding of the problem domain or even the specific
solution (Quicksort, Bubble sort, whatever).

> Let's take the RDB example.  If I take the normal CS graduate,
> sit them down in front of terminal, and say "whip up an SQL
> parser that's comparable in performance to Oracle", would
> they be able to do it based on the knowledge they have learned
> in the normal CS program?  I don't think it's an exaggeration
> to say "hell no."  Why?  Because the haven't been taught The
> Algorithm that will compare to Oracle.  They would have no
> clue where to start.  This is my experience with testing new
> graduates.

> Now what if that same student had gone to the Tim Behrendsen
> Academy of Programming, and solved new problems every day in
> low level programming, and left with a degree in thinking, but
> was relatively light on HLLs and his Bag O' Algorithms was
> heavy on the basics, but light on the specifics of "mapping
> theory" or whatever?

Mapping problems comprise a rather large and broad problem domain
(I gave the example of finding the shortest route between two cities,
but there are many, many problems that fall into this category).
Many mapping problems exist for which solutions still have not been
developed, and many of the solutions that do exist have somewhat
large orders of computational complexity.

>  I think my student would get way
> farther than this standard CS student, because they would
> know how to take complex problems such as SQL optimization
> that hadn't necessarily been seen before, and could make
> a reasonable go of it.  Not that either would be able to
> reproduce a multi-hundred man year product like Oracle. :)

I really rather doubt that Oracle used a bottom up approach in
the design of their database products.

-Bob


















^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-19  0:00                           ` John Hobson
@ 1996-08-20  0:00                             ` Szu-Wen Huang
  1996-08-27  0:00                               ` Richard A. O'Keefe
  0 siblings, 1 reply; 688+ messages in thread
From: Szu-Wen Huang @ 1996-08-20  0:00 UTC (permalink / raw)



John Hobson (jhobson@ceco.ceco.com) wrote:
: Tim Behrendsen wrote:
[snip]
: > I don't think anyone is advocating a return to assembly for
: > "real world" large-project purposes, only for educational
: > purposes.  I'll add you to the list of people who found it
: > to be of value early on.  :-)

: Yes, you can put me down as one of the people who is in favour of
: learning assembler.  You can also put me down as one who is grateful
: that he does not actually have to program in it.

Make sure you understand which list you're getting into :).  Note that
Tim's list has a key phrase "to be of value EARLY ON".  Others are not
as sure as he is about the value of assembly language to a beginner
who is learning algorithms, which is the whole point of the thread now.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-18  0:00                                                     ` Robert Dewar
@ 1996-08-20  0:00                                                       ` Steve Heller
  0 siblings, 0 replies; 688+ messages in thread
From: Steve Heller @ 1996-08-20  0:00 UTC (permalink / raw)



dewar@cs.nyu.edu (Robert Dewar) wrote:

>The distribution counting sort is quite different from a right to left
>radix sort (the latter being the sort you do on a sorting machine for
>80 column cards, if people still remember!) My comment was on the latter,
>so Steve's comment on the former is quite right, but they are two
>different algorithms.
  I suspected we were talking at cross purposes. Thanks for verifying
that and  clearing up the confusion.

Steve Heller, author and software engineer
http://ourworld.compuserve.com/homepages/steve_heller 





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-16  0:00                                               ` Tim Behrendsen
@ 1996-08-20  0:00                                                 ` Richard A. O'Keefe
  1996-08-20  0:00                                                   ` Alan Bowler
  1996-08-21  0:00                                                   ` Tim Behrendsen
  0 siblings, 2 replies; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-08-20  0:00 UTC (permalink / raw)



"Tim Behrendsen" <tim@airshields.com> writes:

>Yes, but you can use the "get a better compiler" argument to
>justify anything.  Real programs run on real computers using
>real compilers.  The "Super-Duper Ivory Tower 9000 Compiler"
>just doesn't exist.

This is a bogus argument, because the better compilers *I* was talking
about ACTUALLY EXIST.  As a particular example, GCC does self-tail-call
optimisation and SPARCompiler C 4.0 does general tail-call optimisation.
Compilers for other languages doing very well indeed with procedures
include Mercury and Stalin.

It does nobody any good to pretend that good compilers do not exist.

>> Interestingly enough, the APL sorting primitive DOESN'T move the data
>> at all.  It returns a permutation vector.  The APL idiom for sorting is
>> 	X[.GradeUP X]
>> where the necessary movement is quite visible.

>But the reality is *not* visible.  What has the student really
>learned?

Scuse please?  WHAT reality?  The reality in this case is that you can
in fact in just about any reasonable programming language sort data
WITHOUT moving it.  Indeed, for a number of statistical and numeric
calculations, the permutation vector is more useful than moving the
data would be.  The reality could well be a sorting network made of
comparators and wires.  On the machine I'm posting from, sorting
_could_ be done using parallel processes (not concurrent, parallel;
the machine has more than one CPU).  And of course there have been
at least two machines built that directly executed APL; APL _was_
their assembly language.

In the case of APL, the student has learned to express his/her intent
in a concise notation with a clear mathematical semantics (this applies
to APL primitives only, alas; the rest of APL is rather murkier) which
permits effective reasoning at a much higher level.

"Mathematical skills" came into it somewhere.

Consider the insertion sort and bubble sort procedures posted in
comp.lang.ada recently.  I think it is easier to see the potential
optimisation in
	X .IN A[.GRADE_UP A]
(written as 6 characters) than to see the potential optimisation in
	subtype Element is Float;
	subtype Index is Integer range 1..N;
	type Vector is array (Index) of Element;
	A : Vector := ...;
	X : Element := ...;
	I : Index;

	procedure Sort is new Insertion_Sort;
	function Find is new Linear_Search;

    begin
	Sort(A);
	I := Find(A, X);

This does not establish the superiority of APL over Ada as a general
programming language, only that it makes a much better *design* language
for some (but important) parts of some (but important) applications.

The ability to do "algebraic" reasoning about a program is quite as
important as the ability to do assembly level thinking (which I agree
is important).  When you really need big improvements in performance,
you get it from the high level thinking.


-- 
Australian citizen since 14 August 1996.  *Now* I can vote the xxxs out!
Richard A. O'Keefe; http://www.cs.rmit.edu.au/~ok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-16  0:00             ` Dr E. Buxbaum
  1996-08-16  0:00               ` Mike Rubenstein
  1996-08-16  0:00               ` Lawrence Kirby
@ 1996-08-20  0:00               ` Paul Schlyter
  1996-08-20  0:00                 ` Mike Rubenstein
  1996-08-21  0:00                 ` James Youngman
  2 siblings, 2 replies; 688+ messages in thread
From: Paul Schlyter @ 1996-08-20  0:00 UTC (permalink / raw)



In article <4v1r2a$gh6@falcon.le.ac.uk>, Dr E. Buxbaum <EB15@le.ac.uk> wrote:
 
> djohnson@tartarus.ucsd.edu (Darin Johnson) wrote:
>>> A binary sort, also known as quicksort, or Hoare's sort is
>>> covered extensively
> 
>>"quicksort is the fastest sort" categorization without really
>>understanding it.  
> 
> A common misconception. Quicksort is fast for UNSORTED data. For data 
> which are largely presorted (a common occurance if a bunch of additional 
> data has to be added to a list), Quicksort becomes Slowsort. 
> 
> Resume: Spend some time on checking your data first, then decide on the 
> proper sorting algorithm!
 
The performane of Quicksort on already sorted, or almost-sorted, data
can be dramatically improved by first scrambling the data(!) somewhat.
 
For instance the first element can be exchanged with some element
near the middle, and then that new first element can be used as a
pivot.  This matters little on unsorted data, but ensures that on
almost-sorted data the pivot will each time split the data set into
two approximately equally large parts, which will yield near-optimum
performance of Quicksort.
 
Of course, this trick still does not ensure that Quicksort never will
perform miserably -- it still may do if the input data is ordered in
some particular way.  But at least it's much less likely to encounter
this "wost performance" initial order, and it's is quite different
from almost-sorted order.
 
-- 
----------------------------------------------------------------
Paul Schlyter,  Swedish Amateur Astronomer's Society (SAAF)
Grev Turegatan 40,  S-114 38 Stockholm,  SWEDEN
e-mail:  pausch@saaf.se        psr@home.ausys.se




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-18  0:00                         ` Tim Behrendsen
@ 1996-08-20  0:00                           ` James Youngman
  1996-08-21  0:00                           ` Szu-Wen Huang
  1 sibling, 0 replies; 688+ messages in thread
From: James Youngman @ 1996-08-20  0:00 UTC (permalink / raw)



I've just read what Donald Knuth has to say on this subject, at:-

http://www-cs-staff.stanford.edu/~uno/mmix.html

-- 
James Youngman                               VG Gas Analysis Systems
The trouble with the rat-race is, even if you win, you're still a rat.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-19  0:00                     ` Richard A. O'Keefe
@ 1996-08-20  0:00                       ` Mike Rubenstein
  1996-08-22  0:00                         ` Richard A. O'Keefe
  0 siblings, 1 reply; 688+ messages in thread
From: Mike Rubenstein @ 1996-08-20  0:00 UTC (permalink / raw)



ok@goanna.cs.rmit.edu.au (Richard A. O'Keefe) wrote:

> miker3@ix.netcom.com (Mike Rubenstein) writes:
> >You should not have cut class the day they covered the techniques for
> >making O(N^2) performance very unlikely.  In fact, properly
> >implemented, quicksort will give O( N log N) performance on both a
> >reversed list and a shifted list.
> 
> Yes, well I suppose all of us missed _something_ in our CS educations,
> such as the fact that the "proof" that O(N**2) is unlikely assumes that
> every possible permutation of the input is equally likely, which is not
> true in the real world.  Check the paper by Bentley and someone (McIlroy?)
> in Software Practice and Experience on "Engineering a Sort":  the UNIX
> qsort() routine *did* yield O(N**2) behaviour in actual use by people who
> were trying to solve a real problem, not break the sorting routine.

No.  The assumption that O(N^2) performance is unlikely assumes that
quicksort is properly implemented.  The distribution of the data has
nothing to do with it.

The fact that some versions of UNIX qsort() were badly implemented
doesn't change the fact that Hoare gave a simple method in his
original paper for making O(N^2) performance very unlikely for any
data distribution.

> 
> The likelihood of quicksort doing badly on *your* data depends on the
> actual probability distribution of *your* data.

Again, not if quicksort is implemented properly.
 
> This leads to the odd observation that if you don't know what the
> distribution of your data is, and you would like to be confident that
> bad things are genuinely unlikely, a good first step is to *scramble*
> your array before sorting it, so that the assumption of this proof is
> known to be true!  (Permuting an array randomly is only O(N).)

There is no need to permute the array.  Just follow Hoare's suggestion
for avoiding O(N^2) performance.

Hoare suggested choosing the pivot randomly.  This has the same effect
on overall performance as permuting the data.

Choosing the median of the first, middle, and last elements for the
pivot also works well in practice.  In particular, it will prevent
O(N^2) performance if the data is nearly in order or reverse order.


Michael M Rubenstein




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-20  0:00               ` Paul Schlyter
@ 1996-08-20  0:00                 ` Mike Rubenstein
  1996-08-21  0:00                 ` James Youngman
  1 sibling, 0 replies; 688+ messages in thread
From: Mike Rubenstein @ 1996-08-20  0:00 UTC (permalink / raw)



pausch@electra.saaf.se (Paul Schlyter) wrote:

> In article <4v1r2a$gh6@falcon.le.ac.uk>, Dr E. Buxbaum <EB15@le.ac.uk> wrote:
>  
> > djohnson@tartarus.ucsd.edu (Darin Johnson) wrote:
> >>> A binary sort, also known as quicksort, or Hoare's sort is
> >>> covered extensively
> > 
> >>"quicksort is the fastest sort" categorization without really
> >>understanding it.  
> > 
> > A common misconception. Quicksort is fast for UNSORTED data. For data 
> > which are largely presorted (a common occurance if a bunch of additional 
> > data has to be added to a list), Quicksort becomes Slowsort. 
> > 
> > Resume: Spend some time on checking your data first, then decide on the 
> > proper sorting algorithm!
>  
> The performane of Quicksort on already sorted, or almost-sorted, data
> can be dramatically improved by first scrambling the data(!) somewhat.
>  
> For instance the first element can be exchanged with some element
> near the middle, and then that new first element can be used as a
> pivot.  This matters little on unsorted data, but ensures that on
> almost-sorted data the pivot will each time split the data set into
> two approximately equally large parts, which will yield near-optimum
> performance of Quicksort.
>  
> Of course, this trick still does not ensure that Quicksort never will
> perform miserably -- it still may do if the input data is ordered in
> some particular way.  But at least it's much less likely to encounter
> this "wost performance" initial order, and it's is quite different
> from almost-sorted order.

Why not just pick the pivot randomly?  This was suggested by Hoare in
1962.

Michael M Rubenstein




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-21  0:00                           ` Szu-Wen Huang
  1996-08-21  0:00                             ` Adam Beneschan
@ 1996-08-21  0:00                             ` Tim Behrendsen
  1 sibling, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-21  0:00 UTC (permalink / raw)



Szu-Wen Huang <huang@mnsinc.com> wrote in article
<4vdnod$5i8@news1.mnsinc.com>...
> Mark Wooding (mdw@excessus.demon.co.uk) wrote:
> [snip]
> : 	char buf[...];
> : 	char *p;
> 
> : 	...
> 
> : 	while (buf[0]==' ')
> : 	{
> : 	  for (p=buf;p[0]=p[1];p++)
> : 	    ;
> : 	}
> 
> : 	while (buf[strlen(buf)-1]==' ')
> : 	  buf[strlen(buf)-1]=0
> 
> : I can't believe that anyone with an understanding of what goes on
`under
> : the covers' would possibly write anything like this without feeling
ill.
> : An inkling of what this would be translated into by any implementation
> : would surely avoid horrors like this.
> 
> [snip]
> What exactly do I need under the cover?  This atrocity was committed by
> somebody who doesn't even know C and algorithms.  You're not proving
> your point (if that indeed is your point) that assembly/architecture is
> *required* to understand algorithms at all.  To require it you must prove
> that there isn't any other way to effectively teach algorithms.  The
> state of education today can be attributed to many, many factors, not
> necessarily abstractions!
> 
> : Anyone who asks `what's wrong with that' will be shot.
> 
> This code works, by the way, as far as I can tell.  It's even portable.
> I suggest you show us somebody with equivalent experience in general 
> computing as this individual but started on assembly language and see how

> this person fares with the problem.  Oh, and give them the same amount
> of time.

You're missing the point.  The point is how can *anyone* commit such
a monument to inefficient coding?  Yet, someone did.  That someone
obviously sees four lines of code, and therefore it must be OK.
Now, if that someone had an assembly background, and could see how
these statements were going to compile, they would instantly feel
the horror that a good and moral programmer should feel over this
code.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-21  0:00                                                   ` Dik T. Winter
@ 1996-08-21  0:00                                                     ` Tim Behrendsen
  1996-08-21  0:00                                                       ` Pete Becker
                                                                         ` (3 more replies)
  1996-08-22  0:00                                                     ` Tanmoy Bhattacharya
  1 sibling, 4 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-21  0:00 UTC (permalink / raw)



Dik T. Winter <dik@cwi.nl> wrote in article <DwGoHq.6n7@cwi.nl>...
> In article <4vd05f$rt5@news1.mnsinc.com> huang@mnsinc.com (Szu-Wen Huang)
writes:
>  > : For example, suppose the behavior of an algorithm is
>  > 
>  > :   for n up to 1000, time is 1 second
>  > 
>  > "time is *n* seconds", I presume.  Otherwise this would be O(1).
> 
> Nope.  Robert Dewar wrote 1 second and intended 1 second, you can not
> conclude big-oh behaviour from a finite number of samples.
>  > 
>  > :   for n greater than 1000, time is 1000*n seconds
>  > 
>  > : that's clearly O(N), but the time for 2000 items will be 2_000_000
>  > : seconds.
>  > [snip]
>  > 
>  > I disagree.
> 
> You may disagree, but that is what the definition is!  Big-oh notation
> is about asymptotic behaviour, i.e. what is expected to happen for n
> very large.
> 
>  >              I expect a linear algorithm to display linear behavior
>  > unless otherwise stated.
> 
> It will, for large enough n.  And "enough" is explicitly not specified.
> 
>  >                           What you cite is a case where it needs to
>  > be explicitly stated, because calling that algorithm "O(n)" is next
>  > to useless in predicting its behavior without knowing this peculiar
>  > behavior at n=1,000.
> 
> But big-oh notation is about prediction for large n.  You can not use
> the notation to really predict the running time for a particular value,
> only to estimate it; and your estimation may be way off.  If an
> algorithm runs in N^2 + 10^100 N seconds, it is still O(N^2), although
> you never will experience the quadratic behaviour of the algorithm.
> (Actually, of course, you will never see the algorithm come to
completion.)

I have to admit, I have to side with Szu-Wen.  I've never really
thought about this case for the O() notation, but it seems from
a purely mathematical standpoint that O(f(n)) means that f(n)
is *the* function.  If f(n) happens to be a non-continuous function,
then so be it.

If the running time is

    kn for n < 1000
    k(n^2) for n >= 1000

then f(n) =
    O(n) for n < 1000
    O(n^2) for n >= 1000

then the big-Oh function should be (pardon my syntax)

    O( (n < 1000) ? n : n^2 )

The latter is not only more useful, but absolutely accurate.
AFAIK, there are no limits on the complexity of the Big-Oh
function, and no requirement that it be a continuous
function.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-18  0:00                         ` Mark Wooding
  1996-08-20  0:00                           ` Peter Seebach
@ 1996-08-21  0:00                           ` Szu-Wen Huang
  1996-08-21  0:00                             ` Adam Beneschan
  1996-08-21  0:00                             ` Tim Behrendsen
  1 sibling, 2 replies; 688+ messages in thread
From: Szu-Wen Huang @ 1996-08-21  0:00 UTC (permalink / raw)



Mark Wooding (mdw@excessus.demon.co.uk) wrote:
[snip]
: 	char buf[...];
: 	char *p;

: 	...

: 	while (buf[0]==' ')
: 	{
: 	  for (p=buf;p[0]=p[1];p++)
: 	    ;
: 	}

: 	while (buf[strlen(buf)-1]==' ')
: 	  buf[strlen(buf)-1]=0

: I can't believe that anyone with an understanding of what goes on `under
: the covers' would possibly write anything like this without feeling ill.
: An inkling of what this would be translated into by any implementation
: would surely avoid horrors like this.

Let's see.  for loop nested in a while loop.  O(n^2).  The job to do
is to strip leading spaces.  Which means find first non-space and copy
the rest of the string from there to the start.  Which is O(n) + O(n).

strlen() is O(n), nested in a while, makes it O(n^2).  The job to do is
to strip trailing spaces.  Which means find last non-space and put a
null terminator after that.  Which is, naively, O(n) to find the end of
the string and O(n) to walk back till we hit a non-space.  A little
more thought and we realize that just O(n) is quite easy to achieve if
we keep a running pointer to the last scanned non-space character.

What exactly do I need under the cover?  This atrocity was committed by
somebody who doesn't even know C and algorithms.  You're not proving
your point (if that indeed is your point) that assembly/architecture is
*required* to understand algorithms at all.  To require it you must prove
that there isn't any other way to effectively teach algorithms.  The
state of education today can be attributed to many, many factors, not
necessarily abstractions!

: Anyone who asks `what's wrong with that' will be shot.

This code works, by the way, as far as I can tell.  It's even portable.
I suggest you show us somebody with equivalent experience in general 
computing as this individual but started on assembly language and see how 
this person fares with the problem.  Oh, and give them the same amount
of time.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-18  0:00                         ` Tim Behrendsen
  1996-08-20  0:00                           ` James Youngman
@ 1996-08-21  0:00                           ` Szu-Wen Huang
  1 sibling, 0 replies; 688+ messages in thread
From: Szu-Wen Huang @ 1996-08-21  0:00 UTC (permalink / raw)



Tim Behrendsen (tim@airshields.com) wrote:
: Robert Dewar <dewar@cs.nyu.edu> wrote in article
: <dewar.840082355@schonberg>...
[snip]
: > That seems false for many modern RISC architectures, and as ILP becomes
: > more and more of a factor, the instruction level semantics will become
: > more and more complex.

: Actually, RISC is usually easier to learn, just because the
: instructions sets are more orthogonal.  Now, optimizing RISC
: machines may be harder, but that's a different issue.

Ahh, so learn the "architecture" without knowing why the instruction
after the branch is always executed, why branching slows down the
machine, why moving this unrelated instruction up here avoids a
pipeline stall, because it's a "different issue"?

Of course it's easier to learn, once you've deleted the harder parts
as a "different issue"!

: The point is not to learn assembly because they will be using
: it every day, the point is to learn it so they can better
: understand what computers really are, and how they work.  You
: simply can't get the same "Ah-HA!  *Now* I understand!"
: experience from programming in a HLL that you can from
: programming assembly.

Key question is, "now I understand" *WHAT*?  Machine architecture,
or the damn sorting algorithm?




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-18  0:00                     ` Tim Behrendsen
@ 1996-08-21  0:00                       ` Szu-Wen Huang
  1996-08-21  0:00                         ` Tim Behrendsen
                                           ` (2 more replies)
  0 siblings, 3 replies; 688+ messages in thread
From: Szu-Wen Huang @ 1996-08-21  0:00 UTC (permalink / raw)



Tim Behrendsen (tim@airshields.com) wrote:
[snip]
: Hmmm... moving a deck of cards around to learn a sorting
: technique.  Reducing the problem to a very low-level set of
: movement operations to help in understanding procedurally
: what the computer is doing.  <s>Naaah, couldn't work.  Much easier
: to focus on the high-level C abstraction of the sorting
: algorithm. </s> ;->

Lofty, ungraspable concepts like:

  void swap(int *a, int *b)
  {
    int c = *a;

    *b = *a;
    *a = c;
  }
  ...
  swap(&a, &b);
  ...

?  swap(), by the way, is a primitive for just about every algorithms
text I've ever read.  Does knowing the computer must:

  LOAD r1, a
  ADD  something_else_totally_unrelated
  LOAD r2, b
  INC  some_counter_from_another_place
  STOR r1, b
  INC  that_same_counter_but_we_just_need_to_fill_the_slot
  STOR r2, a

in order to swap two integers aid in the understanding of swap()?
I agree that we need to break an algorithm down to primitives, but
are you actually saying swap(), for instance, isn't primitive enough?




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-20  0:00                                                 ` Richard A. O'Keefe
  1996-08-20  0:00                                                   ` Alan Bowler
@ 1996-08-21  0:00                                                   ` Tim Behrendsen
  1996-08-22  0:00                                                     ` Bengt Richter
  1996-08-26  0:00                                                     ` Richard A. O'Keefe
  1 sibling, 2 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-21  0:00 UTC (permalink / raw)



Richard A. O'Keefe <ok@goanna.cs.rmit.edu.au> wrote in article
<4vbbf6$g0a@goanna.cs.rmit.edu.au>...
> "Tim Behrendsen" <tim@airshields.com> writes:
> 
> >Yes, but you can use the "get a better compiler" argument to
> >justify anything.  Real programs run on real computers using
> >real compilers.  The "Super-Duper Ivory Tower 9000 Compiler"
> >just doesn't exist.
> 
> This is a bogus argument, because the better compilers *I* was talking
> about ACTUALLY EXIST.  As a particular example, GCC does self-tail-call
> optimisation and SPARCompiler C 4.0 does general tail-call optimisation.
> Compilers for other languages doing very well indeed with procedures
> include Mercury and Stalin.
> 
> It does nobody any good to pretend that good compilers do not exist.

They may exist, but does it exist universally?  The point is that to
go out of your way to use left-field optimizations is just bad
portability practice.  For example, if someone does this:

void sub(char *s)
{
    ...
    if (strlen(s) == 0) {     /* check for null string */
        ...
    }
}

and they know it's stupid, but they also know the compiler
just happens to have optimization for it, should they
still be shot?  "Diana, get me my rifle."
 
> >> Interestingly enough, the APL sorting primitive DOESN'T move the data
> >> at all.  It returns a permutation vector.  The APL idiom for sorting
is
> >> 	X[.GradeUP X]
> >> where the necessary movement is quite visible.
> 
> >But the reality is *not* visible.  What has the student really
> >learned?
> 
> Scuse please?  WHAT reality?  The reality in this case is that you can
> in fact in just about any reasonable programming language sort data
> WITHOUT moving it.  Indeed, for a number of statistical and numeric
> calculations, the permutation vector is more useful than moving the
> data would be.  The reality could well be a sorting network made of
> comparators and wires.  On the machine I'm posting from, sorting
> _could_ be done using parallel processes (not concurrent, parallel;
> the machine has more than one CPU).  And of course there have been
> at least two machines built that directly executed APL; APL _was_
> their assembly language.

Wait, hold the phone!  "Sorts the data without moving it"?  What,
is APL's sorting algorithm O(1)?  Yes, it may not actually get
sorted until it gets printed, but that's irrelevent to the fact
that it eventually gets sorted.

> In the case of APL, the student has learned to express his/her intent
> in a concise notation with a clear mathematical semantics (this applies
> to APL primitives only, alas; the rest of APL is rather murkier) which
> permits effective reasoning at a much higher level.
> 
> "Mathematical skills" came into it somewhere.

Indeed, and I think APL is kind of a neat language to learn.  But
it hides so much about the internals of the computer, that I think
it would give a student too many false impressions about how
things really work.

> [snip] 
> The ability to do "algebraic" reasoning about a program is quite as
> important as the ability to do assembly level thinking (which I agree
> is important).  When you really need big improvements in performance,
> you get it from the high level thinking.

I actually quite agree with this; it's how you get the "high level
thinking" that I think is an issue.  Even in a mathematical proof,
you are talking about a sequence of micro-steps.  Yes, most
proofs are built of other proofs, but I think this is more of
a "proof macro language" than a "high-level mathematical
language" (whatever that means).  My "proof" of this is the
fact that when a proof is used within another proof, you can
pretty much do a straight insertion of the lower-level
"macro proof" into the text of the proof-to-be-proven.  There
is no "proof compilation", so to speak (I don't know if that
last made *any* sense :-) )

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-20  0:00                                         ` Bob Gilbert
@ 1996-08-21  0:00                                           ` Tim Behrendsen
  1996-08-22  0:00                                             ` Bob Gilbert
  1996-09-04  0:00                                             ` Lawrence Kirby
  0 siblings, 2 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-21  0:00 UTC (permalink / raw)



Bob Gilbert <rgilbert@unconfigured.xvnews.domain> wrote in article
<4vcac4$gm6@zeus.orl.mmc.com>...
> In article <01bb8df1$2e19d420$87ee6fce@timpent.airshields.com>, "Tim
Behrendsen" <tim@airshields.com> writes:
> > Bob Gilbert <rgilbert@unconfigured.xvnews.domain> wrote in article
> 
> > Yes, exactly.  I understand that theoretically computers
> > can be defined in terms of any architectural abstraction,
> > but that's not the point.  The point is to give the students
> > a foundation to build on, and a very simple computer
> > architecture is much simpler than complex language syntax.
> 
> Apples and oranges.  Learning problem solving and understanding
> basic computer architecture are two different things, although
> I do agree that understanding the underlying architecture helps
> in developing solutions (gives a target to shoot for), it can 
> also hinder the process by prejudicing the student's thinking
> into terms of the limited set of known targets.

Perhaps, but you have risk prejudice somewhere.  The first
language you pick is going to give them some kind of
abstractional prejudice.
 
> > All computer problems have certain things in common; they have an
> > input, they have an output, and they have a set of procedures
> > in between.  Teaching the student how to think in terms of
> > breaking problems into procedures is by far the most difficult
> > task in teaching someone to program. 
> 
> A very procedural point of view.  Many of the proponents of object
> oriented design might have a problem with this view, and demonstrates
> my point about allowing the details of implementation to obscure the
> higher level problem solving process.

There is no other view than the procedural view.  There is no
such thing as an algorithm that exists in zero time.  Even if
there is only one operation, or (n) operations that happen
simultaneously, it is still (n) data transformations that take
place over time.

> >  I'm reminded of the one guy
> > on another thread that said "I learned Quicksort, but I didn't
> > really understand it."  This is what happens when you focus on
> > packing knowledge, without teaching understanding.
> 
> Agreed.  Sorting is a good example of a classic problem domain for
> which there exists many mature solutions.  I still fail to see how
> implementing one or more of these solutions in assembly vs a HOL
> improves the understanding of the problem domain or even the specific
> solution (Quicksort, Bubble sort, whatever).

How can someone implement *any* sort in assembly language,
and "learn it but not really understand it"?  To implement it,
you have to do it in great detail, and you simply can't do the
"push and prod until it works" approach to programming, which
is what I think a lot of students do.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [any language except Ada]
  1996-08-13  0:00             ` What's the best language to learn? [was Re: Should I learn C or Pascal?] Tim Behrendsen
  1996-08-14  0:00               ` Gabor Egressy
  1996-08-14  0:00               ` What's the best language to learn? [was Re: Should I learn C or Pascal?] Peter Seebach
@ 1996-08-21  0:00               ` Bill Mackay
  1996-08-22  0:00                 ` Stephen M O'Shaughnessy
                                   ` (2 more replies)
  2 siblings, 3 replies; 688+ messages in thread
From: Bill Mackay @ 1996-08-21  0:00 UTC (permalink / raw)




A core unit of a post-grad course i'm doing was 100% Ada and 100% waste 
of time! A rotten language and only used by the US military - enough 
said!
-- 
Bill Mackay
Suffolk Park
Australia

"I'm a Marxist of the Harpo kind"






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-17  0:00                   ` Dan Pop
  1996-08-17  0:00                     ` Tim Behrendsen
@ 1996-08-21  0:00                     ` Tanmoy Bhattacharya
  1996-08-30  0:00                       ` Goto considered really harmful Patrick Horgan
  1 sibling, 1 reply; 688+ messages in thread
From: Tanmoy Bhattacharya @ 1996-08-21  0:00 UTC (permalink / raw)



In article <01bb8f12$6dbb5f00$32ee6fce@timhome2>
"Tim Behrendsen" <tim@airshields.com> writes:
<snip>
B: > : 	while (buf[0]==' ')
B: > : 	{
B: > : 	  for (p=buf;p[0]=p[1];p++)
B: > : 	    ;
B: > : 	}
B: > 
B: > : 	while (buf[strlen(buf)-1]==' ')
B: > : 	  buf[strlen(buf)-1]=0
<snip>
B: You're missing the point.  The point is how can *anyone* commit such
B: a monument to inefficient coding?  Yet, someone did.  That someone
B: obviously sees four lines of code, and therefore it must be OK.
B: Now, if that someone had an assembly background, and could see how
B: these statements were going to compile, they would instantly feel
B: the horror that a good and moral programmer should feel over this
B: code.

I think it is not much `longer' in assembly than in C. (I mean the
ratio of the length of assembly code to C is probably the same as any
other piece of code). The double loop is as easy to write in assembly
as in C: and a call to strlen is equally easy.

Lack of ability or willingness to think cannot be corrected by
changing languages.

Cheers
Tanmoy

--
tanmoy@qcd.lanl.gov(128.165.23.46) DECNET: BETA::"tanmoy@lanl.gov"(1.218=1242)
Tanmoy Bhattacharya O:T-8(MS B285)LANL,NM87545 H:#9,3000,Trinity Drive,NM87544
Others see <gopher://yaleinfo.yale.edu:7700/00/Internet-People/internet-mail>,
<http://alpha.acast.nova.edu/cgi-bin/inmgq.pl>or<ftp://csd4.csd.uwm.edu/pub/
internetwork-mail-guide>. -- <http://nqcd.lanl.gov/people/tanmoy/tanmoy.html>
fax: 1 (505) 665 3003   voice: 1 (505) 665 4733    [ Home: 1 (505) 662 5596 ]




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-21  0:00                                                 ` Tanmoy Bhattacharya
@ 1996-08-21  0:00                                                   ` Adam Beneschan
  1996-08-22  0:00                                                     ` Andrew Koenig
  1996-08-22  0:00                                                     ` Christian Bau
  1996-08-21  0:00                                                   ` Tim Behrendsen
  1 sibling, 2 replies; 688+ messages in thread
From: Adam Beneschan @ 1996-08-21  0:00 UTC (permalink / raw)



tanmoy@qcd.lanl.gov (Tanmoy Bhattacharya) writes:
 
 >In article <01bb8f1b$ce59c820$32ee6fce@timhome2>
 >"Tim Behrendsen" <tim@airshields.com> writes:
 ><snip>
 >B: I have to admit, I have to side with Szu-Wen.  I've never really
 >B: thought about this case for the O() notation, but it seems from
 >B: a purely mathematical standpoint that O(f(n)) means that f(n)
 >B: is *the* function.  If f(n) happens to be a non-continuous function,
 > 
 >It is not a question of convenience or usefulness: it is a question of
 >definitions. As is almost universally defined defined, O(n^2+n) and
 >O(n^2) mean exactly the same, as does
 >
 ><snip>
 >B: 
 >B:     O( (n < 1000) ? n : n^2 )
 >B: 
 >
 >and hence when you write it instead of the more easily read O(n^2), it
 >tells people that you do not know the meaning of O().
 >
 >B: The latter is not only more useful, but absolutely accurate.
 >
 >Which is irrelevant. Use, a different notation if you want to invent a
 >new concept: a concept that is more useful in what you want to do. 
 >
 >B: AFAIK, there are no limits on the complexity of the Big-Oh
 >B: function, and no requirement that it be a continuous
 >B: function.
 >
 >No, but a complex Big-Oh function has *exactly* the same meaning as a
 >simpler one in most cases. Calling something a O(f(n)) function means
 >saying that (This may not be the technical definition: but this
 >expresses the meaning commonly used):
 >
 >  1) there exists k and N depending on k such that for all n > N, the
 >     function in question is bounded in absolute value by by k |f(n)|. 
 >  2) The minimum such k is non-zero. (This is to prevent a O(n)
 >     function also being classified as a a O(nlogn) for example).

I've never seen a definition of O(f(n)) that includes anything like
(2).  In the mathematical definitions I've seen, O(f(n)) provides only
an upper bound on the value expressed by O-notation, not a lower
bound.  Under these definitions, it is perfectly correct (although not
too useful) to say that a linear algorithm is O(n**2) or O(n**3).

Perhaps someone else can provide a different definition?  Tanmoy's
statement as expressed above isn't precise enough to be a mathematical
definition.  Perhaps it means that there are two constants, kl and ku,
such that the value represented by O(f(n)) is between kl |f(n)| and ku
|f(n)| for all n > N.  This still may not be enough for people who
want O(f(n)) to give an approximation of proportional running time, so
those people may want to add the restriction that kl/ku >= 0.95 or
something to keep the running time from varying too much.  And Tim
wants a notation that puts bounds on the running time for *all* n, not
just "large enough" n.

The more I follow this thread, the more I'm convinced that our use of
O-notation is an abuse; we've twisted the original mathematical
purpose of the notation beyond recognition.  The first place I saw
this notation used was in Knuth's _Art of Computer Programming_, and
he only used it to express part of a formula that approaches zero as n
gets large.  For example:

P(n) = sqrt(pi*n/2) - 2/3 + (11/24)*sqrt(pi/(2*n)) + 4/(135*n)
         - (71/1152)*sqrt(pi/(2*n**3)) + O(n**-2)

Here, O(n**-2) refers to terms in the sum that eventually go to zero
as n gets large.  From what I could find, Knuth *never* uses it to
describe the running time of an algorithm. 

Following this viewpoint, when we computer scientists speak of an
algorithm's running time as O(n**2), mathematicians might say

    Running time = K * n**2 * (1 + O(1/n))

for some constant K.  The point here is that the proportional
difference between the running time and (K * n**2) tends to disappear
as n gets large (hence the O(1/n) term).  

In contrast to what I believe was the original purpose of O-notation,
computer scientists are using the notation as a shorthand for
"approximately proportional to."  The original mathematical
definition, which suited the purpose of expressing a quantity that
tended toward zero, doesn't suit our purposes of classifying
algorithms by approximate running time, and for predicting how long
our programs will run for.  Sure, we can augment the definition to
meet our needs, but then we all have arguments about what O-notation
really means, since there's no standard any more.  (Furthermore, the
changed definitions may mean that O-notation no longer has some of the
mathematical properties that mathematicians make use of when working
with O-notation.)

So maybe it's time we all admit that our use of O-notation conforms to
neither the original purpose nor the original definition of the
notation.  With that in mind, perhaps we should invent our own
notation or notations, with definitions we can all standardize on.

\end{overly-academic-nitpicking}

                                -- Adam

P.S. My theories about how mathematicians view and use O-notation may
be incorrect.  I might ask sci.math to enlighten me here.   --ajb





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-21  0:00                                                     ` Tim Behrendsen
  1996-08-21  0:00                                                       ` Pete Becker
@ 1996-08-21  0:00                                                       ` Matt Austern
  1996-08-21  0:00                                                         ` Tim Behrendsen
  1996-08-21  0:00                                                       ` Tanmoy Bhattacharya
  1996-08-22  0:00                                                       ` Robert Dewar
  3 siblings, 1 reply; 688+ messages in thread
From: Matt Austern @ 1996-08-21  0:00 UTC (permalink / raw)



adam@irvine.com (Adam Beneschan) writes:

> The more I follow this thread, the more I'm convinced that our use of
> O-notation is an abuse; we've twisted the original mathematical
> purpose of the notation beyond recognition.  The first place I saw
> this notation used was in Knuth's _Art of Computer Programming_, and
> he only used it to express part of a formula that approaches zero as n
> gets large.  For example:
> 
> P(n) = sqrt(pi*n/2) - 2/3 + (11/24)*sqrt(pi/(2*n)) + 4/(135*n)
>          - (71/1152)*sqrt(pi/(2*n**3)) + O(n**-2)
> 
> Here, O(n**-2) refers to terms in the sum that eventually go to zero
> as n gets large.  From what I could find, Knuth *never* uses it to
> describe the running time of an algorithm. 

Yes he does.  Theorem P, for example, in section 5.2.1, says that the
running time of shell sort (using a specific increment sequence) is
O(N^(3/2)).  This statement is correct and precise: it means that
there exists some positive constant C such that, for sufficiently
large N, the running time of shell sort is less than C N^(3/2).

Part of this confusion might be that some people say O(N) when they
really mean Theta(N).  See section 2.1 of Cormen, Leiserson, and
Rivest for the distinction between O(N), Theta(N), and Omega(N).




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-21  0:00                                                       ` Matt Austern
@ 1996-08-21  0:00                                                         ` Tim Behrendsen
  0 siblings, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-21  0:00 UTC (permalink / raw)





Matt Austern <austern@isolde.mti.sgi.com> wrote in article
<fxtu3tw1pi0.fsf@isolde.mti.sgi.com>...
> adam@irvine.com (Adam Beneschan) writes:
> 
> > The more I follow this thread, the more I'm convinced that our use of
> > O-notation is an abuse; we've twisted the original mathematical
> > purpose of the notation beyond recognition.  The first place I saw
> > this notation used was in Knuth's _Art of Computer Programming_, and
> > he only used it to express part of a formula that approaches zero as n
> > gets large.  For example:
> > 
> > P(n) = sqrt(pi*n/2) - 2/3 + (11/24)*sqrt(pi/(2*n)) + 4/(135*n)
> >          - (71/1152)*sqrt(pi/(2*n**3)) + O(n**-2)
> > 
> > Here, O(n**-2) refers to terms in the sum that eventually go to zero
> > as n gets large.  From what I could find, Knuth *never* uses it to
> > describe the running time of an algorithm. 
> 
> Yes he does.  Theorem P, for example, in section 5.2.1, says that the
> running time of shell sort (using a specific increment sequence) is
> O(N^(3/2)).  This statement is correct and precise: it means that
> there exists some positive constant C such that, for sufficiently
> large N, the running time of shell sort is less than C N^(3/2).
> 
> Part of this confusion might be that some people say O(N) when they
> really mean Theta(N).  See section 2.1 of Cormen, Leiserson, and
> Rivest for the distinction between O(N), Theta(N), and Omega(N).

Interestingly, in section 5.2.2, he uses O() notation in the
actual running time computation for bubble sort:

"... so the MIX running time is 8A + 7B + 8C + 1 =
(min 8N + 1, ave 5.75N^2 + O(N ln N), max 7.5N^2 + 0.5N + 1)."

Reading on to the section on Asymptotic Methods, he goes into
some very heavy duty math involving the asymptotic behavior
of bubble sort.  Egad!  I never thought you could generate
so much calculus from lowly bubble sort.

I think you're right, however.  The O() notation is all over
his analysis in non-trivial ways, such as my favorite:

O( N^(1/2) * e^(-pi*N/2) * integral(-1.5 -> M, N^t dt) ), if
    2^iN != 1.

Obviously, O() is being used in a precise mathematical way,
rather than the informal use that it's normally put to.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-20  0:00               ` Paul Schlyter
  1996-08-20  0:00                 ` Mike Rubenstein
@ 1996-08-21  0:00                 ` James Youngman
  1996-08-22  0:00                   ` TRAN PHAN ANH
  1 sibling, 1 reply; 688+ messages in thread
From: James Youngman @ 1996-08-21  0:00 UTC (permalink / raw)



In article <4vblm8$df9@electra.saaf.se>, pausch@electra.saaf.se says...

>The performane of Quicksort on already sorted, or almost-sorted, data
>can be dramatically improved by first scrambling the data(!) somewhat.

It all depends exactly on the Quicksort implementation chosen.  For example, 
the implementation in "The Standard C Library" by PJ Plauger runs fastest on 
fully-sorted input.

-- 
James Youngman                               VG Gas Analysis Systems
The trouble with the rat-race is, even if you win, you're still a rat.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-21  0:00                       ` Szu-Wen Huang
@ 1996-08-21  0:00                         ` Tim Behrendsen
  1996-08-22  0:00                         ` Mark Wooding
  1996-08-23  0:00                         ` Clayton Weaver
  2 siblings, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-21  0:00 UTC (permalink / raw)



Szu-Wen Huang <huang@mnsinc.com> wrote in article
<4vdon9$8nt@news1.mnsinc.com>...
> Tim Behrendsen (tim@airshields.com) wrote:
> [snip]
> : Hmmm... moving a deck of cards around to learn a sorting
> : technique.  Reducing the problem to a very low-level set of
> : movement operations to help in understanding procedurally
> : what the computer is doing.  <s>Naaah, couldn't work.  Much easier
> : to focus on the high-level C abstraction of the sorting
> : algorithm. </s> ;->
> 
> Lofty, ungraspable concepts like:
> 
>   void swap(int *a, int *b)
>   {
>     int c = *a;
> 
>     *b = *a;
>     *a = c;
>   }
>   ...
>   swap(&a, &b);
>   ...
> 
> ?  swap(), by the way, is a primitive for just about every algorithms
> text I've ever read.  Does knowing the computer must:
> 
>   LOAD r1, a
>   ADD  something_else_totally_unrelated
>   LOAD r2, b
>   INC  some_counter_from_another_place
>   STOR r1, b
>   INC  that_same_counter_but_we_just_need_to_fill_the_slot
>   STOR r2, a
> 
> in order to swap two integers aid in the understanding of swap()?
> I agree that we need to break an algorithm down to primitives, but
> are you actually saying swap(), for instance, isn't primitive enough?

I'm not quite following your assembly translation, but let me
roll with it.  Yes, I think this is a *great* example of how the
assembly would be much simpler than the C.  What are they looking
at when the look at the C code?  Look at all the syntax involved
in that thing.  a and b exist somwhere, but where?  Never mind,
they're variables.  Argument passing, indirection, automatic
variables, on and on.  Can you honestly say an assembly
translation with the memory map right beside wouldn't be
easier to understand?  It would be completely obvious, and you
wouldn't be bogged down with all that syntax.

-- Tim Behrendsen (tim@airshields.com)

 




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-21  0:00                                                     ` Tim Behrendsen
  1996-08-21  0:00                                                       ` Pete Becker
  1996-08-21  0:00                                                       ` Matt Austern
@ 1996-08-21  0:00                                                       ` Tanmoy Bhattacharya
  1996-08-22  0:00                                                         ` Mike Rubenstein
  1996-08-22  0:00                                                         ` Dann Corbit
  1996-08-22  0:00                                                       ` Robert Dewar
  3 siblings, 2 replies; 688+ messages in thread
From: Tanmoy Bhattacharya @ 1996-08-21  0:00 UTC (permalink / raw)



In article <4vfk6b$i6h@krusty.irvine.com>
adam@irvine.com (Adam Beneschan) writes:
<snip>
AB:  >No, but a complex Big-Oh function has *exactly* the same meaning as a
AB:  >simpler one in most cases. Calling something a O(f(n)) function means
AB:  >saying that (This may not be the technical definition: but this
AB:  >expresses the meaning commonly used):
AB:  >
AB:  >  1) there exists k and N depending on k such that for all n > N, the
AB:  >     function in question is bounded in absolute value by by k |f(n)|. 
AB:  >  2) The minimum such k is non-zero. (This is to prevent a O(n)
AB:  >     function also being classified as a a O(nlogn) for example).
AB: 
AB: I've never seen a definition of O(f(n)) that includes anything like
AB: (2).  In the mathematical definitions I've seen, O(f(n)) provides only
AB: an upper bound on the value expressed by O-notation, not a lower
AB: bound.  Under these definitions, it is perfectly correct (although not
AB: too useful) to say that a linear algorithm is O(n**2) or O(n**3).

Correct. Number (2) actually exists only to get it closer to common
usage without giving up the asymptotic nature of the definition.

AB: 
AB: Perhaps someone else can provide a different definition?  Tanmoy's
AB: statement as expressed above isn't precise enough to be a mathematical
AB: definition.  Perhaps it means that there are two constants, kl and ku,

The (1) above is obviously rigorous.

The (2) above is completely rigorous statement expressing common
usage. The `minimum such k' refers to the minima of the set of allowed
k's (i.e. values of k for which an N exists): the minima may not
actually belong to the set.

AB: such that the value represented by O(f(n)) is between kl |f(n)| and ku
AB: |f(n)| for all n > N.  This still may not be enough for people who

No. That does not capture the common meaning: if an algorithm is
trivial for odd N, but goes as N^2 for even N, it is still called
O(N^2). But, naturally, you can't find kl in this case.

AB: want O(f(n)) to give an approximation of proportional running time, so
AB: those people may want to add the restriction that kl/ku >= 0.95 or
AB: something to keep the running time from varying too much.  And Tim
AB: wants a notation that puts bounds on the running time for *all* n, not
AB: just "large enough" n.

Yes if you want O() to be an approximation, you may want to use your
kl/ku solution.

AB: 
AB: The more I follow this thread, the more I'm convinced that our use of
AB: O-notation is an abuse; we've twisted the original mathematical
AB: purpose of the notation beyond recognition.  The first place I saw
AB: this notation used was in Knuth's _Art of Computer Programming_, and
AB: he only used it to express part of a formula that approaches zero as n
AB: gets large.  For example:
AB: 
AB: P(n) = sqrt(pi*n/2) - 2/3 + (11/24)*sqrt(pi/(2*n)) + 4/(135*n)
AB:          - (71/1152)*sqrt(pi/(2*n**3)) + O(n**-2)
AB: 
AB: Here, O(n**-2) refers to terms in the sum that eventually go to zero
AB: as n gets large.  From what I could find, Knuth *never* uses it to
AB: describe the running time of an algorithm. 
AB: 
AB: Following this viewpoint, when we computer scientists speak of an
AB: algorithm's running time as O(n**2), mathematicians might say
AB: 
AB:     Running time = K * n**2 * (1 + O(1/n))
AB: 
AB: for some constant K.  The point here is that the proportional
AB: difference between the running time and (K * n**2) tends to disappear
AB: as n gets large (hence the O(1/n) term).  
AB: 

And as you can trivially show, the K in the above formula is the
minima of the set of k's that I was talking about. The two points of
view are identical. Except, when I (or almost anyone) calls something
O(N^2), the correction term need not be O(N): it may as well be
O(NlogN). 

AB: In contrast to what I believe was the original purpose of O-notation,
AB: computer scientists are using the notation as a shorthand for
AB: "approximately proportional to."  The original mathematical

No. It is used mainly in the theory of algorithmic complexity. Its use
as an approximation is only a loose usage: which is often, but not
always, correct.

AB: definition, which suited the purpose of expressing a quantity that
AB: tended toward zero, doesn't suit our purposes of classifying

The O notation is used for both small and large quantities. Its
original definition was that f(x) and g(x) were the same `order' if
lim f(x)/g(x) was neither 0 nor infinite as x went to something. This
definition can be easily modified if the limit does not exist, and I
think my definition is identical to that variation as long as we talk
about O(f) only for functions f that are monotonic for sufficiently
large N. 

<snip>
AB: P.S. My theories about how mathematicians view and use O-notation may
AB: be incorrect.  I might ask sci.math to enlighten me here.   --ajb

Yes, the topic has gone far beyond the purview of most of the groups:
let us take this discussion elsewhere.

Cheers
Tanmoy
--
tanmoy@qcd.lanl.gov(128.165.23.46) DECNET: BETA::"tanmoy@lanl.gov"(1.218=1242)
Tanmoy Bhattacharya O:T-8(MS B285)LANL,NM87545 H:#9,3000,Trinity Drive,NM87544
Others see <gopher://yaleinfo.yale.edu:7700/00/Internet-People/internet-mail>,
<http://alpha.acast.nova.edu/cgi-bin/inmgq.pl>or<ftp://csd4.csd.uwm.edu/pub/
internetwork-mail-guide>. -- <http://nqcd.lanl.gov/people/tanmoy/tanmoy.html>
fax: 1 (505) 665 3003   voice: 1 (505) 665 4733    [ Home: 1 (505) 662 5596 ]




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-17  0:00                                               ` Robert Dewar
  1996-08-20  0:00                                                 ` Szu-Wen Huang
@ 1996-08-21  0:00                                                 ` Tanmoy Bhattacharya
  1996-08-21  0:00                                                   ` Adam Beneschan
  1996-08-21  0:00                                                   ` Tim Behrendsen
  1 sibling, 2 replies; 688+ messages in thread
From: Tanmoy Bhattacharya @ 1996-08-21  0:00 UTC (permalink / raw)



In article <01bb8f1b$ce59c820$32ee6fce@timhome2>
"Tim Behrendsen" <tim@airshields.com> writes:
<snip>
B: I have to admit, I have to side with Szu-Wen.  I've never really
B: thought about this case for the O() notation, but it seems from
B: a purely mathematical standpoint that O(f(n)) means that f(n)
B: is *the* function.  If f(n) happens to be a non-continuous function,

It is not a question of convenience or usefulness: it is a question of
definitions. As is almost universally defined defined, O(n^2+n) and
O(n^2) mean exactly the same, as does

<snip>
B: 
B:     O( (n < 1000) ? n : n^2 )
B: 

and hence when you write it instead of the more easily read O(n^2), it
tells people that you do not know the meaning of O().

B: The latter is not only more useful, but absolutely accurate.

Which is irrelevant. Use, a different notation if you want to invent a
new concept: a concept that is more useful in what you want to do. 

B: AFAIK, there are no limits on the complexity of the Big-Oh
B: function, and no requirement that it be a continuous
B: function.

No, but a complex Big-Oh function has *exactly* the same meaning as a
simpler one in most cases. Calling something a O(f(n)) function means
saying that (This may not be the technical definition: but this
expresses the meaning commonly used):

  1) there exists k and N depending on k such that for all n > N, the
     function in question is bounded in absolute value by by k |f(n)|. 
  2) The minimum such k is non-zero. (This is to prevent a O(n)
     function also being classified as a a O(nlogn) for example).

The notation works quite well in practice: generally what happens is
that for very small n (for n < N in the definition), the `main loop'
is not dominant: there are other stuff like initialization which
dominates the time taken. The big O notation says that we are
interested in the behaviour of the function when such subdominant
pieces are unimportant. In fact, such a division is important because
there is no reason why the ratio of time taken in the loop to the time
taken outside it should be independent of the machine in use: if you
wanted big O notation to be a property of the algorithm and not the
hardware (as long as we stick to a reasonable architecture), you
cannot include subdominant terms in it. Incidentally, the
constant k simply means that it is not important what the units of
measurement are, again serves to make the notation independent of the
precise hardware on which it is being run.

However, there are pathological cases: one such pathological case is
when the initialization (or something similar) takes a large amount of
time. Thus for example, we can thing of an algorithm that takes
0.00001 n (main loop) + 1000 (initialization) amount of time. Now, for
n >> 100000000, the initialization is unimportant: and big O notation
will latch on to that case and call it an O(n) problem. However, in
practice, all that you will ever see for a reasonable n is the
initialization time, and hence, in practice, it will look much like an
O(1) problem.

Similarly, sometimes the time taken to really do a problem (say adding
two numbers) is proportional to the number of digits in the number:
i.e. to log n. However, on most hardware, the time taken to do this is
a constant, albeit with a limit to the size of numbers it can add. In
such cases, again, the observed complexity will be O(1), whereas
truly, the complexity is O(logn).

But, these caveats about order O is the penalty we pay for using an
useful concept. The defintion you called more useful, most will find
*less* useful: to use such a concept for a sorting routine, one would
need to know the relative time for a swap and compare on the
particular hardware for example. Giving up the machine independence in
a concept designed to measure abstract algorithmic efficiency is a
stiff price to pay for a perceived usefulness.

Most people instead choose to give you the O() of the algorithm with a
statement pointing out the caveats. I have seen statements that
`addition is O(1) with the caveat that if the numbers really became
big, it would become O(logn)'. Such statements are necessarily
mathematically imprecise, but we know what they mean. And, if, someone
were to question me as to what the precise meaning is, I could of
course phrase it trivially: It would be a fomalization of `what I mean
is that if we enlarge the turing machine with an oracle that can add
arbitrarily large numbers in one move, the algorithm would become O(1)
instead of O(n)'. 

Calling something O(n^2+n) is still an abomination!

Cheers
Tanmoy
--
tanmoy@qcd.lanl.gov(128.165.23.46) DECNET: BETA::"tanmoy@lanl.gov"(1.218=1242)
Tanmoy Bhattacharya O:T-8(MS B285)LANL,NM87545 H:#9,3000,Trinity Drive,NM87544
Others see <gopher://yaleinfo.yale.edu:7700/00/Internet-People/internet-mail>,
<http://alpha.acast.nova.edu/cgi-bin/inmgq.pl>or<ftp://csd4.csd.uwm.edu/pub/
internetwork-mail-guide>. -- <http://nqcd.lanl.gov/people/tanmoy/tanmoy.html>
fax: 1 (505) 665 3003   voice: 1 (505) 665 4733    [ Home: 1 (505) 662 5596 ]




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-20  0:00                                                   ` Dann Corbit
@ 1996-08-21  0:00                                                     ` Tim Behrendsen
  1996-08-21  0:00                                                       ` Dann Corbit
  1996-08-22  0:00                                                       ` Richard A. O'Keefe
  0 siblings, 2 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-21  0:00 UTC (permalink / raw)



Dann Corbit <a-cnadc@microsoft.com> wrote in article
<01bb8eeb$11e06d00$2bac399d@v-cnadc1>...
> 
> Szu-Wen Huang <huang@mnsinc.com> wrote in article
> <4vd05f$rt5@news1.mnsinc.com>...
> {snip}
> > I disagree.  I expect a linear algorithm to display linear behavior
> > unless otherwise stated.  What you cite is a case where it needs to
> > be explicitly stated, because calling that algorithm "O(n)" is next
> > to useless in predicting its behavior without knowing this peculiar
> > behavior at n=1,000.  IOW, this prediction holds if I'm predicting
> > for n in (0, 1,000] and (1,000, infinity), and will hold for other
> > algorithms that do not have this peculiarity.
> 
> Nonetheless, it only has to be true in the limiting case.
> Try, for instance, some O(n*log(n) ) algorithm like heapsort
> on data that is random, ordered, reverse ordered, sawtooth, etc,
> and you will find that it takes different amounts of time even for
> the same number of input values.
> 
> Or take quicksort, which is worst case O(n*n) and you will
> see that it usually plots as approximately O(n*log(n)), but
> can even appear to be linear for some special data sets and 
> algorithm implementations.

Well, but look at your own terminology.  You describe quicksort
as "worst case O(n*n)" and (paraphrase) "average case O(n*log(n))".
When the Big-Oh is given for a particular algorithm without
any qualification, it is usually assumed to be the average
behavior over the data set.  The point is not that O() makes
absolute time predictions, but that it gives *behavior*
predictions.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-16  0:00                                             ` Szu-Wen Huang
                                                                 ` (2 preceding siblings ...)
  1996-08-18  0:00                                               ` Steve Heller
@ 1996-08-21  0:00                                               ` Matt Austern
  1996-08-23  0:00                                               ` Tanmoy Bhattacharya
  4 siblings, 0 replies; 688+ messages in thread
From: Matt Austern @ 1996-08-21  0:00 UTC (permalink / raw)



"Tim Behrendsen" <tim@airshields.com> writes:

> By "useful", I mean conveying information about the algorithm.  I
> agree with you that in practice the less significant terms may
> be left off the function, i.e. O(n*n + n) is effectively equivalent
> to O(n*n).  But the particular case that started all this was the
> non-continuous case, where one range of (n) produces O(n)
> behavior, and another range produces O(n*n).  In practice, this
> would normally be described exactly as I have just done, but I
> don't think it's either accurate or complete to only declare
> the algorithm O(n*n) if both ranges are possible input data.

Not just "effectively equivalent": identical.  By definition (see, for
example, section 1.2.11 of Knuth) a function f(n) is O(g(n)) if there
exists some positive constant C and some index n0 such that, for all 
n >= n0, |f(n)| <= C |g(n)|.

Given this definition, it's easy to prove that any function f that is
O(n^2 + n) must also be O(n^2).  




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-21  0:00                                                 ` Tanmoy Bhattacharya
  1996-08-21  0:00                                                   ` Adam Beneschan
@ 1996-08-21  0:00                                                   ` Tim Behrendsen
  1996-08-22  0:00                                                     ` Mike Rubenstein
  1996-08-22  0:00                                                     ` Robert Dewar
  1 sibling, 2 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-21  0:00 UTC (permalink / raw)



Tanmoy Bhattacharya <tanmoy@qcd.lanl.gov> wrote in article
<TANMOY.96Aug21083507@qcd.lanl.gov>...
> In article <01bb8f1b$ce59c820$32ee6fce@timhome2>
> "Tim Behrendsen" <tim@airshields.com> writes:
> <snip>
> B: I have to admit, I have to side with Szu-Wen.  I've never really
> B: thought about this case for the O() notation, but it seems from
> B: a purely mathematical standpoint that O(f(n)) means that f(n)
> B: is *the* function.  If f(n) happens to be a non-continuous function,
> 
> It is not a question of convenience or usefulness: it is a question of
> definitions. As is almost universally defined defined, O(n^2+n) and
> O(n^2) mean exactly the same, as does
> 
> <snip>
> B: 
> B:     O( (n < 1000) ? n : n^2 )
> B: 
> 
> and hence when you write it instead of the more easily read O(n^2), it
> tells people that you do not know the meaning of O().

Perhaps.  My understanding of O() notation is that it does
not give a prediction of running time, it gives a rough
approximation of behavior based on data set size.

IOW, O(n*n) has quadratic behavior relative to O(n), but the
O() function does not imply an absolute k*n*n running time.

> B: The latter is not only more useful, but absolutely accurate.
> 
> Which is irrelevant. Use, a different notation if you want to invent a
> new concept: a concept that is more useful in what you want to do. 

Well, *somebody* has to fix the concepts that may be dumbly
defined! :)
 
> B: AFAIK, there are no limits on the complexity of the Big-Oh
> B: function, and no requirement that it be a continuous
> B: function.
> 
> No, but a complex Big-Oh function has *exactly* the same meaning as a
> simpler one in most cases. Calling something a O(f(n)) function means
> saying that (This may not be the technical definition: but this
> expresses the meaning commonly used):
> 
>   1) there exists k and N depending on k such that for all n > N, the
>      function in question is bounded in absolute value by by k |f(n)|. 
>   2) The minimum such k is non-zero. (This is to prevent a O(n)
>      function also being classified as a a O(nlogn) for example).
> 
> The notation works quite well in practice: generally what happens is
> that for very small n (for n < N in the definition), the `main loop'
> is not dominant: there are other stuff like initialization which
> dominates the time taken. The big O notation says that we are
> interested in the behaviour of the function when such subdominant
> pieces are unimportant. In fact, such a division is important because
> there is no reason why the ratio of time taken in the loop to the time
> taken outside it should be independent of the machine in use: if you
> wanted big O notation to be a property of the algorithm and not the
> hardware (as long as we stick to a reasonable architecture), you
> cannot include subdominant terms in it. Incidentally, the
> constant k simply means that it is not important what the units of
> measurement are, again serves to make the notation independent of the
> precise hardware on which it is being run.
> 
> However, there are pathological cases: one such pathological case is
> when the initialization (or something similar) takes a large amount of
> time. Thus for example, we can thing of an algorithm that takes
> 0.00001 n (main loop) + 1000 (initialization) amount of time. Now, for
> n >> 100000000, the initialization is unimportant: and big O notation
> will latch on to that case and call it an O(n) problem. However, in
> practice, all that you will ever see for a reasonable n is the
> initialization time, and hence, in practice, it will look much like an
> O(1) problem.
> 
> Similarly, sometimes the time taken to really do a problem (say adding
> two numbers) is proportional to the number of digits in the number:
> i.e. to log n. However, on most hardware, the time taken to do this is
> a constant, albeit with a limit to the size of numbers it can add. In
> such cases, again, the observed complexity will be O(1), whereas
> truly, the complexity is O(logn).

Agreed; the O() function is meant to be used in context, not
as an absolute statement of algorithm efficiency.

> But, these caveats about order O is the penalty we pay for using an
> useful concept. The defintion you called more useful, most will find
> *less* useful: to use such a concept for a sorting routine, one would
> need to know the relative time for a swap and compare on the
> particular hardware for example. Giving up the machine independence in
> a concept designed to measure abstract algorithmic efficiency is a
> stiff price to pay for a perceived usefulness.

By "useful", I mean conveying information about the algorithm.  I
agree with you that in practice the less significant terms may
be left off the function, i.e. O(n*n + n) is effectively equivalent
to O(n*n).  But the particular case that started all this was the
non-continuous case, where one range of (n) produces O(n)
behavior, and another range produces O(n*n).  In practice, this
would normally be described exactly as I have just done, but I
don't think it's either accurate or complete to only declare
the algorithm O(n*n) if both ranges are possible input data.

If the definition of O() notation calls for the linear case to
simply be ignored, then the definition should be changed,
because it is simply not accurate.  Note I'm not expecting
precision, I'm expecting accuracy.

> Most people instead choose to give you the O() of the algorithm with a
> statement pointing out the caveats. I have seen statements that
> `addition is O(1) with the caveat that if the numbers really became
> big, it would become O(logn)'. Such statements are necessarily
> mathematically imprecise, but we know what they mean. And, if, someone
> were to question me as to what the precise meaning is, I could of
> course phrase it trivially: It would be a fomalization of `what I mean
> is that if we enlarge the turing machine with an oracle that can add
> arbitrarily large numbers in one move, the algorithm would become O(1)
> instead of O(n)'. 

The reason we can safely ignore the O(n) case is that the large
number case is not part of our input domain.  If it was, then
we would necessarily need to have a non-continuous O() function.

> Calling something O(n^2+n) is still an abomination!

Agreed.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-21  0:00                           ` Szu-Wen Huang
@ 1996-08-21  0:00                             ` Adam Beneschan
  1996-08-21  0:00                             ` Tim Behrendsen
  1 sibling, 0 replies; 688+ messages in thread
From: Adam Beneschan @ 1996-08-21  0:00 UTC (permalink / raw)



huang@mnsinc.com (Szu-Wen Huang) writes:
 >Mark Wooding (mdw@excessus.demon.co.uk) wrote:
 >[snip]
 >: 	char buf[...];
 >: 	char *p;
 >
 >: 	...
 >
 >: 	while (buf[0]==' ')
 >: 	{
 >: 	  for (p=buf;p[0]=p[1];p++)
 >: 	    ;
 >: 	}
 >
 >: 	while (buf[strlen(buf)-1]==' ')
 >: 	  buf[strlen(buf)-1]=0
 >
 >: I can't believe that anyone with an understanding of what goes on `under
 >: the covers' would possibly write anything like this without feeling ill.
 >: An inkling of what this would be translated into by any implementation
 >: would surely avoid horrors like this.
 
[snip]
 
 >This code works, by the way, as far as I can tell.  

No, it doesn't.  Try it when "buf" consists entirely of spaces.

Then, try to debug a C program that contains this code, when "buf"
happens to be preceded in memory by a pointer whose value at that
point just happens to be 0x40001820 or something else that ends in
0x20.  At least in Ada, if you tried something this stupid, your
program would get an exception during testing.

                                -- Adam









^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-21  0:00                                                     ` Tim Behrendsen
@ 1996-08-21  0:00                                                       ` Pete Becker
  1996-08-22  0:00                                                         ` Szu-Wen Huang
  1996-08-21  0:00                                                       ` Matt Austern
                                                                         ` (2 subsequent siblings)
  3 siblings, 1 reply; 688+ messages in thread
From: Pete Becker @ 1996-08-21  0:00 UTC (permalink / raw)



In article <01bb8f1b$ce59c820$32ee6fce@timhome2>, tim@airshields.com says...
>
>> 
>> But big-oh notation is about prediction for large n.  You can not use
>> the notation to really predict the running time for a particular value,
>> only to estimate it; and your estimation may be way off.  If an
>> algorithm runs in N^2 + 10^100 N seconds, it is still O(N^2), although
>> you never will experience the quadratic behaviour of the algorithm.
>> (Actually, of course, you will never see the algorithm come to
>completion.)
>
>I have to admit, I have to side with Szu-Wen.  I've never really
>thought about this case for the O() notation, but it seems from
>a purely mathematical standpoint that O(f(n)) means that f(n)
>is *the* function.  If f(n) happens to be a non-continuous function,
>then so be it.
>
>If the running time is
>
>    kn for n < 1000
>    k(n^2) for n >= 1000
>
>then f(n) =
>    O(n) for n < 1000
>    O(n^2) for n >= 1000
>
>then the big-Oh function should be (pardon my syntax)
>
>    O( (n < 1000) ? n : n^2 )
>
>The latter is not only more useful, but absolutely accurate.
>AFAIK, there are no limits on the complexity of the Big-Oh
>function, and no requirement that it be a continuous
>function.

No. The point is that this notation talks about asymptotic behavior, not 
behavior for specific cases. If you want to talk about more complex notions, 
feel free, but don't use O() notation to talk about them, because you will 
confuse people when you misuse this notation. In particular, note that

f(x) = 100000*x + x*x

is linear for small values of x, and quadratic for large values. O(f(x)) is 
O(x^2), however, because eventually the x^2 term dominates.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-21  0:00                                                     ` Tim Behrendsen
@ 1996-08-21  0:00                                                       ` Dann Corbit
  1996-08-22  0:00                                                       ` Richard A. O'Keefe
  1 sibling, 0 replies; 688+ messages in thread
From: Dann Corbit @ 1996-08-21  0:00 UTC (permalink / raw)





Tim Behrendsen <tim@airshields.com> wrote in article
<01bb8f6f$46bc2da0$87ee6fce@timpent.airshields.com>...
{snip}
> Well, but look at your own terminology.  You describe quicksort
> as "worst case O(n*n)" and (paraphrase) "average case O(n*log(n))".
> When the Big-Oh is given for a particular algorithm without
> any qualification, it is usually assumed to be the average
> behavior over the data set.  The point is not that O() makes
> absolute time predictions, but that it gives *behavior*
> predictions.
> 
True, but only in the limiting case.  An algorithm
that is O(n) has **some** line that lies above all
time values for the function with finite slope and
intercept.  And an O(n*n) algorithm has **some **
parabola that lies above the function for all values
of n.   It is, of course, possible for linear algorithms
to be unusable if the slope is nearly infinite, or
the intercept is nearly infinite, or for exponential
algorithms or even factorial algorithms to be
usable if there is a constant multiplier small enough
to pull the curve down into the usable range for
a useful domain.

In short, O(f(n)) does not tell the whole story, even
when we know what f(n) is.  It's still a good idea to
benchmark the algorithm for your domain values
against competing algorithms, unless someone
has already done it for you.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-12  0:00         ` Patrick Horgan
                             ` (4 preceding siblings ...)
  1996-08-20  0:00           ` Darin Johnson
@ 1996-08-21  0:00           ` Darin Johnson
  1996-08-22  0:00           ` What's the best language to learn? [any language except Ada] Jon S Anthony
                             ` (2 subsequent siblings)
  8 siblings, 0 replies; 688+ messages in thread
From: Darin Johnson @ 1996-08-21  0:00 UTC (permalink / raw)



mdw@excessus.demon.co.uk (Mark Wooding) writes:
> 	while (buf[strlen(buf)-1]==' ')
> 	  buf[strlen(buf)-1]=0
> 
> I can't believe that anyone with an understanding of what goes on `under
> the covers' would possibly write anything like this without feeling ill.

Actually, when I first started my current job, most of the code looked
a bit like that.  The author is an old MVS programmer and likes C
(when compared to assembler which is what MVS is usually programmed
in), but hasn't used C that long.  Here's a real sample (indentation
preserved).

        static char ablank[] = " ";
    while(0 !=( strncmp(buffp,ablank,1)) & 0 !=( strncmp(buffp,"\n",1)))
              {
               strncpy(destp,buffp,1);
               buffp +=1;
               destp +=1;
              }
         strncpy(destp,"\0",1);
-- 
Darin Johnson
djohnson@ucsd.edu	O-
	"Look here.  There's a crop circle in my ficus!"  -- The Tick




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-20  0:00                                                 ` Szu-Wen Huang
  1996-08-20  0:00                                                   ` Dann Corbit
@ 1996-08-21  0:00                                                   ` Dik T. Winter
  1996-08-21  0:00                                                     ` Tim Behrendsen
  1996-08-22  0:00                                                     ` Tanmoy Bhattacharya
  1 sibling, 2 replies; 688+ messages in thread
From: Dik T. Winter @ 1996-08-21  0:00 UTC (permalink / raw)



In article <4vd05f$rt5@news1.mnsinc.com> huang@mnsinc.com (Szu-Wen Huang) writes:
 > : For example, suppose the behavior of an algorithm is
 > 
 > :   for n up to 1000, time is 1 second
 > 
 > "time is *n* seconds", I presume.  Otherwise this would be O(1).

Nope.  Robert Dewar wrote 1 second and intended 1 second, you can not
conclude big-oh behaviour from a finite number of samples.
 > 
 > :   for n greater than 1000, time is 1000*n seconds
 > 
 > : that's clearly O(N), but the time for 2000 items will be 2_000_000
 > : seconds.
 > [snip]
 > 
 > I disagree.

You may disagree, but that is what the definition is!  Big-oh notation
is about asymptotic behaviour, i.e. what is expected to happen for n
very large.

 >              I expect a linear algorithm to display linear behavior
 > unless otherwise stated.

It will, for large enough n.  And "enough" is explicitly not specified.

 >                           What you cite is a case where it needs to
 > be explicitly stated, because calling that algorithm "O(n)" is next
 > to useless in predicting its behavior without knowing this peculiar
 > behavior at n=1,000.

But big-oh notation is about prediction for large n.  You can not use
the notation to really predict the running time for a particular value,
only to estimate it; and your estimation may be way off.  If an
algorithm runs in N^2 + 10^100 N seconds, it is still O(N^2), although
you never will experience the quadratic behaviour of the algorithm.
(Actually, of course, you will never see the algorithm come to completion.)
-- 
dik t. winter, cwi, kruislaan 413, 1098 sj  amsterdam, nederland, +31205924098
home: bovenover 215, 1025 jn  amsterdam, nederland; http://www.cwi.nl/~dik/




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-21  0:00                                                   ` Tim Behrendsen
@ 1996-08-22  0:00                                                     ` Mike Rubenstein
  1996-08-22  0:00                                                     ` Robert Dewar
  1 sibling, 0 replies; 688+ messages in thread
From: Mike Rubenstein @ 1996-08-22  0:00 UTC (permalink / raw)



"Tim Behrendsen" <tim@airshields.com> wrote:

> Tanmoy Bhattacharya <tanmoy@qcd.lanl.gov> wrote in article
> <TANMOY.96Aug21083507@qcd.lanl.gov>...
> > In article <01bb8f1b$ce59c820$32ee6fce@timhome2>
> > "Tim Behrendsen" <tim@airshields.com> writes:
> > <snip>
> > B: I have to admit, I have to side with Szu-Wen.  I've never really
> > B: thought about this case for the O() notation, but it seems from
> > B: a purely mathematical standpoint that O(f(n)) means that f(n)
> > B: is *the* function.  If f(n) happens to be a non-continuous function,
> > 
> > It is not a question of convenience or usefulness: it is a question of
> > definitions. As is almost universally defined defined, O(n^2+n) and
> > O(n^2) mean exactly the same, as does
> > 
> > <snip>
> > B: 
> > B:     O( (n < 1000) ? n : n^2 )
> > B: 
> > 
> > and hence when you write it instead of the more easily read O(n^2), it
> > tells people that you do not know the meaning of O().
> 
> Perhaps.  My understanding of O() notation is that it does
> not give a prediction of running time, it gives a rough
> approximation of behavior based on data set size.
> 
> IOW, O(n*n) has quadratic behavior relative to O(n), but the
> O() function does not imply an absolute k*n*n running time.
> 
> > B: The latter is not only more useful, but absolutely accurate.
> > 
> > Which is irrelevant. Use, a different notation if you want to invent a
> > new concept: a concept that is more useful in what you want to do. 
> 
> Well, *somebody* has to fix the concepts that may be dumbly
> defined! :)
>  
> > B: AFAIK, there are no limits on the complexity of the Big-Oh
> > B: function, and no requirement that it be a continuous
> > B: function.
> > 
> > No, but a complex Big-Oh function has *exactly* the same meaning as a
> > simpler one in most cases. Calling something a O(f(n)) function means
> > saying that (This may not be the technical definition: but this
> > expresses the meaning commonly used):
> > 
> >   1) there exists k and N depending on k such that for all n > N, the
> >      function in question is bounded in absolute value by by k |f(n)|. 
> >   2) The minimum such k is non-zero. (This is to prevent a O(n)
> >      function also being classified as a a O(nlogn) for example).
> > 
> > The notation works quite well in practice: generally what happens is
> > that for very small n (for n < N in the definition), the `main loop'
> > is not dominant: there are other stuff like initialization which
> > dominates the time taken. The big O notation says that we are
> > interested in the behaviour of the function when such subdominant
> > pieces are unimportant. In fact, such a division is important because
> > there is no reason why the ratio of time taken in the loop to the time
> > taken outside it should be independent of the machine in use: if you
> > wanted big O notation to be a property of the algorithm and not the
> > hardware (as long as we stick to a reasonable architecture), you
> > cannot include subdominant terms in it. Incidentally, the
> > constant k simply means that it is not important what the units of
> > measurement are, again serves to make the notation independent of the
> > precise hardware on which it is being run.
> > 
> > However, there are pathological cases: one such pathological case is
> > when the initialization (or something similar) takes a large amount of
> > time. Thus for example, we can thing of an algorithm that takes
> > 0.00001 n (main loop) + 1000 (initialization) amount of time. Now, for
> > n >> 100000000, the initialization is unimportant: and big O notation
> > will latch on to that case and call it an O(n) problem. However, in
> > practice, all that you will ever see for a reasonable n is the
> > initialization time, and hence, in practice, it will look much like an
> > O(1) problem.
> > 
> > Similarly, sometimes the time taken to really do a problem (say adding
> > two numbers) is proportional to the number of digits in the number:
> > i.e. to log n. However, on most hardware, the time taken to do this is
> > a constant, albeit with a limit to the size of numbers it can add. In
> > such cases, again, the observed complexity will be O(1), whereas
> > truly, the complexity is O(logn).
> 
> Agreed; the O() function is meant to be used in context, not
> as an absolute statement of algorithm efficiency.
> 
> > But, these caveats about order O is the penalty we pay for using an
> > useful concept. The defintion you called more useful, most will find
> > *less* useful: to use such a concept for a sorting routine, one would
> > need to know the relative time for a swap and compare on the
> > particular hardware for example. Giving up the machine independence in
> > a concept designed to measure abstract algorithmic efficiency is a
> > stiff price to pay for a perceived usefulness.
> 
> By "useful", I mean conveying information about the algorithm.  I
> agree with you that in practice the less significant terms may
> be left off the function, i.e. O(n*n + n) is effectively equivalent
> to O(n*n).  But the particular case that started all this was the
> non-continuous case, where one range of (n) produces O(n)
> behavior, and another range produces O(n*n).  In practice, this
> would normally be described exactly as I have just done, but I
> don't think it's either accurate or complete to only declare
> the algorithm O(n*n) if both ranges are possible input data.
> 
> If the definition of O() notation calls for the linear case to
> simply be ignored, then the definition should be changed,
> because it is simply not accurate.  Note I'm not expecting
> precision, I'm expecting accuracy.

What do you mean "if".  The definition of O() notation is a given.
It's been araound much too long for us to consider changing it just
because you don't like it.  O(n^2) and O((n < 1000) ? n : n^2) mean
exactly the same thing.
 
> > Most people instead choose to give you the O() of the algorithm with a
> > statement pointing out the caveats. I have seen statements that
> > `addition is O(1) with the caveat that if the numbers really became
> > big, it would become O(logn)'. Such statements are necessarily
> > mathematically imprecise, but we know what they mean. And, if, someone
> > were to question me as to what the precise meaning is, I could of
> > course phrase it trivially: It would be a fomalization of `what I mean
> > is that if we enlarge the turing machine with an oracle that can add
> > arbitrarily large numbers in one move, the algorithm would become O(1)
> > instead of O(n)'. 
> 
> The reason we can safely ignore the O(n) case is that the large
> number case is not part of our input domain.  If it was, then
> we would necessarily need to have a non-continuous O() function.

Huh?  This doesn't make any sense at all.  As long as we are talkinga
about integers and the standard definition of continuity, all
functions are continuous.
 
> > Calling something O(n^2+n) is still an abomination!
> 
> Agreed.

It is exactly the same abomination as calling something O((n < 1000) ?
n : n^2).  Both mean that the writer was too ignorant or too lazy to
reduce it to the equivalent O(n^2).


Michael M Rubenstein




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-21  0:00                                                       ` Tanmoy Bhattacharya
@ 1996-08-22  0:00                                                         ` Mike Rubenstein
  1996-08-22  0:00                                                         ` Dann Corbit
  1 sibling, 0 replies; 688+ messages in thread
From: Mike Rubenstein @ 1996-08-22  0:00 UTC (permalink / raw)



tanmoy@qcd.lanl.gov (Tanmoy Bhattacharya) wrote:

> In article <4vfk6b$i6h@krusty.irvine.com>
> adam@irvine.com (Adam Beneschan) writes:
> <snip>
> AB:  >No, but a complex Big-Oh function has *exactly* the same meaning as a
> AB:  >simpler one in most cases. Calling something a O(f(n)) function means
> AB:  >saying that (This may not be the technical definition: but this
> AB:  >expresses the meaning commonly used):
> AB:  >
> AB:  >  1) there exists k and N depending on k such that for all n > N, the
> AB:  >     function in question is bounded in absolute value by by k |f(n)|. 
> AB:  >  2) The minimum such k is non-zero. (This is to prevent a O(n)
> AB:  >     function also being classified as a a O(nlogn) for example).
> AB: 
> AB: I've never seen a definition of O(f(n)) that includes anything like
> AB: (2).  In the mathematical definitions I've seen, O(f(n)) provides only
> AB: an upper bound on the value expressed by O-notation, not a lower
> AB: bound.  Under these definitions, it is perfectly correct (although not
> AB: too useful) to say that a linear algorithm is O(n**2) or O(n**3).
> 
> Correct. Number (2) actually exists only to get it closer to common
> usage without giving up the asymptotic nature of the definition.

FYI, some authors (e.g., Hardy and Wright in "An Introduction to the
Theory of Numbers") use a special notation that, unfortunately, does
not render well in ASCII to say that two functions have the same order
of magnitude (an equal sign with the bars curved and the convex sides
facing each other).  f and g have the same order of magnitude if there
are positive constants A and B such that Af < g < Bf for all arguments
in question.


Michael M Rubenstein




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-22  0:00                         ` Richard A. O'Keefe
@ 1996-08-22  0:00                           ` Mike Rubenstein
  0 siblings, 0 replies; 688+ messages in thread
From: Mike Rubenstein @ 1996-08-22  0:00 UTC (permalink / raw)



ok@goanna.cs.rmit.edu.au (Richard A. O'Keefe) wrote:

> miker3@ix.netcom.com (Mike Rubenstein) writes:
> >No.  The assumption that O(N^2) performance is unlikely assumes that
> >quicksort is properly implemented.  The distribution of the data has
> >nothing to do with it.
> 
> The point I was trying to make is that the allegedly "unlikely"
> behaviour is already *known* to have occurred much more often than
> the theorems you see in data structures books would have you
> believe.  Given any non-randomising version of quicksort, I can
> construct a pessimal input for that version.  If the distribution
> of the input I supply is such that pessimal problems occur with
> probability 1, I will get so-called "unlikely" behaviour with
> probability 1.  So *in the absence of built-in randomisation*
> the distribution has a lot to do with it.
> 
> Again, some of the proofs of the unlikelihood of bad behaviour
> tacitly assume that all elements are distinct.  Even randomisation
> won't help unless you use a fat pivot or some other technique that
> handles repeated elements well.
> 
> So if Mike Rubenstein means by "properly implemented" a quicksort
> using randomisation and a fat pivot, then I am forced to agree with
> him.  On the other hand, while the quicksort I use (the "Engineered"
> one by Bentley & McIlroy) uses a fat pivot, it doesn't use randomisation,
> so for a recently version of quicksort designed and tested by experts
> and published in a refereed journal, which gets impressive performance
> in practice (it is the first quicksort I've ever come across which can
> routinely compete with a well implemented merge sort), it is possible
> to find an input distribution for which "Engineered" qsort will behave
> badly with probability 1.  So either Mike Rubenstein is wrong, or
> Bentley & McIlroy's qsort is not "properly implemented".

I certainly mean that a properly implemented quicksort (or any other
algorithm) is one that is also properly applied and is implemented
with due consideration to the problem at hand.  For a general purpose
quicksort, this certainly means randomization and a fat pivot (or some
other technique for handling multiple elements).
 
> >> The likelihood of quicksort doing badly on *your* data depends on the
> >> actual probability distribution of *your* data.
> 
> >Again, not if quicksort is implemented properly.
> 
> Which either means "using randomisation and a fat pivot" or is false.
> 
> >There is no need to permute the array.  Just follow Hoare's suggestion
> >for avoiding O(N^2) performance.
> 
> >Hoare suggested choosing the pivot randomly.  This has the same effect
> >on overall performance as permuting the data.
> 
> Permuting the whole array is simpler and faster.  There is a phase where
> the random number generator is in the cache and the sorting code is not,
> then there is a phase when the sorting code is in the cache and the
> random number generator is not.  What's more, depending on the random
> number generator, it may be possible to vectorise the random number
> generation and fuse that loop with the permutation loop.  The separated
> approach *looks* as though it ought to be easier to make fast.  (For
> example, if using AS183, the three integers it maintains can be kept in
> registers during the permutation phase instead of being repeatedly
> stored back into memory and reloaded.)

What nonsense.  If you mean the instruction cache, it's unlikely that
the code for quicksort will fit in the cache.  If you mean high speed
memory (e.g., the typical memory cache on an 80x86 or pentium), both
will almost certainly fit.

Using a random pivot requires no movement of the data.  Rearranging
the data requires moving the data or constructing an array of pointers
of indices that will slow down the comparison.
 
> There is also the practical point that if you want to use someone else's
> quicksort and don't fully trust it, you can permute the array yourself
> before calling qsort, without being able to modify the qsort code.

If you are using untrusted code when timing is critical, you deserve
everything that happens to you.

If timing is really critical, functions like C's qsort() are
unsuitable anyways since they rely on inefficient comparison and,
possibly, swapping code.

I've never seen a general sort program that is always suitable.
 
> >Choosing the median of the first, middle, and last elements for the
> >pivot also works well in practice.
> 
> Ahem.  This is the approach that has yielded bad behaviour IN PRACTICE.
> 
> >In particular, it will prevent
> >O(N^2) performance if the data is nearly in order or reverse order.
> 
> Only if (a) repeated elements are rare or (b) you use a fat pivot.

Actually, repeated elements don't have to be rare.  What matters is if
there are a lot of a few elements.  For example, if every element
appears twice, repeated elements are common, but performance is still
O(n log n).  If there is likely to be items that are repeated many
times, a fat pivot should certainly be used.  My experience is that
this is not a very common situation, but your experience may differ.
 
> The only ways to make bad runtime cost unlikely for any algorithm are
> (a) use an algorithm for which bad cost can *never* happen
> (b) randomise so that you *know* what the distribution is.

More nonesense.  In practice one often knows a great deal about the
input.  Randomising may be the worst thing to do.  It's much better to
think than blindly follow rules.

Elsewhere, I've mentioned a problem I ran into of doing a lot of sorts
on almost in order data.  Insertion sort, with a known worst and
average case terrible performance, works quite well in such a
situation.  Randomizing and using someone elses quicksort would almost
certainly have been disasterous.


Michael M Rubenstein




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: (topic change on) Teaching sorts
  1996-08-22  0:00                                                     ` Christian Bau
  1996-08-22  0:00                                                       ` Larry Kilgallen
@ 1996-08-22  0:00                                                       ` Marcus H. Mendenhall
  1996-08-27  0:00                                                         ` Ralph Silverman
  1996-08-23  0:00                                                       ` Teaching sorts [was Re: What's the best language to start with?] Andrew Koenig
  2 siblings, 1 reply; 688+ messages in thread
From: Marcus H. Mendenhall @ 1996-08-22  0:00 UTC (permalink / raw)
  To: Christian Bau


Christian Bau wrote:
-> On a real computer (PowerMac, no virtual memory, no background 
processes,
-> nothing that would interfere with execution time), the _number of
-> instructions per second_ did reproducably vary by a factor up to 
_seven_
-> when going from n to n+1 (for example, case n = 128 took seven times
-> longer than cases n = 127 and n = 129). So for this computer, and 
this

Isn't cacheing fun?  I have observed many bizarre effects on the 
PowerMacs when one is doing work which involves thrashing memory (FFT's, 
matrix multiplies, etc.).

In effect, one can usually assume that the total number of cpu cycles 
actully used for floating point arithmetic in these cases is 0.  
Counting real memory hits due to cache reloads gives a much more 
accurate measure of time.

In the case of testing your matrix multiply, you could use a trick I did 
to investigate timing for FFT's: I took out all pointer increments from 
the loop, so that the algorithm proceeded as usual, but carried out all 
its operations on the same few bytes of memory.  It yields nonsense for 
the result, but gives an idea of how many cpu cycles are spent on 
everything except fetching. It is sometimes quite shocking (> factor of 
10) the speed increase.

In your case, with the problem at 128 elements, i suspect this was 
because of the way the PowerPC chips (some of them at least) choose 
which cache line to fill with new data, and the 1024 byte offset between 
successive data points probably meant that each fetch required a 
complete cache line reload.

Marcus Mendenhall




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-19  0:00                                               ` Robert Dewar
@ 1996-08-22  0:00                                                 ` Stephen Baynes
  1996-08-27  0:00                                                 ` Richard A. O'Keefe
  1 sibling, 0 replies; 688+ messages in thread
From: Stephen Baynes @ 1996-08-22  0:00 UTC (permalink / raw)


[-- Warning: decoded text below may be mangled, UTF-8 assumed --]
[-- Attachment #1: Type: text/plain, Size: 2674 bytes --]


Robert Dewar (dewar@cs.nyu.edu) wrote:
: (slightly corrected version, I noticed a minor typo in the first version,
: which I canceled).

: Richard O'Keefe says

:   "Now the funny thing here is that Robert Dewar wrote
:   >you will almost certainly find you require more overhead
:   [for insertion sort than bubble sort]
:   >because of the two nested loops.
:   But both versions of bubble sort have two nested loops as well!"

: You missed the point. In the bubble sort, the outer loop is executed only
: once for a sorted list. For your insertion sort, the outer loop runs a full
: number of times. Your insertion sort has 2*N loop termination tests, while
: a properly written bubble sort has N loop termination tests.

But on a sorted list the bubble sort the inner loop also involves an extra
comparison test, which for the insertion sort is part of the second 
termination test. So it is more complex than you make out. Does it cost more to
compare two elements than test a loop counter? For many data items it is much
more. (Consider for example the speed of strcmp vs '>' on ints.)

If you need to do a sort at all then at least some of the time you are trying
to sort an unsorted list. Unless this is very rare then the cost of sorting an
already sorted list is not a very good guide to actual behaviour in practice. 

Once you get one element out of place you have to take into account the cost 
of moving elements too. This could be expensive and bubble sorts can do a lot
more than insertion sorts (depending on the coding). So to decide the best sort
you may have to benchmark several sorts and several codings and decide on how
to trade of speeding special cases vs code complexity.

: I do not know any way to write the insertion sort so that it has only N
: loop termination tests, certainly your example has 2*N tests.

: Here is a simple example of bubble sort the way I would code it

:    procedure Sort (N : Positive; Exch : Exch_Procedure; Lt : Lt_Function) is
:       Switched : Boolean;
:    begin
:       loop
:          Switched := False;

:          for J in 1 .. N - 1 loop

Note that for each iteration of the outer loop one element at least will
have bubbled to the correct position, so you can decrease the range of
the inner loop by one for each itteration of the outer loop. This makes
the code more complex but may double the speed on unsorted input data.

--
Stephen Baynes                              baynes@ukpsshp1.serigate.philips.nl
Philips Semiconductors Ltd
Southampton                                 My views are my own.
United Kingdom
 Are you using ISO8859-1? Do you see � as copyright, � as division and � as 1/2?




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-21  0:00                 ` James Youngman
@ 1996-08-22  0:00                   ` TRAN PHAN ANH
  1996-08-22  0:00                     ` Dr E. Buxbaum
  0 siblings, 1 reply; 688+ messages in thread
From: TRAN PHAN ANH @ 1996-08-22  0:00 UTC (permalink / raw)



Just a side comment, by the time this thread and its offsprings end, the 
original author will have learned both C and Pascal. :-)

Anh

In article <4veqrn$9qi@halon.vggas.com>, JYoungman@vggas.com (James Youngman) writes:
> In article <4vblm8$df9@electra.saaf.se>, pausch@electra.saaf.se says...
> 
>>The performane of Quicksort on already sorted, or almost-sorted, data
>>can be dramatically improved by first scrambling the data(!) somewhat.
> 
> It all depends exactly on the Quicksort implementation chosen.  For example, 
> the implementation in "The Standard C Library" by PJ Plauger runs fastest on 
> fully-sorted input.
> 
> -- 
> James Youngman                               VG Gas Analysis Systems
> The trouble with the rat-race is, even if you win, you're still a rat.
> 




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-22  0:00                                     ` Bengt Richter
@ 1996-08-22  0:00                                       ` Frank Manning
  1996-08-31  0:00                                         ` Bengt Richter
  1996-08-22  0:00                                       ` Tim Behrendsen
  1 sibling, 1 reply; 688+ messages in thread
From: Frank Manning @ 1996-08-22  0:00 UTC (permalink / raw)



In article <4vgs4p$evl@news.accessone.com> bokr@accessone.com (Bengt
Richter) writes:

> What if computers are totally irrelevant? 

When you give students a programming assignment, are the students
required to run the program on actual, physical, real-life computers?
If so, that presents certain...ah...practical problems

Should the students know how to turn on the computer? Do you expect
them to know where the on/off switch is? Suppose they flip the switch
and nothing happens? Are they expected to know that the computer is
supposed to get electrical power somewhere -- is it plugged into a
wall socket, or does it have internal batteries, or what? What if the
fuse is blown? Are they supposed to sit there helplessly?

>         "'... it is important not to lose sight of the fact that
> there is a difference between training and education. If computer
> science is a fundamental discipline, then university education in
> this field should emphasize enduring fundamental principles rather
> than transient current technology.'
>         Peter Wegner, Three Computer Cultures"

Ah, yes. The eternal training-vs-education dilemma.

I totally agree that university education should emphasize fundamentals,
but not to the exclusion of hands-on experience. What happens if you
want to do research, especially if experiments are required? Who's going
to run the experiments? Technicians? You can't get technicians to do
everything. At large research universities, for example, students are
relied on for quite a lot.

And if the students know only abstract theory, how are professors going
to explore new theories if the students are incapable of running
experiments, as the students most certainly will be if they don't know
the details of how the hardware actually works? There are countless ways
an experiment can go wrong or give misleading results. It's difficult to
prevent those problems without a thorough understanding of hardware
details.

-- Frank Manning
-- Chair, AIAA-Tucson Section




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-22  0:00                                                     ` Bengt Richter
@ 1996-08-22  0:00                                                       ` Tim Behrendsen
  1996-08-31  0:00                                                         ` Bengt Richter
  0 siblings, 1 reply; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-22  0:00 UTC (permalink / raw)



Bengt Richter <bokr@accessone.com> wrote in article
<4vgs4j$evl@news.accessone.com>...
> "Tim Behrendsen" <tim@airshields.com> wrote:
> [...]
> >Wait, hold the phone!  "Sorts the data without moving it"?  What,
> >is APL's sorting algorithm O(1)?  Yes, it may not actually get
> >sorted until it gets printed, but that's irrelevent to the fact
> >that it eventually gets sorted.
> 	Does that mean that your concept of the essential aspect of
> sorting is putting the data into sort order, rather than establishing
> the order itself? The respective timings say otherwise to me ;-)

I'm not sure what you're trying to say, but if I have a data set,
and I set a bit that says "this data set has the property of
being ordered", then technically I have an ordered data set in
O(1) time.  Now, if I do an add reduction (terminology?), the
sorting doesn't actually have to be done, and I've saved some
CPU cycles.

But all that's not the point.  If my point is to take a vector as
input, order it, and then display it on the screen, the vector
will be sorted.  And sorting something requires moving it into
a sorted order.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-20  0:00                       ` Mike Rubenstein
@ 1996-08-22  0:00                         ` Richard A. O'Keefe
  1996-08-22  0:00                           ` Mike Rubenstein
  0 siblings, 1 reply; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-08-22  0:00 UTC (permalink / raw)



miker3@ix.netcom.com (Mike Rubenstein) writes:
>No.  The assumption that O(N^2) performance is unlikely assumes that
>quicksort is properly implemented.  The distribution of the data has
>nothing to do with it.

The point I was trying to make is that the allegedly "unlikely"
behaviour is already *known* to have occurred much more often than
the theorems you see in data structures books would have you
believe.  Given any non-randomising version of quicksort, I can
construct a pessimal input for that version.  If the distribution
of the input I supply is such that pessimal problems occur with
probability 1, I will get so-called "unlikely" behaviour with
probability 1.  So *in the absence of built-in randomisation*
the distribution has a lot to do with it.

Again, some of the proofs of the unlikelihood of bad behaviour
tacitly assume that all elements are distinct.  Even randomisation
won't help unless you use a fat pivot or some other technique that
handles repeated elements well.

So if Mike Rubenstein means by "properly implemented" a quicksort
using randomisation and a fat pivot, then I am forced to agree with
him.  On the other hand, while the quicksort I use (the "Engineered"
one by Bentley & McIlroy) uses a fat pivot, it doesn't use randomisation,
so for a recently version of quicksort designed and tested by experts
and published in a refereed journal, which gets impressive performance
in practice (it is the first quicksort I've ever come across which can
routinely compete with a well implemented merge sort), it is possible
to find an input distribution for which "Engineered" qsort will behave
badly with probability 1.  So either Mike Rubenstein is wrong, or
Bentley & McIlroy's qsort is not "properly implemented".

>> The likelihood of quicksort doing badly on *your* data depends on the
>> actual probability distribution of *your* data.

>Again, not if quicksort is implemented properly.

Which either means "using randomisation and a fat pivot" or is false.

>There is no need to permute the array.  Just follow Hoare's suggestion
>for avoiding O(N^2) performance.

>Hoare suggested choosing the pivot randomly.  This has the same effect
>on overall performance as permuting the data.

Permuting the whole array is simpler and faster.  There is a phase where
the random number generator is in the cache and the sorting code is not,
then there is a phase when the sorting code is in the cache and the
random number generator is not.  What's more, depending on the random
number generator, it may be possible to vectorise the random number
generation and fuse that loop with the permutation loop.  The separated
approach *looks* as though it ought to be easier to make fast.  (For
example, if using AS183, the three integers it maintains can be kept in
registers during the permutation phase instead of being repeatedly
stored back into memory and reloaded.)

There is also the practical point that if you want to use someone else's
quicksort and don't fully trust it, you can permute the array yourself
before calling qsort, without being able to modify the qsort code.

>Choosing the median of the first, middle, and last elements for the
>pivot also works well in practice.

Ahem.  This is the approach that has yielded bad behaviour IN PRACTICE.

>In particular, it will prevent
>O(N^2) performance if the data is nearly in order or reverse order.

Only if (a) repeated elements are rare or (b) you use a fat pivot.

The only ways to make bad runtime cost unlikely for any algorithm are
(a) use an algorithm for which bad cost can *never* happen
(b) randomise so that you *know* what the distribution is.

-- 
Australian citizen since 14 August 1996.  *Now* I can vote the xxxs out!
Richard A. O'Keefe; http://www.cs.rmit.edu.au/~ok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-22  0:00                                     ` Bengt Richter
  1996-08-22  0:00                                       ` Frank Manning
@ 1996-08-22  0:00                                       ` Tim Behrendsen
  1996-08-23  0:00                                         ` Larry J. Elmore
  1 sibling, 1 reply; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-22  0:00 UTC (permalink / raw)



Bengt Richter <bokr@accessone.com> wrote in article
<4vgs4p$evl@news.accessone.com>...
> "Tim Behrendsen" <tim@airshields.com> wrote:
> 
> >Lawrence Kirby <fred@genesis.demon.co.uk> wrote in article
> ><840288278snz@genesis.demon.co.uk>...
> >> 
> >> They might have a good feel for how to implement quicksort in that
> >> particular machine code however it would be so wrapped up in
> >> implementation details that they wouldn't have a good understanding of
> >> the algorithm itself, at last not in the same timescale.
> 
> >And they don't get wrapped up in implementation
> >details if they write it in C?  My primary point is
> >when it's implemented in assembly, and you have to
> >manually move the bytes from one location to another
> >rather than have the compiler "carry them for you",
> >you get a better view feel for not only what the
> >algorithm is doing, but what the computer itself is
> >doing.
> 	What if computers are totally irrelevant? If
> an algorithm is an abstract procedure for manipulating
> symbols, or accomplishing some mathematical purpose?
> Then any computerized implementation will just be an
> instance of a broad class of implementation possibilities.
> And assembler would be just one of a number of possible
> intermediate code representations in translating to
> an executable representation for a particular platform/
> OS environment.

So what?  Remember who we're talking about.  We're talking
about Joe Schmoe off the street in CS 101.  Are we going
to hit him full-bore with the entire universe of possible
computer architectures?  You have to start somewhere, and
you have to try and convey the procedural nature of the
computer in the simplest possible terms.

Also, manipulating symbols or mathematics are still
data transformations over time.

> 	If you are more fluent in assembler than in
> the native terms/notation of the original abstraction,
> then you may look at the assembler code and say, "Aha,
> that's what that meant," and be helped in understanding
> the original algorithm, ...if the latter was relatively
> trivial. But I don't think looking at assembler is a good
> way to understand system-level workings of things, even
> though it may help with a given primitive. Even C can
> be too low a level. How do you see object hierarchies in
> assembler? Or even C++ source code -- you need a way of
> seeing the woods as well as the trees. E.g., the important
> aspects of OpenGL are not the language it's been ported
> to, though you can be sure there are assembler-coded
> primitives supporting any given implementation. An annotated
> dataflow diagram supplemented with some equations etc. may
> be one of the best helps in arriving at understanding.
> A riveter sees the importance of rivets everywhere, but
> it's a limited view.

Indeed, all this is important ... later.  They are simply
not capable of understanding any of this completely until
they have grasped the procedural nature of the computer.

> >The most important thing a student can learn about
> >computers is the fundamentally simple nature.  The
> 	I think the "fundamentally simple nature" you
> are talking about is a simplified *view/abstraction* of an
> older-architecture computer. If you focus on small enough
> a detail, it becomes "simple," e.g., a bus transaction
> that moves a word from memory to cpu chip, but to keep it
> simple you may have to ignore caches, burst modes, split
> address/data transactions, multiple bus masters, etc. etc.
> not to mention multiple CPUs each with separate multiple
> execution units, pipelines, etc. Perhaps what is needed
> is a standard virtual machine for teaching purposes.

I agree.  I think I mentioned this a long time ago, but
it got lost in the noise.  I've suggested a 68000 or 6809
processor; nice orthogonal instruction set.  It's not that
I want everyone to learn assembler so they'll use it
everyday on the job, I want them to get a "feel" for
data movement/flow.

> >mystery of execution *must* be broken down in order for
> 	This mystery sounds suspiciously like step-step,
> i.e., a rule of sequential evaluation. I think a lot of these
> things benefit from being considered in the abstract (assuming
> one is comfortable in that domain). A nice book is "Anatomy
> of Lisp" by John Allen, (c) 1978 by McGraw-Hill, Inc.
> ISBN-0-07-001115-X. I don't think they'll mind if I quote a
> quotation the author put at the head of the preface:
> 
> 	"'... it is important not to lose sight of the fact that
> there is a difference between training and education. If computer
> science is a fundamental discipline, then university education in
> this field should emphasize enduring fundamental principles rather
> than transient current technology.'
> 	Peter Wegner, Three Computer Cultures"

EXACTLY!  I just think that we don't spend enough time on trying
to establish fundamental thinking skills, rather than
immediately jumping into the "grand abstractions".

> As true now as then, I think. A particular assembler for a particular
> CPU/OS environment is "transient current technology." A well-chosen
> VM might not be. An object-oriented implementation of an emulator
> for such a VM might be an interesting multi-perspective project...
> 
> >the student to being to think like a programmer.
> Of Smalltalk/LISP/CLOS or C++/Delphi or assembler/Fortran/C? ;-)
> >Once they can think like a programmer, all the rest
> >of the knowledge they learn becomes trivial.
> A terrifying prospect!
> After 70k+- hrs of OJT as programmer, I still run into
> stuff that seems non-trivial to me, thank goodness.
> But then I have been trying to get beyond my assembler origins ;-)
> 
> BTW, what about thinking like a mathematician ;-) Or a
> problem-solver? Or an artist?

What about them?  Take an artist: Thinking like an artist
means learning to see.  Reminds of when I learned to draw.
I was a hopeless failure, until I figure out that problem
wasn't with my hands, it was with the way I physically
looked at the world.

If we're talking about programming, then we are talking
about thinking in terms of data.
 
> If you are teaching programming, how much "transient current
> technology" is it worth while to teach, projecting trends to
> graduation time? Just enough to use it for lab work (education),
> or in depth as case studies exemplifying the principles you
> are trying to teach, or focused on a job market (training)?
> I don't think there is a one-size-fits-all answer.

I would say I would way rather have someone that can think
very well, with minimal knowledge, rather than some who
has merely memorized a large amount of knowledge, but
has no clue how to apply it.  The latter is by far the
norm based on my attempting to hire new graduates.

> If you want students to know about *computers*, why not
> teach them a little logical design and electronics?

Good! Yes!

> So they have an idea of how data actually moves around
> in that minitower. Hm. Should we leave caches out of it?
> ISA vs VLB vs PCI? VME as example of non-synchronous
> alternatives? Do they need to know how to write a BIOS?
> What does the power-on reset do to all those chips on
> all those boards that it reaches? Oh, and how about the
> OS? After all, we don't usually run an assembler on the
> bare metal. So, to have a decent concept of what really
> is going on, don't they have to know how an OS boots up
> and gets to the point where it's able to run a shell?
> And how that shell gets its input and interacts with OS
> services to allocate memory and load and start execution
> of the assembler, and how that's going to get your code
> from a file and translate it to object form in another file.
> How about the linker and that whole story? Should we be
> satisfied with knowing how this works in DOS as OS? "Puh-leez!"
> It's maybe a little much to go into both Unix and WinXX
> (and system 7?), so what to choose?

All of that notwithstanding, you have to start somewhere.
Why pick anything?  Maybe we should lock them in a room
away from all the distractions of reality, because we
might have to teach one thing at the expense of everything
else.

Bottom line, you pick a nice, simple architecture!  Heck,
CP/M on a Z80!

> Ok, when are they going to be ready to learn about continuations?
> Or meta-object protocol? Or unification? or Corba?
> What is helpful? What is a scenic detour?
> 
> IMHO, over-emphasis on the sequential-execution aspect of programming
> represented by assembler may well leave students so used to one view
> of the optical illusion that they cannot easily snap their mental
> image into another just-as-valid form. I.e., they may not "get"
> OO design very easily.

I think they will get it much easier, because they will be
able to see the simply nature of it.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-21  0:00                                                   ` Dik T. Winter
  1996-08-21  0:00                                                     ` Tim Behrendsen
@ 1996-08-22  0:00                                                     ` Tanmoy Bhattacharya
  1 sibling, 0 replies; 688+ messages in thread
From: Tanmoy Bhattacharya @ 1996-08-22  0:00 UTC (permalink / raw)



In article <01bb8fbf$3c21e380$2bac399d@v-cnadc1>
"Dann Corbit" <a-cnadc@microsoft.com> writes:

C: 
C: 
C: Tanmoy Bhattacharya <tanmoy@qcd.lanl.gov> wrote in article
C: <TANMOY.96Aug21164211@qcd.lanl.gov>...
C: {snips}
C: > The (1) above is obviously rigorous.
C: {1 from above}
C: > AB:  >  1) there exists k and N depending on k such that for all n > N,
C: the
C: > AB:  >     function in question is bounded in absolute value by by k
C: |f(n)|. 
C: 
C: Shouldn't that read:
C: 1) there exists k and N and C depending on k such that for all n > N, 
C:     the function in question is bounded in absolute value by by k |f(n)| +
C: C
C:     where C is some numeric constant.
C: ?
C: Functions may call some start up routines, etc, so that the line never 
C: passes through the origin.
C: 

You don't need it for cases where the f(n) increases without bound
with N because if something is bounded by k |f(n)| + C for n > N, it
is also bounded by (k+epsilon) |f(n)| for every positive value of
epsilon, for n > M where M-N depends on C and f(n).

Cheers
Tanmoy
--
tanmoy@qcd.lanl.gov(128.165.23.46) DECNET: BETA::"tanmoy@lanl.gov"(1.218=1242)
Tanmoy Bhattacharya O:T-8(MS B285)LANL,NM87545 H:#9,3000,Trinity Drive,NM87544
Others see <gopher://yaleinfo.yale.edu:7700/00/Internet-People/internet-mail>,
<http://alpha.acast.nova.edu/cgi-bin/inmgq.pl>or<ftp://csd4.csd.uwm.edu/pub/
internetwork-mail-guide>. -- <http://nqcd.lanl.gov/people/tanmoy/tanmoy.html>
fax: 1 (505) 665 3003   voice: 1 (505) 665 4733    [ Home: 1 (505) 662 5596 ]




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [any language except Ada]
  1996-08-21  0:00               ` What's the best language to learn? [any language except Ada] Bill Mackay
@ 1996-08-22  0:00                 ` Stephen M O'Shaughnessy
  1996-08-22  0:00                 ` Robert Dewar
  1996-08-24  0:00                 ` Alan Brain
  2 siblings, 0 replies; 688+ messages in thread
From: Stephen M O'Shaughnessy @ 1996-08-22  0:00 UTC (permalink / raw)



In article <4vephc$8aa@hobyah.cc.uq.oz.au>, wmackay@om.com.au 
says...
>
>
>A core unit of a post-grad course i'm doing was 100% Ada and 
100% waste 
>of time! A rotten language and only used by the US military - 
enough 
>said!
>-- 
>Bill Mackay
>Suffolk Park
>Australia
>
>"I'm a Marxist of the Harpo kind"
>
>

I am sorry you feel that way.  Why waste our time by posting this
to comp.lang.ada?





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-21  0:00                                           ` Tim Behrendsen
@ 1996-08-22  0:00                                             ` Bob Gilbert
  1996-08-22  0:00                                               ` Tim Behrendsen
  1996-09-04  0:00                                             ` Lawrence Kirby
  1 sibling, 1 reply; 688+ messages in thread
From: Bob Gilbert @ 1996-08-22  0:00 UTC (permalink / raw)



In article <01bb8f19$9a89d820$32ee6fce@timhome2>, "Tim Behrendsen" <tim@airshields.com> writes:
> 
> There is no other view than the procedural view.

Don't think so.

>  There is no
> such thing as an algorithm that exists in zero time.  Even if
> there is only one operation, or (n) operations that happen
> simultaneously, it is still (n) data transformations that take
> place over time.

And this relates to procedural vs non-procedural views how????

> How can someone implement *any* sort in assembly language,
> and "learn it but not really understand it"?

Easy.  Understanding the how and why an algorithm works has little
to do with implementing it, regardless of the language used.  How 
does being able to implement a sort in assembly provide a student
with the ability to prove that the given sort algorithm will work for
all cases?  Learning to implement, and learning sound implementation
techniques (regardless of language) is a separate issue from learning
algorithm development and analysis.

>  To implement it,
> you have to do it in great detail, and you simply can't do the
> "push and prod until it works" approach to programming, which
> is what I think a lot of students do.

I think anyone can "push and prod" in any language.

-Bob







^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-12  0:00                                                 ` Bob Kitzberger
@ 1996-08-22  0:00                                                   ` Patrick Horgan
  1996-08-23  0:00                                                     ` Steve Heller
  0 siblings, 1 reply; 688+ messages in thread
From: Patrick Horgan @ 1996-08-22  0:00 UTC (permalink / raw)



In article <4uodcb$8hr@rational.rational.com>, rlk@rational.com (Bob Kitzberger) writes:

> 
> But efficiency is not the be-all and end-all.  You have to balance
> efficiency with ease of use, ease of maintenance, etc.  The
> fastest or most efficient solution is not always the "best"
> solution.  
>

That's true but Tim had a good point.  As the years have gone by I've found myself
searching more and more for (and hitting occasionally) elegant solutions.

I know that's an inexact term, but let me see if I can flounder around a bit
and make it more confusing.;)

When I find an elegant solution much of the code falls out.  The solution gets
simpler.  It seems to come from a better understanding of the problem.  An
elegant solution seems to arise out of the problem domain with a life of it's
own demanding to be implemented.  Its a best-fit solution.

When I find one (and I think years of thinking really hard and learning hard
lead to finding more of them), it almost always ends up being the best of
all worlds.

It's efficient.
It's easy to use.
It's clear and easy to maintain.

I don't think you get here without a lot of experience in good code and bad code,
good ideas and bad ideas, and I know that this will offend some, but really,
really, low level code like assembler code is one of the ingredients that
go into your palette to bring you to a place where you can do this.

Add in some intuition and some other right brain stuff, and you start getting
developers that are head and shoulders above the rest of the folks.  You
start getting people that come up with the elegant solutions.

I've seen some of them trying to communicate in this (too) long discussion and
being frustrated because people didn't understand what they were saying.

It seemed so obvious to them, but they were talking in some cases to people 
without the background to understand their arguments.

Most people in this discussion that said you needed to learn assembler didn't
mean that there was a particular reason that assembler led you to the holy
grail of development.

They meant that assembler, and object oriented design, and algorithms, and 
math, and poetry (too strange?;) and a lot of other stuff are the pieces from
which they're building their gestalts.  It's the things from which an elegant
design and resultant elegant implementation falls out.  It takes a lot of
knowledge and a lot of experience.

Sure, what you learn from becoming a good assembler programmer is only part of
the puzzle, and if that was all you had you wouldn't be a good developer over-
all.  It IS part of the puzzle though, and people that think it's not aren't as
good as they would be if they had the breadth of experience.

It all helps, it's all important.
-- 

   Patrick J. Horgan    patrick@broadvision.com   Have horse will ride.
   Opinions mine, not my employer's except by most bizarre coincidence.






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-21  0:00                                                   ` Tim Behrendsen
@ 1996-08-22  0:00                                                     ` Bengt Richter
  1996-08-22  0:00                                                       ` Tim Behrendsen
  1996-08-26  0:00                                                     ` Richard A. O'Keefe
  1 sibling, 1 reply; 688+ messages in thread
From: Bengt Richter @ 1996-08-22  0:00 UTC (permalink / raw)



"Tim Behrendsen" <tim@airshields.com> wrote:
[...]
>Wait, hold the phone!  "Sorts the data without moving it"?  What,
>is APL's sorting algorithm O(1)?  Yes, it may not actually get
>sorted until it gets printed, but that's irrelevent to the fact
>that it eventually gets sorted.
	Does that mean that your concept of the essential aspect of
sorting is putting the data into sort order, rather than establishing
the order itself? The respective timings say otherwise to me ;-)
[...]

Regards,
Bengt Richter






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-17  0:00                                   ` Tim Behrendsen
  1996-08-19  0:00                                     ` Bob Gilbert
@ 1996-08-22  0:00                                     ` Bengt Richter
  1996-08-22  0:00                                       ` Frank Manning
  1996-08-22  0:00                                       ` Tim Behrendsen
  1 sibling, 2 replies; 688+ messages in thread
From: Bengt Richter @ 1996-08-22  0:00 UTC (permalink / raw)


[-- Warning: decoded text below may be mangled, UTF-8 assumed --]
[-- Attachment #1: Type: text/plain, Size: 6705 bytes --]


"Tim Behrendsen" <tim@airshields.com> wrote:

>Lawrence Kirby <fred@genesis.demon.co.uk> wrote in article
><840288278snz@genesis.demon.co.uk>...
>> In article <01bb846d$c6c01780$87ee6fce@timpent.airshields.com>
>>            tim@airshields.com "Tim Behrendsen" writes:
>> 
>> >Who's talking about showing them?  I would suggest that if
>> >they wrote a quicksort in assembler, they will have a much
>> >better "feel" for the algorithm, than if they wrote it in C.
>> 
>> They might have a good feel for how to implement quicksort in that
>> particular machine code however it would be so wrapped up in
>> implementation details that they wouldn't have a good understanding of
>> the algorithm itself, at last not in the same timescale.

>And they don't get wrapped up in implementation
>details if they write it in C?  My primary point is
>when it's implemented in assembly, and you have to
>manually move the bytes from one location to another
>rather than have the compiler "carry them for you",
>you get a better view feel for not only what the
>algorithm is doing, but what the computer itself is
>doing.
	What if computers are totally irrelevant? If
an algorithm is an abstract procedure for manipulating
symbols, or accomplishing some mathematical purpose?
Then any computerized implementation will just be an
instance of a broad class of implementation possibilities.
And assembler would be just one of a number of possible
intermediate code representations in translating to
an executable representation for a particular platform/
OS environment.
	If you are more fluent in assembler than in
the native terms/notation of the original abstraction,
then you may look at the assembler code and say, "Aha,
that's what that meant," and be helped in understanding
the original algorithm, ...if the latter was relatively
trivial. But I don't think looking at assembler is a good
way to understand system-level workings of things, even
though it may help with a given primitive. Even C can
be too low a level. How do you see object hierarchies in
assembler? Or even C++ source code -- you need a way of
seeing the woods as well as the trees. E.g., the important
aspects of OpenGL are not the language it's been ported
to, though you can be sure there are assembler-coded
primitives supporting any given implementation. An annotated
dataflow diagram supplemented with some equations etc. may
be one of the best helps in arriving at understanding.
A riveter sees the importance of rivets everywhere, but
it's a limited view.

>The most important thing a student can learn about
>computers is the fundamentally simple nature.  The
	I think the "fundamentally simple nature" you
are talking about is a simplified *view/abstraction* of an
older-architecture computer. If you focus on small enough
a detail, it becomes "simple," e.g., a bus transaction
that moves a word from memory to cpu chip, but to keep it
simple you may have to ignore caches, burst modes, split
address/data transactions, multiple bus masters, etc. etc.
not to mention multiple CPUs each with separate multiple
execution units, pipelines, etc. Perhaps what is needed
is a standard virtual machine for teaching purposes.
>mystery of execution *must* be broken down in order for
	This mystery sounds suspiciously like step-step,
i.e., a rule of sequential evaluation. I think a lot of these
things benefit from being considered in the abstract (assuming
one is comfortable in that domain). A nice book is "Anatomy
of Lisp" by John Allen, (c) 1978 by McGraw-Hill, Inc.
ISBN-0-07-001115-X. I don't think they'll mind if I quote a
quotation the author put at the head of the preface:

	"'... it is important not to lose sight of the fact that
there is a difference between training and education. If computer
science is a fundamental discipline, then university education in
this field should emphasize enduring fundamental principles rather
than transient current technology.'
	Peter Wegner, Three Computer Cultures"

As true now as then, I think. A particular assembler for a particular
CPU/OS environment is "transient current technology." A well-chosen
VM might not be. An object-oriented implementation of an emulator
for such a VM might be an interesting multi-perspective project...

>the student to being to think like a programmer.
Of Smalltalk/LISP/CLOS or C++/Delphi or assembler/Fortran/C? ;-)
>Once they can think like a programmer, all the rest
>of the knowledge they learn becomes trivial.
A terrifying prospect!
After 70k+- hrs of OJT as programmer, I still run into
stuff that seems non-trivial to me, thank goodness.
But then I have been trying to get beyond my assembler origins ;-)

BTW, what about thinking like a mathematician ;-) Or a
problem-solver? Or an artist?

If you are teaching programming, how much "transient current
technology" is it worth while to teach, projecting trends to
graduation time? Just enough to use it for lab work (education),
or in depth as case studies exemplifying the principles you
are trying to teach, or focused on a job market (training)?
I don't think there is a one-size-fits-all answer.

If you want students to know about *computers*, why not
teach them a little logical design and electronics?
So they have an idea of how data actually moves around
in that minitower. Hm. Should we leave caches out of it?
ISA vs VLB vs PCI? VME as example of non-synchronous
alternatives? Do they need to know how to write a BIOS?
What does the power-on reset do to all those chips on
all those boards that it reaches? Oh, and how about the
OS? After all, we don't usually run an assembler on the
bare metal. So, to have a decent concept of what really
is going on, don't they have to know how an OS boots up
and gets to the point where it's able to run a shell?
And how that shell gets its input and interacts with OS
services to allocate memory and load and start execution
of the assembler, and how that's going to get your code
from a file and translate it to object form in another file.
How about the linker and that whole story? Should we be
satisfied with knowing how this works in DOS as OS? "Puh-leez!"
It's maybe a little much to go into both Unix and WinXX
(and system 7?), so what to choose?

Ok, when are they going to be ready to learn about continuations?
Or meta-object protocol? Or unification? or Corba?
What is helpful? What is a scenic detour?

IMHO, over-emphasis on the sequential-execution aspect of programming
represented by assembler may well leave students so used to one view
of the optical illusion that they cannot easily snap their mental
image into another just-as-valid form. I.e., they may not "get"
OO design very easily.

My 2�

Regards,
Bengt Richter






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [any language except Ada]
  1996-08-12  0:00         ` Patrick Horgan
                             ` (5 preceding siblings ...)
  1996-08-21  0:00           ` What's the best language to learn? [was Re: Should I learn C or Pascal?] Darin Johnson
@ 1996-08-22  0:00           ` Jon S Anthony
  1996-08-23  0:00           ` Darin Johnson
  1996-08-24  0:00           ` Jon S Anthony
  8 siblings, 0 replies; 688+ messages in thread
From: Jon S Anthony @ 1996-08-22  0:00 UTC (permalink / raw)



In article <4vephc$8aa@hobyah.cc.uq.oz.au> Bill Mackay <wmackay@om.com.au> writes:

> A core unit of a post-grad course i'm doing was 100% Ada and 100% waste 
> of time! A rotten language and only used by the US military - enough 
> said!

Almost as much a waste of time as your "insightful" troll.

What a loser...

/Jon
-- 
Jon Anthony
Organon Motives, Inc.
1 Williston Road, Suite 4
Belmont, MA 02178

617.484.3383
jsa@organon.com





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-22  0:00                                             ` Bob Gilbert
@ 1996-08-22  0:00                                               ` Tim Behrendsen
  0 siblings, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-22  0:00 UTC (permalink / raw)



Bob Gilbert <rgilbert@unconfigured.xvnews.domain> wrote in article
<4vhgjk$8kc@zeus.orl.mmc.com>...
> In article <01bb8f19$9a89d820$32ee6fce@timhome2>, "Tim Behrendsen"
<tim@airshields.com> writes:
> > 
> > There is no other view than the procedural view.
> 
> Don't think so.
> 
> >  There is no
> > such thing as an algorithm that exists in zero time.  Even if
> > there is only one operation, or (n) operations that happen
> > simultaneously, it is still (n) data transformations that take
> > place over time.
> 
> And this relates to procedural vs non-procedural views how????

We may be having a terminology crisis; when I say "procedure",
I mean the computer doing work, by whatever method it uses,
over time.  Exactly how can a computer do work, either without
procedures, or without time?

> > How can someone implement *any* sort in assembly language,
> > and "learn it but not really understand it"?
> 
> Easy.  Understanding the how and why an algorithm works has little
> to do with implementing it, regardless of the language used.  How 
> does being able to implement a sort in assembly provide a student
> with the ability to prove that the given sort algorithm will work for
> all cases?  Learning to implement, and learning sound implementation
> techniques (regardless of language) is a separate issue from learning
> algorithm development and analysis.

You are right, in a mechanistic literal view of the world.  From
a human, reality view, implementation has a *lot* to do with
learning algorithm development and analysis.  You don't just
plug in the 'dev and ana' disk into the student's ear.  They
learn through trial and error, and what cements concepts
into their head is the implementation of the concepts.
 
> >  To implement it,
> > you have to do it in great detail, and you simply can't do the
> > "push and prod until it works" approach to programming, which
> > is what I think a lot of students do.
> 
> I think anyone can "push and prod" in any language.

It's a hell of lot harder to push and prod assembly.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-22  0:00                                                       ` Richard A. O'Keefe
@ 1996-08-22  0:00                                                         ` Szu-Wen Huang
  1996-08-23  0:00                                                           ` Richard A. O'Keefe
  1996-08-25  0:00                                                         ` Robert Dewar
  1 sibling, 1 reply; 688+ messages in thread
From: Szu-Wen Huang @ 1996-08-22  0:00 UTC (permalink / raw)



Richard A. O'Keefe (ok@goanna.cs.rmit.edu.au) wrote:
: "Tim Behrendsen" <tim@airshields.com> writes:

: >Well, but look at your own terminology.  You describe quicksort
: >as "worst case O(n*n)" and (paraphrase) "average case O(n*log(n))".
: >When the Big-Oh is given for a particular algorithm without
: >any qualification, it is usually assumed to be the average
: >behavior over the data set.

: This is news to me.  In fact, it shocked me so much that I went back to
: a recent textbook (Cormen, Leiserson, and Rivest) to check.

: They say
:         What me mean when we say "the running time is O(n^2 )"
: 	is that the WORST-CASE running time (which is a function of n)
:         is O(n^2), or equivalently, no matter what particular input of
:         size n is chosen for each value of n, the running time on that
:         set of inputs is O(n^2).

O() has nothing to do with the cases.  You can express either best,
average, or worst case scenarios using the notation.  I believe Tim
arguing that f(n) in O(f(n)) is assumed to be continuous unless
specified.

[snip]




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-21  0:00                                                   ` Adam Beneschan
@ 1996-08-22  0:00                                                     ` Andrew Koenig
  1996-08-24  0:00                                                       ` Robert Dewar
  1996-08-22  0:00                                                     ` Christian Bau
  1 sibling, 1 reply; 688+ messages in thread
From: Andrew Koenig @ 1996-08-22  0:00 UTC (permalink / raw)



In article <4vfk6b$i6h@krusty.irvine.com> adam@irvine.com (Adam Beneschan) writes:

> Here, O(n**-2) refers to terms in the sum that eventually go to zero
> as n gets large.  From what I could find, Knuth *never* uses it to
> describe the running time of an algorithm. 

`The Art of Computer Programming,' Vol. 3, page 90:

	Thus we can make a substantial improvement over straight
	insertion, from O(N^2) to O(N^1.667), just by using Shell's
	method with two increments.

Page 142:

	This idea is called quadratic selection; its total
	execution time is O(N*sqrt(N)), which is substantially
	better than order N^2.

And so on.
-- 
				--Andrew Koenig
				  ark@research.att.com




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-21  0:00                       ` Szu-Wen Huang
  1996-08-21  0:00                         ` Tim Behrendsen
@ 1996-08-22  0:00                         ` Mark Wooding
  1996-08-23  0:00                           ` Bengt Richter
  1996-08-23  0:00                         ` Clayton Weaver
  2 siblings, 1 reply; 688+ messages in thread
From: Mark Wooding @ 1996-08-22  0:00 UTC (permalink / raw)



Szu-Wen Huang <huang@mnsinc.com> wrote:

> Lofty, ungraspable concepts like:
> 
>   void swap(int *a, int *b)
>   {
>     int c = *a;
> 
>     *b = *a;
>     *a = c;
>   }
>   ...
>   swap(&a, &b);
>   ...
> 
> ?  swap(), by the way, is a primitive for just about every algorithms
> text I've ever read.  Does knowing the computer must:
> 
>   LOAD r1, a
>   ADD  something_else_totally_unrelated
>   LOAD r2, b
>   INC  some_counter_from_another_place
>   STOR r1, b
>   INC  that_same_counter_but_we_just_need_to_fill_the_slot
>   STOR r2, a
> 
> in order to swap two integers aid in the understanding of swap()?
> I agree that we need to break an algorithm down to primitives, but
> are you actually saying swap(), for instance, isn't primitive enough?

I think you're deliberately trying to choose a nasty architecture.
We're talking about teaching, I think, so why not choose an easy one?

; --- swap ---
;
; Just like void swap(int *a,int *b)

swap		LDR	a3,[a1]
		LDR	a4,[a2]
		STR	a3,[a2]
		STR	a4,[a1]
		MOV	pc,lr

Simple.  (And now there's a 200MHz version of this processor, so it's
not /that/ old-fashioned.)
-- 
[mdw]

`When our backs are against the wall, we shall turn and fight.'
		-- John Major





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-22  0:00                                                     ` Christian Bau
@ 1996-08-22  0:00                                                       ` Larry Kilgallen
  1996-08-23  0:00                                                         ` Tim Hollebeek
  1996-08-24  0:00                                                         ` Robert Dewar
  1996-08-22  0:00                                                       ` (topic change on) Teaching sorts Marcus H. Mendenhall
  1996-08-23  0:00                                                       ` Teaching sorts [was Re: What's the best language to start with?] Andrew Koenig
  2 siblings, 2 replies; 688+ messages in thread
From: Larry Kilgallen @ 1996-08-22  0:00 UTC (permalink / raw)



In article <christian.bau-2208961046370001@christian-mac.isltd.insignia.com>, christian.bau@isltd.insignia.com (Christian Bau) writes:

> Some time ago I did some experiments to find out how valuable benchmarks
> are. All I did was to program the standard method for multiplying two
> square matrices filled with double values. The number of assembly language
> instructions executed for matrices of size n x n was something like
> k1*n*n*n + k2*n*n + k3*n + k4, so you would expect that execution time is
> a monotonous function of n...
> 
> On a real computer (PowerMac, no virtual memory, no background processes,
> nothing that would interfere with execution time), the _number of
> instructions per second_ did reproducably vary by a factor up to _seven_
> when going from n to n+1 (for example, case n = 128 took seven times
> longer than cases n = 127 and n = 129). So for this computer, and this
> problem, an execution time of O (n**3) means "less than k*n**3, for some
> k>0, and for all n >= some n0", and nothing more

That sounds like you hit a resonant point for cache collisions,
which certainly can "interfere with execution time".

Larry Kilgallen




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-22  0:00                                                         ` Szu-Wen Huang
  1996-08-22  0:00                                                           ` Pete Becker
@ 1996-08-22  0:00                                                           ` Robert Dewar
  1 sibling, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-22  0:00 UTC (permalink / raw)



Szu-Wen says

"I understand your point, and I agree with you in the strictest technical
sense.  However, it doesn't help at all if you tell somebody your
algorithm is O(n) when it's actually O(1000n) in a certain range of
n.  Think about it - what do we use time complexity for?  To predict
behavior as n increases from our test case (usually smaller) to our
real case (usually bigger).  If the statement of the algorithm does not
make it obvious that f(x) in O(f(x)) is not a continuous function, I
believe it is the failing of the statement."

O(n) is the same as O(1000n). I trust this is clear to you now if you
have followed this thread.

If you naively use big-O notation to predict behavior, you will get into
trouble!





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [any language except Ada]
  1996-08-21  0:00               ` What's the best language to learn? [any language except Ada] Bill Mackay
  1996-08-22  0:00                 ` Stephen M O'Shaughnessy
@ 1996-08-22  0:00                 ` Robert Dewar
  1996-08-23  0:00                   ` Larry J. Elmore
  1996-08-24  0:00                 ` Alan Brain
  2 siblings, 1 reply; 688+ messages in thread
From: Robert Dewar @ 1996-08-22  0:00 UTC (permalink / raw)



Bill McKay said

"A core unit of a post-grad course i'm doing was 100% Ada and 100% waste
of time! A rotten language and only used by the US military - enough
said!"

It is not clear that your evaluation of Ada as a "rotten language" is
useful, given the clear nonsense of your second claim, but still, you
would contribute more to the discussion if you said *what* you found
rotten about it ....





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-22  0:00                   ` TRAN PHAN ANH
@ 1996-08-22  0:00                     ` Dr E. Buxbaum
  0 siblings, 0 replies; 688+ messages in thread
From: Dr E. Buxbaum @ 1996-08-22  0:00 UTC (permalink / raw)



anh@kuhub.cc.ukans.edu (TRAN PHAN ANH) wrote:
>Just a side comment, by the time this thread and its offsprings end, the 
>original author will have learned both C and Pascal. :-)

The discussion which programming language is best is as old as 
programming. Any choice reflects the needs and personality of the 
programmer at least as much as anything else (appart, of course, from the 
needs and personality of your employer).

First, I think, anybody commenting on this issue should state which 
languages he/she knows. A lot of people advocate the only language they 
know, not because it is best, but becaus it is the only language they 
know. I myself started with Fortran on a Cyber mainfraim computer, but 
when I started to learn Pascal, it was like a revelation and I have not 
written a single line of Fortran ever since. My second revelation came 
when I first started working on a microcomputer, getting fast responses, 
graphical output, a user friendly interface and no hassle with user 
numbers has thoroughly convinced me of their value. Over the last 10 
years I have looked at a few other programming languages like, C, APL, 
Forth, Basic and Prolog. None of them have had the same 'Aha' effect on 
me as Pascal did. C in particular is to cryptic for my taste, although I 
know it good enough to port the odd interesting routine to Pascal. 

Part of the problem is of course the kind of programs you write. My own 
work is mainly concerned with the handling and evaluation of scientific 
data (I am a biochemist). I like a language which allows me to come back 
to my own programs (or that of other people) after a couple of years and 
see immediately how things work. Short utility programs, which work close 
to the hardware, may be a different kettle of fish. Of course, I work 
exclusively in the MS-DOS world, if I had to port programs between 
different operating systems, Pascal might not be such a good choice 
(although with the new Pascal standard and Gnu-Pascal being available for 
different systems, this may change). 

On a more partisan tone, I always wondered whether or not the quality of 
a programming language should not be reflected somehow in the final 
product. Questions of maintainability, readability, compiler complexity  
and so on should leave traces in the programs for the end user to see. 
This would require 2 programs, serving the same purpose, written in 
different languages. These programs should be generally available and 
have a function common enough that the comparison can be done without 
specialist knowledge.

I know of only one pair of programs that meet these standards: 
COMMAND.COM, the shell of the MS-DOS operating system is written in C and 
Assembler. 4DOS.EXE, its replacement, is written in Pascal, with some C 
and Assembler. So, if you whant to know which language is better, go 
ahead and compare these two programs. May I add that my own computer runs 
4DOS?






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-22  0:00                                                         ` Szu-Wen Huang
@ 1996-08-22  0:00                                                           ` Pete Becker
  1996-08-22  0:00                                                           ` Robert Dewar
  1 sibling, 0 replies; 688+ messages in thread
From: Pete Becker @ 1996-08-22  0:00 UTC (permalink / raw)



In article <4vhm8t$hdg@news1.mnsinc.com>, huang@mnsinc.com says...
>
>Pete Becker (pete@borland.com) wrote:
>[snip]
>: No. The point is that this notation talks about asymptotic behavior, not 
>: behavior for specific cases. If you want to talk about more complex notions, 
>: feel free, but don't use O() notation to talk about them, because you will 
>: confuse people when you misuse this notation. In particular, note that
>
>: f(x) = 100000*x + x*x
>
>: is linear for small values of x, and quadratic for large values. O(f(x)) is 
>: O(x^2), however, because eventually the x^2 term dominates.
>
>I understand your point, and I agree with you in the strictest technical
>sense.  However, it doesn't help at all if you tell somebody your
>algorithm is O(n) when it's actually O(1000n) in a certain range of
>n.  

O(n) and O(1000n) are EXACTLY THE SAME THING.

>Think about it - what do we use time complexity for?  

As I said earlier, if you want to devise a different notation for talking about 
algorithmic complexity, feel free to do so. Do not distort existing notation to 
mean something different from what it has traditionally meant in CS and from 
what it means in mathematics.

>To predict
>behavior as n increases from our test case (usually smaller) to our
>real case (usually bigger).  If the statement of the algorithm does not
>make it obvious that f(x) in O(f(x)) is not a continuous function, I
>believe it is the failing of the statement.
>
>IOW, if I test your algorithm for 1,000 inputs and find that it works
>fine, then install it for use on my real world 10,000,000 input database,
>*and* your documentation doesn't tell me about this "little kink" when
>n > 1000, let's just say I probably won't pay you.  ;)
>

If you try to extrapolate the behavior of an implementation of an algorithm to 
10,000,000 inputs from the behavior based on the behavior for 1,000 inputs you 
need something other than O().

>Based on the same reason why we study best, average, and worst case
>time complexities of each algorithm, I believe it is important, if not
>necessary, that these behaviors are documented along with the time
>complexity.  Put simply, I expect "O(n) with nothing else" to imply
>continuous linear behavior, and special cases such as yours to say
>"O(n) from here to there, O(n^2) from here to there".

What you "expect" is irrelevant if it is different from the way O(n) is 
defined. O(n) says that eventually the behavior is linear, not that it is 
linear everywhere. The idea is to simplify discussions of time complexity in 
order to improve understanding. Simplifications cannot be relied on to describe 
full behavior, precisely because they are simplifications.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-21  0:00                                                     ` Tim Behrendsen
                                                                         ` (2 preceding siblings ...)
  1996-08-21  0:00                                                       ` Tanmoy Bhattacharya
@ 1996-08-22  0:00                                                       ` Robert Dewar
  1996-08-24  0:00                                                         ` Joe Keane
  3 siblings, 1 reply; 688+ messages in thread
From: Robert Dewar @ 1996-08-22  0:00 UTC (permalink / raw)



Tim says

"I have to admit, I have to side with Szu-Wen.  I've never really
thought about this case for the O() notation, but it seems from
a purely mathematical standpoint that O(f(n)) means that f(n)
is *the* function.  If f(n) happens to be a non-continuous function,
then so be it.

If the running time is

    kn for n < 1000
    k(n^2) for n >= 1000

then f(n) =
    O(n) for n < 1000
    O(n^2) for n >= 1000

then the big-Oh function should be (pardon my syntax)

    O( (n < 1000) ? n : n^2 )


Well you can side with whoever you like, but big-O notation has a very
well accepted meaning, it is about asymptotic behavior, not actual
behavbior for any specific finite set of values. If you think that O(N)
means that you will see behavior of kN for all N, you are simply mistaken
as to the accepted meaning of this notation. If you don't find the
accepted meaning useful, fine, but don't try to make up your own
personal idiosyncratic definiteion.

Your final suggestion is just complete nonsense, since that formula
is asymptotically equal to n^2.

Asymptotic behavior is an important aspect of analysis of algorithms.
It is frequently misunderstood, and lots of poeple think that O(N)
means that you will see linear behavior, but it's just wrong, sorry!





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-21  0:00                                                   ` Tim Behrendsen
  1996-08-22  0:00                                                     ` Mike Rubenstein
@ 1996-08-22  0:00                                                     ` Robert Dewar
  1 sibling, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-22  0:00 UTC (permalink / raw)



Tim says

"Perhaps.  My understanding of O() notation is that it does
not give a prediction of running time, it gives a rough
approximation of behavior based on data set size."

OK, anyone can have a misunderstanding, but that's what this is, the big-O
behavior does NOT give a prediction of running time. For instance you
cannot say that an O(N^2) algorithm is worse than an O(N) algorithm, because
it might be the case that N has to be too large, or the constant is too great.

Here is an example. It is possible to multiply two N digit numbers using
an algorithm whose behavior is

  O (N ** (1 + epsilon))

for any value of epsilon, which sounds as though you can get as close to
linear as you like, which is true, but only in the asymptotically limiting
case. In practice, if you choose a very small epsilon, then the constant
is soo large that this is a very inefficient algorithm. In practice
you will prefer even an O(N^2) algorithm to an O(N^1.0000000000001)
algorithm constructed in this general manner (although in practice hybrid
algorithms like the Toomb-Cooke algorithm -- I hope I spell this right, it
is an ancient memory -- dynamically adjust epsilon to get optimal
performance for a given value of N, and result in a performance of
something like O(N log log N) which doesn't look as good as 
O(N^1.00000000001) but is in fact faster for any practical data set size.

Once again, I refer people to Mike Feldman's book, I think he has a very
clear treatment of big_O notation, as well as a good explanation of its
importance (despite hints in this thread that it is broken and needs
fixing :-)





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-21  0:00                                                     ` Tim Behrendsen
  1996-08-21  0:00                                                       ` Dann Corbit
@ 1996-08-22  0:00                                                       ` Richard A. O'Keefe
  1996-08-22  0:00                                                         ` Szu-Wen Huang
  1996-08-25  0:00                                                         ` Robert Dewar
  1 sibling, 2 replies; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-08-22  0:00 UTC (permalink / raw)



"Tim Behrendsen" <tim@airshields.com> writes:

>Well, but look at your own terminology.  You describe quicksort
>as "worst case O(n*n)" and (paraphrase) "average case O(n*log(n))".
>When the Big-Oh is given for a particular algorithm without
>any qualification, it is usually assumed to be the average
>behavior over the data set.

This is news to me.  In fact, it shocked me so much that I went back to
a recent textbook (Cormen, Leiserson, and Rivest) to check.

They say
        What me mean when we say "the running time is O(n^2 )"
	is that the WORST-CASE running time (which is a function of n)
        is O(n^2), or equivalently, no matter what particular input of
        size n is chosen for each value of n, the running time on that
        set of inputs is O(n^2).

>The point is not that O() makes absolute time predictions, but that it
>gives *behavior* predictions.

The normal use of big-Oh when discussing running time is to give
 - an asymptotic
 - upper bound
 - for the worst case.

Since "average" running time for any nontrivial algorithm (such as a sort)
is totally meaningless unless you specify the distribution of the inputs,
"average" times are the linguistically "marked" case and by the usual maxims
is the case which gets specially highlighted.  That is, if you want to talk
about average running times, you have to
 - SPECIFY A DISTRIBUTION and
 - *say* AVERAGE run time.
-- 
Australian citizen since 14 August 1996.  *Now* I can vote the xxxs out!
Richard A. O'Keefe; http://www.cs.rmit.edu.au/~ok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-21  0:00                                                       ` Tanmoy Bhattacharya
  1996-08-22  0:00                                                         ` Mike Rubenstein
@ 1996-08-22  0:00                                                         ` Dann Corbit
  1 sibling, 0 replies; 688+ messages in thread
From: Dann Corbit @ 1996-08-22  0:00 UTC (permalink / raw)





Tanmoy Bhattacharya <tanmoy@qcd.lanl.gov> wrote in article
<TANMOY.96Aug21164211@qcd.lanl.gov>...
{snips}
> The (1) above is obviously rigorous.
{1 from above}
> AB:  >  1) there exists k and N depending on k such that for all n > N,
the
> AB:  >     function in question is bounded in absolute value by by k
|f(n)|. 

Shouldn't that read:
1) there exists k and N and C depending on k such that for all n > N, 
    the function in question is bounded in absolute value by by k |f(n)| +
C
    where C is some numeric constant.
?
Functions may call some start up routines, etc, so that the line never 
passes through the origin.






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-21  0:00                                                   ` Adam Beneschan
  1996-08-22  0:00                                                     ` Andrew Koenig
@ 1996-08-22  0:00                                                     ` Christian Bau
  1996-08-22  0:00                                                       ` Larry Kilgallen
                                                                         ` (2 more replies)
  1 sibling, 3 replies; 688+ messages in thread
From: Christian Bau @ 1996-08-22  0:00 UTC (permalink / raw)



In article <4vfk6b$i6h@krusty.irvine.com>, adam@irvine.com (Adam
Beneschan) wrote (and I quote him out of context):

> Following this viewpoint, when we computer scientists speak of an
> algorithm's running time as O(n**2), mathematicians might say
> 
>     Running time = K * n**2 * (1 + O(1/n))
> 
> for some constant K.  The point here is that the proportional
> difference between the running time and (K * n**2) tends to disappear
> as n gets large (hence the O(1/n) term).  

Some time ago I did some experiments to find out how valuable benchmarks
are. All I did was to program the standard method for multiplying two
square matrices filled with double values. The number of assembly language
instructions executed for matrices of size n x n was something like
k1*n*n*n + k2*n*n + k3*n + k4, so you would expect that execution time is
a monotonous function of n...

On a real computer (PowerMac, no virtual memory, no background processes,
nothing that would interfere with execution time), the _number of
instructions per second_ did reproducably vary by a factor up to _seven_
when going from n to n+1 (for example, case n = 128 took seven times
longer than cases n = 127 and n = 129). So for this computer, and this
problem, an execution time of O (n**3) means "less than k*n**3, for some
k>0, and for all n >= some n0", and nothing more




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-21  0:00                                                       ` Pete Becker
@ 1996-08-22  0:00                                                         ` Szu-Wen Huang
  1996-08-22  0:00                                                           ` Pete Becker
  1996-08-22  0:00                                                           ` Robert Dewar
  0 siblings, 2 replies; 688+ messages in thread
From: Szu-Wen Huang @ 1996-08-22  0:00 UTC (permalink / raw)



Pete Becker (pete@borland.com) wrote:
[snip]
: No. The point is that this notation talks about asymptotic behavior, not 
: behavior for specific cases. If you want to talk about more complex notions, 
: feel free, but don't use O() notation to talk about them, because you will 
: confuse people when you misuse this notation. In particular, note that

: f(x) = 100000*x + x*x

: is linear for small values of x, and quadratic for large values. O(f(x)) is 
: O(x^2), however, because eventually the x^2 term dominates.

I understand your point, and I agree with you in the strictest technical
sense.  However, it doesn't help at all if you tell somebody your
algorithm is O(n) when it's actually O(1000n) in a certain range of
n.  Think about it - what do we use time complexity for?  To predict
behavior as n increases from our test case (usually smaller) to our
real case (usually bigger).  If the statement of the algorithm does not
make it obvious that f(x) in O(f(x)) is not a continuous function, I
believe it is the failing of the statement.

IOW, if I test your algorithm for 1,000 inputs and find that it works
fine, then install it for use on my real world 10,000,000 input database,
*and* your documentation doesn't tell me about this "little kink" when
n > 1000, let's just say I probably won't pay you.  ;)

Based on the same reason why we study best, average, and worst case
time complexities of each algorithm, I believe it is important, if not
necessary, that these behaviors are documented along with the time
complexity.  Put simply, I expect "O(n) with nothing else" to imply
continuous linear behavior, and special cases such as yours to say
"O(n) from here to there, O(n^2) from here to there".




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-19  0:00                         ` Tim Behrendsen
  1996-08-19  0:00                           ` John Hobson
@ 1996-08-23  0:00                           ` Alan Bowler
  1 sibling, 0 replies; 688+ messages in thread
From: Alan Bowler @ 1996-08-23  0:00 UTC (permalink / raw)



In article <01bb8ded$c55ea3a0$87ee6fce@timpent.airshields.com> "Tim Behrendsen" <tim@airshields.com> writes:
>I think Brooks' point is to measure relative programmer
>productivity in the same language, not different languages.  What
>is it the average programmer produces; like, 100 lines/month
>of fully debugged code or some low number?  Obviously, it's
>not the amount of typing that influences productivity.

Almost, however, it has been found that the the  number of lines
produced by the same programmer is somewhat language independant.
I.e. the same guy will write 100 lines of assembler in about the
same time as he writes a hundred lines of Fortran
>
>In fact, let's *dramatically* increase productivity and move
>to APL!  Heck, I remember a friend of mine wrote a "moon rocket
>lander" game in 1 line of APL on a dare.  A very *long* line,
>mind you...

APL is an exception to the line count rules.  Someone else once obsered
that for most languages you measure size in terms of lines.  For APL
you measure it in square inches.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-22  0:00                                                     ` Christian Bau
  1996-08-22  0:00                                                       ` Larry Kilgallen
  1996-08-22  0:00                                                       ` (topic change on) Teaching sorts Marcus H. Mendenhall
@ 1996-08-23  0:00                                                       ` Andrew Koenig
  2 siblings, 0 replies; 688+ messages in thread
From: Andrew Koenig @ 1996-08-23  0:00 UTC (permalink / raw)



In article <christian.bau-2208961046370001@christian-mac.isltd.insignia.com> christian.bau@isltd.insignia.com (Christian Bau) writes:

> On a real computer (PowerMac, no virtual memory, no background processes,
> nothing that would interfere with execution time), the _number of
> instructions per second_ did reproducably vary by a factor up to _seven_
> when going from n to n+1 (for example, case n = 128 took seven times
> longer than cases n = 127 and n = 129).

Aha!  Your machine had interleaved memory!
-- 
				--Andrew Koenig
				  ark@research.att.com




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-22  0:00                                       ` Tim Behrendsen
@ 1996-08-23  0:00                                         ` Larry J. Elmore
  0 siblings, 0 replies; 688+ messages in thread
From: Larry J. Elmore @ 1996-08-23  0:00 UTC (permalink / raw)





Tim Behrendsen <tim@airshields.com> wrote in article
<01bb903e$bee0db80$87ee6fce@timpent.airshields.com>...
> Bengt Richter <bokr@accessone.com> wrote in article
> <4vgs4p$evl@news.accessone.com>...
> So what?  Remember who we're talking about.  We're talking
> about Joe Schmoe off the street in CS 101.  Are we going
> to hit him full-bore with the entire universe of possible
> computer architectures?  You have to start somewhere, and
> you have to try and convey the procedural nature of the
> computer in the simplest possible terms.
 
Speaking as a sophomore in the CS program at Montana State University, I
must agree most wholeheartedly with Tim. The program here starts out
teaching simple program design in a generic PDL with a lab in C, Fortran or
Ada. The rest of the program is mostly Ada 83, though Ada 95 is coming Real
Soon Now. Most of the students with little prior experience with computers
beyond playing games and using a word processor appear to have very little
understanding of what they are actually doing. It seems a lot like learning
English without learning English grammar. Yes, you can write fairly
decently, but you'll never really command the language or communicate as
clearly as someone who does understand the grammar.

While I'm only a sophomore in college, I've been learning about and
programming computers since 1981 on my own, starting with a TRS-80 Model I
with 48k RAM and Basic. I've taken a handful of college classes before I
started college "for real" this time, things like basic EE, digital logic,
Fortran, plus a 6-month electronics tech school in the Air Force. I don't
think there's any substitute for learning what's "really happening" in the
machine at an early stage.
 
> > trivial. But I don't think looking at assembler is a good
> > way to understand system-level workings of things, even
> > though it may help with a given primitive. Even C can
> > be too low a level. How do you see object hierarchies in
> > assembler? Or even C++ source code -- you need a way of
> > seeing the woods as well as the trees.

> Indeed, all this is important ... later.  They are simply
> not capable of understanding any of this completely until
> they have grasped the procedural nature of the computer.

Absolutely correct! One can't put up a building without a proper
foundation...

> > execution units, pipelines, etc. Perhaps what is needed
> > is a standard virtual machine for teaching purposes.
> 
> I agree.  I think I mentioned this a long time ago, but
> it got lost in the noise.  I've suggested a 68000 or 6809
> processor; nice orthogonal instruction set.  It's not that
> I want everyone to learn assembler so they'll use it
> everyday on the job, I want them to get a "feel" for
> data movement/flow.

Yes, definitely use a simple machine! The other stuff (caches, virtual
memory, pipelines, superscalar or parallel processors) can all be taught
later! It can also be used to illustrate the history of computer design as
all these features were developed decades ago for mainframes and now are
being used by microprocessors.

> Bottom line, you pick a nice, simple architecture!  Heck,
> CP/M on a Z80!

That's what I learned on, and it's quite good for learning the basics.
Especially why Intel processors have the architecture they do now. Also
teach 6809 to compare another simple, but very different architecture. (And
a much more elegant one, in my opinion. I really liked the 6809 (and OS/9)
in the Tandy Color Computer way back when). This is a great way to
demonstrate simple operating systems, and how programs make calls to them.
I strongly believe such a basic beginning is vital to really understanding
what one is doing, even if you never actually use it again. There's more
than a few students in CS here that _don't_ understand exactly what it is
they're doing or why... Going immediately on to OO will just confuse them
more, and unnecessarily, IMHO.

Larry Elmore
ljelmore@montana.campus.mci.net




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-22  0:00                                                         ` Szu-Wen Huang
@ 1996-08-23  0:00                                                           ` Richard A. O'Keefe
  0 siblings, 0 replies; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-08-23  0:00 UTC (permalink / raw)



huang@mnsinc.com (Szu-Wen Huang) writes:

>O() has nothing to do with the cases.

You have one end of the stick.
  The end you have is "you can correctly and idiomatically use big-Oh
  when discussing the asymptotic cost of any case."

But it was the OTHER end we were talking about.
  The end you have missed is "when you say that an algorithm is big-Oh of
  something WITHOUT SAYING WHICH CASE YOU ARE TALKING about the normal
  implicature is that you are talking about the WORST case."

>You can express either best, average, or worst case scenarios
>using the notation.

Nobody has disputed this.  You _can_.  The question was which case is
normally *implied* by a mention of big-Oh without explicit mention of
the case.  For example, "chairman" *CAN* be used to refer to a man
(Mr Chairman) or a woman (Madam Chairman) but it is today argued that
since it is normally taken to imply a man *if you don't specify which*
it is therefore rude to use it.

It's just like the fact that you *can* say
	f(x) = O(g(x)) as x -> infinity
or	f(x) = O(g(x)) as x -> 0
but if you don't *say explicitly* where x is headed, the conventional
implication is that x is headed for +infinity.

>I believe Tim arguing that f(n) in O(f(n)) is assumed to be continuous
>unless specified.

Eh?  Big-Oh notation comes from the calculus, so when one talks about
O(f(x)) continuity might (or might not, the definition of big-Oh doesn't
care in the least) be relevant.  But when you are talking about O(f(n)),
the conventional implication is that n is an integer, and I'm not quite
sure what you mean by a continuous function from the (discrete!) natural
numbers.  Trivial topology?  Scott topology?  What?

-- 
Australian citizen since 14 August 1996.  *Now* I can vote the xxxs out!
Richard A. O'Keefe; http://www.cs.rmit.edu.au/~ok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-22  0:00                                                       ` Larry Kilgallen
@ 1996-08-23  0:00                                                         ` Tim Hollebeek
  1996-08-24  0:00                                                           ` Robert Dewar
  1996-08-24  0:00                                                         ` Robert Dewar
  1 sibling, 1 reply; 688+ messages in thread
From: Tim Hollebeek @ 1996-08-23  0:00 UTC (permalink / raw)




In article <1996Aug22.165831.1@eisner>, kilgallen@eisner.decus.org writes:
> In article <christian.bau-2208961046370001@christian-mac.isltd.insignia.com>, christian.bau@isltd.insignia.com (Christian Bau) writes:
> 
> > Some time ago I did some experiments to find out how valuable benchmarks
> > are. All I did was to program the standard method for multiplying two
> > square matrices filled with double values. The number of assembly language
> > instructions executed for matrices of size n x n was something like
> > k1*n*n*n + k2*n*n + k3*n + k4, so you would expect that execution time is
> > a monotonous function of n...
> > 
> > On a real computer (PowerMac, no virtual memory, no background processes,
> > nothing that would interfere with execution time), the _number of
> > instructions per second_ did reproducably vary by a factor up to _seven_
> > when going from n to n+1 (for example, case n = 128 took seven times
> > longer than cases n = 127 and n = 129). So for this computer, and this
> > problem, an execution time of O (n**3) means "less than k*n**3, for some
> > k>0, and for all n >= some n0", and nothing more

This sort of behavior isn't that rare at all; I've seen it in quite a
number of my benchmarks.  An interesting one I found is that the time
required to memcpy() 34 bytes is 5 times *less* than the time required
to memcpy 30 bytes on my system.  From a graph of time vs n, it
appears that memcpy() switches algorithms at 32 bytes (I believe from
a straightforward copy to something along the lines of Duff's device,
since assymptotically it is very close to my routine using the latter
method).

---------------------------------------------------------------------------
Tim Hollebeek         | Disclaimer :=> Everything above is a true statement,
Electron Psychologist |                for sufficiently false values of true.
Princeton University  | email: tim@wfn-shop.princeton.edu
----------------------| http://wfn-shop.princeton.edu/~tim (NEW! IMPROVED!)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [any language except Ada]
  1996-08-22  0:00                 ` Robert Dewar
@ 1996-08-23  0:00                   ` Larry J. Elmore
  0 siblings, 0 replies; 688+ messages in thread
From: Larry J. Elmore @ 1996-08-23  0:00 UTC (permalink / raw)





> Bill McKay said
> 
> "A core unit of a post-grad course i'm doing was 100% Ada and 100% waste
> of time! A rotten language and only used by the US military - enough
> said!"

I have to admit that when I was first introduced to Ada 83 one year ago, I
was rather disgusted. I felt the language was too big, too complicated, too
picky and WAY too verbose. If a camel was a horse designed by a committee,
I felt Ada was a Pascal designed by a government committee.

As I used it more and more, and learned more of its features, I gradually
began to grudgingly respect it. Then I got GNAT Ada 95 for my Win95 home
system (and largely escaped Ada 83 under VMS on the school's computer) and
I really began to appreciate Ada. There's a few details I disagree with
(like why couldn't they have specified square brackets ( '[', ']' ) for
arrays (like Pascal) instead of parens? It makes it easier for me to
distinguish between array references and function/procedure calls...), and
it still feels verbose, I actually prefer Ada 95 to C or C++ now and can
really understand why some people argue that Ada's the only real language
choice for many very large projects.

But for small programs and utilities, I still greatly favor Forth...

Larry J. Elmore
ljelmore@montana.campus.mci.net




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-19  0:00                 ` Richard A. O'Keefe
@ 1996-08-23  0:00                   ` Joe Keane
  0 siblings, 0 replies; 688+ messages in thread
From: Joe Keane @ 1996-08-23  0:00 UTC (permalink / raw)



In article <4v99re$g3s@goanna.cs.rmit.edu.au>
Richard A. O'Keefe <ok@goanna.cs.rmit.edu.au> writes:
>As for whether merge sort is a good choice for qsort():  if you have the
>spare memory (which in most cases on workstations and mainframes you _have_)
>and if you care about performance (which again one usually does) merge sort
>makes an *excellent* implementation of qsort(), if you are as good a
>hacker as whoever would have written the quicksort.  UNIX on a PDP-11 *had*
>to make use of a less efficient sort than merge sort because in 64k you
>didn't have the memory to spare for anything.

These days, with virtual memory, asking for a bit of extra space is
probably a good idea if it gives you a faster algorithm.  I'd say that
once you have the `qsort' interface, calling a function for comparison,
quicksort has lost its biggest advantage, the tight inner loops.

Indeed the GNU `qsort' is actually a merge sort, pretty much textbook,
and it beats any quick sort i've seen.  The FreeBSD `mergesort' function
is considerably more funky, but it seems to work very well.

More specifically, what the GNU `qsort' function does is this: if the
extra space needed is small, allocate it off the stack, otherwise call
malloc, and if that fails, fall back on a quicksort.  It's attention to
detail like this that distinguishes robust library functions from some
code that someone was playing around with.

--
Joe Keane, amateur mathematician




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-22  0:00                                                   ` Patrick Horgan
@ 1996-08-23  0:00                                                     ` Steve Heller
  0 siblings, 0 replies; 688+ messages in thread
From: Steve Heller @ 1996-08-23  0:00 UTC (permalink / raw)



patrick@broadvision.com (Patrick Horgan) wrote:

>When I find an elegant solution much of the code falls out.  The solution gets
>simpler.  It seems to come from a better understanding of the problem.  An
>elegant solution seems to arise out of the problem domain with a life of it's
>own demanding to be implemented.  Its a best-fit solution.

>When I find one (and I think years of thinking really hard and learning hard
>lead to finding more of them), it almost always ends up being the best of
>all worlds.

>It's efficient.
>It's easy to use.
>It's clear and easy to maintain.

>I don't think you get here without a lot of experience in good code and bad code,
>good ideas and bad ideas, and I know that this will offend some, but really,
>really, low level code like assembler code is one of the ingredients that
>go into your palette to bring you to a place where you can do this.

>Add in some intuition and some other right brain stuff, and you start getting
>developers that are head and shoulders above the rest of the folks.  You
>start getting people that come up with the elegant solutions.

>I've seen some of them trying to communicate in this (too) long discussion and
>being frustrated because people didn't understand what they were saying.

>It seemed so obvious to them, but they were talking in some cases to people 
>without the background to understand their arguments.

  It's often very difficult for someone who knows a subject very well
to communicate their insights to those who are relatively
inexperienced. That's probably why good technical writers can make a
living (in some cases, a very good living) writing books that explain
such things clearly and simply.




Steve Heller, author and software engineer
http://ourworld.compuserve.com/homepages/steve_heller 





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-22  0:00                         ` Mark Wooding
@ 1996-08-23  0:00                           ` Bengt Richter
  0 siblings, 0 replies; 688+ messages in thread
From: Bengt Richter @ 1996-08-23  0:00 UTC (permalink / raw)



mdw@excessus.demon.co.uk (Mark Wooding) wrote:

>Szu-Wen Huang <huang@mnsinc.com> wrote:

>> Lofty, ungraspable concepts like:
>> 
>>   void swap(int *a, int *b)
>>   {
>>     int c = *a;
>> 
>>     *b = *a;
>>     *a = c;
>>   }
>>   ...
>>   swap(&a, &b);
>>   ...
>> 
>> ?  swap(), by the way, is a primitive for just about every algorithms
>> text I've ever read.  Does knowing the computer must:
>> 
>>   LOAD r1, a
>>   ADD  something_else_totally_unrelated
>>   LOAD r2, b
>>   INC  some_counter_from_another_place
>>   STOR r1, b
>>   INC  that_same_counter_but_we_just_need_to_fill_the_slot
>>   STOR r2, a
>> 
>> in order to swap two integers aid in the understanding of swap()?
>> I agree that we need to break an algorithm down to primitives, but
>> are you actually saying swap(), for instance, isn't primitive enough?

>I think you're deliberately trying to choose a nasty architecture.
>We're talking about teaching, I think, so why not choose an easy one?

>; --- swap ---
>;
>; Just like void swap(int *a,int *b)

>swap		LDR	a3,[a1]
>		LDR	a4,[a2]
>		STR	a3,[a2]
>		STR	a4,[a1]
>		MOV	pc,lr

>Simple.  (And now there's a 200MHz version of this processor, so it's
>not /that/ old-fashioned.)
>-- 
>[mdw]

>`When our backs are against the wall, we shall turn and fight.'
>		-- John Major
 ^^^^^^^^^^^^^^^^^^^^^^^^--I chuckle every time. Is that a real quote?

Here's what BCC32 -v -O2 shows via TD32 on a Pentium:
-----------------------------------------------------
Turbo Debugger Log
CPU Pentium
swap: void _fastcall swap(int *a, int *b){                 
:0040107C 53             push   ebx                        
#tswap#7:  temp = *a;                                      
:0040107D 8B08           mov    ecx,[eax]                  
#tswap#8:  *a = *b;                                        
:0040107F 8B1A           mov    ebx,[edx]                  
:00401081 8918           mov    [eax],ebx                  
#tswap#9:  *b = temp;                                      
:00401083 890A           mov    [edx],ecx                  
#tswap#10: }                                               
:00401085 5B             pop    ebx                        
:00401086 C3             ret                               
:00401087 90             nop                               
-----------------------------------------------------
Or, taking advantage of 386+ architecture by assembler:
Turbo Debugger Log
CPU Pentium
#swapx#11:  mov ecx,[eax]                     
:00401150 8B08           mov    ecx,[eax]     
#swapx#12:  xchg ecx,[edx]                    
:00401152 870A           xchg   [edx],ecx     
#swapx#13:  mov [eax],ecx                     
:00401154 8908           mov    [eax],ecx     
#swapx#14:  ret                               
:00401156 C3             ret                  
-----------------------------------------------------
Either way, there are additional architecture and compiler/
assembler-specific things you have to understand beyond the
C code representation, which in turn has things extraneous
to the essential algorithm in the abstract. I view the imperfect
match of expression language and idea as a necessary evil you have
to put up with to communicate. Ideally the language will be
appropriate and adequate, and both parties will be fluent.
But some algorithmic poetry will probably not survive translation
to assembler, so a language may have to be learned -- nay, mastered --
before the original can truly be appreciated.
Assembler is only one language, with numerous dialects. But
whatever language one uses to express something, it can be
said of the expression, "Ceci n'est-pas une pipe."

Regards,
Bengt Richter
I wish I knew Latin. I could pontificate so much more elegantly ;-)







^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-19  0:00                   ` Ted Dennison
@ 1996-08-23  0:00                     ` Richard A. O'Keefe
  1996-08-23  0:00                       ` Ted Dennison
  1996-08-24  0:00                       ` Robert Dewar
  0 siblings, 2 replies; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-08-23  0:00 UTC (permalink / raw)



Ted Dennison <dennison@escmail.orl.mmc.com> writes:

>I had some time on my hands this weekend, so I tried this myself (On vacation
>at the beach with a broken arm, what else was I to do?) With no practice,
>quicksort on a deck of playing cards took me less than 5 minutes. After a bit
>of practice, heapsort still took me more than 15 minutes (and a LOT of table
>space).

I would expect a two-handed swap to be about 3 times as fast as
a one-handed swap.  I just tried it with CDs, having a stack of them
handy, and I can swap two CDs with two hands a little bit *more* than
three times as fast as I can with one hand.  (I don't know why it's
more than 3x faster, it may just be the awkwardness of doing anything
one-handed.)  

So one-handed -vs- two-handed might make a very big difference.

There is another factor:  heap sort requires moving one's attention from
card [i] to card [i/2]; this is a very expensive operation for people.
The only way I would be able to do it with any reasonable speed is to
write the indices on the table and scan for a match instead of counting.
In contrast, insertion sort requires _no_ human counting, just move left
one and detect end of sequence, and depending on the size of the cards,
can benefit from a "block move".  (With a standard card deck, I could
probably move >5 cards up a position at once.)

The moral of the story is that the cost of the elementary operations
scales differently for humans, so that the "card sorting" trial cannot
be relied to tell you very much about how various sorts behave inside
a very different mechanism.  (Polynomial -> polynomial; that's about all.)

For what it's worth, if I am sorting a shuffled deck of cards, I lay
out four rows of Ace..King and simply move each card straight to its
right place.  This is a bucket sort of the numbers 1..52 into 52 buckets.
You can't do any better than that.  But what does that tell you about
sorting very large numbers of items, or items that are not numbers?

-- 
Australian citizen since 14 August 1996.  *Now* I can vote the xxxs out!
Richard A. O'Keefe; http://www.cs.rmit.edu.au/~ok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-16  0:00                                             ` Szu-Wen Huang
                                                                 ` (3 preceding siblings ...)
  1996-08-21  0:00                                               ` Matt Austern
@ 1996-08-23  0:00                                               ` Tanmoy Bhattacharya
  1996-08-23  0:00                                                 ` Adam Beneschan
  4 siblings, 1 reply; 688+ messages in thread
From: Tanmoy Bhattacharya @ 1996-08-23  0:00 UTC (permalink / raw)



In article <dewar.840757915@schonberg>
dewar@cs.nyu.edu (Robert Dewar) writes:
<snip>
RD: performance for a given value of N, and result in a performance of
RD: something like O(N log log N) which doesn't look as good as 
RD: O(N^1.00000000001) but is in fact faster for any practical data set size.
RD: 

I am confused: O(N log log N) is of course faster than
O(N^(1+epsilon)) for every epsilon>0. Proof: lim log N/N^epsilon = lim
1/(epsilon N^epsilon) = 0, and log log N is even slower because by the
same proof, lim log log N / log N = 0.

So, what was that comment that `it doesn't look as good'?

Cheers
Tanmoy
--
tanmoy@qcd.lanl.gov(128.165.23.46) DECNET: BETA::"tanmoy@lanl.gov"(1.218=1242)
Tanmoy Bhattacharya O:T-8(MS B285)LANL,NM87545 H:#9,3000,Trinity Drive,NM87544
Others see <gopher://yaleinfo.yale.edu:7700/00/Internet-People/internet-mail>,
<http://alpha.acast.nova.edu/cgi-bin/inmgq.pl>or<ftp://csd4.csd.uwm.edu/pub/
internetwork-mail-guide>. -- <http://nqcd.lanl.gov/people/tanmoy/tanmoy.html>
fax: 1 (505) 665 3003   voice: 1 (505) 665 4733    [ Home: 1 (505) 662 5596 ]




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-23  0:00                     ` Richard A. O'Keefe
@ 1996-08-23  0:00                       ` Ted Dennison
  1996-08-24  0:00                       ` Robert Dewar
  1 sibling, 0 replies; 688+ messages in thread
From: Ted Dennison @ 1996-08-23  0:00 UTC (permalink / raw)



Richard A. O'Keefe wrote:
> 
> Ted Dennison <dennison@escmail.orl.mmc.com> writes:
> 
> >I had some time on my hands this weekend, so I tried this myself (On vacation
> >at the beach with a broken arm, what else was I to do?) With no practice,
> >quicksort on a deck of playing cards took me less than 5 minutes. After a bit
> >of practice, heapsort still took me more than 15 minutes (and a LOT of table
> >space).
> 
> There is another factor:  heap sort requires moving one's attention from
> card [i] to card [i/2]; this is a very expensive operation for people.
> The only way I would be able to do it with any reasonable speed is to
> write the indices on the table and scan for a match instead of counting.

The way I did it was to actually build a (pyramid-shaped) heap. Anything 
else would just be too error prone, given a time to beat. That's why I
said it took a LOT of table space. 21 cards on the bottom, 16 on the next 
level, 8 on the next, etc.

> The moral of the story is that the cost of the elementary operations
> scales differently for humans, so that the "card sorting" trial cannot
> be relied to tell you very much about how various sorts behave inside
> a very different mechanism.  (Polynomial -> polynomial; that's about all.)

Exactly. For humans, comparisons are several factors faster than moves. So
comparisons would have to seriously outnumber moves before you'd see the kind
of time-behavior predicited by comparison-based analysis.

-- 
T.E.D.          (who just passed his algorithms course with a B)
                |  Work - mailto:dennison@escmail.orl.mmc.com  |
                |  Home - mailto:dennison@iag.net              |
                |  URL  - http://www.iag.net/~dennison         |




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-23  0:00                                               ` Tanmoy Bhattacharya
@ 1996-08-23  0:00                                                 ` Adam Beneschan
  0 siblings, 0 replies; 688+ messages in thread
From: Adam Beneschan @ 1996-08-23  0:00 UTC (permalink / raw)



tanmoy@qcd.lanl.gov (Tanmoy Bhattacharya) writes:
 >In article <dewar.840757915@schonberg>
 >dewar@cs.nyu.edu (Robert Dewar) writes:
 ><snip>
 >RD: performance for a given value of N, and result in a performance of
 >RD: something like O(N log log N) which doesn't look as good as 
 >RD: O(N^1.00000000001) but is in fact faster for any practical data set size.
 >RD: 
 >
 >I am confused: O(N log log N) is of course faster than
 >O(N^(1+epsilon)) for every epsilon>0. Proof: lim log N/N^epsilon = lim
 >1/(epsilon N^epsilon) = 0, and log log N is even slower because by the
 >same proof, lim log log N / log N = 0.
 >
 >So, what was that comment that `it doesn't look as good'?
 >
 >Cheers
 >Tanmoy

You're right that (N log log N) grows slower than (N^1.00000000001).
This means that there is a constant M such that, for all N > M, 
(N log log N) < (N^1.00000000001).  By doing some calculations, I've
determined M to be somewhere around e^3361885464550.  That's right,
e to the (3.36-trillion)th power.  So if we assumed that the two
constants of proportionality used for O-notation were the same, the 
O(N log log N) algorithm would only be faster if we were planning on
multiplying numbers with at least e^3361885464550 digits.  I suspect
it will be at least ten years before any manufacturers come out with a
computer even capable of storing that many digits in memory.  :-)

I'm just trying to point out how silly this discussion can become when
we try to be strict about the mathematics.  Sure, O(N log log N) is
smaller than O(N^1.00000000001), according to the mathematical
definitions, but (as Robert was trying to say) this tells us
absolutely nothing about which algorithm is better for our purposes.  

Just to continue the silliness, what we've all failed to realize is
that ALL sort algorithms are really O(1).  This is because, for any
sort algorithm, there is a constant M such that

    for N < M,      running time is roughly proportional to N^2 or
                    N^(3/2) or (N log N) or whatever
    for N >= M,     running time is constant (i.e. the time it takes
                    to display "Memory capacity exceeded" and abort)

So according to the mathematical definition of O-notation, the running
time is always O(1).  

                                -- Adam









^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-21  0:00                       ` Szu-Wen Huang
  1996-08-21  0:00                         ` Tim Behrendsen
  1996-08-22  0:00                         ` Mark Wooding
@ 1996-08-23  0:00                         ` Clayton Weaver
  2 siblings, 0 replies; 688+ messages in thread
From: Clayton Weaver @ 1996-08-23  0:00 UTC (permalink / raw)



On 21 Aug 1996, Szu-Wen Huang wrote:

> Tim Behrendsen (tim@airshields.com) wrote:

> : Hmmm... moving a deck of cards around to learn a sorting
> : technique.  Reducing the problem to a very low-level set of
> : movement operations to help in understanding procedurally
> : what the computer is doing.  <s>Naaah, couldn't work.  Much easier
> : to focus on the high-level C abstraction of the sorting
> : algorithm. </s> ;->

> Lofty, ungraspable concepts like:

>   void swap(int *a, int *b)
>   {
>     int c = *a;
> 
>     *b = *a;
>     *a = c;
>   }
>   ...
>   swap(&a, &b);
>   ...

The cards are more obvious in cs101. You have to learn C to demo the
algorithm in C. At that point you are asking students to memorize C
syntax, and only some will make the leap to understanding it before seeing
algorithmic examples that don't depend on C's particular symbols, which
seem quite obscure at first glance even to a Basic programmer.

> ?  swap(), by the way, is a primitive for just about every algorithms
> text I've ever read.  Does knowing the computer must:

>   LOAD r1, a
>   ADD  something_else_totally_unrelated
>   LOAD r2, b
>   INC  some_counter_from_another_place
>   STOR r1, b
>   INC  that_same_counter_but_we_just_need_to_fill_the_slot
>   STOR r2, a

> in order to swap two integers aid in the understanding of swap()?
> I agree that we need to break an algorithm down to primitives, but
> are you actually saying swap(), for instance, isn't primitive enough?

This interleaved register example is actually good. I agree with Elmore
that beginning education should include assembly. In the above example,
you would map out the algorithm in assembly statements as if the register
had a single pipeline, to show the algorithm in register terms, and then
show the interleave to show how you (or more likely a compiler) arranges
code to keep more than one pipeline full.

I would take it a bit further than a z80 or 6809, however. Use *more than
one* 32-bit processor with protected address spaces, and teach the
assembly and some common languages (C; C++; Ada; Basic; Modula-2,
Modula-3, or Oberon; Smalltalk; Forth) in *parallel*. Step 2 can be
functional languages and logic languages (Prolog, Lisp), still showing how
the same code looks in assembler, and perhaps how it looks in each of the
languages covered in the previous step. Step 3 is 4gl languages,
interpreted prototyping languages, very high level tools like actor
languages. Step 4 is specialization, advanced study in whichever of the
above looked most interesting or most marketable, depending on the
student.

So you get a comparative look at cisc, risc, and maybe a hybrid processor,
a comprehension of data flow in assembly terms for each architecture, an
exposure to the syntax of procedural, OO, logic, functional, and 4gl
languages, an insight into how the typical syntax and essential semantics
of each actually gets implemented on the hardware, some practical
experience in a variety of each type of language in current everyday use
in academia and in the programming job market, and a comprehension of
algorithms independent of any particular architecture, language type
paradigm, or particular language syntax. 

And the summer after they graduate, you will find any that didn't get
hired on the day after graduation to do something else programming in C on
linux. Why? Free compiler on a free multi-tasking os that runs on cheap
hardware that they already own with a complete set of development tools.

Ok, maybe they'll actually use Gnat, Gnu Smalltalk, the Prolog frontend,
g++, Oberon for linux, f2c, Modula-3 or some other gcc frontend (or
tcl/tk, perl, or java). I can't speak for Gnat, but these others all have
some bugs, while the C and libc bugs get rooted out right away by the
linux system programmers.  So gcc C ends up the most practical choice for
most of these programmers, because 99 times out of a 100 you can pin down
a bug to your code and not to the compiler or runtime library (i.e, one
place to check instead of 3 to determine the cause of an unexpected
result). 

The reality of C and gcc has nothing to do with the theoretical benefits
of either educational curricula choices or of language design optimization
for programmer convenience. Even if some other language would be more
convenient for expressing the solution to a particular programming
problem, "if you can do it in C you can do it for free."

It will be a boon to type-safe programming if Gnat develops
reliability comparable to Gnu C.

Regards, 

Clayton Weaver                              Transparent Words
cgweav@eskimo.com      (Seattle)      http://www.eskimo.com/~cgweav/











^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [any language except Ada]
  1996-08-12  0:00         ` Patrick Horgan
                             ` (6 preceding siblings ...)
  1996-08-22  0:00           ` What's the best language to learn? [any language except Ada] Jon S Anthony
@ 1996-08-23  0:00           ` Darin Johnson
  1996-08-25  0:00             ` Robert Dewar
  1996-08-24  0:00           ` Jon S Anthony
  8 siblings, 1 reply; 688+ messages in thread
From: Darin Johnson @ 1996-08-23  0:00 UTC (permalink / raw)



> I have to admit that when I was first introduced to Ada 83 one year ago, I
> was rather disgusted.

I'll admit too that this is the only Ada I've ever seen, and I bet
it's the only Ada most people have seen either.  Unfortunately, back
then there were few good implementations, and even the standard
packages were difficult to use even if they worked (I even had
troubles getting hello-world to work).  If the packages have gotten
standardized and are easy to use then that in itself would eliminate
most of the problems I had with it (unlike C++ where every vendor
claims to have a different industry wide standard class libary, with a
tendency to base things off of templates to ensure slow bulky apps).
-- 
Darin Johnson
djohnson@ucsd.edu	O-
    The full name of the compiler is "Compiler Language With No Pronounceable
    Acronym", which is, for obvious reasons, abbreviated "INTERCAL".




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [any language except Ada]
  1996-08-12  0:00         ` Patrick Horgan
                             ` (7 preceding siblings ...)
  1996-08-23  0:00           ` Darin Johnson
@ 1996-08-24  0:00           ` Jon S Anthony
  8 siblings, 0 replies; 688+ messages in thread
From: Jon S Anthony @ 1996-08-24  0:00 UTC (permalink / raw)



In article <01bb9102$1865afa0$506700cf@ljelmore.montana> "Larry J. Elmore" <ljelmore@montana.campus.mci.net> writes:

> I really began to appreciate Ada. There's a few details I disagree with
> (like why couldn't they have specified square brackets ( '[', ']' ) for
> arrays (like Pascal) instead of parens? It makes it easier for me to
> distinguish between array references and function/procedure calls...),

This seems to be one of those "religious" things.  The reason why it was
done is that from one perspective an array is just a function.  You give
it an input, it returns an output.  There's no reason why it should look
different from a function call.  Formally speaking, I think you can make
a very good case for this, but it is also true that arrays in Ada don't
really adhere to this.  You can create array types, you can slice and dice
them, you can assign them or parts of them, etc.

A related reason is one of maintenance.  You can shift from a simple
array "function" to a more complex computation based on the same input
without having to change any of the code referencing the instances.
This again is more useful and convincing in the case where you have,
say, an anonymous array and not a bunch of instances of an array
type.

Basically, I'm swayed more by the second reason and the fact that I like
parens more than the (subjective view here) "clunky" looking square
brackets.  Shrug...


> it still feels verbose, I actually prefer Ada 95 to C or C++ now and can
> really understand why some people argue that Ada's the only real language
> choice for many very large projects.
> 
> But for small programs and utilities, I still greatly favor Forth...
> 
> Larry J. Elmore
> ljelmore@montana.campus.mci.net


/Jon
-- 
Jon Anthony
Organon Motives, Inc.
1 Williston Road, Suite 4
Belmont, MA 02178

617.484.3383
jsa@organon.com





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [any language except Ada]
  1996-08-21  0:00               ` What's the best language to learn? [any language except Ada] Bill Mackay
  1996-08-22  0:00                 ` Stephen M O'Shaughnessy
  1996-08-22  0:00                 ` Robert Dewar
@ 1996-08-24  0:00                 ` Alan Brain
  2 siblings, 0 replies; 688+ messages in thread
From: Alan Brain @ 1996-08-24  0:00 UTC (permalink / raw)



Bill Mackay <wmackay@om.com.au> wrote:
>
>A core unit of a post-grad course i'm doing was 100% Ada and 100% waste 
>of time! A rotten language and only used by the US military - enough 
>said!

Sorry that you didn't have a good teacher.

1. Ada is used in many places, not just the US Military. Like when you ride in an 
Ansett or Qantas 737 or Airbus. If this is the impresiion you got, either your 
teacher is less than perfect, or you didn't listen. (Be Honest...) I might add that 
a lot of Academics who use Ada concentrate on the bells n whistles - arrays of Task 
Types etc. which are almost never used in practice, due to being impossible to debug 
(this may change with Ada-95). But I digress.

2. If you'd like a copy of the internal training notes I've done for GEC-Marconi (an 
intro to Ada for C programmers) just E-mail me.

Regards, AEB (Canberra, Oz)





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-22  0:00                                                     ` Andrew Koenig
@ 1996-08-24  0:00                                                       ` Robert Dewar
  0 siblings, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-24  0:00 UTC (permalink / raw)



Adam said

"> Here, O(n**-2) refers to terms in the sum that eventually go to zero
> as n gets large.  From what I could find, Knuth *never* uses it to
> describe the running time of an algorithm.
"

Are we talking about the same Knuth and the same book? (Art of Computer
Programming vols 1-3)? DK uses O(f(N)) notation consistently throughout
the book, and indeed one cannot imagine an algorithms book that did not!





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-22  0:00                                                       ` Larry Kilgallen
  1996-08-23  0:00                                                         ` Tim Hollebeek
@ 1996-08-24  0:00                                                         ` Robert Dewar
  1 sibling, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-24  0:00 UTC (permalink / raw)



Larry said

"That sounds like you hit a resonant point for cache collisions,
which certainly can "interfere with execution time".
"

Sounds more like a simple case of cache misses to me ...





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-23  0:00                     ` Richard A. O'Keefe
  1996-08-23  0:00                       ` Ted Dennison
@ 1996-08-24  0:00                       ` Robert Dewar
  1996-08-27  0:00                         ` Richard A. O'Keefe
  1 sibling, 1 reply; 688+ messages in thread
From: Robert Dewar @ 1996-08-24  0:00 UTC (permalink / raw)



Richard said

"There is another factor:  heap sort requires moving one's attention from
card [i] to card [i/2]; this is a very expensive operation for people.
The only way I would be able to do it with any reasonable speed is to
write the indices on the table and scan for a match instead of counting.
In contrast, insertion sort requires _no_ human counting, just move left
one and detect end of sequence, and depending on the size of the cards,
can benefit from a "block move".  (With a standard card deck, I could
probably move >5 cards up a position at once.)
"

Oh gosh no!!!

Arrange the cards in a heap layed out as a binary tree, nothing else
makes sense if you are using heap sort on cards. Remember that the
[i] to [i/2] business is just a trick for mapping the underlying binary
tree!





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-22  0:00                                                       ` Robert Dewar
@ 1996-08-24  0:00                                                         ` Joe Keane
  0 siblings, 0 replies; 688+ messages in thread
From: Joe Keane @ 1996-08-24  0:00 UTC (permalink / raw)



At the risk of repeating what others have said well, the O() notation
has been around for a long time and it has a consistent, agreed-on
definition.  It sets an asymptotic upper bound and that is all.

There's also Theta() notation, which is what's needed in many cases.
Both are useful and it makes no sense to argue which one is better.

It's true that some people say O() when they mean Theta(), but that's
just wrong usage, simple as that.  Why don't they use the right term,
except that they don't know it or they're just generally sloppy.

--
Joe Keane, amateur mathematician




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-23  0:00                                                         ` Tim Hollebeek
@ 1996-08-24  0:00                                                           ` Robert Dewar
  0 siblings, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-24  0:00 UTC (permalink / raw)



Tim said

"This sort of behavior isn't that rare at all; I've seen it in quite a
number of my benchmarks.  An interesting one I found is that the time
required to memcpy() 34 bytes is 5 times *less* than the time required
to memcpy 30 bytes on my system.  From a graph of time vs n, it
appears that memcpy() switches algorithms at 32 bytes (I believe from
a straightforward copy to something along the lines of Duff's device,
since assymptotically it is very close to my routine using the latter
method).
"


Still you may be simply be seeing cache effects, it is very hard to
control for these.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-22  0:00                                                       ` Richard A. O'Keefe
  1996-08-22  0:00                                                         ` Szu-Wen Huang
@ 1996-08-25  0:00                                                         ` Robert Dewar
  1 sibling, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-25  0:00 UTC (permalink / raw)



Richard says

"The normal use of big-Oh when discussing running time is to give
 - an asymptotic
 - upper bound
 - for the worst case."

In practice, this is not at all always true, for isntance you will often
see people refer to quicksort as O(N.logN) without qualifying it with
average case, so I think the best thing is (a) when using big-O notatoin,
always qualify it with worst or average case and (b) when reading big-O
notation without such a qualification, do not assume that it is necessarily
worst or average case, but reserve judgmnent!





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [any language except Ada]
  1996-08-23  0:00           ` Darin Johnson
@ 1996-08-25  0:00             ` Robert Dewar
  0 siblings, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-25  0:00 UTC (permalink / raw)



Darin says

"I'll admit too that this is the only Ada I've ever seen, and I bet
it's the only Ada most people have seen either.  Unfortunately, back
then there were few good implementations, and even the standard
packages were difficult to use even if they worked (I even had
troubles getting hello-world to work).  If the packages have gotten
standardized and are easy to use then that in itself would eliminate
most of the problems I had with it (unlike C++ where every vendor
claims to have a different industry wide standard class libary, with a
tendency to base things off of templates to ensure slow bulky apps).
--

if you had trouble getting hello world to work, then there are only three
possible explanations

1. You did not know what you were doing
2. The Ada compiler you were using was completely broken
3. The Ada compiler you were using was incorrectly installed

Solid Ada 83 compilers in which the standard packages work just fine
(the I/O packages have always been standardized) have been around
for a long time.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-26  0:00                                                     ` Richard A. O'Keefe
  1996-08-26  0:00                                                       ` Mark Wooding
  1996-08-26  0:00                                                       ` Tim Behrendsen
@ 1996-08-26  0:00                                                       ` madscientist
  1996-08-29  0:00                                                         ` Richard A. O'Keefe
  1996-08-31  0:00                                                       ` Tanmoy Bhattacharya
  1996-09-04  0:00                                                       ` Patrick Horgan
  4 siblings, 1 reply; 688+ messages in thread
From: madscientist @ 1996-08-26  0:00 UTC (permalink / raw)



In <4vroh3$17f@goanna.cs.rmit.edu.au>, ok@goanna.cs.rmit.edu.au (Richard A. O'Keefe) writes:
>"Tim Behrendsen" <tim@airshields.com> writes:

.snip..

>standardisers say "that's a quality of implementation issue which is
>outside our scope".
>
>For an *exact* parallel, consider function inlining.
>These days, I expect a good compiler to at the very minimum support
>programmer-supplied inlining hints, and the Fortran, Pascal, and C
>compilers I normally use do it _automatically_.  This is another
>of those optimisations which has been known and used for decades in
>the Lisp community, which lets you write your program using a function
>call instead of a macro.  I guess inline functions are another
>"left-field optimisation".
>
>> For example, if someone does this:
>
>>void sub(char *s)
>>{
>>    ...
>>    if (strlen(s) == 0) {     /* check for null string */
>>        ...
>>    }
>>}
>
>>and they know it's stupid, but they also know the compiler
>>just happens to have optimization for it, should they
>>still be shot?  "Diana, get me my rifle."
>
>You have prejudged the issue by stipulating that the programmer
>doing this KNOWS it is stupid.  To start with, can I just make the
>obvious point that in a very large number of currently used languages,
>the direct equivalent of this would be the very best thing to do.
>There are a lot of *good* programmers out there (definition: in a
>reasonable amount of time they can write readable code that works)
>who would *not* "know that it's stupid".  It is, for example, much
>less stupid than writing
>	if (s == "") 
>
>Let's see what is good about this code:
> - it is correct (not a NULL test, not a "" test, but a length test)
> - it is clear (much clearer to people not immersed in C than "!*s")
>Doesn't look stupid to me yet.
>What's bad about it?
> - it takes O(|s|) time.
>BUT
> - standard optimisations like common subexpression elimination, can be
>   applied, so that any number of calls to strlen(s) without any
>   intervening potential change to s are no more expensive than a
>   single call
> - in many applications, all or a large part of s will be processed
>   if it is not empty, so the cost of sub() will be O(|s|) anyway
>so it's only "stupid" if it occurs in a loop and you know your
>compiler _doesn't_ move loop invariants out.

I Just wanted to point out that this rediculous conversation has taken
reality right out the window!

if(strlen(s)==0) 

is *NOT* functionally equivelent to 

if(s=="")

Wanna know why??

* It would only work if s could only represent string constants
and then only if the compiler removed duplicate constants! If a null string
were allocated with strdup("") then it's address would be somewhere else
entirely and the expression would fail.

The second expression compares the pointer s to the address of the string
constant "" (null string). If that constant resides in more than one place
in memory (because the compiler didn't remove redundant copies) then the
expression would fail even if it were "essentially" true.

-Pete

BTW: I find it good practice to put constant expressions on the left side of
comparisons to point out situations where I might forget one of the '='.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-26  0:00                                                     ` Richard A. O'Keefe
@ 1996-08-26  0:00                                                       ` Mark Wooding
  1996-08-30  0:00                                                         ` Kaz Kylheku
  1996-08-30  0:00                                                         ` Richard A. O'Keefe
  1996-08-26  0:00                                                       ` Tim Behrendsen
                                                                         ` (3 subsequent siblings)
  4 siblings, 2 replies; 688+ messages in thread
From: Mark Wooding @ 1996-08-26  0:00 UTC (permalink / raw)



Richard A. O'Keefe <ok@goanna.cs.rmit.edu.au> wrote:

>  - standard optimisations like common subexpression elimination, can be
>    applied, so that any number of calls to strlen(s) without any
>    intervening potential change to s are no more expensive than a
>    single call

and then later:

> strlen() is a pure function and its argument does not change, so the
> strlen(s) computation can be hoisted out of the loop.

Erk!  No it isn't.  A pure function is one whose value depends only on
its arguments.  sin() is pure.  strlen() isn't.

The argument to strlen() is a pointer to a string whose length we want.
It's the address, not the string itself.  Because the string can change
between calls to strlen(), it might give different results given the
same string address.  So the compiler can't just use its general `pure'
function mechanism for common-subexpression-optimising strlen()

Now, can a compiler do clever things and optimise strlen() all by
itself?  You comment that it might spot assignments to the string.  This
is true, but not all such assignments are visible to the compiler.  For
automatic buffers, this /is/ true, but (in my experience) calls to
strlen() and similar functions are comparitively rarerely used on
locally allocated buffers.  If the buffer is not local to the function,
there's no guarantee that (in an extreme case) its address hasn't been
made available to a signal handler which maliciously changes the
string's length.

Prove me wrong.
-- 
[mdw]

`When our backs are against the wall, we shall turn and fight.'
		-- John Major





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-21  0:00                                                   ` Tim Behrendsen
  1996-08-22  0:00                                                     ` Bengt Richter
@ 1996-08-26  0:00                                                     ` Richard A. O'Keefe
  1996-08-26  0:00                                                       ` Mark Wooding
                                                                         ` (4 more replies)
  1 sibling, 5 replies; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-08-26  0:00 UTC (permalink / raw)



"Tim Behrendsen" <tim@airshields.com> writes:

>They may exist, but does it exist universally?  The point is that to
>go out of your way to use left-field optimizations is just bad
>portability practice.

"Effen it wur good enough fur granpaw, it's good enough fur me."
in other words.  By the way, I find it EXTREMELY odd to call a technique
that has been known and used for nearly 20 years a "left-field" one.
Behrendsen is putting things exactly backwards;
*he* is advocating "do not use a technique in your programming language,
no matter how natural or readable it is, unless you know that every
compiler you use will implement it reasonably".  Well, I *can't* know
that.  Heck, I can't even know that
	X := X+1;
will take constant or even small time; almost all programming language
standardisers say "that's a quality of implementation issue which is
outside our scope".

For an *exact* parallel, consider function inlining.
These days, I expect a good compiler to at the very minimum support
programmer-supplied inlining hints, and the Fortran, Pascal, and C
compilers I normally use do it _automatically_.  This is another
of those optimisations which has been known and used for decades in
the Lisp community, which lets you write your program using a function
call instead of a macro.  I guess inline functions are another
"left-field optimisation".

> For example, if someone does this:

>void sub(char *s)
>{
>    ...
>    if (strlen(s) == 0) {     /* check for null string */
>        ...
>    }
>}

>and they know it's stupid, but they also know the compiler
>just happens to have optimization for it, should they
>still be shot?  "Diana, get me my rifle."

You have prejudged the issue by stipulating that the programmer
doing this KNOWS it is stupid.  To start with, can I just make the
obvious point that in a very large number of currently used languages,
the direct equivalent of this would be the very best thing to do.
There are a lot of *good* programmers out there (definition: in a
reasonable amount of time they can write readable code that works)
who would *not* "know that it's stupid".  It is, for example, much
less stupid than writing
	if (s == "") 

Let's see what is good about this code:
 - it is correct (not a NULL test, not a "" test, but a length test)
 - it is clear (much clearer to people not immersed in C than "!*s")
Doesn't look stupid to me yet.
What's bad about it?
 - it takes O(|s|) time.
BUT
 - standard optimisations like common subexpression elimination, can be
   applied, so that any number of calls to strlen(s) without any
   intervening potential change to s are no more expensive than a
   single call
 - in many applications, all or a large part of s will be processed
   if it is not empty, so the cost of sub() will be O(|s|) anyway
so it's only "stupid" if it occurs in a loop and you know your
compiler _doesn't_ move loop invariants out.

I would go so far as to say that

	int count_blanks(char const * const s) {
	    int i, r;

	    r = 0;
	    for (i = 0; i < strlen(s); i++)
		if (s[i] == ' ')
		    r++;
	    return r;
	}

is a _good_ function.  The ONLY strlen() optimisation that is needed here
is one that is completely general:  strlen() is a pure function and its
argument does not change, so the strlen(s) computation can be hoisted out
of the loop.  Since the optimisation in question is one which has been in
use since the very early Fortran days, it can hardly be called "left-field".
A compiler that _doesn't_ do this is indeed a poor compiler, these days.

Here I am actually arguing against my own inclination, because I strongly
dislike seeing strlen() used this way, and will take advantage of any
excuse to "fix" it.  I certainly tell students why not to do this.  But I
cannot in good conscience call clear correct code "stupid".

>> Scuse please?  WHAT reality?  The reality in this case is that you can
>> in fact in just about any reasonable programming language sort data
>> WITHOUT moving it.

>Wait, hold the phone!  "Sorts the data without moving it"?  What,
>is APL's sorting algorithm O(1)?

No of course not.  I didn't say that.  I said that
	in just about any reasonable programming language
[I did not say APL]
	you can sort data WITHOUT moving it
[I did not say O(1)].

My claim is that YOU CAN SORT THE DATA WITHOUT MOVING *THE DATA*.
I don't see how any sane reader could translate that into O(1),
unless perhaps said reader was under the delusion that sorting
involves nothing but movement!

>Yes, it may not actually get
>sorted until it gets printed, but that's irrelevent to the fact
>that it eventually gets sorted.

Printing is a complete irrelevancy indeed, dragged in solely by
Behrendsen.  It is another of the straw men I could well do without.

Sorting is not about movement.  It is about *determining a permutation*.
Depending on the type of the keys, it may do divisions (as in bucket
sort and radix sort) or comparisons (as in merge sort).
When you are sorting large or variable length records, it is of
practical importance that you can sort the data by returning a
permutation vector WITHOUT MOVING THE ORIGINAL DATA.  This is not
an academic point, because the permutation vector may be 10s or 100s
of times smaller than the original data.

My reference to the APL grade-up primitive made all clear:  grade-up
returns a permutation vector.  You can use a permutation vector to
access the original elements in ascending order, descending order, or
by binary search, or anything you want, without having to pay the
price of moving large records.  One of the things which made the
Bentley & McIlroy "Engineered" version of qsort() complicated was
having to work around the fact that qsort() is set up to sort large
fixed length records; writing the code to take advantage when the
elements turn out to fit into machine registers is tricky, and in
fact not portable.

>Indeed, and I think APL is kind of a neat language to learn.  But
>it hides so much about the internals of the computer, that I think
>it would give a student too many false impressions about how
>things really work.

Of course APL hides the internals of the computer.
That's what a high level language is FOR!
I've done enough programming in ESPOL (B6700 only) and BLISS-10 (DEC-10
only) to be grateful for languages that hide machine details.
I have manuals for
 - System/360
 - Z80 (somewhere)
 - 8051
 - 8086
 - 80960
 - Intergraph Clipper
 - Motorola 68000
 - Motorola 88000
 - SPARC
 - PA-RISC
 - (part of) MIPS 3000 (in Hennesey & someone).
and have studied them.  I not written assembly code for the Z80, 8051,
80960, or PA-RISC, though I have debugged other people's code for the
Z80 and 8051.  I have written assembler for the others, and for the
DEC-10, PDP-11, VAX, Pyramid, IBM RT-PC, and NS32532 (oh yes, and the
Computer Automation Alpha LSI 2 and the PRIME 400).  I know more about
the internals of computers than I really want to, and the better a job
the language I'm using does of hiding them, the better I like it.
(There is a *lot* to like about Oberon, but I cannot like Wirth's
reversion to C-style integral types.)

>I actually quite agree with this; it's how you get the "high level
>thinking" that I think is an issue.

There is an Edinburgh paper from the 70s that I want to quote from.
It came from the EMAS project.  The EMAS operating system was designed
and written at Edinburgh and written an Algol-like language called
IMP 77.  They observed that the operating system modules written by
people skilled in assembly language (and who even went to the extreme
of checking the assembly code produced by the compiler to make sure it
was efficient) tended to be
 - bigger,
 - less readable, and
 - SLOWER
than that produced by people who had a "high level language" perspective.
I'll try to remember to bring the paper in tomorrow so I can quote it
exactly.

>Even in a mathematical proof,
>you are talking about a sequence of micro-steps.  Yes, most
>proofs are built of other proofs, but I think this is more of
>a "proof macro language" than a "high-level mathematical
>language" (whatever that means).

I don't quite see proofs by duality or symmetry as simple macro
substitution.  I think your example here actually gives more comfort
to the anti-assembly camp than to your own.

I once took a masters level course on the Spinor Calculus.
The lecturer (who came from the Sorbonne) wanted to make sure we
understood the whole thing thoroughly, so he started with metamathematics
and constructed everything from the ground up.  We were nearly half way
through the course before we got to the rational numbers.  We ended up
with only two lectures that were actually about Spinors, and  I _still_
don't know what they are.  (I've started reading Penrose & Rindler to
find out.)  In that case at least, over-emphasis on "fundamentals" was
a major impediment to the high level understanding I really wanted.

For what it's worth, students here *do* get taught about computer
architecture, and they *do* get taught a bit of 68000 assembly code,
and they *do* work with little single board machines so they can try 
their 68000 code out.  But this is understood as a computer architecture
course, not as a data structures and algorithms course.  (The course
is not confined to the 68000, but that's all _they_ have to program at
that level.)

-- 
Australian citizen since 14 August 1996.  *Now* I can vote the xxxs out!
Richard A. O'Keefe; http://www.cs.rmit.edu.au/~ok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-18  0:00                       ` Robert Dewar
  1996-08-18  0:00                         ` Tim Behrendsen
@ 1996-08-26  0:00                         ` Patrick Horgan
  1996-08-27  0:00                           ` Alan Peake
  1996-08-29  0:00                           ` Darin Johnson
  1 sibling, 2 replies; 688+ messages in thread
From: Patrick Horgan @ 1996-08-26  0:00 UTC (permalink / raw)



In article <dewar.840342288@schonberg>, dewar@cs.nyu.edu (Robert Dewar) writes:
> Tim says
> 
> "Because I've found that people tend to stick with the first
> [dare I use the word] paradigm that they are introduced to.
> Everything else they learn will be compared against the first
> thing they learn"
> 
> How true is this? Certainly true to some extent, and is of course
> the fundamental reason why it is a huge mistake to teach assembly
> to begin with.

Maybe I'm an anomaly, but this isn't true for me at all.  My first language
was BASIC (the old kind with line-numbers;) my second 6502 assembler, and
then I learned small C well enough to get it running on my machine so I
could start learning C.  This bizarre start exposed me to a lot of paradigms
none of which predominate in how I approach a task now.  Rather as the years
go by I learn more and more paradigms, synethise combinations of them, make
up others, and apply everything applicable to every problem across my desk.
It was all important to my learning process and I've certainly come along
way since my idea of good code was to leave spaces in the line numbers so
I wouldn't have to renumber to insert lines.  

I certainly didn't imprint on any particular paradigm like I did on an editor.
(No I'm not telling, we've got too many religious wars going on now;)

-- 

   Patrick J. Horgan    patrick@broadvision.com   Have horse will ride.
   Opinions mine, not my employer's except by most bizarre coincidence.






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-26  0:00                                                     ` Richard A. O'Keefe
  1996-08-26  0:00                                                       ` Mark Wooding
@ 1996-08-26  0:00                                                       ` Tim Behrendsen
  1996-08-29  0:00                                                         ` Richard A. O'Keefe
  1996-08-26  0:00                                                       ` madscientist
                                                                         ` (2 subsequent siblings)
  4 siblings, 1 reply; 688+ messages in thread
From: Tim Behrendsen @ 1996-08-26  0:00 UTC (permalink / raw)



Richard A. O'Keefe <ok@goanna.cs.rmit.edu.au> wrote in article
<4vroh3$17f@goanna.cs.rmit.edu.au>...
> "Tim Behrendsen" <tim@airshields.com> writes:
> 
> >They may exist, but does it exist universally?  The point is that to
> >go out of your way to use left-field optimizations is just bad
> >portability practice.
> 
> "Effen it wur good enough fur granpaw, it's good enough fur me."
> in other words.  By the way, I find it EXTREMELY odd to call a technique
> that has been known and used for nearly 20 years a "left-field" one.
> Behrendsen is putting things exactly backwards;
> *he* is advocating "do not use a technique in your programming language,
> no matter how natural or readable it is, unless you know that every
> compiler you use will implement it reasonably".  Well, I *can't* know
> that.  Heck, I can't even know that
> 	X := X+1;
> will take constant or even small time; almost all programming language
> standardisers say "that's a quality of implementation issue which is
> outside our scope".

Just because it is known, doesn't mean it's implemented everywhere,
or even that people bother to put it in everywhere. I live in
what's really everywhere, not what's theoretically everywhere.

> For an *exact* parallel, consider function inlining.
> These days, I expect a good compiler to at the very minimum support
> programmer-supplied inlining hints, and the Fortran, Pascal, and C
> compilers I normally use do it _automatically_.  This is another
> of those optimisations which has been known and used for decades in
> the Lisp community, which lets you write your program using a function
> call instead of a macro.  I guess inline functions are another
> "left-field optimisation".
> 
> > For example, if someone does this:
> 
> >void sub(char *s)
> >{
> >    ...
> >    if (strlen(s) == 0) {     /* check for null string */
> >        ...
> >    }
> >}
> 
> >and they know it's stupid, but they also know the compiler
> >just happens to have optimization for it, should they
> >still be shot?  "Diana, get me my rifle."
> 
> You have prejudged the issue by stipulating that the programmer
> doing this KNOWS it is stupid.  To start with, can I just make the
> obvious point that in a very large number of currently used languages,
> the direct equivalent of this would be the very best thing to do.

We aren't talking about every language, we are talking about C.

> There are a lot of *good* programmers out there (definition: in a
> reasonable amount of time they can write readable code that works)
> who would *not* "know that it's stupid".  It is, for example, much
> less stupid than writing
> 	if (s == "") 

This is not correct code, so of course the former is "less stupid",
but does that make it non-stupid?

> Let's see what is good about this code:
>  - it is correct (not a NULL test, not a "" test, but a length test)
>  - it is clear (much clearer to people not immersed in C than "!*s")
> Doesn't look stupid to me yet.
> What's bad about it?
>  - it takes O(|s|) time.
> BUT
>  - standard optimisations like common subexpression elimination, can be
>    applied, so that any number of calls to strlen(s) without any
>    intervening potential change to s are no more expensive than a
>    single call
>  - in many applications, all or a large part of s will be processed
>    if it is not empty, so the cost of sub() will be O(|s|) anyway
> so it's only "stupid" if it occurs in a loop and you know your
> compiler _doesn't_ move loop invariants out.

Yes, yes, I've heard this before.  The compiler knows all, sees all,
fixes all.  If my strings are a few thousand bytes long, you
don't think it may be *slightly* inefficient?

> I would go so far as to say that
> 
> 	int count_blanks(char const * const s) {
> 	    int i, r;
> 
> 	    r = 0;
> 	    for (i = 0; i < strlen(s); i++)
> 		if (s[i] == ' ')
> 		    r++;
> 	    return r;
> 	}
> 
> is a _good_ function.  The ONLY strlen() optimisation that is needed here
> is one that is completely general:  strlen() is a pure function and its
> argument does not change, so the strlen(s) computation can be hoisted out
> of the loop.  Since the optimisation in question is one which has been in
> use since the very early Fortran days, it can hardly be called
"left-field".
> A compiler that _doesn't_ do this is indeed a poor compiler, these days.
> 
> Here I am actually arguing against my own inclination, because I strongly
> dislike seeing strlen() used this way, and will take advantage of any
> excuse to "fix" it.  I certainly tell students why not to do this.  But I
> cannot in good conscience call clear correct code "stupid".

I can easily call clear correct code stupid if it is blatently
inefficient.  I don't expect the compiler to fix bad code, and
thus I am never disappointed.
 
> >> Scuse please?  WHAT reality?  The reality in this case is that you can
> >> in fact in just about any reasonable programming language sort data
> >> WITHOUT moving it.
> 
> >Wait, hold the phone!  "Sorts the data without moving it"?  What,
> >is APL's sorting algorithm O(1)?
> 
> No of course not.  I didn't say that.  I said that
> 	in just about any reasonable programming language
> [I did not say APL]
> 	you can sort data WITHOUT moving it
> [I did not say O(1)].
> 
> My claim is that YOU CAN SORT THE DATA WITHOUT MOVING *THE DATA*.
> I don't see how any sane reader could translate that into O(1),
> unless perhaps said reader was under the delusion that sorting
> involves nothing but movement!

OK, fine.  Of course you can keep a pointers to the data and
sort the pointers, but that is irrelevent to the fact that some
data some where must be moved in order to produce a sort.
 

> >Indeed, and I think APL is kind of a neat language to learn.  But
> >it hides so much about the internals of the computer, that I think
> >it would give a student too many false impressions about how
> >things really work.
> 
> Of course APL hides the internals of the computer.
> That's what a high level language is FOR!
> I've done enough programming in ESPOL (B6700 only) and BLISS-10 (DEC-10
> only) to be grateful for languages that hide machine details.
> I have manuals for
>  - System/360
>  - Z80 (somewhere)
>  - 8051
>  - 8086
>  - 80960
>  - Intergraph Clipper
>  - Motorola 68000
>  - Motorola 88000
>  - SPARC
>  - PA-RISC
>  - (part of) MIPS 3000 (in Hennesey & someone).
> and have studied them.  I not written assembly code for the Z80, 8051,
> 80960, or PA-RISC, though I have debugged other people's code for the
> Z80 and 8051.  I have written assembler for the others, and for the
> DEC-10, PDP-11, VAX, Pyramid, IBM RT-PC, and NS32532 (oh yes, and the
> Computer Automation Alpha LSI 2 and the PRIME 400).  I know more about
> the internals of computers than I really want to, and the better a job
> the language I'm using does of hiding them, the better I like it.
> (There is a *lot* to like about Oberon, but I cannot like Wirth's
> reversion to C-style integral types.)

I am grateful that most of the details are hidden from me, too,
but we are talking about education.  We don't want to hide the
details from the students, we want them to learn.

You take all this for granted because you've been doing it
long enough to where you understand how to think.  I don't
think you are able to appreciate the difference between you and
someone who has never done it before.  There is a significant
amount of "learning to think" that has to be done.

> >I actually quite agree with this; it's how you get the "high level
> >thinking" that I think is an issue.
> 
> There is an Edinburgh paper from the 70s that I want to quote from.
> It came from the EMAS project.  The EMAS operating system was designed
> and written at Edinburgh and written an Algol-like language called
> IMP 77.  They observed that the operating system modules written by
> people skilled in assembly language (and who even went to the extreme
> of checking the assembly code produced by the compiler to make sure it
> was efficient) tended to be
>  - bigger,
>  - less readable, and
>  - SLOWER
> than that produced by people who had a "high level language" perspective.
> I'll try to remember to bring the paper in tomorrow so I can quote it
> exactly.

Uh, the people who coded in assembly -- AND CHECKED THE COMPILER
OUTPUT -- produced programs that were bigger and slower?  I think
that there were other factors involved other than just "assembly
vs HLL" style learning.

> >Even in a mathematical proof,
> >you are talking about a sequence of micro-steps.  Yes, most
> >proofs are built of other proofs, but I think this is more of
> >a "proof macro language" than a "high-level mathematical
> >language" (whatever that means).
> 
> I don't quite see proofs by duality or symmetry as simple macro
> substitution.  I think your example here actually gives more comfort
> to the anti-assembly camp than to your own.
> 
> I once took a masters level course on the Spinor Calculus.
> The lecturer (who came from the Sorbonne) wanted to make sure we
> understood the whole thing thoroughly, so he started with metamathematics
> and constructed everything from the ground up.  We were nearly half way
> through the course before we got to the rational numbers.  We ended up
> with only two lectures that were actually about Spinors, and  I _still_
> don't know what they are.  (I've started reading Penrose & Rindler to
> find out.)  In that case at least, over-emphasis on "fundamentals" was
> a major impediment to the high level understanding I really wanted.
> 
> For what it's worth, students here *do* get taught about computer
> architecture, and they *do* get taught a bit of 68000 assembly code,
> and they *do* work with little single board machines so they can try 
> their 68000 code out.  But this is understood as a computer architecture
> course, not as a data structures and algorithms course.  (The course
> is not confined to the 68000, but that's all _they_ have to program at
> that level.)

Again, I have to go back to the fact that *we have the world
that you want*.  And it doesn't work.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-27  0:00             ` Jeffrey C. Dege
  1996-08-27  0:00               ` Bob Cousins
  1996-08-27  0:00               ` Steve Heller
@ 1996-08-27  0:00               ` Craig Franck
  1996-08-27  0:00                 ` Ted Dennison
  1996-08-27  0:00               ` Ted Dennison
                                 ` (2 subsequent siblings)
  5 siblings, 1 reply; 688+ messages in thread
From: Craig Franck @ 1996-08-27  0:00 UTC (permalink / raw)
  To: jdege


jdege@jdege.visi.com (Jeffrey C. Dege) wrote:
>On 13 Aug 1996 10:44:56 -0700, Darin Johnson <djohnson@tartarus.ucsd.edu> wrote:
>>Too many people fall asleep in algorithms class
>>(then bitch about the waste of time later).
>
>It's odd how little things can bring back memories.
>
>Sitting there at nine o'clock at night, because I couldn't fit the
>day class into my schedule, listening to Sartaj Sahni drone:
>
>    And in step 27, we set temp.prev.next to temp.next.
>
>Then watching him erase the arrow connecting bubble C to bubble B,
>then watching him draw an array connecting bubble C to bubble A.
>
>    And in step 28, we set temp.next.prev to temp.prev.
>
>Then watching him erase the arrow connecting bubble A to bubble B,
>then watching him draw an array connecting bubble A to bubble C.
>
>    And in step 30, we...
>
>Has _anyone_ had an instructer who brought any excitement to this stuff,
>or is it inherently impossible to teach without becoming dull and tedious?
>
>-- 
>Anyone who cannot cope with mathematics is not fully human.  At best he
>is a tolerable subhuman who has learned to wear shoes, bathe and not
>make messes in the house.
>                -- Lazarus Long, "Time Enough for Love"
>

Robet Heinlein "Time Enough for Algorithms". It's a story about a 
liberal arts student who gets his brain transplanted into the body of 
a famous mathematician. So of course eveyone takes him seriously, and 
he then goes on to describe the "Worlds Most Important Algorithm".

I had a professor who used to say "I am here to teach, not entertain!".
Why the two had to become separated is a mystery to me...

-- 
Craig  
clfranck@worldnet.att.net 
Manchester, NH
"You see all around you people engaged in making others live
lives which are not their own, while they themselves care
nothing for their own real lives -- men who hate life though
they fear death". -- William Morris, "News from Nowhere" (1891)






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-13  0:00           ` Darin Johnson
                               ` (2 preceding siblings ...)
  1996-08-16  0:00             ` Dr E. Buxbaum
@ 1996-08-27  0:00             ` Jeffrey C. Dege
  1996-08-27  0:00               ` Bob Cousins
                                 ` (5 more replies)
  3 siblings, 6 replies; 688+ messages in thread
From: Jeffrey C. Dege @ 1996-08-27  0:00 UTC (permalink / raw)



On 13 Aug 1996 10:44:56 -0700, Darin Johnson <djohnson@tartarus.ucsd.edu> wrote:
>Too many people fall asleep in algorithms class
>(then bitch about the waste of time later).

It's odd how little things can bring back memories.

Sitting there at nine o'clock at night, because I couldn't fit the
day class into my schedule, listening to Sartaj Sahni drone:

    And in step 27, we set temp.prev.next to temp.next.

Then watching him erase the arrow connecting bubble C to bubble B,
then watching him draw an array connecting bubble C to bubble A.

    And in step 28, we set temp.next.prev to temp.prev.

Then watching him erase the arrow connecting bubble A to bubble B,
then watching him draw an array connecting bubble A to bubble C.

    And in step 30, we...

Has _anyone_ had an instructer who brought any excitement to this stuff,
or is it inherently impossible to teach without becoming dull and tedious?

-- 
Anyone who cannot cope with mathematics is not fully human.  At best he
is a tolerable subhuman who has learned to wear shoes, bathe and not
make messes in the house.
                -- Lazarus Long, "Time Enough for Love"





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-26  0:00                         ` Patrick Horgan
@ 1996-08-27  0:00                           ` Alan Peake
  1996-08-27  0:00                             ` Steve Heller
                                               ` (2 more replies)
  1996-08-29  0:00                           ` Darin Johnson
  1 sibling, 3 replies; 688+ messages in thread
From: Alan Peake @ 1996-08-27  0:00 UTC (permalink / raw)



In article <4vqt9t$3jk@ns.broadvision.com> patrick@broadvision.com (Patrick Horgan) writes:
>From: patrick@broadvision.com (Patrick Horgan)
>Subject: Re: What's the best language to learn? [was Re: Should I learn C or
>Pascal?]
>Date: 26 Aug 1996 01:06:05 GMT

>In article <dewar.840342288@schonberg>, dewar@cs.nyu.edu (Robert Dewar) writes:
>> Tim says
>> 
>> "Because I've found that people tend to stick with the first
>> [dare I use the word] paradigm that they are introduced to.
>> Everything else they learn will be compared against the first
>> thing they learn"
>> 
>> How true is this? Certainly true to some extent, and is of course
>> the fundamental reason why it is a huge mistake to teach assembly
>> to begin with.

>Maybe I'm an anomaly, but this isn't true for me at all.  My first language
>was BASIC (the old kind with line-numbers;) my second 6502 assembler, and
>then I learned small C well enough to get it running on my machine so I
>could start learning C. 

Not true for me either. My first language was Fortran 77 but I haven't used it 
in 20 years. Next was Basic which gets an occasional run. Various assemblers 
still get used from time to time but just about everything now gets done in 
C. I started on MSVC++ a few months ago thinking that this would be the next 
stage up but I'm not so sure now. The learning curve has been much steeper 
than I imagined (maybe due to advancing age too !) The handbooks form a pile 
about a foot high! 
However, I don't believe that assembly should be taught to CS students 
initially, if at all. Those that need to know it will pick it up later (on 
the job training etc.) but most CS graduates will never use it or even need 
it. Whatever advantages they lose by not learning assembler will be more 
than offset by the increased productivity of a higher level language.
As far as learning the basics of algorithms, what's wrong with the good old 
flow chart?  I still use them for complicated routines.

>I certainly didn't imprint on any particular paradigm like I did on an 
>editor.
>(No I'm not telling, we've got too many religious wars going on now;)

Come on - you can tell us. Wasn't Wordstar was it?  Edlin?

Alan






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-19  0:00                                               ` Robert Dewar
  1996-08-22  0:00                                                 ` Stephen Baynes
@ 1996-08-27  0:00                                                 ` Richard A. O'Keefe
  1 sibling, 0 replies; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-08-27  0:00 UTC (permalink / raw)



dewar@cs.nyu.edu (Robert Dewar) writes:
>In fact the critical thing to preserve the number of compares is to run
>the inner loop backwards. That comes naturally in C, but not in Ada,
>where the preferable code is to find the insertion point and then do a
>slice assignment to shuffle the data (the slice assignment is not only
>at a preferable higher level, but with a good optimizing compiler has
>the potential of generating much more efficient code on some architectures
>than the individual moves.

Whether the language is C or Ada has nothing to do with it.
The key step in insertion sort is moving from

	+--------+-+--------+
    A:	|   ab   |x|    w   |
	+--------+-+--------+
	 L        I        U

where	A[L..I-1] is-permutation-of old A[L..I-1]
  and	A[L..I-1] is-sorted
  and	A[I..U] = old A[I..U]

to

	+----+-+----+--------+
    A:	|  a |x| b  |    w   |
	+----+-+----+--------+
	 L    J    I        U

where	all(A[L..J-1] <= x)
  and	all(x < A[j+1..I])

This can be viewed as two sub-tasks:
    - find J
    - rotate A[J..I] one place

I would expect any good programmer to think "hmm, I *have* to touch
the `b' elements in order to move them; do I really have to touch the
`a' elements as well to find J, or can I fuse these two tasks to do
both while touching only the `b' elements?"

As for the slice being faster, here are the figures I get from
Gnat 3.04 using
	gnatmake -O4 -gnatp -gnatr insert.adb
I already had several versions of insertion sort coded, amongst other
things to explore the question of whether slice assignment helped.
Thanks for your bubble sort procedures.  I have retyped them like this
to make the comparison more direct:

    procedure bubbl1(A: in out Sequence) is
	Switched: Boolean;
	T: Element;
    begin
	loop
	    Switched := False;
	    for J in A'First .. A'Last - 1 loop
		if A(J+1) < A(J) then
		    T := A(J);
		    A(J) := A(J+1);
		    A(J+1) := T;
		    Switched := True;
		end if;
	    end loop;
	    exit when not Switched;
	end loop;
    end bubbl1;

    procedure bubbl2(A: in out Sequence) is
	T: Element;
    begin
	<<Restart>>
	    for J in A'First .. A'Last - 1 loop
		if A(J+1) < A(J) then
		    T := A(J);
		    A(J) := A(J+1);
		    A(J+1) := T;
		    for K in J+1 .. A'Last-1 loop
			if A(K+1) < A(K) then
			    T := A(K);
			    A(K) := A(K+1);
			    A(K+1) := T;
			end if;
		    end loop;
		    goto Restart;
		end if;
	    end loop;
    end bubbl2;

Now, here are the times I got on a SPARCstation-10 (sun4m).
All times are in nanoseconds.  The best case times should not be
taken very seriously; all the methods go so fast it is hard to time
them.

Best	Average  	Worst	Method
400*N	 45*N**2    90*N**2   Naive insertion sort
300*N	110*N**2   240*N**2   Slice assignment insertion sort
200*N	180*N**2   200*N**2   bubbl1 (bubble sort with boolean variable)
200*N	170*N**2   200*N**2   bubbl2 (bubble sort with jump)

Result 1:  I was unable to obtain any evidence that bubbl2 is faster
	   than bubbl1.  (Remember that these times are _not_ precise.)
Result 2:  Using GNAT 3.04 with array bounds checking switched off and
	   optimisation high, the slice assignment does not pay off.
Result 3:  Insertion sort is much faster than bubble sort, except for
	   the case of an array that is already in sort order.

>So, you ran the inner loop backwards and indeed got the number of
>compares right, but you still have the overhead of both loops testing
>the termination condition,

and the evidence *does* show this happening in the best case.

>I find Hillam's initial bubble sort algorithm curious. I certainly don't
>call this bubble sort.

I call your code bubble sort too, and was rather surprised by Hillam's
book.  I used his code merely because it happened to be handy.  I guess
this is evidence that Hillam's book should be moved to the "Sturgeon's Law"
pile, which would not surprise me greatly.

>You missed the point. In the bubble sort, the outer loop is executed only
>once for a sorted list. For your insertion sort, the outer loop runs a full
>number of times. Your insertion sort has 2*N loop termination tests, while
>a properly written bubble sort has N loop termination tests.

Of course, the times I reported above apply to sorting an array of Integer.
I would say that bubble sort does N index comparisons and N element
comparisons in the best case, while insertion sort does 2N index comparisons
and N element comparisons, so we're comparing 2N with 3N, rather than 1N with
2N.  In the cases I have cared about in the past, the element comparisons have
been much more costly than index comparisons.

>I do not seem to count five variables here! More like two, one of which
>is a loop control constant.

Expanding Exch() inline, I count three variables for bubble sort:
	- a boolean
	- an index
	- an element
which is pretty close to insertion sort's two indices and one element.

>I do not know any way to write the insertion sort so that it has only N
>loop termination tests, certainly your example has 2*N tests.

Well, there are several pretty obvious things to try.

(1) Do one pass of selection sort, to find the left-most smallest element,
    and rotate the array so that it goes first.  Now we have a sentinel,
    and the index comparison can be dropped from the inner loop.

(2) Insert a "check for sorted prefix".  Something like this:

	void insertion_sort(elt *a, int n) {
	    int i, j;

	    /* find sorted prefix */
	    for (i = 1; i < N && !(a[i] < a[i-1]); i++) {}
	    /* now resume normal insertion sort */
	    for (; i < N; i++) {
		elt const t = a[i];
		for (j = i; j > 0 && t < a[j-1]; j--)
		    a[j] = a[j-1];
		a[j] = t;
	    }
	}

    If the array is already sorted, the first loop will do N index
    comparisons and N-1 element comparisons.

(3) We can actually write insertion sort as
	<<initialise i>>
	loop
	    <<search forward for out of place element>>
	    exit when <<there isn't any>>
	    <<fused find J and rotate>>
	end loop;
    so that the cost is O(N + X.N) where X is the number of out-of-order
    elements encountered.

This actually bears on another thread in this group:  you do NOT need to
think in assembly code terms to think of improvements such as this.

-- 
Australian citizen since 14 August 1996.  *Now* I can vote the xxxs out!
Richard A. O'Keefe; http://www.cs.rmit.edu.au/~ok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-24  0:00                       ` Robert Dewar
@ 1996-08-27  0:00                         ` Richard A. O'Keefe
  1996-09-02  0:00                           ` Lawrence Kirby
  0 siblings, 1 reply; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-08-27  0:00 UTC (permalink / raw)



dewar@cs.nyu.edu (Robert Dewar) writes:

>Richard said
>"There is another factor:  heap sort requires moving one's attention from
>card [i] to card [i/2]; this is a very expensive operation for people".

>Oh gosh no!!!

>Arrange the cards in a heap layed (sic) out as a binary tree, nothing else
>makes sense if you are using heap sort on cards. Remember that the
>[i] to [i/2] business is just a trick for mapping the underlying binary
>tree!

A very fair point, which I must concede.
It does, however, support my contention, which is that the costs for people
may differ from the costs for computers, and different spatial layouts of
the cards may very large difference to human costs, thus making manual card
sorting a rather poor way to gain intuition about *computer* costs.
Animations such as those on the Cormen Leiserson & Rivest CD may perhaps
be a better way to go.

-- 
Australian citizen since 14 August 1996.  *Now* I can vote the xxxs out!
Richard A. O'Keefe; http://www.cs.rmit.edu.au/~ok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-20  0:00                             ` Szu-Wen Huang
@ 1996-08-27  0:00                               ` Richard A. O'Keefe
  0 siblings, 0 replies; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-08-27  0:00 UTC (permalink / raw)



Here's the reference I promised.

	The IMP Language and Compiler
	P. D. Stephens
	EMAS report 6
	Department of Computer Science, University of Edinburgh

My copy says "Reprinted 1977"; it was also apparently published in
The Computer Journal.

IMP is basically a sort of Algol with dynamic arrays, records, pointers,
bitwise operations, strings, and nested procedures.  (Pointers to
procedures were at one time provided, but dropped because they felt that
the scope rules -- similar to Ada 95's -- "emasculated" the feature.)
It's rather like C, but somewhat higher level.

Here's the quotation:

	The EMAS programmers who had previous experience of high-level
	languages adapted easily to system programming in IMP.  They
	produced compact, highly structured programs, which were easy
	to maintain or amend despite defects incommentary and/or
	documentation.  They seldom worried about the efficiency of
	object code produced by the compiler, but their programs
	generally performed well.  This group included the most productive
	programmers working on the [EMAS operating system] project.
	
	Programmers with a background of assembly language were less
	happy with IMP and seldom used its more advanced features such
	as recursion.  They produced well commented and documented
	programs that nevertheless proved difficult to maintain since they
	lacked structure.  This group worried about the efficiency of
	object code produced by the compiler to the extent of examining
	the listings of code produced, yet their programs were often
	large in size and slow in execution.  Some of the least productive
	programmers were included in this group.

Earlier in the report, they say that

	Two routines in EMAS and twnty in the compiler have been hand-coded
	[in assembler].  The gains in performance or reductions in size
	have varied from an encouraging 2% to a rather discouraging 40%.
	The majority have fallen into the 10% to 20% range.  One routine
	in EMAS -- the interrupt analysis routine -- was originally written
	in assembly code.  This routine has recently been rewritten in IMP
	and this time the IMP version is smaller by 11% and presumably
	faster by a like amount.
...
	The effect of [improvements to the code generator, most importantly
	the register allocator] is that the Release 8 compiler produces
	about half the amount of object code that Release 1 produced for
	the same program.  Release 8 object code is rather more than twice
	as fast as Release 1 object code.  Since the compiler is written
	in IMP, compiling speeds have increased similarly.  These figures
	enable the 4 months spent hand coding analysis routines to be seen
	as the waste of effort it undoubtedly was.

The bottom line is that thinking in assembly language terms, whether the
programmers actually _wrote_ in assembly language or an Algol-like language,
in the end led to _less_ efficient code.  Draw your own conclusions about
implications for teaching.

-- 
Australian citizen since 14 August 1996.  *Now* I can vote the xxxs out!
Richard A. O'Keefe; http://www.cs.rmit.edu.au/~ok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-27  0:00             ` Jeffrey C. Dege
                                 ` (2 preceding siblings ...)
  1996-08-27  0:00               ` Craig Franck
@ 1996-08-27  0:00               ` Ted Dennison
  1996-08-28  0:00               ` Robert Dewar
  1996-09-01  0:00               ` Patrick Horgan
  5 siblings, 0 replies; 688+ messages in thread
From: Ted Dennison @ 1996-08-27  0:00 UTC (permalink / raw)



Jeffrey C. Dege wrote:
> 
> On 13 Aug 1996 10:44:56 -0700, Darin Johnson <djohnson@tartarus.ucsd.edu> wrote:
> >Too many people fall asleep in algorithms class
> >(then bitch about the waste of time later).
> 
> It's odd how little things can bring back memories.
> 
> Sitting there at nine o'clock at night, because I couldn't fit the
> day class into my schedule, listening to Sartaj Sahni drone:
> 
>     And in step 27, we set temp.prev.next to temp.next.
> 

> Has _anyone_ had an instructer who brought any excitement to this stuff,
> or is it inherently impossible to teach without becoming dull and tedious?
> 

My graduate Algorithms instructor at UCF was pretty interesting. He even
made sure to point out the relationship between the FindIth algorithm, 
tennis matches, and Alice in Wonderland.

Of course, I've had some of my fellow students violently disagree with me
on this. But I think they were reacting to the course's heavy math content.


-- 
T.E.D.          
                |  Work - mailto:dennison@escmail.orl.mmc.com  |
                |  Home - mailto:dennison@iag.net              |
                |  URL  - http://www.iag.net/~dennison         |




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-27  0:00               ` Craig Franck
@ 1996-08-27  0:00                 ` Ted Dennison
  1996-08-27  0:00                   ` John Hobson
  0 siblings, 1 reply; 688+ messages in thread
From: Ted Dennison @ 1996-08-27  0:00 UTC (permalink / raw)



Craig Franck wrote:
> 
> 
> I had a professor who used to say "I am here to teach, not entertain!".
> Why the two had to become separated is a mystery to me...
> 

(S)He's a moron. The best teachers I ever had were the MOST entertaining. 

One was even a former stand-up comedian, and it sure showed. Imagine a
prob/stat class being so interesting that people who aren't even taking
the course come to class!

-- 
T.E.D.          
                |  Work - mailto:dennison@escmail.orl.mmc.com  |
                |  Home - mailto:dennison@iag.net              |
                |  URL  - http://www.iag.net/~dennison         |




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-27  0:00             ` Jeffrey C. Dege
  1996-08-27  0:00               ` Bob Cousins
@ 1996-08-27  0:00               ` Steve Heller
  1996-08-27  0:00               ` Craig Franck
                                 ` (3 subsequent siblings)
  5 siblings, 0 replies; 688+ messages in thread
From: Steve Heller @ 1996-08-27  0:00 UTC (permalink / raw)



jdege@jdege.visi.com (Jeffrey C. Dege) wrote:

>Has _anyone_ had an instructer who brought any excitement to this stuff,
>or is it inherently impossible to teach without becoming dull and tedious?
  I don't believe it's inherently impossible, and I suspect my
students would agree with me. In fact, I have had some of them doubled
over with laughter in class (and not at my inability to express
myself, either). For a sample of my writing and teaching style, as
well as information on my books, you might want to visit my web site.
I'd appreciate any feedback you might have to give me on it.


Steve Heller, author and software engineer
http://ourworld.compuserve.com/homepages/steve_heller 





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-27  0:00                           ` Alan Peake
@ 1996-08-27  0:00                             ` Steve Heller
  1996-08-28  0:00                             ` Robert Dewar
  1996-08-28  0:00                             ` Tom Watson
  2 siblings, 0 replies; 688+ messages in thread
From: Steve Heller @ 1996-08-27  0:00 UTC (permalink / raw)



peake@dstos3.dsto.gov.au (Alan Peake) wrote:

> I started on MSVC++ a few months ago thinking that this would be the next 
>stage up but I'm not so sure now. The learning curve has been much steeper 
>than I imagined (maybe due to advancing age too !) The handbooks form a pile 
>about a foot high! 
  If you're trying to learn Windows programming at the same time as
C++, I advise against it. A much better plan is to learn C++ well
before getting involved with Windows. To learn C++, you might want to
take a look at my book, "Who's Afraid of C++?" (ISBN 0-12-339097-4),
which comes with a 32-bit C++ compiler for DOS. It's designed for
self-study by people with any (or no) level of prior programming
experience.
  I'd be interested to hear what you think of the book if you have a 
chance to look at it.

Steve Heller, author and software engineer
http://ourworld.compuserve.com/homepages/steve_heller 





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: (topic change on) Teaching sorts
  1996-08-22  0:00                                                       ` (topic change on) Teaching sorts Marcus H. Mendenhall
@ 1996-08-27  0:00                                                         ` Ralph Silverman
  0 siblings, 0 replies; 688+ messages in thread
From: Ralph Silverman @ 1996-08-27  0:00 UTC (permalink / raw)



Marcus H. Mendenhall (mendenmh@nashville.net) wrote:
: Christian Bau wrote:
: -> On a real computer (PowerMac, no virtual memory, no background 
: processes,
: -> nothing that would interfere with execution time), the _number of
: -> instructions per second_ did reproducably vary by a factor up to 
: _seven_
: -> when going from n to n+1 (for example, case n = 128 took seven times
: -> longer than cases n = 127 and n = 129). So for this computer, and 
: this

: Isn't cacheing fun?  I have observed many bizarre effects on the 
: PowerMacs when one is doing work which involves thrashing memory (FFT's, 
: matrix multiplies, etc.).

: In effect, one can usually assume that the total number of cpu cycles 
: actully used for floating point arithmetic in these cases is 0.  
: Counting real memory hits due to cache reloads gives a much more 
: accurate measure of time.

: In the case of testing your matrix multiply, you could use a trick I did 
: to investigate timing for FFT's: I took out all pointer increments from 
: the loop, so that the algorithm proceeded as usual, but carried out all 
: its operations on the same few bytes of memory.  It yields nonsense for 
: the result, but gives an idea of how many cpu cycles are spent on 
: everything except fetching. It is sometimes quite shocking (> factor of 
: 10) the speed increase.

: In your case, with the problem at 128 elements, i suspect this was 
: because of the way the PowerPC chips (some of them at least) choose 
: which cache line to fill with new data, and the 1024 byte offset between 
: successive data points probably meant that each fetch required a 
: complete cache line reload.

: Marcus Mendenhall

--
*****************begin r.s. response*******************

	yes...the beauty of old,  simple
	computers for this!!!!!

	think about an old 286 running old
	dos (c.3.31) such as my compaq 2551!

	a) much less in the way of these problems,
	you can be sure!!!!!

	b) so cheap it seems like a joke!!!!!

	c) also...shareware development systems
	(or freeware) you can get and try (or use) for
	nothing...
		chasm 
		a86
		fmodula2
		desmet pcc
		micro c (dave dunfield)
		small c
		(hi-tech) pacific ppd
		c--

		(interpreted languages too!)
		ubasic
		xlisp
		pc-lisp
		icon (may not have tried this myself)
		snobol (may not have tried this myself)
		abc (used this small amount)

		etc.

*****************end r.s. response*********************
Ralph Silverman
z007400b@bcfreenet.seflin.lib.fl.us





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-27  0:00             ` Jeffrey C. Dege
@ 1996-08-27  0:00               ` Bob Cousins
  1996-08-27  0:00               ` Steve Heller
                                 ` (4 subsequent siblings)
  5 siblings, 0 replies; 688+ messages in thread
From: Bob Cousins @ 1996-08-27  0:00 UTC (permalink / raw)



jdege@jdege.visi.com (Jeffrey C. Dege) wrote:

>Has _anyone_ had an instructer who brought any excitement to this stuff,
>or is it inherently impossible to teach without becoming dull and tedious?

If you want entertainment go and watch a movie.


-- 
Bob Cousins, Software Engineer.
http://www.demon.co.uk/sirius-cybernetics/





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-27  0:00                 ` Ted Dennison
@ 1996-08-27  0:00                   ` John Hobson
  0 siblings, 0 replies; 688+ messages in thread
From: John Hobson @ 1996-08-27  0:00 UTC (permalink / raw)



Ted Dennison wrote:
> 
> Craig Franck wrote:
> > I had a professor who used to say "I am here to teach, not entertain!".
> > Why the two had to become separated is a mystery to me...
> 
> (S)He's a moron. The best teachers I ever had were the MOST entertaining.
> 
> One was even a former stand-up comedian, and it sure showed. Imagine a
> prob/stat class being so interesting that people who aren't even taking
> the course come to class!

As an undergraduate, I had a professor who was a Southern Baptist
minister.
His lectures were spellbinding.

I also once took a physics class from a Nobel Prize winner.  They missed
a sure bet by not taping his lectures and selling them as cures for
insomnia.

--
John Hobson             |Whenever someone says to me,
Unix Support Group      |"Have a nice day", I reply,
ComEd, Chicago, IL, USA |"Sorry, I've made other plans."
jhobson@ceco.ceco.com   |	-- Sir Peter Ustinov




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-07  0:00                                   ` Tim Behrendsen
                                                       ` (6 preceding siblings ...)
  1996-08-15  0:00                                     ` Blair Phillips
@ 1996-08-27  0:00                                     ` Tanmoy Bhattacharya
  1996-08-29  0:00                                     ` Robert I. Eachus
  1996-08-30  0:00                                     ` Tanmoy Bhattacharya
  9 siblings, 0 replies; 688+ messages in thread
From: Tanmoy Bhattacharya @ 1996-08-27  0:00 UTC (permalink / raw)



In article <slrn523tpg.ce.mdw@excessus.demon.co.uk>
mdw@excessus.demon.co.uk (Mark Wooding) writes:
<snip>
MW: locally allocated buffers.  If the buffer is not local to the function,
MW: there's no guarantee that (in an extreme case) its address hasn't been
MW: made available to a signal handler which maliciously changes the
MW: string's length.

In C, a signal handler is not allowed to change anything whose type
isn't volatile sigatomic_t. So, technically, a C compiler can ignore this
situation.

The problem with strlen actually is that any data type can be treated
as an array of character type. So, if you do 

if(strlen(a) > 0) { changed = 1; a[strlen(a)-1] = '\0'; }

the compiler has to make sure that a != (char*)&changed before it can
decide to call strlen only once. 

Cheers
Tanmoy
--
tanmoy@qcd.lanl.gov(128.165.23.46) DECNET: BETA::"tanmoy@lanl.gov"(1.218=1242)
Tanmoy Bhattacharya O:T-8(MS B285)LANL,NM87545 H:#9,3000,Trinity Drive,NM87544
Others see <gopher://yaleinfo.yale.edu:7700/00/Internet-People/internet-mail>,
<http://alpha.acast.nova.edu/cgi-bin/inmgq.pl>or<ftp://csd4.csd.uwm.edu/pub/
internetwork-mail-guide>. -- <http://nqcd.lanl.gov/people/tanmoy/tanmoy.html>
fax: 1 (505) 665 3003   voice: 1 (505) 665 4733    [ Home: 1 (505) 662 5596 ]




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-27  0:00                           ` Alan Peake
  1996-08-27  0:00                             ` Steve Heller
@ 1996-08-28  0:00                             ` Robert Dewar
  1996-08-28  0:00                             ` Tom Watson
  2 siblings, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-28  0:00 UTC (permalink / raw)



Alan says

"As far as learning the basics of algorithms, what's wrong with the good old
flow chart?  I still use them for complicated routines."

Well anyone is free to use anything they find helpful, but the danger
with flow charts is that they are not at a pleasant semantic level, and
one would like to teach people to think more abstractly. I find flow
charts totally useless, *especially* when things are complex. 





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-27  0:00             ` Jeffrey C. Dege
                                 ` (3 preceding siblings ...)
  1996-08-27  0:00               ` Ted Dennison
@ 1996-08-28  0:00               ` Robert Dewar
  1996-09-01  0:00               ` Patrick Horgan
  5 siblings, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-28  0:00 UTC (permalink / raw)



Jefferey asks

"Has _anyone_ had an instructer who brought any excitement to this stuff,
or is it inherently impossible to teach without becoming dull and tedious?
"

Well I have had plenty of students who did not find this dull and tedious,
but of course that may be because of them rather than me :-)





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-28  0:00                             ` Tom Watson
@ 1996-08-28  0:00                               ` Robert Dewar
  1996-08-30  0:00                               ` Alan Peake
  1 sibling, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-08-28  0:00 UTC (permalink / raw)



Tom Watson said

"This reminds me of a ad wanting a person with 5+ years Java experience."

Maybe this is someone who believes that really Ada is close enough to Java
anyway, so five years experience with Ada would do :-)





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-27  0:00                           ` Alan Peake
  1996-08-27  0:00                             ` Steve Heller
  1996-08-28  0:00                             ` Robert Dewar
@ 1996-08-28  0:00                             ` Tom Watson
  1996-08-28  0:00                               ` Robert Dewar
  1996-08-30  0:00                               ` Alan Peake
  2 siblings, 2 replies; 688+ messages in thread
From: Tom Watson @ 1996-08-28  0:00 UTC (permalink / raw)



In article <peake.211.0029C4C8@dstos3.dsto.gov.au>,
peake@dstos3.dsto.gov.au (Alan Peake) wrote:

> 
> Not true for me either. My first language was Fortran 77 but I haven't
used it 
> in 20 years. 

It seems to me that you have a serious bug in your math here.  Fortran 77
didn't exist until 1978 (publication dates and the like).  Given that the
language didn't exist 20 years ago (this being 1996, and 20 years before
would make it 1976) you seem to be "fluffing" a resume, or something.

This reminds me of a ad wanting a person with 5+ years Java experience.

Get a clue!!

-- 
Tom Watson
tsw@3do.com         (Home: tsw@johana.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-29  0:00                                                         ` Richard A. O'Keefe
@ 1996-08-29  0:00                                                           ` Craig Franck
  1996-08-30  0:00                                                           ` system
                                                                             ` (2 subsequent siblings)
  3 siblings, 0 replies; 688+ messages in thread
From: Craig Franck @ 1996-08-29  0:00 UTC (permalink / raw)



ok@goanna.cs.rmit.edu.au (Richard A. O'Keefe) wrote:
>"Tim Behrendsen" <tim@airshields.com> writes:
>
>>Uh, the people who coded in assembly -- AND CHECKED THE COMPILER
>>OUTPUT -- produced programs that were bigger and slower?
>
>Yes, the report says that the people who were thinking in assembly
>terms (but writing IMP) and checked the compiler output DID produce
>bigger slower programs than the people who were thinking in high
>level (well, PL/I-ish) terms.
>
>>I think that there were other factors involved other than just
>>"assembly vs HLL" style learning.
>
>This was a project at one of the top computer departments in one of the
>top Universities in Britain, and they were desperately keen to get
>working software.  In fact, when UNIX V7 first started to become popular,
>IMP had been ported to more machines than C (including 16bit, 32-bit,
>36-bit, and 48-bit ones) and EMAS had been ported to more machines than
>UNIX.  When the University of Kent ditched ICL's operating system for
>the ICL 2900 and adopted EMAS, they wanted and got an *increase* in
>performance and reliability.
>
>When I met EMAS, in 1980, it was running on ICL 2900s and serving a
>large undergraduate population.  At the time I sneered at it because
>it wasn't UNIX, but dynamic loading, memory mapped files, archiving,
>and a bunch of other things were there and darned solid.
>
>In short, I think you will find that the people the report was talking
>about were good experienced 70s-style programmers.
>
>>Again, I have to go back to the fact that *we have the world
>>that you want*.  And it doesn't work.
>
>You don't even BEGIN to know what world I want.
>I can tell you for certain sure that we don't even begin to approach
>hailing distance of the shadow of the world I want.
>To start with, I would like students at entry
>
> - to have a really thorough grasp of their native language
>   (I could count the number of 1st year students here who can tell me
>    the difference between a count noun and a mass noun on the fingers
>    of one ear)
>
> - to have an adequate grasp of English if that is not their native language
>
> - to take *pleasure* in reading
>
> - to be able to use the index in a book
>
> - to be able to write a short essay on a topic that interests them
>
> - to have a reasonable grasp of the elements of algebra and calculus
>   (I learned calculus from "Teach Yourself Calculus"; a *great* little book)
>   [I *know* what the high-school curriculum has in this state; I'd take
>   that and say thank you but that the students don't actually -know- it]
>
> - to have a reasonable grasp of the really elementary points about statistics;
>   I don't care if they know what a standard deviation is, but I _do_ wish
>   they understood that in the presence of variation one measurement tells
>   you very little
>   [again, the Victorian high school currculum has everything I want and
>   more; it is or was a good curriculum]
>
> - to have some grasp of reasoning; ideally the notion of formal proof,
>   but at the very least the idea that a plausible argument might be wrong
>   [there is some very good material produced for schools these days.]
>
> - to be able to play at least one musical instrument (including the human
>   voice and drums as musical instruments) or knit or crochet or weave;
>   what I have in mind here is enjoyable "humane" or "arts" activities
>   that concern quasi-periodic patterns with a notation that you learn
>   to read, so that you learn to take pleasure in reading things that
>   are not text, but are still in some sense "stories".
>
>I'm sorry, but it is FARCICAL to argue about whether students should learn
>assembly code early on or not when I get students who cannot divide
>1000 by 10 without a calculator (this really happened) and cannot spell
>their own name (this happened too).  And we don't get the worst university
>entrants here either, far from it.
>
>In the world I want, it would be impossible to stop students learning
>about assembly code anyway; they would be willing and able to pick it
>up from a book.  In the world I want, it would not be dangerous to
>teach students about assembly early on, because they would have already
>caught the ideas of patterns and transformations and reasoning.  I wrote
>my first (IBM/360) assembly code program while still at high school; but
>I had learned matrix algebra before that, so it did me no harm.
>
>Prioritising the thousand things we need to teach, to students
> - who find reading and writing English difficult and unpleasant
> - who loathe and dread mathematics and anything that looks like it
> - who are much more interested in producing flashy GUIs than in having
>   a working algorithm for the U to I with
> - who cannot listen and take notes at the same time (preprinted lecture
>   notes are now being demanded as a right)
> - ah, you get the idea
>is not easy.  
>
>Tim Behrendsen, if you think assembler is more urgent than remedial
>English and elementary mathematics, we shall just have to disagree.
>

Well, Tim will have to answer that last question, but I will say
that being a well educated, moral, civic minded person is far more 
important than having a good grasp of any programming language. It
is just that a discusion of assembler VS. HLL is far more topical 
for a comp. group, all other concerns not withstanding. :-)

-- 
Craig  
clfranck@worldnet.att.net 
Manchester, NH
"You see all around you people engaged in making others live
lives which are not their own, while they themselves care
nothing for their own real lives -- men who hate life though
they fear death". -- William Morris, "News from Nowhere" (1891)






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-26  0:00                                                       ` madscientist
@ 1996-08-29  0:00                                                         ` Richard A. O'Keefe
  0 siblings, 0 replies; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-08-29  0:00 UTC (permalink / raw)



SYS0.MICRO-NEIL.COM@   (madscientist) writes:
>I Just wanted to point out that this rediculous conversation has taken
>reality right out the window!

>if(strlen(s)==0) 

>is *NOT* functionally equivelent to 

>if(s=="")

No, the conversation has *not* taken reality right out the window.
I said that 'if (strlen(s) == 0)' is better than 'if (s == "")'
precisely BECAUSE both of them are syntactically legal C, but
the first one is right and the second one is wrong.
THAT WAS THE FLIPPING *POINT*.

>If a null string
>were allocated with strdup("") then it's address would be somewhere else
>entirely and the expression would fail.

(a) You have nicely illustrated my point about English being more urgent
    than assembler.
(b) There is no strdup("") in ANSI C.  (It's in the System V ABI.)
(c) ANSI C has null characters, null pointers, null preprocessing
    directives, null statements, and "strings with zero length".
   'Null string' is an exceptionally confusing phrase to use for
    empty strings, because it suggests null pointers, which are not
    empty strings.  strdup(""), if strdup() exists, may return a
    null (string) pointer, or a pointer to an empty string.

-- 
Australian citizen since 14 August 1996.  *Now* I can vote the xxxs out!
Richard A. O'Keefe; http://www.cs.rmit.edu.au/%7Eok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-26  0:00                         ` Patrick Horgan
  1996-08-27  0:00                           ` Alan Peake
@ 1996-08-29  0:00                           ` Darin Johnson
  1 sibling, 0 replies; 688+ messages in thread
From: Darin Johnson @ 1996-08-29  0:00 UTC (permalink / raw)



> This reminds me of a ad wanting a person with 5+ years Java experience.

This seems to be standard sort of stuff for job ads.  2 years ago they
all wanted 5+ years in Windows NT.  My theory is that the person
wanting a job says "give me someone who knows Java", and the HR person
who fills out the job ad translates that to
    5+ years Java
    5+ years HTML
    Windows programming experience required
    OLE and ODBC programming experience recommended

Of course, on the other hand by saying "I haven't used Fortran 77 in
twenty years" might actually mean it hasn't been used.  Or that using
it physically aged the person enough so that the statement is true in
perspective.
-- 
Darin Johnson
djohnson@ucsd.edu	O-
    Gravity is a harsh mistress - The Tick




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-26  0:00                                                       ` Tim Behrendsen
@ 1996-08-29  0:00                                                         ` Richard A. O'Keefe
  1996-08-29  0:00                                                           ` Craig Franck
                                                                             ` (3 more replies)
  0 siblings, 4 replies; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-08-29  0:00 UTC (permalink / raw)



"Tim Behrendsen" <tim@airshields.com> writes:

>Uh, the people who coded in assembly -- AND CHECKED THE COMPILER
>OUTPUT -- produced programs that were bigger and slower?

Yes, the report says that the people who were thinking in assembly
terms (but writing IMP) and checked the compiler output DID produce
bigger slower programs than the people who were thinking in high
level (well, PL/I-ish) terms.

>I think that there were other factors involved other than just
>"assembly vs HLL" style learning.

This was a project at one of the top computer departments in one of the
top Universities in Britain, and they were desperately keen to get
working software.  In fact, when UNIX V7 first started to become popular,
IMP had been ported to more machines than C (including 16bit, 32-bit,
36-bit, and 48-bit ones) and EMAS had been ported to more machines than
UNIX.  When the University of Kent ditched ICL's operating system for
the ICL 2900 and adopted EMAS, they wanted and got an *increase* in
performance and reliability.

When I met EMAS, in 1980, it was running on ICL 2900s and serving a
large undergraduate population.  At the time I sneered at it because
it wasn't UNIX, but dynamic loading, memory mapped files, archiving,
and a bunch of other things were there and darned solid.

In short, I think you will find that the people the report was talking
about were good experienced 70s-style programmers.

>Again, I have to go back to the fact that *we have the world
>that you want*.  And it doesn't work.

You don't even BEGIN to know what world I want.
I can tell you for certain sure that we don't even begin to approach
hailing distance of the shadow of the world I want.
To start with, I would like students at entry

 - to have a really thorough grasp of their native language
   (I could count the number of 1st year students here who can tell me
    the difference between a count noun and a mass noun on the fingers
    of one ear)

 - to have an adequate grasp of English if that is not their native language

 - to take *pleasure* in reading

 - to be able to use the index in a book

 - to be able to write a short essay on a topic that interests them

 - to have a reasonable grasp of the elements of algebra and calculus
   (I learned calculus from "Teach Yourself Calculus"; a *great* little book)
   [I *know* what the high-school curriculum has in this state; I'd take
   that and say thank you but that the students don't actually -know- it]

 - to have a reasonable grasp of the really elementary points about statistics;
   I don't care if they know what a standard deviation is, but I _do_ wish
   they understood that in the presence of variation one measurement tells
   you very little
   [again, the Victorian high school currculum has everything I want and
   more; it is or was a good curriculum]

 - to have some grasp of reasoning; ideally the notion of formal proof,
   but at the very least the idea that a plausible argument might be wrong
   [there is some very good material produced for schools these days.]

 - to be able to play at least one musical instrument (including the human
   voice and drums as musical instruments) or knit or crochet or weave;
   what I have in mind here is enjoyable "humane" or "arts" activities
   that concern quasi-periodic patterns with a notation that you learn
   to read, so that you learn to take pleasure in reading things that
   are not text, but are still in some sense "stories".

I'm sorry, but it is FARCICAL to argue about whether students should learn
assembly code early on or not when I get students who cannot divide
1000 by 10 without a calculator (this really happened) and cannot spell
their own name (this happened too).  And we don't get the worst university
entrants here either, far from it.

In the world I want, it would be impossible to stop students learning
about assembly code anyway; they would be willing and able to pick it
up from a book.  In the world I want, it would not be dangerous to
teach students about assembly early on, because they would have already
caught the ideas of patterns and transformations and reasoning.  I wrote
my first (IBM/360) assembly code program while still at high school; but
I had learned matrix algebra before that, so it did me no harm.

Prioritising the thousand things we need to teach, to students
 - who find reading and writing English difficult and unpleasant
 - who loathe and dread mathematics and anything that looks like it
 - who are much more interested in producing flashy GUIs than in having
   a working algorithm for the U to I with
 - who cannot listen and take notes at the same time (preprinted lecture
   notes are now being demanded as a right)
 - ah, you get the idea
is not easy.  

Tim Behrendsen, if you think assembler is more urgent than remedial
English and elementary mathematics, we shall just have to disagree.

-- 
Australian citizen since 14 August 1996.  *Now* I can vote the xxxs out!
Richard A. O'Keefe; http://www.cs.rmit.edu.au/%7Eok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-07  0:00                                   ` Tim Behrendsen
                                                       ` (7 preceding siblings ...)
  1996-08-27  0:00                                     ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Tanmoy Bhattacharya
@ 1996-08-29  0:00                                     ` Robert I. Eachus
  1996-08-30  0:00                                       ` Steve Heller
  1996-08-30  0:00                                     ` Tanmoy Bhattacharya
  9 siblings, 1 reply; 688+ messages in thread
From: Robert I. Eachus @ 1996-08-29  0:00 UTC (permalink / raw)




    (I don't usually post this wide and not restrict followups, but
this seems to be a very general discussion of interest to all sorts of
software professionals.)

In article <503bq0$js@goanna.cs.rmit.edu.au> ok@goanna.cs.rmit.edu.au (Richard A. O'Keefe) writes:

  > You don't even BEGIN to know what world I want.
  > I can tell you for certain sure that we don't even begin to approach
  > hailing distance of the shadow of the world I want.
  > To start with, I would like students at entry...

  > I'm sorry, but it is FARCICAL to argue about whether students
  > should learn assembly code early on or not when I get students who
  > cannot divide 1000 by 10 without a calculator (this really
  > happened) and cannot spell their own name (this happened too).
  > And we don't get the worst university entrants here either, far
  > from it.

   We have all seen it happening and the slide is getting faster and
faster.  The world is rapidly dividing into those who try to
understand math, science, and technology, and those who have given up,
often at an early age.

   We joke about "McJobs" and read Dilbert, but the best reality we
can hope for is one where those who are willing to confront technology
on the terms it requires are allowed and encouraged to do so.

   Back to Richard for a second:

   > - to have a really thorough grasp of their native language
   >   (I could count the number of 1st year students here who can tell me
   >    the difference between a count noun and a mass noun on the fingers
   >    of one ear)

   > - to have an adequate grasp of English if that is not their
       native language 

   > - to take *pleasure* in reading

   I always used to complain about Snow's "Two Cultures" that he got
it fundamentally wrong.  The two classes are those who understand
logic and enjoy literature, and those who don't.  Science is a side
issue.  If you can't use your "mother tongue" as a tool, you can't
succeed at many things, including science, mathemathics, history,
medicine, literature, the performing arts, and politics.

   When I was in high school (in the early 60's), the National Science
Foundation had a number of summer programs for bright students.  I
went to one for Mathematics.  The real treat was that most of the
participants, from all sorts of backgrounds, could argue.  It didn't
matter what we were discussing, students framed their arguments
cogently, avoided ad hominem attacks, and graciously accepted facts
when introduced.

   My brother later took a college course which used one of the
textbooks from that summer.  (Absract Algebra by Andre'.)  After one
semester, the college moved it to from a lower-level (freshman) course
to an upper-level (Junior or Senior).  The reason? the students needed
much more exposure to non-mathmatical courses (read English and
Philosophy) as prerequisites.

   Can we fix the problem?  No, but we can do a lot to ameliorate it.
Human society is changing way to fast for human evolution (genetic and
social) to keep up.  We first have to reach the point where every
child and adult who has the intelligence and inclination to master the
"three R's" is given every bit of assistance and encouragement to do
so.  Then we can worry about the arrangement of courses when they
reach college, but we won't have to.  Once bright students have
mastered the three Rs, they will usually have dined on such an
eclectic smorgasbord of self selected material, that we can't pretend
to dictate an order of approach.  We have all run into (and been) such
students.  We desparately need to create more.

   If you can't think of anything else you can do to help.  Found, if
necessary, a debating club/team at your child's school, and help run
it.  Those kids may love computers, but they won't make it through
college and out into the workplace unless they learn to use language
as a tool. 

   I now take you back to your regularly scheduled newsgroup.
--

					Robert I. Eachus

with Standard_Disclaimer;
use  Standard_Disclaimer;
function Message (Text: in Clever_Ideas) return Better_Ideas is...




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Goto considered really harmful
  1996-08-21  0:00                     ` Tanmoy Bhattacharya
@ 1996-08-30  0:00                       ` Patrick Horgan
  1996-09-04  0:00                         ` Dennison
  0 siblings, 1 reply; 688+ messages in thread
From: Patrick Horgan @ 1996-08-30  0:00 UTC (permalink / raw)



Forwarded by: patrick@broadvision.com (Patrick Horgan)
Forwarded-by: spaf@cs.purdue.edu (Gene "Chief Yuckster" Spafford)
Forwarded-by: Patrick Tufts <zippy@cs.brandeis.edu>

From: rad@via.East.Sun.COM ( Bob Doolittle - Sun Parallel Open Systems)

Along these lines, when I was at UC Santa Cruz (late 70s), we still had
some crusty profs who insisted that punched cards built character. Since
it was an ivory tower, the dogma of "goto's considered harmful" was
religiously taught. So some grad students decided that negative
reinforcement was needed, and modified the PASCAL compiler's listing
generator, so that upon detecting a goto statement it would generate a
full 132 columns of '-' characters, followed by a carriage return w/o
linefeed, and print this line about 50 times, on the same line of the
output. This had the effect of chopping right through the lineprinter
output at the goto statement, cutting the paper in half, which jammed the
lineprinter, requiring operator intervention. This *really* pissed the
operators off, and generally resulted in a high-decibel stream of abuse
directed at the poor slob of an undergrad who submitted the job.

Pretty effective, all in all. You certainly could hear the change in
lineprinter melody when one of these listings was being generated, and
that was a good time to find some forgotten errands that needed running
elsewhere.
-- 

   Patrick J. Horgan    patrick@broadvision.com   Have horse will ride.
   Opinions mine, not my employer's except by most bizarre coincidence.






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-28  0:00                             ` Tom Watson
  1996-08-28  0:00                               ` Robert Dewar
@ 1996-08-30  0:00                               ` Alan Peake
  1996-08-31  0:00                                 ` Robert Dewar
  1996-09-07  0:00                                 ` .
  1 sibling, 2 replies; 688+ messages in thread
From: Alan Peake @ 1996-08-30  0:00 UTC (permalink / raw)




>> Not true for me either. My first language was Fortran 77 but I haven't
oops! Fortran IV - been so long, I forgot the version :)
Alan





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-29  0:00                                     ` Robert I. Eachus
@ 1996-08-30  0:00                                       ` Steve Heller
  0 siblings, 0 replies; 688+ messages in thread
From: Steve Heller @ 1996-08-30  0:00 UTC (permalink / raw)



eachus@spectre.mitre.org (Robert I. Eachus) wrote:

>   When I was in high school (in the early 60's), the National Science
>Foundation had a number of summer programs for bright students.  I
>went to one for Mathematics.  The real treat was that most of the
>participants, from all sorts of backgrounds, could argue.  It didn't
>matter what we were discussing, students framed their arguments
>cogently, avoided ad hominem attacks, and graciously accepted facts
>when introduced.
  I was also lucky enough to participate in that program, only mine
was in programming. I then went to Shimer College, which was very
similar except that it lasted 4 years (and they didn't have any
computers!) Most of the best friends I ever had were classmates of
mine at Shimer. By the way, if anyone is interested, I've written an
essay on that topic which is accessible via my home page.

>   If you can't think of anything else you can do to help.  Found, if
>necessary, a debating club/team at your child's school, and help run
>it.  Those kids may love computers, but they won't make it through
>college and out into the workplace unless they learn to use language
>as a tool. 
  Yes, I think I will do some of these things that you suggest. Thank
you for an important, well-written and obviously heartfelt post.


Steve Heller, author and software engineer
http://ourworld.compuserve.com/homepages/steve_heller 





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-26  0:00                                                       ` Mark Wooding
  1996-08-30  0:00                                                         ` Kaz Kylheku
@ 1996-08-30  0:00                                                         ` Richard A. O'Keefe
  1996-08-30  0:00                                                           ` Peter Seebach
                                                                             ` (2 more replies)
  1 sibling, 3 replies; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-08-30  0:00 UTC (permalink / raw)



I wrote:
>> strlen() is a pure function and its argument does not change, so the
>> strlen(s) computation can be hoisted out of the loop.

mdw@excessus.demon.co.uk (Mark Wooding) writes:

>Erk!  No it isn't.  A pure function is one whose value depends only on
>its arguments.  sin() is pure.  strlen() isn't.

We are in dispute about words, not facts.
MOST annoyingly, my copy of the C standard has walked, so I can't refer
to the official rule book.

>The argument to strlen() is a pointer to a string whose length we want.

This is where I was using sloppy language.
The way I was using language, the argument *is* the "null-terminated
byte string" (to use C++ draft standard terminology), and the pointer
is merely the mechanism used to refer to it.

In the program fragment under discussion, it was not merely the pointer
that was constant, but the NTBS itself.

In what sense is strlen() pure?  In the sense that it depends only on
the NTBS variable which is the intended argument and not on the time
of day, the phase of the moon, the colour of the programmer's socks,
or any storage _other_ than the NTBS.

If I do
	n = strlen(s);
	/* much code that does not change s */
	/* NOT the NTBS that s refers to */
	m = strlen(s);
it is necessarily the case that m and n will be the same size_t value.

>It's the address, not the string itself.  Because the string can change
>between calls to strlen(), it might give different results given the
>same string address.  So the compiler can't just use its general `pure'
>function mechanism for common-subexpression-optimising strlen()

Yes it can:  in the sense in which I intended it (and I apologise for
not using the terminology in the New Testament aka ISO C standard)
the argument is NBTS(s), and detecing whether NBTS(s) has changed is
admittedly very difficult in the presence of aliasing, but it isn't
difficult *all* the time.

>Now, can a compiler do clever things and optimise strlen() all by
>itself?

The answer is unequivocally "Yes it CAN."  The standard allows it to,
and nothing in the code fragment I presented suggested that it s
referred to a global buffer.  In fact, most of my uses of strlen()
have been either to local buffers (which I have just filled from a
file) or to stored which I can be damn sure hasn't been touched
since last time.  (And I then save the result of strlen() and never
ever ask for it again.  We're talking about what makes sense, not
what I do.)

>You comment that it might spot assignments to the string.  This
>is true, but not all such assignments are visible to the compiler.  For
>automatic buffers, this /is/ true, but (in my experience) calls to
>strlen() and similar functions are comparitively rarerely used on
>locally allocated buffers.

In my experience, people spell comparatively with only one "i"
and "rarely" with only two "r"s.  Clearly our experience differs.

The question was not about what you do or what I do or what your
experience was or what my experience was but about whether repeated
calls are ever sensible.

>If the buffer is not local to the function,
[which makes an assumption not grounded in my code fragment]
>there's no guarantee that (in an extreme case) its address hasn't been
>made available to a signal handler which maliciously changes the
>string's length.

Well actually, such a guarantee is easy to come by:
 - if the program contains no calls to signal(), it can't happen.
 - if the buffer is local to a file, and its address doesn't leak
   outside, and no function in the file is passed to signal(),
   again, it can't happen.
 - a program that conforms to the C standard (we were talking about
   standard C, no?) may not legally set any variable external to the
   handler if that variable is not of type sig_atomic_t.  Now on _this_
   system, sig_atomic_t is int, so a *legal* signal handler, however
   malicious, may not change a char buffer.  (This obviously won't
   apply to systems in which sig_atomic_t is char, but it _does_
   apply to this system, and the compiler is entitled to rely on it.)

Let's take an example which _doesn't_ involve function-local buffers.
Consider

	char *dupcat(char const *a, char const *b) {
	    char *result = malloc(strlen(a) + strlen(b) + 1);
	    assert(result != 0);
	    memcpy(result, a, strlen(a));
	    memcpy(result + strlen(a), b, strlen(b));
	    memcpy(result + strlen(a) + strlen(b), "", 1);
	    return result;
	}

I repeat, this is NOT the way I would normally write it.
I contend that it is _inefficient_ but not "stupid".

If a malicious (and not strictly conforming) signal handler smashes
NBTS(a) or NBTS(b), then you have worse problems than just possible
changes to strlen().  Manually saving the numbers won't protect you
from the fact that the null-terminated-byte-strings have changed,
so that the function result (as an NBTS) is not the concatenation
of its NBTS arguments.

My understanding of the standard is that a compiler is entitled to
assume that NBTS(a) and NBTS(b) will not change (because a legal
standard C program _can't_ undergo such changes) and therefore to
optimise the calls to strlen().

If anyone wants to argue that, take it to comp.std.c and ask the
experts.

The point I am concerned to defend is solely this:
the function dupcat() may be inefficient,
but it clearly and correctly expresses the programmer's intent,
and is therefore not "stupid".

Surely nobody but an assembly hacker would dispute the point
that there are times when it is RATIONAL to write suboptimal
code (in order to devote your resources to activities with a
higher payoff)?

-- 
Australian citizen since 14 August 1996.  *Now* I can vote the xxxs out!
Richard A. O'Keefe; http://www.cs.rmit.edu.au/%7Eok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-30  0:00                                                         ` Richard A. O'Keefe
@ 1996-08-30  0:00                                                           ` Peter Seebach
  1996-09-03  0:00                                                             ` Lawrence Kirby
  1996-09-01  0:00                                                           ` Joe Keane
  1996-09-03  0:00                                                           ` Arkady Belousov
  2 siblings, 1 reply; 688+ messages in thread
From: Peter Seebach @ 1996-08-30  0:00 UTC (permalink / raw)



In article <50650h$rek@goanna.cs.rmit.edu.au>,
Richard A. O'Keefe <ok@goanna.cs.rmit.edu.au> wrote:
> - a program that conforms to the C standard (we were talking about
>   standard C, no?) may not legally set any variable external to the
>   handler if that variable is not of type sig_atomic_t.  Now on _this_
>   system, sig_atomic_t is int, so a *legal* signal handler, however
>   malicious, may not change a char buffer.  (This obviously won't
>   apply to systems in which sig_atomic_t is char, but it _does_
>   apply to this system, and the compiler is entitled to rely on it.)

>Let's take an example which _doesn't_ involve function-local buffers.
>Consider

>	char *dupcat(char const *a, char const *b) {
>	    char *result = malloc(strlen(a) + strlen(b) + 1);
>	    assert(result != 0);
>	    memcpy(result, a, strlen(a));
>	    memcpy(result + strlen(a), b, strlen(b));
>	    memcpy(result + strlen(a) + strlen(b), "", 1);
>	    return result;
>	}

>I repeat, this is NOT the way I would normally write it.
>I contend that it is _inefficient_ but not "stupid".

Maybe.

There's a possible weakness.  Assume sig_atomic_t is an int.  Assume we
have an array of sig_atomic_t somewhere.  Assume we decide to, as we
are permitted by the standard, treat it as an array of characters.  The
first sig_atomic_t in the array may be overwritten by a signal handler
we have installed.  We make sure the 2nd s_a_t in the array contains at least
one null byte, so strlen((char *) sat_array) will be zero, and pass
the array to dupcat as both a and b.  (Suitably cast to (char *).)

It is entirely possible that, even though a and b are the same pointer,
that a signal handler will change the NTBS they point to between a
strlen(a) and a strlen(b) in the above, or between statements.

The problem is unique to (char *), which may point to things other than
arrays of characters, because you may treat any object as an array of
characters.

I still don't think it's stupid, merely (probably) inefficient.

On the other hand, using strlen() on objects which were not intended to be
strings is legitimately stupid.

On the other hand, using assert() for the return from malloc is, IMHO, stupid.
assert() should catch programming errors, not resource limitations.

>The point I am concerned to defend is solely this:
>the function dupcat() may be inefficient,
>but it clearly and correctly expresses the programmer's intent,
>and is therefore not "stupid".

This, I would agree with.

>Surely nobody but an assembly hacker would dispute the point
>that there are times when it is RATIONAL to write suboptimal
>code (in order to devote your resources to activities with a
>higher payoff)?

Assembly hackers spend most of their time and effort writing suboptimal code;
it ports horribly, so it's clearly not *optimal*.

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
       [not found]                                                           ` <01bb95ba$9dfed580$496700cf@ljelmore.montana>
@ 1996-08-30  0:00                                                             ` Steve Heller
  1996-08-31  0:00                                                             ` Clayton Weaver
  1 sibling, 0 replies; 688+ messages in thread
From: Steve Heller @ 1996-08-30  0:00 UTC (permalink / raw)



"Larry J. Elmore" <ljelmore@montana.campus.mci.net> wrote:


>The problem is not with the colleges, it is with the public school
>system from elementary to high school. (It strikes me as incredible that
>the only proposed remedy not ridiculed by the mass media and politicians
>calls for handing even more money and power over to the very system and
>people that have *presided* over the collapse of education in this country.
>And why most people swallow that bilge is beyond me...)
  That's easy. They were brainwashed by that same public school
system!

>"Anyone who isn't a socialist by the age of twenty doesn't have a heart.
> Anyone who isn't a conservative by the age of forty doesn't have a brain."
>-- Winston Churchill
  "Anyone who isn't a libertarian by the age of 30 doesn't have a
clue."
--Steve Heller

Steve Heller, author and software engineer
http://ourworld.compuserve.com/homepages/steve_heller 





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-07  0:00                                   ` Tim Behrendsen
                                                       ` (8 preceding siblings ...)
  1996-08-29  0:00                                     ` Robert I. Eachus
@ 1996-08-30  0:00                                     ` Tanmoy Bhattacharya
  9 siblings, 0 replies; 688+ messages in thread
From: Tanmoy Bhattacharya @ 1996-08-30  0:00 UTC (permalink / raw)



In article <50650h$rek@goanna.cs.rmit.edu.au>
ok@goanna.cs.rmit.edu.au (Richard A. O'Keefe) writes:
<snip>
RAO:  - a program that conforms to the C standard (we were talking about
RAO:    standard C, no?) may not legally set any variable external to the
RAO:    handler if that variable is not of type sig_atomic_t.  Now on _this_
RAO:    system, sig_atomic_t is int, so a *legal* signal handler, however
RAO:    malicious, may not change a char buffer.  (This obviously won't
RAO:    apply to systems in which sig_atomic_t is char, but it _does_
RAO:    apply to this system, and the compiler is entitled to rely on it.)

It can't change a variable declared `sig_atomic_t' either: it has to
be `volatile sig_atomic_t'. And, if your buffer is volatile, it can
change anyway: signal handler or no.

Cheers
Tanmoy
--
tanmoy@qcd.lanl.gov(128.165.23.46) DECNET: BETA::"tanmoy@lanl.gov"(1.218=1242)
Tanmoy Bhattacharya O:T-8(MS B285)LANL,NM87545 H:#9,3000,Trinity Drive,NM87544
Others see <gopher://yaleinfo.yale.edu:7700/00/Internet-People/internet-mail>,
<http://alpha.acast.nova.edu/cgi-bin/inmgq.pl>or<ftp://csd4.csd.uwm.edu/pub/
internetwork-mail-guide>. -- <http://nqcd.lanl.gov/people/tanmoy/tanmoy.html>
fax: 1 (505) 665 3003   voice: 1 (505) 665 4733    [ Home: 1 (505) 662 5596 ]




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-29  0:00                                                         ` Richard A. O'Keefe
  1996-08-29  0:00                                                           ` Craig Franck
@ 1996-08-30  0:00                                                           ` system
  1996-08-31  0:00                                                             ` Kenneth Mays
       [not found]                                                           ` <01bb95ba$9dfed580$496700cf@ljelmore.montana>
  1996-09-01  0:00                                                           ` Tim Behrendsen
  3 siblings, 1 reply; 688+ messages in thread
From: system @ 1996-08-30  0:00 UTC (permalink / raw)



"Larry J. Elmore" <ljelmore@montana.campus.mci.net> writes:
>
>Richard A. O'Keefe <ok@goanna.cs.rmit.edu.au> wrote in article
><503bq0$js@goanna.cs.rmit.edu.au>...

>> Prioritising the thousand things we need to teach, to students
>> is not easy.  

>I believe Tim was referring to qualified students actually ready for
>college, Richard. 

>The problem is not with the colleges, it is with the public school
>system from elementary to high school. 

>(It strikes me as incredible that
>the only proposed remedy not ridiculed by the mass media and politicians
>calls for handing even more money and power over to the very system

Being a bleeding heart liberal I generally listen to NPR rather than
the mass media.

As such I have heard of several possible remedies, a number of which are
being tried out with the support of politians.

I certainly believe that children _should_ be educated in the basics before 
they come to college.  The reasons they are not are multisided and can not
all be laid at the doorstep of the public education system.

Robert

>Larry J. Elmore

Morphis@physics.niu.edu

Real Men change diapers




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-26  0:00                                                       ` Mark Wooding
@ 1996-08-30  0:00                                                         ` Kaz Kylheku
  1996-08-30  0:00                                                         ` Richard A. O'Keefe
  1 sibling, 0 replies; 688+ messages in thread
From: Kaz Kylheku @ 1996-08-30  0:00 UTC (permalink / raw)



In article <slrn523tpg.ce.mdw@excessus.demon.co.uk>,
Mark Wooding <mdw@excessus.demon.co.uk> wrote:

>Erk!  No it isn't.  A pure function is one whose value depends only on
>its arguments.  sin() is pure.  strlen() isn't.
>
>The argument to strlen() is a pointer to a string whose length we want.
>It's the address, not the string itself.  Because the string can change
>between calls to strlen(), it might give different results given the
>same string address.  So the compiler can't just use its general `pure'
>function mechanism for common-subexpression-optimising strlen()

So what, the value of X can change in between calls to sin(X) just the same way.

The strlen() call can indeed be eliminated via CSE. It has no side effects,
and its result depends only on the object being passed to it.

The value of strlen() can be readily reused if no pointer was dereferenced for
the purposes of storing a value since the last time strlen() was called with
the same subexpression, and if no constituent of the subexpression has
changed.

>Now, can a compiler do clever things and optimise strlen() all by
>itself?  You comment that it might spot assignments to the string.  This
>is true, but not all such assignments are visible to the compiler.  For

Yes they are all visible. The compiler can readily decide whether the
expression which is an argument to strlen() _could_ have changed by looking
at the statements that have occured since the invocation of strlen. There are
well known techniques for doing CSE in conjunction with pointers. The most
pessimistic approach will invalidate the register-cached results of all prior
expressions when a pointer has been dereferenced. From this assumption, you
can make finer adjustments. For example, you can deduce that if a pointer
was explicitly assigned to a particular object, only that object can be
modified via that pointer. That's because the rules of C state that pointer
arithmetic is not allowed to take a pointer outside of the object it is
pointed to (with the exception of pointing one element past the end of an
array, in which case a dereference is illegal so it amounts to the same thing
from the optimizer's point of view). Thus when a pointer is dereferenced,
and thanks to careful analysis you know that it cannot affect any constituent
objects named the subexprssion that was previously passed to strlen(), you can
safely use the register copy of that previous result (given that all the
other requirements for eliminating the subexpression have been met, of
course).

>automatic buffers, this /is/ true, but (in my experience) calls to
>strlen() and similar functions are comparitively rarerely used on
>locally allocated buffers.  If the buffer is not local to the function,
>there's no guarantee that (in an extreme case) its address hasn't been
>made available to a signal handler which maliciously changes the
>string's length.
>
>Prove me wrong.

That's easy: if you modify the data in the signal handler, you are violating
the rules for writing a conforming C program. Read what the C standard has to
say about signal handlers and global variables! 

The buffer could be modified by a function call, but in that case, you can do
the same invalidation as you would for a pointer dereference.

One pesky problem are pointers that are never explicitly initialized in the
function, but are obtained as parameters. These can potentially refer to
anything that is not local to the function, and the objects they point to can
overlap. Here is where languages with more powerful parameter passing
techniques and stronger typing have a win over C.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-30  0:00                                                           ` system
@ 1996-08-31  0:00                                                             ` Kenneth Mays
  0 siblings, 0 replies; 688+ messages in thread
From: Kenneth Mays @ 1996-08-31  0:00 UTC (permalink / raw)



Making a small statement in addition to this thread, it is not so 
much the colleges fault that the public school systems do a bad job 
in most cases. Colleges and universities spend the first two years 
reeducating the student on 9-12 grade subjects. It was hard to 
believe I had to waste tuition money on College Algebra when I 
finished Trigonometry in 11th grade!!!! some of the greatest minds on 
this earth never attended one college class while others were college 
dropouts. I would not focus so much on the school systems, but more 
on the minds of the people we are dealing with. If someone asked me 
what is the best language for game development, I would wisely tell 
them C/C++. If they asked me about embedded controllers, I would 
Ada95 (easy to maintain). This, to me, doesn't take much thought 
process. Why is it so hard to get it across to those college students 
whom had to pass PSAT/SAT/GRE/GMAT/CLEP and whatever else they take 
for exams today?

Did we miss the mark? Why are companies like 3DO (Studio 3DO), 
Lockheed-Martin, MicroSoft, and Intel/Motorola able to push 
technology with some of the brightest (and not so bright) people in 
this country and over the globe? You will find out that a lot of the 
ingenious designs and inventions come from people who are NOT 
scientists and engineers - but hobbyists or "visionairs". You don't 
need a scientific calculator with slide rule and compass to 
understand transputers or build advanced rocket systems. You just 
have to have a brain, a will to learn. and the ability to accomplish 
your goals.

Imagine what you would think if you found out your environmental 
engineering supervisor had a degree in molecular biology. Your boss 
didn't belong to the IEEE or ACM, but could solve cancer and many 
ailments caused by pollution. Would you say your boss is not 
competent in engineering concepts because you have a engineering 
degree but your boss doesn't???? Read the fine print AGAIN!!!

You have to understand that it doesn't matter what language you wish 
to start with. You have to understand the CONCEPTS of programming. 
What is the thought process involved in programming, the correct ways 
of thinking to solve a problem and put it into a programming model. 
The programming language is just a tool and you have to know what 
tools to use correctly to solve your problem efficiently. 

Now if your asking me or oters what programming language should you 
learn so you can get a job in the real world, well you run into the 
same problem. What jobs are out there hiring for the language you are 
learning. C/C++, Smalltalk, MUMPS, COBOL, RPG, Ada, Query languages, 
Database programming (XTALK),  PERL, Java, FORTRAN, BASIC/Visual 
BASIC, PASCAL/Delphi, and the list goes on. Would you learn Dylan? 
Many studenmts are studying it for OOP/OOD work. Just a small 
comment: look before you leap.


Ken Mays, Ada95 Researcher
kmays@msn.com

NASA quiz:
You got one million people on a space station circling planet Earth 
and the life support system suddenly explodes and wipes out all 
backup reserves. There isn't enough time to evacuated, a large air 
leak has development in the cafeteria area, radiation is spreading 
rapidly through the air conditioning system and becasue of the 
explosion the space station is out of orbit and falling towards earth 
fast. Your personnel has less than an hour to live? What WILL you do? 
What WILL... you do?




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-22  0:00                                                       ` Tim Behrendsen
@ 1996-08-31  0:00                                                         ` Bengt Richter
  1996-09-01  0:00                                                           ` Maurice M. Carey IV
  0 siblings, 1 reply; 688+ messages in thread
From: Bengt Richter @ 1996-08-31  0:00 UTC (permalink / raw)



"Tim Behrendsen" <tim@airshields.com> wrote:

>Bengt Richter <bokr@accessone.com> wrote in article
><4vgs4j$evl@news.accessone.com>...
>> "Tim Behrendsen" <tim@airshields.com> wrote:
>> [...]
>> >Wait, hold the phone!  "Sorts the data without moving it"?  What,
>> >is APL's sorting algorithm O(1)?  Yes, it may not actually get
>> >sorted until it gets printed, but that's irrelevent to the fact
>> >that it eventually gets sorted.
>> 	Does that mean that your concept of the essential aspect of
>> sorting is putting the data into sort order, rather than establishing
>> the order itself? The respective timings say otherwise to me ;-)

>I'm not sure what you're trying to say, but if I have a data set,
>and I set a bit that says "this data set has the property of
>being ordered", then technically I have an ordered data set in
>O(1) time.  Now, if I do an add reduction (terminology?), the
>sorting doesn't actually have to be done, and I've saved some
>CPU cycles.

>But all that's not the point.  If my point is to take a vector as
>input, order it, and then display it on the screen, the vector
>will be sorted.  And sorting something requires moving it into
>a sorted order.

If you have (please overlook syntax bloopers)
  struct tBigComplexThing {
    ... blah, blah
  } aBigArrayOfThem[kBig];
you could sort them by moving the data around, and then print it
by a loop such as for(i=0;i<kBig;i++)
   printf("theformat",aBigArrayOfThem[i].anItem, etc..);
You could also sort by using an array
  int aiPermute[kbig];
initialized by 
  for(i=0;i<kBig;i++) aiPermute[i]=i;
and then use a quicksort or whatever so that the
integer elements of aiPermute[] get moved around instead
of the tBigComplexThing elements of aBigArrayOfThem[].
The comparison function just has to take two indices into
aiPermute[], e.g., j and k, and compare
  aBigArrayOfThem[aiPermute[j]].someField
and
  aBigArrayOfThem[aiPermute[k]].someField
and operate on aiPermute[] as if it had compared
  aiPermute[j] with aiPermute[k].
I take it something like this is involved in the APL
use of a "permutation vector" cited in a previous post.

Anyway, now you can do your print by the loop
  for(i=0;i<kBig;i++)
    printf("theformat",aBigArrayOfThem[aiPermute[i]].anItem, etc..);
and expensive movements of the original data don't have to be part
of the sorting process, even though you could say that the data
eventually gets "moved into sorted order" into the output stream.
But the latter happens in linear time, and is not relevant to the
sorting algorithm executed beforehand.
	To summarize what I was getting at, I think
"sorting data without moving it" is a fair description of sorting
data by reordering references to it, since the sorting does not
involve moving the data. I don't think subsequent extraction of
the data itself into an ordered stream can be called sorting.

Regards,
Bengt Richter






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-22  0:00                                       ` Frank Manning
@ 1996-08-31  0:00                                         ` Bengt Richter
  1996-08-31  0:00                                           ` Frank Manning
  0 siblings, 1 reply; 688+ messages in thread
From: Bengt Richter @ 1996-08-31  0:00 UTC (permalink / raw)



frank@bigdog.engr.arizona.edu (Frank Manning) wrote:

>In article <4vgs4p$evl@news.accessone.com> bokr@accessone.com (Bengt
>Richter) writes:

>> What if computers are totally irrelevant? 

>When you give students a programming assignment, are the students
>required to run the program on actual, physical, real-life computers?
>If so, that presents certain...ah...practical problems

	First, I agree with you. But the topic I was trying to comment
on was the efficacy of assembler code as a vehicle for getting an
understanding of an algorithm. And I was trying to point out that an
algorithm may be fundamentally abstract, making computers
*essentially* irrelevant. That is not to say that the practical
ability to use computers is irrelevant to a programming assignment
involving the algorithm, nor is it to say that such an assignment
could not help a student understand the algorithm.
	I felt the value of assembler level concerns was being
evangelized to such an extent that I counter-reacted. If I had
encountered my own post out of context, emphasizing the abstract,
I would probably have felt the urge to point out that I wouldn't
want to do without a CPU/machine language view in my debugger, as
things stand now.
[...]	
>And if the students know only abstract theory, how are professors going
>to explore new theories if the students are incapable of running
>experiments, as the students most certainly will be if they don't know
>the details of how the hardware actually works? There are countless ways
>an experiment can go wrong or give misleading results. It's difficult to
>prevent those problems without a thorough understanding of hardware
>details.
	Certainly I don't think students should know only abstract theory,
even if they're math majors. But I think you could play with
algorithms in Scheme and not have to worry too much about how the
hardware works. I would think a central goal in choosing or
constructing an experimental environment would be to eliminate
extraneous concerns. 
	Of course what's extraneous for some is central to others.
There can be bugs in Scheme interpreters or compilers, but those
are presumably in a different domain from the domain of the algorithm.
Should driving school students be able to fix the transmission if that
breaks during a lesson? It's a worthy subject, but out of scope.

Regards,
Bengt Richter
BTW, did I detect a hint of exclusivity in "...professors going to
explore new theories...?" I think many students will quickly outrun
their coaches when allowed on the same track :-)





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-26  0:00                                                     ` Richard A. O'Keefe
                                                                         ` (2 preceding siblings ...)
  1996-08-26  0:00                                                       ` madscientist
@ 1996-08-31  0:00                                                       ` Tanmoy Bhattacharya
  1996-09-04  0:00                                                         ` Tom Payne
  1996-09-04  0:00                                                       ` Patrick Horgan
  4 siblings, 1 reply; 688+ messages in thread
From: Tanmoy Bhattacharya @ 1996-08-31  0:00 UTC (permalink / raw)



In article <506v0c$ne8@solutions.solon.com>
seebs@solutions.solon.com (Peter Seebach) writes:

PS: There's a possible weakness.  Assume sig_atomic_t is an int.  Assume we
PS: have an array of sig_atomic_t somewhere.  Assume we decide to, as we
PS: are permitted by the standard, treat it as an array of characters.  The
PS: first sig_atomic_t in the array may be overwritten by a signal handler
PS: we have installed.  We make sure the 2nd s_a_t in the array
PS: contains at least 
PS: one null byte, so strlen((char *) sat_array) will be zero, and pass
PS: the array to dupcat as both a and b.  (Suitably cast to (char *).)
PS: 
PS: It is entirely possible that, even though a and b are the same pointer,
PS: that a signal handler will change the NTBS they point to between a
PS: strlen(a) and a strlen(b) in the above, or between statements.
PS: 

Not quite: Only volatile sig_atomic_t can be handled portably by a
signal handler. Of course, one can cast away the volatile, but *that*
is bad anyways.

Cheers
Tanmoy
--
tanmoy@qcd.lanl.gov(128.165.23.46) DECNET: BETA::"tanmoy@lanl.gov"(1.218=1242)
Tanmoy Bhattacharya O:T-8(MS B285)LANL,NM87545 H:#9,3000,Trinity Drive,NM87544
Others see <gopher://yaleinfo.yale.edu:7700/00/Internet-People/internet-mail>,
<http://alpha.acast.nova.edu/cgi-bin/inmgq.pl>or<ftp://csd4.csd.uwm.edu/pub/
internetwork-mail-guide>. -- <http://nqcd.lanl.gov/people/tanmoy/tanmoy.html>
fax: 1 (505) 665 3003   voice: 1 (505) 665 4733    [ Home: 1 (505) 662 5596 ]




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
       [not found]                                                           ` <01bb95ba$9dfed580$496700cf@ljelmore.montana>
  1996-08-30  0:00                                                             ` Steve Heller
@ 1996-08-31  0:00                                                             ` Clayton Weaver
  1 sibling, 0 replies; 688+ messages in thread
From: Clayton Weaver @ 1996-08-31  0:00 UTC (permalink / raw)



On 29 Aug 1996, Larry J. Elmore wrote:

> Richard A. O'Keefe <ok@goanna.cs.rmit.edu.au> wrote in article
> <503bq0$js@goanna.cs.rmit.edu.au>...

> > Prioritising the thousand things we need to teach, to students
> >  - who find reading and writing English difficult and unpleasant
> >  - who loathe and dread mathematics and anything that looks like it
> >  - who are much more interested in producing flashy GUIs than in having
> >    a working algorithm for the U to I with
> >  - who cannot listen and take notes at the same time (preprinted lecture
> >    notes are now being demanded as a right)
> >  - ah, you get the idea
> > is not easy.  

> > Tim Behrendsen, if you think assembler is more urgent than remedial
> > English and elementary mathematics, we shall just have to disagree.

> I believe Tim was referring to qualified students actually ready for
> college, Richard. The fact is, remedial programs should NOT be offered by
> colleges--unqualified students should never be admitted in the first
> place!!! The problem is not with the colleges, it is with the public school
> system from elementary to high school. (It strikes me as incredible that
> the only proposed remedy not ridiculed by the mass media and politicians
> calls for handing even more money and power over to the very system and
> people that have *presided* over the collapse of education in this country.
> And why most people swallow that bilge is beyond me...)

You don't suppose this could be a question of student maturity rather than
a lack of educator dedication?

(Although I find the basic mathematics problems strange. As a student, I
always viewed math class as "less homework, more time for
extra-curricular" and thus less of a chore than some other subjects. "Do
the math first, because it only takes a short time to be current on it.
I can consider doing the rest of this stuff after dinner or on the
weekend.")

> Larry J. Elmore
> ljelmore@montana.campus.mci.net

Regards, 

Clayton Weaver                              Transparent Words
cgweav@eskimo.com      (Seattle)      http://www.eskimo.com/~cgweav/









^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-31  0:00                                         ` Bengt Richter
@ 1996-08-31  0:00                                           ` Frank Manning
  1996-08-31  0:00                                             ` Frank Manning
  1996-09-02  0:00                                             ` deafen
  0 siblings, 2 replies; 688+ messages in thread
From: Frank Manning @ 1996-08-31  0:00 UTC (permalink / raw)



In article <5085r7$ra7@kanga.accessone.com> bokr@accessone.com (Bengt
Richter) writes:

   [...]
>         I felt the value of assembler level concerns was being
> evangelized to such an extent that I counter-reacted. If I had
> encountered my own post out of context, emphasizing the abstract,
> I would probably have felt the urge to point out that I wouldn't
> want to do without a CPU/machine language view in my debugger, as
> things stand now.

I see your point. Personally, think Tim is going too far in
evangelizing assembly language. I agree with him that students should
get some exposure, but I personally would be satisfied with the bare
minimum for the students to get a vague idea about what actually goes
on inside the machine.

Such as -- move a couple of words from memory into registers, do a math
operation on the pair, then move the result back into memory. Maybe add
simple I/O and a jump instruction -- just enough for them to get a hint
of what's actually going on inside the magic box.

I think Tim has the right idea, but teaching sorting in assembly goes a
bit too far, IMHO.

[...]
> Should driving school students be able to fix the transmission if that
> breaks during a lesson? It's a worthy subject, but out of scope.

Good point. I agree that transmission repair doesn't belong in a driving
school. On the other hand, I'd hope the students have at least a vague
idea that transmissions usually have gears and some kind of oil or
transmission fluid inside.

"What am I gonna do? There's this weird red puddle in my driveway and
my cousin Vinny's fiance isn't around to help!"

I look at the problem from the following point of view -- in a former
life I worked at a university, and one thing I did was assist professors
in research or running labs in an engineering department.

Engineering professors, especially older ones, tend to be rather hostile
toward or intimidated by computers, especially when computers are used
for data acquisition during experiments. They're worried that students
won't learn anything if a computer reads a sensor. Some professors tend
to think that *students* need to read thermometers, pressure gages and
voltmeters themselves in order to understand physically what's going on.
Otherwise the computer does everything and magically spits out an answer,
and the students supposedly won't learn anything.

In my humble opinion, the solution to that problem is to teach the
students what actually goes on inside the computer, how a transducer
signal wends its way through an ADC into the computer memory, and how to
write the software that controls all that stuff. Students also need to
understand that things can go wrong and the computer will tell you a
direct lie if it can possibly get away with it.

-- Frank Manning
-- Chair, AIAA-Tucson Section




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-31  0:00                                           ` Frank Manning
@ 1996-08-31  0:00                                             ` Frank Manning
  1996-09-02  0:00                                             ` deafen
  1 sibling, 0 replies; 688+ messages in thread
From: Frank Manning @ 1996-08-31  0:00 UTC (permalink / raw)



I wrote:

> Engineering professors, especially older ones, tend to be rather hostile
> toward or intimidated by computers [...]

In retrospect, I think my statement is overgeneralizing. Many
engineering professors are of course experts when it comes to
computers.

-- Frank Manning
-- Chair, AIAA-Tucson Section




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-30  0:00                               ` Alan Peake
@ 1996-08-31  0:00                                 ` Robert Dewar
  1996-09-03  0:00                                   ` Alan Peake
  1996-09-07  0:00                                 ` .
  1 sibling, 1 reply; 688+ messages in thread
From: Robert Dewar @ 1996-08-31  0:00 UTC (permalink / raw)



Alan said

" Fortran IV - been so long, I forgot the version :)"

Really you mean Fortran 66, which is the name of the standardized
language, to which Fortran IV, a particular implementation, is a close
approximation.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-30  0:00                                                         ` Richard A. O'Keefe
  1996-08-30  0:00                                                           ` Peter Seebach
@ 1996-09-01  0:00                                                           ` Joe Keane
  1996-09-04  0:00                                                             ` Richard A. O'Keefe
  1996-09-03  0:00                                                           ` Arkady Belousov
  2 siblings, 1 reply; 688+ messages in thread
From: Joe Keane @ 1996-09-01  0:00 UTC (permalink / raw)



In article <4vroh3$17f@goanna.cs.rmit.edu.au>
Richard A. O'Keefe <ok@goanna.cs.rmit.edu.au> writes:
>I would go so far as to say that
>
>	int count_blanks(char const * const s) {
>	    int i, r;
>
>	    r = 0;
>	    for (i = 0; i < strlen(s); i++)
>		if (s[i] == ' ')
>		    r++;
>	    return r;
>	}
>
>is a _good_ function.

That's just bad code.  It specifies a *quadratic* algorithm for a very
simple operation.  My impression would be that the coder doesn't
understand some basic things about C programming, or they're just
inconsiderate.  How hard is it to put the length in a variable?

It's a basic fact that, in the idiomatic `for' loop, the test is
evaluated every time through the loop.  If the upper bound is fixed,
it's really good practice to put it in a local variable.  This makes it
perfectly clear to the compiler *and the reader* that it won't change.

Conversely, calling a function on every iteration connotes that the
value may change.  In a less simple example, there may be a genuine
question about what in the loop causes the bound to change.

>The ONLY strlen() optimisation that is needed here
>is one that is completely general:  strlen() is a pure function and its
>argument does not change, so the strlen(s) computation can be hoisted out
>of the loop.

It's not a pure function.  Whether the argument changes isn't the point;
it's a local variable and it's clear to everyone that it's not changed.

In article <01bb9360$21d0dbe0$87ee6fce@timpent.airshields.com>
"Tim Behrendsen" <tim@airshields.com> writes:
>Yes, yes, I've heard this before.  The compiler knows all, sees all,
>fixes all.  If my strings are a few thousand bytes long, you
>don't think it may be *slightly* inefficient?

>I can easily call clear correct code stupid if it is blatently
>inefficient.  I don't expect the compiler to fix bad code, and
>thus I am never disappointed.

Really.  I expect an optimizing compiler to do a decent job of low-level
things, such as allocating registers, selecting instructions, scheduling
instructions, and so on.  If optimization is turned off, the code will
get slower, maybe two to five times, but a constant factor.

I don't expect the compiler to work miracles, to turn crap into gold.
That would include changing a quadratic algorithm to a linear one.  Some
people may think that it's nifty, but i think it's just a bit disturbing
and i'd rather have compiler writers concentrate on compiling good code.

In article <slrn523tpg.ce.mdw@excessus.demon.co.uk>
Mark Wooding <mdw@excessus.demon.co.uk> writes:
>Erk!  No it isn't.  A pure function is one whose value depends only on
>its arguments.  sin() is pure.  strlen() isn't.

In article <50650h$rek@goanna.cs.rmit.edu.au>
Richard A. O'Keefe <ok@goanna.cs.rmit.edu.au> writes:
>This is where I was using sloppy language.
>The way I was using language, the argument *is* the "null-terminated
>byte string" (to use C++ draft standard terminology), and the pointer
>is merely the mechanism used to refer to it.

You can't blow off a level of indirection so easily.  If we're talking
about some expression that just involves the value of the pointer, we'd
agree that it's just CSE and there's nothing controversial about it.

>In the program fragment under discussion, it was not merely the pointer
>that was constant, but the NTBS itself.

I think that you misunderstand the `const' keyword.

>If I do
>	n = strlen(s);
>	/* much code that does not change s */
>	/* NOT the NTBS that s refers to */
>	m = strlen(s);
>it is necessarily the case that m and n will be the same size_t value.

I'm unclear on that comment, but if there are pointer stores in there,
the length may be changed, even if `s' isn't changed.

>The answer is unequivocally "Yes it CAN."

In your example, it is possible.  But it's based on a lot of assumptions
and it's not something you should depend on.  In many similar examples,
the compiler *must not* remove the duplicated calls.

--
Joe Keane, amateur mathematician




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-29  0:00                                                         ` Richard A. O'Keefe
                                                                             ` (2 preceding siblings ...)
       [not found]                                                           ` <01bb95ba$9dfed580$496700cf@ljelmore.montana>
@ 1996-09-01  0:00                                                           ` Tim Behrendsen
  3 siblings, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-09-01  0:00 UTC (permalink / raw)



Richard A. O'Keefe <ok@goanna.cs.rmit.edu.au> wrote in article
<503bq0$js@goanna.cs.rmit.edu.au>...
> 
> You don't even BEGIN to know what world I want.
> I can tell you for certain sure that we don't even begin to approach
> hailing distance of the shadow of the world I want.
> To start with, I would like students at entry
> 
> [lots of crying out for thinking, engaged students]
> 
> Tim Behrendsen, if you think assembler is more urgent than remedial
> English and elementary mathematics, we shall just have to disagree.

Well, now your getting into a whole other can o' worms.  I completely
agree with you that today's students are woefully unprepared.  Part of
it is (at least here in the US) the complete collapse of the public
school system, and part of the reason is the explosion of knowledge
of the last 50 years means that nothing can be taught in a complete
way.

I agree that education must get away from knowledge packing and
get back to the basics of learning to think.  To tell you the
truth, I'm not sure education has *ever* been devoted to learning
to think, but "knowledge packing"-style education will simply
collapse because there is too much knowledge.

But to bring it back to learning to program; don't underestimate
the specialized skills required to learn to "think like a programmer".
Programming is a quite unique skill blending abstract problem
solving with concrete data movement.  Yet, the data movement is
still abstract.

I believe that the more you can build a programming foundation
on concrete principles, the more likely a student will "get it."
Why do many people teach sorting using a deck of cards?  It's
because the student can see the cards moving around in a concrete
way.

When a student is taught using a *simple* assembly, they experience
the bytes moving around in a nice flat map.  This makes the
concepts concrete, because it's as if they are moving marbles
around in little boxes.  They can experience the data moving
around, without the confusion of the syntax.

I think one of the biggest myths in computer science is that
"HLL syntax is simple to understand."

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-31  0:00                                                         ` Bengt Richter
@ 1996-09-01  0:00                                                           ` Maurice M. Carey IV
  0 siblings, 0 replies; 688+ messages in thread
From: Maurice M. Carey IV @ 1996-09-01  0:00 UTC (permalink / raw)



So.

Bengt Richter <bokr@accessone.com> wrote in article
<5085ou$ra7@kanga.accessone.com>...
> "Tim Behrendsen" <tim@airshields.com> wrote:
> 
> >Bengt Richter <bokr@accessone.com> wrote in article
> ><4vgs4j$evl@news.accessone.com>...
> >> "Tim Behrendsen" <tim@airshields.com> wrote:
> >> [...]
> >> >Wait, hold the phone!  "Sorts the data without moving it"?  What,
> >> >is APL's sorting algorithm O(1)?  Yes, it may not actually get
> >> >sorted until it gets printed, but that's irrelevent to the fact
> >> >that it eventually gets sorted.
> >> 	Does that mean that your concept of the essential aspect of
> >> sorting is putting the data into sort order, rather than establishing
> >> the order itself? The respective timings say otherwise to me ;-)
> 
> >I'm not sure what you're trying to say, but if I have a data set,
> >and I set a bit that says "this data set has the property of
> >being ordered", then technically I have an ordered data set in
> >O(1) time.  Now, if I do an add reduction (terminology?), the
> >sorting doesn't actually have to be done, and I've saved some
> >CPU cycles.
> 
> >But all that's not the point.  If my point is to take a vector as
> >input, order it, and then display it on the screen, the vector
> >will be sorted.  And sorting something requires moving it into
> >a sorted order.
> 
> If you have (please overlook syntax bloopers)
>   struct tBigComplexThing {
>     ... blah, blah
>   } aBigArrayOfThem[kBig];
> you could sort them by moving the data around, and then print it
> by a loop such as for(i=0;i<kBig;i++)
>    printf("theformat",aBigArrayOfThem[i].anItem, etc..);
> You could also sort by using an array
>   int aiPermute[kbig];
> initialized by 
>   for(i=0;i<kBig;i++) aiPermute[i]=i;
> and then use a quicksort or whatever so that the
> integer elements of aiPermute[] get moved around instead
> of the tBigComplexThing elements of aBigArrayOfThem[].
> The comparison function just has to take two indices into
> aiPermute[], e.g., j and k, and compare
>   aBigArrayOfThem[aiPermute[j]].someField
> and
>   aBigArrayOfThem[aiPermute[k]].someField
> and operate on aiPermute[] as if it had compared
>   aiPermute[j] with aiPermute[k].
> I take it something like this is involved in the APL
> use of a "permutation vector" cited in a previous post.
> 
> Anyway, now you can do your print by the loop
>   for(i=0;i<kBig;i++)
>     printf("theformat",aBigArrayOfThem[aiPermute[i]].anItem, etc..);
> and expensive movements of the original data don't have to be part
> of the sorting process, even though you could say that the data
> eventually gets "moved into sorted order" into the output stream.
> But the latter happens in linear time, and is not relevant to the
> sorting algorithm executed beforehand.
> 	To summarize what I was getting at, I think
> "sorting data without moving it" is a fair description of sorting
> data by reordering references to it, since the sorting does not
> involve moving the data. I don't think subsequent extraction of
> the data itself into an ordered stream can be called sorting.
> 
> Regards,
> Bengt Richter
> 
> 
> 




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Should I learn C or Pascal?
  1996-08-27  0:00             ` Jeffrey C. Dege
                                 ` (4 preceding siblings ...)
  1996-08-28  0:00               ` Robert Dewar
@ 1996-09-01  0:00               ` Patrick Horgan
  1996-09-12  0:00                 ` Delete - Don't Bother to Read This Charles H. Sampson
  5 siblings, 1 reply; 688+ messages in thread
From: Patrick Horgan @ 1996-09-01  0:00 UTC (permalink / raw)



In article <slrn524jju.7hg.jdege@jdege.visi.com>, jdege@jdege.visi.com (Jeffrey C. Dege) writes:
> 
> Has _anyone_ had an instructer who brought any excitement to this stuff,
> or is it inherently impossible to teach without becoming dull and tedious?

I think I've done a pretty good job of it with pictures and arrows and yelling,
and drawing faces on things and jokes.  If you're interesting and you keep a
close lookout for glazed eyes and fix it before its a problem you can teach
things pretty well.  I suspect that I can do a good job of it because I like
it.  I get excited about cool algorithms and I like to show other people how
cool they are.

That's not to say that I uniformly teach well to all though.  One of the
hardest things about teaching (and the reason I like tutoring better) is
that everyone brings something different to the classroom and nothing you
can do will be right for everyone...sigh...

-- 

   Patrick J. Horgan    patrick@broadvision.com   Have horse will ride.
   Opinions mine, not my employer's except by most bizarre coincidence.






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-31  0:00                                           ` Frank Manning
  1996-08-31  0:00                                             ` Frank Manning
@ 1996-09-02  0:00                                             ` deafen
  1996-09-03  0:00                                               ` Frank Manning
                                                                 ` (4 more replies)
  1 sibling, 5 replies; 688+ messages in thread
From: deafen @ 1996-09-02  0:00 UTC (permalink / raw)



Frank Manning (frank@bigdog.engr.arizona.edu) wrote:
   [...]

: In my humble opinion, the solution to that problem is to teach the
: students what actually goes on inside the computer, how a transducer
: signal wends its way through an ADC into the computer memory, and how to
: write the software that controls all that stuff. Students also need to
: understand that things can go wrong and the computer will tell you a
: direct lie if it can possibly get away with it.

Frank, please allow me to speak from a student's point of view.  I'm a
senior in CIS, so I'm not required to do any assembly coding.  (For that
matter, I'm not required to learn much programming at all; I consider
myself the "odd man out" because I actually enjoy doing it and learning
about it.)

I do have a point.  Please be patient.  :)

Computer science students entering college today face a dilemma.  While 
we understand that the fundamentals (how the "transducer signal wends its 
way...into the computer memory") are important, we need to keep up with 
the state of the art, as well.

Those who have been in computing for twenty (or even ten) years have an 
advantage over us; they've had the time to learn all of the fundamentals 
and grow with the technology and paradigm shifts.  We are faced with the 
daunting task of learning all of these things in a few years, rather than 
a decade or two.

If I were to specialize in "how to write the software that controls all 
that stuff," I'd be left with no time at all to learn the more 
sophisticated things -- the MS Windows API, for example.

I can't speak much for the academic or research communities.  However, if 
a programmer is going to be of much use at all to the business community, 
it's far more important *to the employer* that s/he know the current 
state of the art than the underlying foundation.

I'm not saying that it shouldn't be taught.  However, before decrying the 
lack of fundamental knowledge in current CS graduates, it's important to 
take these things into consideration.

Colleges are stuck with the unenviable task of ensuring that their 
graduates will be employable and proficient.  An employer (in the 
business community) doesn't much care if a programmer knows what object 
code is created from their source code.  They care that the programmer 
can program well *in the environment of that particular organization*.

As to the statement that students should learn that the computer will lie 
at every opportunity, be assured that this IS being taught.  Perhaps it's 
not being explained in terms of the actual machine language, but we're 
learning it.  Believe me.  (Actually, I think that that knowledge comes 
from hands-on programming experience, which we've all got to do.)

Side note: I'm glad that this is the first thread I picked up in 
c.l.c++.  Nice to see intelligent discourse on Usenet for once.  :)

--
Hal Haygood :: PC/Network/Unix technician, Suntek Integrated Technologies
hal@suncap.com  ::  deafen@asu.edu  ::  hagar@mail.hvs.com
My opinions and those of my employer rarely, if ever, coincide.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Teaching sorts [was Re: What's the best language to start with?]
  1996-08-27  0:00                         ` Richard A. O'Keefe
@ 1996-09-02  0:00                           ` Lawrence Kirby
  0 siblings, 0 replies; 688+ messages in thread
From: Lawrence Kirby @ 1996-09-02  0:00 UTC (permalink / raw)



In article <4vtto9$77n@goanna.cs.rmit.edu.au>
           ok@goanna.cs.rmit.edu.au "Richard A. O'Keefe" writes:

>dewar@cs.nyu.edu (Robert Dewar) writes:
>
>>Richard said
>>"There is another factor:  heap sort requires moving one's attention from
>>card [i] to card [i/2]; this is a very expensive operation for people".
>
>>Oh gosh no!!!
>
>>Arrange the cards in a heap layed (sic) out as a binary tree, nothing else
>>makes sense if you are using heap sort on cards. Remember that the
>>[i] to [i/2] business is just a trick for mapping the underlying binary
>>tree!

That's how I approached it originally. The problem with this is that the
cards near the apex of the heap are physically a long way apart and it
takes longer to move them.

>A very fair point, which I must concede.
>It does, however, support my contention, which is that the costs for people
>may differ from the costs for computers, and different spatial layouts of
>the cards may very large difference to human costs, thus making manual card
>sorting a rather poor way to gain intuition about *computer* costs.

However it is still a good way of getting to know the algorithm. If you
can apply an algorithm to different situations where different operations
are available with different costs then you know the algorithm well.

>Animations such as those on the Cormen Leiserson & Rivest CD may perhaps
>be a better way to go.

Possibly although I haven't seen them. The important thing for learning
is to perform the steps yourself.

-- 
-----------------------------------------
Lawrence Kirby | fred@genesis.demon.co.uk
Wilts, England | 70734.126@compuserve.com
-----------------------------------------




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-02  0:00                                             ` deafen
                                                                 ` (3 preceding siblings ...)
  1996-09-03  0:00                                               ` Tim Behrendsen
@ 1996-09-03  0:00                                               ` Phil Barnett
  4 siblings, 0 replies; 688+ messages in thread
From: Phil Barnett @ 1996-09-03  0:00 UTC (permalink / raw)



In article <50fmsm$3o2@news.asu.edu>, deafen@imap2.asu.edu says...

>I can't speak much for the academic or research communities.  However, if 
>a programmer is going to be of much use at all to the business community, 
>it's far more important *to the employer* that s/he know the current 
>state of the art than the underlying foundation.

This is a total crock. If you don't know the foundations, you will build 
shakey apps with your state of the art technology with completely good 
intentions. You have it exactly backwards.

If you know the foundations, you can create excellent, stable and robust code 
in ANY language. 

If you don't know the foundations, you will be beat to death by your 
mistakes or will build as many beautiful pieces of useless or wrong code as 
you will good ones.

>I'm not saying that it shouldn't be taught.  However, before decrying the 
>lack of fundamental knowledge in current CS graduates, it's important to 
>take these things into consideration.

No, after being in the business community for many years, I can tell you by 
experience that this is simply not the case.

>Colleges are stuck with the unenviable task of ensuring that their 
>graduates will be employable and proficient.  An employer (in the 
>business community) doesn't much care if a programmer knows what object 
>code is created from their source code.  They care that the programmer 
>can program well *in the environment of that particular organization*.

College graduates are seldom useful for the large organization until they 
have spent a few years learning how it all actually works. (of course, there 
are exceptions)

I work for a large multinational corporation with a huge IS department, and 
college graduates are not placed in the planning or management of projects 
until they have many years on the job. There is a reason for this. They do 
not know the foundations of THIS PARTICULAR business.

Only experience teaches what an employee needs to know to become valuable.

Foundations make the trip quicker.

Foundations allow you to know where the mistakes and errors lie without 
actually commiting them.

Foundations keep you from going down the wrong path in the first place.

State of the Art technology provides none of this. 

---
              Phil Barnett  mailto:philb@iag.net
                       WWW  http://www.iag.net/~philb/
                  FTP Site  ftp://ftp.iag.net/pub/clipper





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-02  0:00                                             ` deafen
                                                                 ` (2 preceding siblings ...)
  1996-09-03  0:00                                               ` Bob Kitzberger
@ 1996-09-03  0:00                                               ` Tim Behrendsen
  1996-09-03  0:00                                               ` Phil Barnett
  4 siblings, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-09-03  0:00 UTC (permalink / raw)



deafen@imap2.asu.edu wrote in article <50fmsm$3o2@news.asu.edu>...
> Frank Manning (frank@bigdog.engr.arizona.edu) wrote:
>    [...]
> 
> : In my humble opinion, the solution to that problem is to teach the
> : students what actually goes on inside the computer, how a transducer
> : signal wends its way through an ADC into the computer memory, and how
to
> : write the software that controls all that stuff. Students also need to
> : understand that things can go wrong and the computer will tell you a
> : direct lie if it can possibly get away with it.
> 
> Frank, please allow me to speak from a student's point of view.  I'm a
> senior in CIS, so I'm not required to do any assembly coding.  (For that
> matter, I'm not required to learn much programming at all; I consider
> myself the "odd man out" because I actually enjoy doing it and learning
> about it.)
> 
> I do have a point.  Please be patient.  :)
> 
> Computer science students entering college today face a dilemma.  While 
> we understand that the fundamentals (how the "transducer signal wends its

> way...into the computer memory") are important, we need to keep up with 
> the state of the art, as well.
> 
> Those who have been in computing for twenty (or even ten) years have an 
> advantage over us; they've had the time to learn all of the fundamentals 
> and grow with the technology and paradigm shifts.  We are faced with the 
> daunting task of learning all of these things in a few years, rather than

> a decade or two.
> 
> If I were to specialize in "how to write the software that controls all 
> that stuff," I'd be left with no time at all to learn the more 
> sophisticated things -- the MS Windows API, for example.
>
> I can't speak much for the academic or research communities.  However, if

> a programmer is going to be of much use at all to the business community,

> it's far more important *to the employer* that s/he know the current 
> state of the art than the underlying foundation.

I think this is a sub-point of this thread; which is more
important, learning the general concepts of programming, or
learning specific knowledge such as the MS Windows API?

With all due respect, I have to tell you that you know almost
nothing that is useful to me as an employer.  What I mean by
that is you have smattering of this, and a smattering of that,
but all that pales compared to all the knowledge that you need
to be a truly useful asset.

When I hire new people, I look for the person that has the best
understanding of the fundamental concepts, because any specific
subject you will have learned (such as the Windows API) is so
shallow as to be useless.  I often say that "I would rather have
smart and ignorant, rather than stupid and knowledgeable."
 
> I'm not saying that it shouldn't be taught.  However, before decrying the

> lack of fundamental knowledge in current CS graduates, it's important to 
> take these things into consideration.
> 
> Colleges are stuck with the unenviable task of ensuring that their 
> graduates will be employable and proficient.  An employer (in the 
> business community) doesn't much care if a programmer knows what object 
> code is created from their source code.  They care that the programmer 
> can program well *in the environment of that particular organization*.

The problem is, the programmer can't program well.  They just
plain don't have enough experience from the toy projects in the
typical CS curriculum. The question, how well prepared are they to
learn new environments?  In other words, how well prepared are you
to "think like a programmer"?  Any API can be learned; the question
is, how good is your fundamental understanding of computer
programming so that you can have some perspective in order to
understand the API?

> As to the statement that students should learn that the computer will lie

> at every opportunity, be assured that this IS being taught.  Perhaps it's

> not being explained in terms of the actual machine language, but we're 
> learning it.  Believe me.  (Actually, I think that that knowledge comes 
> from hands-on programming experience, which we've all got to do.)

Are you...?  Not the students that I get.  The majority of the
ones that I get are not capable of taking a programming problem
that they've never seen before and coming up with an algorithm
for a solution.  In one hiring cycle I went through, I got 1 guy
out of 50 applicants that could do my testing process.  I hired
him, and the rest went elsewhere.

I'm not trying to downplay the value of your CS degree, but the
world of programming is far, far larger than the tunnel-visioned
view you've been given at school.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-02  0:00                                             ` deafen
@ 1996-09-03  0:00                                               ` Frank Manning
  1996-09-03  0:00                                               ` Steve Howard
                                                                 ` (3 subsequent siblings)
  4 siblings, 0 replies; 688+ messages in thread
From: Frank Manning @ 1996-09-03  0:00 UTC (permalink / raw)



In article <50fmsm$3o2@news.asu.edu> deafen@imap2.asu.edu writes:

> Computer science students entering college today face a dilemma.  While 
> we understand that the fundamentals (how the "transducer signal wends its 
> way...into the computer memory") are important, we need to keep up with 
> the state of the art, as well.

I see I haven't expressed myself very well. My fault.

I was questioning how *engineering* students were being educated about
computers -- aerospace and mechanical students, in particular, which I
should have mentioned. Obviously electrical & computer engineering
students are being taught about ADC's and stuff (at least I assume
they are).

I don't have any particular complaints about CS education per se. For
one thing, I don't know much about the CS curriculum, other than the
bits and pieces I pick up in cla from Mike Feldman et al.

I've seen many aero/mech research projects get bogged down in software
problems because neither the students nor professors -- with some
exceptions -- know much about software development. Examples:

   (1)  "Why should I put a program in more than one file? If I'm
	looking for something, I know it's either above or below the
	cursor."

   (2)  Professor writes GOSUB 5000 and expects everybody else on
	the team to know what that means.

   (3)  "Local variable? What's that?"

   (4)  Wind tunnel software written in HP Basic runs on machine
	that was new when Fred Flintstone was in diapers. Porting
	to somewhat newer PC in QuickBasic is nightmare.

   (5)  Computerized vehicle is field tested. PC with floppy
	drives is used as an embedded computer. Rather than store
	test data on floppies for later analysis, a printer is set
	up next to the test track. At the end of each run,
	professor drives to printer, plugs in printer and does
	screen dump.

> Those who have been in computing for twenty (or even ten) years have an 
> advantage over us; they've had the time to learn all of the fundamentals 
> and grow with the technology and paradigm shifts.  

Don't worry about it. You can replace the experienced boneheads I've
just alluded to. There are plenty of them around.

-- Frank Manning
-- Chair, AIAA-Tucson Section




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-02  0:00                                             ` deafen
  1996-09-03  0:00                                               ` Frank Manning
  1996-09-03  0:00                                               ` Steve Howard
@ 1996-09-03  0:00                                               ` Bob Kitzberger
  1996-09-03  0:00                                               ` Tim Behrendsen
  1996-09-03  0:00                                               ` Phil Barnett
  4 siblings, 0 replies; 688+ messages in thread
From: Bob Kitzberger @ 1996-09-03  0:00 UTC (permalink / raw)




deafen@imap2.asu.edu wrote:

: An employer (in the 
: business community) doesn't much care if a programmer knows what object 
: code is created from their source code.  They care that the programmer 
: can program well *in the environment of that particular organization*.

Not all companies have that point of view.  Around here, we assume
that a strength in the fundamentals, plus a proven above-the-rim
level of competence allows our developers to switch from today's
environment to tomorrows.  One thing that's certain in this industry
is rapid change.  If you spend all of your college years learning
the technology du jour, how can a company be assured that you can
quickly shift to tomorrow's technology?

: Side note: I'm glad that this is the first thread I picked up in 
: c.l.c++.  Nice to see intelligent discourse on Usenet for once.  :)

Clearly because it's cross-posted to comp.lang.ada :-)

--
Bob Kitzberger	      Rational Software Corporation       rlk@rational.com
http://www.rational.com http://www.rational.com/pst/products/testmate.html




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-31  0:00                                 ` Robert Dewar
@ 1996-09-03  0:00                                   ` Alan Peake
  1996-09-07  0:00                                     ` Robert Dewar
  0 siblings, 1 reply; 688+ messages in thread
From: Alan Peake @ 1996-09-03  0:00 UTC (permalink / raw)




>Alan said

>" Fortran IV - been so long, I forgot the version :)"

>Really you mean Fortran 66, which is the name of the standardized
>language, to which Fortran IV, a particular implementation, is a close
>approximation.

You're probably right. The lecture notes didn't mention Fortran 66 - only IV 
and as I recall, the names Flair and George II came into it. One was the 
compiler I think. The machine (which I never saw as it resided at a different 
campus about 15 miles away) that we learnt on was an ICL of about 1972 vintage 
or earlier. All our programming was done on punch-cards so if you made a 
mistake, you had to wait a day or two to find out. Imagine the luxury when we 
got to use the Electrical Engineering dept's Nova with Basic and an ASR33 
terminal!  How times have changed!

Alan






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-30  0:00                                                         ` Richard A. O'Keefe
  1996-08-30  0:00                                                           ` Peter Seebach
  1996-09-01  0:00                                                           ` Joe Keane
@ 1996-09-03  0:00                                                           ` Arkady Belousov
  2 siblings, 0 replies; 688+ messages in thread
From: Arkady Belousov @ 1996-09-03  0:00 UTC (permalink / raw)



Hello!

Richard A. O'Keefe wrote:

> Consider
>         char *dupcat(char const *a, char const *b) {
>             char *result = malloc(strlen(a) + strlen(b) + 1);
>             assert(result != 0);
>             memcpy(result, a, strlen(a));
>             memcpy(result + strlen(a), b, strlen(b));
>             memcpy(result + strlen(a) + strlen(b), "", 1);
>             return result;
>         }
> I repeat, this is NOT the way I would normally write it.
> I contend that it is _inefficient_ but not "stupid".
> If a malicious (and not strictly conforming) signal handler smashes
> NBTS(a) or NBTS(b), then you have worse problems than just possible
> changes to strlen().  Manually saving the numbers won't protect you
> from the fact that the null-terminated-byte-strings have changed,
> so that the function result (as an NBTS) is not the concatenation
> of its NBTS arguments.

     For ease controlling such situation standard have "volatile"
keyword. So you are right, I think.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-02  0:00                                             ` deafen
  1996-09-03  0:00                                               ` Frank Manning
@ 1996-09-03  0:00                                               ` Steve Howard
  1996-09-03  0:00                                               ` Bob Kitzberger
                                                                 ` (2 subsequent siblings)
  4 siblings, 0 replies; 688+ messages in thread
From: Steve Howard @ 1996-09-03  0:00 UTC (permalink / raw)



deafen@imap2.asu.edu wrote:
> 
> Computer science students entering college today face a dilemma.  While
> we understand that the fundamentals (how the "transducer signal wends its
> way...into the computer memory") are important, we need to keep up with
> the state of the art, as well.
> 

This is not unique to students. Those of us in the "business" have to do this as well. This is, in 
fact, true with almost any profession (law, medicine, etc.) The state of the art is constantly 
changing, and the most valuable professionals are those who know both how things are currently done and 
how the industry is progressing. This is one of the reasons that there are so many professional 
programming journals and magazines.

> Those who have been in computing for twenty (or even ten) years have an
> advantage over us; they've had the time to learn all of the fundamentals
> and grow with the technology and paradigm shifts.  We are faced with the
> daunting task of learning all of these things in a few years, rather than
> a decade or two.
> 

I don't think students have a unique claim to this, either. I believe that it was as hard for me to 
change from what I learned in college to what was done on my first job 10+ years ago than it will be 
for someone graduating today. The same will probably true 10 years hence for the kids graduating then.

> If I were to specialize in "how to write the software that controls all
> that stuff," I'd be left with no time at all to learn the more
> sophisticated things -- the MS Windows API, for example.
> 

Learning the Windows API may be important to get a specific job, but what if the only jobs available 
are for Unix or Mac developers. The basic concepts for programming a Unix GUI or the Mac GUI are 
essentially the same conceptually as Windows, but vastly different in the details. 

> I can't speak much for the academic or research communities.  However, if
> a programmer is going to be of much use at all to the business community,
> it's far more important *to the employer* that s/he know the current
> state of the art than the underlying foundation.
> 

Which state of the art would that be? Opearating system design for the newest parallel processor? Java 
development for Internet applications? C++ for Windows development? Assembly or Ada development for 
embedded microprocessor systems?

All of these tasks are going on today, as well as a million more. To teach all of these things to every 
student so that s/he could walk into a job and start right away would be impossible. All of these tasks 
share some common base of "fundamental" concepts. Teaching these fundamentals would allow a new hire to 
adapt to them quickly. Familiarity with the more specific concepts might be desireable, but IMHO the 
details will be out of date an hour after they are taught.

In addition, I suspect that much (most?) of the work out there in the computer field is somewhat 
distant from the state of the art. A lot of it is integrating data from existing systems with new 
systems.... maintaining legacy applications and code... adapting systems to new operating 
environments... etc.

> I'm not saying that it shouldn't be taught.  However, before decrying the
> lack of fundamental knowledge in current CS graduates, it's important to
> take these things into consideration.
> 
> Colleges are stuck with the unenviable task of ensuring that their
> graduates will be employable and proficient.  An employer (in the
> business community) doesn't much care if a programmer knows what object
> code is created from their source code.  They care that the programmer
> can program well *in the environment of that particular organization*.
> 

The problem as I see it is that the new stuff changes far too fast to teach. It is a moving target, and 
if the wrong "state of the art" is taught, you have a bunch of grads who lack both the fundamentals of 
software development, and lack training in the development methods currently in vogue. Teaching the 
fundamentals will rarely miss the mark, since it is the historical basis for all of the state of the 
art. 
-- 
E. Steve Howard                | Lockheed Martin
                               | Ocean, Radar, & Sensor Systems        
mailto:howard@mtm.syr.lmco.com | Syracuse, NY




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-30  0:00                                                           ` Peter Seebach
@ 1996-09-03  0:00                                                             ` Lawrence Kirby
  0 siblings, 0 replies; 688+ messages in thread
From: Lawrence Kirby @ 1996-09-03  0:00 UTC (permalink / raw)



In article <506v0c$ne8@solutions.solon.com>
           seebs@solon.com "Peter Seebach" writes:

...

>There's a possible weakness.  Assume sig_atomic_t is an int.  Assume we
>have an array of sig_atomic_t somewhere.  Assume we decide to, as we
>are permitted by the standard, treat it as an array of characters.  The
>first sig_atomic_t in the array may be overwritten by a signal handler
>we have installed.  We make sure the 2nd s_a_t in the array contains at least
>one null byte, so strlen((char *) sat_array) will be zero, and pass
>the array to dupcat as both a and b.  (Suitably cast to (char *).)
>
>It is entirely possible that, even though a and b are the same pointer,
>that a signal handler will change the NTBS they point to between a
>strlen(a) and a strlen(b) in the above, or between statements.

However the compiler doesn't have to worry about this. If you access
a volatile object through a non-volatile lvalue you get undefined
behaviour (6.5.3). Since signal handlers can only access static variables
of type volatile sig_atomic_t and strlen does not treat the character array
as volatile the compiler doesn't have to make the strlen code signal safe.

Consider the situation where the signal handler changed the string while
strlen was running, it could easily miss the null character terminator
completely. This is simply not a sensible thing for code to do.

>The problem is unique to (char *), which may point to things other than
>arrays of characters, because you may treat any object as an array of
>characters.

That is certainly a general aliasing problem. On the other hand it may be
that a lot of those situations can't occur in strictly conforming programs
so, again, the compiler is not obliged to deal with them to be conforming,
although the compiler writer may not want the trouble of convincing irate
customers about this.

-- 
-----------------------------------------
Lawrence Kirby | fred@genesis.demon.co.uk
Wilts, England | 70734.126@compuserve.com
-----------------------------------------




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-01  0:00                                                           ` Joe Keane
@ 1996-09-04  0:00                                                             ` Richard A. O'Keefe
  0 siblings, 0 replies; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-09-04  0:00 UTC (permalink / raw)



Joe Keane <jgk@jgk.org> writes:
>That's just bad code.  It specifies a *quadratic* algorithm for a very
>simple operation.  My impression would be that the coder doesn't
>understand some basic things about C programming, or they're just
>inconsiderate.  How hard is it to put the length in a variable?

Perhaps we have different perspectives.
Would a *professional* programmer produce code like that?
No way!

Would it be "bad" if a 3rd semester student produced it?
In my view, no way!

At the moment, I'm bogged down in marking.  Last semester, when the
students were in their 3rd semester, using C, I would have wept for
joy to see code that good.  This semester, now they have had another
semester under their belts, and understand something about big-Oh,
I would expect better code.  In fact, to my joy I *am* seeing better
code, much better code.  But the students are writing Ada this
semester, and I find that our students *do* write much better Ada
than they do C.

>It's a basic fact that, in the idiomatic `for' loop, the test is
>evaluated every time through the loop.

You have forgotten the "as if" rule.
The outcome of the program must be the same AS IF the expression had
been evaluated every time, but when part of an expression manifestly
does not change, the compiler is under no obligation to re-evaluate it.

>If the upper bound is fixed,
>it's really good practice to put it in a local variable.

We *agree* that a *professional* programmer, if using a language in which
the _only_ for-loop is like C's (which I am not knocking; I use "do" in
Lisp and Scheme all the time, and it's like C's "for" only better), would
arrange it cheaply.  By the way, a *professional* C programmer would not
call strlen() at all in such an example.

But coming from an educational perspective, I am unwilling to call
*clear*, *correct* code "stupid", or even "bad".  A student who is
capable of writing code that _good_ is one we can help to improve.

>>The ONLY strlen() optimisation that is needed here
>>is one that is completely general:  strlen() is a pure function and its
>>argument does not change, so the strlen(s) computation can be hoisted out
>>of the loop.

>It's not a pure function.  Whether the argument changes isn't the point;
>it's a local variable and it's clear to everyone that it's not changed.

In the article you quote, I explained that by "argument" I meant the
NTBS that the actual parameter points to, AND IN THIS EXAMPLE THAT
NTBS DOES NOT CHANGE, and that *IS* the point.

>Really.  I expect an optimizing compiler to do a decent job of low-level
>things, such as allocating registers, selecting instructions, scheduling
>instructions, and so on.  If optimization is turned off, the code will
>get slower, maybe two to five times, but a constant factor.

>I don't expect the compiler to work miracles, to turn crap into gold.
>That would include changing a quadratic algorithm to a linear one.  Some
>people may think that it's nifty, but i think it's just a bit disturbing
>and i'd rather have compiler writers concentrate on compiling good code.

Again, we have obviously have different perspectives.
I have one foot in the "declarative programming" community (and if you
think Haskell and Mercury are esoteric, think "SQL" which has _exactly_
the same kinds of optimisation requirements and opportunities).
For example, there is a well-known transformation which routinely
turns a class of algorithms with exponential cost if taken naively
into implementations with polynomial cost.

Do I have to explain that the strlen() optimisation is one that has been
legal for Fortran compilers since the first Fortran standard?  Or that
strlen() is a C library function, and that the C standard very carefully
allows a compiler to take advantage of special knowledge about library
functions?

>You can't blow off a level of indirection so easily.

"Blow off" iks rather offensive language.
The NTBS that the pointer pointers to does not change.
True, that is "a level of indirection", but I *explained* how that
can be handled.

>>In the program fragment under discussion, it was not merely the pointer
>>that was constant, but the NTBS itself.

>I think that you misunderstand the `const' keyword.

Bullshit.  I said nothing about the 'const' keyword.
I said that the pointer and the NTBS were *constant*.
That doesn't mean they are declared "read-only".
It means THEY DIDN'T CHANGE.  That's all.
I defy anyone to show where the pointer or the NTBS changed in my
example.  I didn't mention the 'const' keyword because it wasn't relevant.

>>If I do
>>	n = strlen(s);
>>	/* much code that does not change s */
>>	/* NOT the NTBS that s refers to */
>>	m = strlen(s);
>>it is necessarily the case that m and n will be the same size_t value.

>I'm unclear on that comment, but if there are pointer stores in there,
>the length may be changed, even if `s' isn't changed.

The comment (which should have read NOR, not NOT) *says* there are no
such stores.  You are like someone who is told

	"There are no dinosaurs in this room."

and replies

	"I'm unclear on that remark, but if there are any dinosaurs
	 in this room, one of them is about to bite your head off."

>In your example, it is possible.  But it's based on a lot of assumptions
>and it's not something you should depend on.

You have mistaken just about everything I was trying to say.
I suppose this must be my fault.
I *did* say explicitly that *I* know why naive compilers generate
costly code for this, and that it is not something that I would myself
write.  I have been using C since V6+ UNIX, and have had to live with
naive C compilers for years.

>In many similar examples,
>the compiler *must not* remove the duplicated calls.

Yes, but we aren't *talking* about similar examples.  Again, you are
shifting the argument to attack a position I am not stating or defending.
The strlen() example is a very very specific one; it is *ONLY* in the
specific case of strlen() and *ONLY* in the specific case where the
NTBS is not changing that C suckers students into writing

	for (i = 0; i < strlen(s); i++) ... process s[i] ...

All I am claiming, all I have ever claimed, is that a beginner who
writes *that* code to iterative over an unchanging NTBS is not
writing "stupid" or "bad" code, merely code which is _inefficient_
under _some_ C compilers (admittedly the majority of them).

I refuse to call clear correct *working* *readable* code "stupid"
or "bad".  That doesn't mean I am going to slobber over the author
with praise, or that I am not going to put a fair bit of red ink
on such an assignment.  It just means I'm not going to call such a
student nasty names, and am going to take it as an encouraging sign
that the student has learned how to do some important things _well_.

-- 
Australian citizen since 14 August 1996.  *Now* I can vote the xxxs out!
Richard A. O'Keefe; http://www.cs.rmit.edu.au/%7Eok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-31  0:00                                                       ` Tanmoy Bhattacharya
@ 1996-09-04  0:00                                                         ` Tom Payne
  0 siblings, 0 replies; 688+ messages in thread
From: Tom Payne @ 1996-09-04  0:00 UTC (permalink / raw)



Tanmoy Bhattacharya <tanmoy@qcd.lanl.gov> wrote:

: Not quite: Only volatile sig_atomic_t can be handled portably by a
: signal handler. Of course, one can cast away the volatile, but *that*
: is bad anyways.

And, for the record, the only portable handling is writing.  Even
reading provokes the goblin, "undefined behavior."

Tom Payne




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-26  0:00                                                     ` Richard A. O'Keefe
                                                                         ` (3 preceding siblings ...)
  1996-08-31  0:00                                                       ` Tanmoy Bhattacharya
@ 1996-09-04  0:00                                                       ` Patrick Horgan
  1996-09-05  0:00                                                         ` Richard A. O'Keefe
  4 siblings, 1 reply; 688+ messages in thread
From: Patrick Horgan @ 1996-09-04  0:00 UTC (permalink / raw)



In article <4vroh3$17f@goanna.cs.rmit.edu.au>, ok@goanna.cs.rmit.edu.au (Richard A. O'Keefe) writes:
 
> IMP 77.  They observed that the operating system modules written by
> people skilled in assembly language (and who even went to the extreme
> of checking the assembly code produced by the compiler to make sure it
> was efficient) tended to be
>  - bigger,
>  - less readable, and
>  - SLOWER
> than that produced by people who had a "high level language" perspective.
> I'll try to remember to bring the paper in tomorrow so I can quote it
> exactly.

You don't state any conclusion from this, but I might imply that you think
that somehow this is an implication that people that have an assembler
language background are bad programmers in high level languages.  I hope
you quote the paper in somewhat more detail, because I'd like to know how
much experience they had as programmers.  Generalizing from one small pool
of people with assembly language skills to any conclusions about the 
relative worth of knowing or not knowing assembler, or whether people with
those skills do or do not write good code is suspect logically.


-- 

   Patrick J. Horgan    patrick@broadvision.com   Have horse will ride.
   Opinions mine, not my employer's except by most bizarre coincidence.






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-17  0:00             ` Richard Chiu
@ 1996-09-04  0:00               ` Lawrence Kirby
  0 siblings, 0 replies; 688+ messages in thread
From: Lawrence Kirby @ 1996-09-04  0:00 UTC (permalink / raw)



In article <rchiu.840255046@popeye.cs.iastate.edu>
           rchiu@cs.iastate.edu "Richard Chiu" writes:

>miker3@ix.netcom.com (Mike Rubenstein) writes:
>>But that is not always true.
>
>>A number of years ago I developed a program that had to do a large
>>number of sorts with the following characteristics:
>
>>       1.  The mean number of items to be sorted was about 5.  In a 
>>           test sample of a million cases, the larges number of items
>>           to be sorted was 35.
>
>>       2.  The items were usually in order.  In the test sample, 90% 
>>           were in order, and most of the rest were in order except 
>>           for a single interchange of adjacent items.  Only 8 were 
>>           out of order by more than three interchanges or required 
>>           interchanges of nonadjacent items.
>
>>Care to try this with quicksort?  or heapsort?  A good old O(n^2)
>>insertion sort works quite nicely.
>
>By the same thread of reasoning, a program that does nothing but
>reture the inputs will be the best algorithm for sorting if the 
>numbers are already sorted. (Which is true!)

However it wasn't stated that the numbers were already in order, just
*usually* in order. It may also mean that in cases where items aren't
exactly in order they are still nearly ordered.

-- 
-----------------------------------------
Lawrence Kirby | fred@genesis.demon.co.uk
Wilts, England | 70734.126@compuserve.com
-----------------------------------------




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-21  0:00                                           ` Tim Behrendsen
  1996-08-22  0:00                                             ` Bob Gilbert
@ 1996-09-04  0:00                                             ` Lawrence Kirby
  1996-09-04  0:00                                               ` Tim Behrendsen
  1996-09-05  0:00                                               ` Mark Wooding
  1 sibling, 2 replies; 688+ messages in thread
From: Lawrence Kirby @ 1996-09-04  0:00 UTC (permalink / raw)



In article <01bb8f19$9a89d820$32ee6fce@timhome2>
           tim@airshields.com "Tim Behrendsen" writes:

>Bob Gilbert <rgilbert@unconfigured.xvnews.domain> wrote in article
><4vcac4$gm6@zeus.orl.mmc.com>...

...

>> A very procedural point of view.  Many of the proponents of object
>> oriented design might have a problem with this view, and demonstrates
>> my point about allowing the details of implementation to obscure the
>> higher level problem solving process.
>
>There is no other view than the procedural view.

Some functional language programmers might take issue with that
statement. Prologgers may have a thought or two also.

...

>How can someone implement *any* sort in assembly language,
>and "learn it but not really understand it"?  To implement it,
>you have to do it in great detail, and you simply can't do the
>"push and prod until it works" approach to programming, which
>is what I think a lot of students do.

Example: once I had an assignment to implement a sort on IBM 360 assembler.
What I learnt to do was convert a sequence of individual operations into
assembler instructions. It didn't teach me a thing about the sort algorithm
(which was merge sort).

I can't imagine a worse way of teaching an algorithm than using assembly.
It is essential to separate the algorithm from the implementation and
assembly ties you up in knots with implementation specific details
especially when you are just learning 'computing'. Once you've learnt
an algorithm it makes a lot of sense to practice that algorithm by
writing implementations, even in assembly, but you should understand the
algorithm to a reasonable extent *before* attempting that.

-- 
-----------------------------------------
Lawrence Kirby | fred@genesis.demon.co.uk
Wilts, England | 70734.126@compuserve.com
-----------------------------------------




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-04  0:00                                             ` Lawrence Kirby
@ 1996-09-04  0:00                                               ` Tim Behrendsen
  1996-09-06  0:00                                                 ` Bob Gilbert
  1996-09-05  0:00                                               ` Mark Wooding
  1 sibling, 1 reply; 688+ messages in thread
From: Tim Behrendsen @ 1996-09-04  0:00 UTC (permalink / raw)



Lawrence Kirby <fred@genesis.demon.co.uk> wrote in article
<841797763snz@genesis.demon.co.uk>...
> In article <01bb8f19$9a89d820$32ee6fce@timhome2>
>            tim@airshields.com "Tim Behrendsen" writes:
> 
> >Bob Gilbert <rgilbert@unconfigured.xvnews.domain> wrote in article
> ><4vcac4$gm6@zeus.orl.mmc.com>...
> 
> ...
> 
> >> A very procedural point of view.  Many of the proponents of object
> >> oriented design might have a problem with this view, and demonstrates
> >> my point about allowing the details of implementation to obscure the
> >> higher level problem solving process.
> >
> >There is no other view than the procedural view.
> 
> Some functional language programmers might take issue with that
> statement. Prologgers may have a thought or two also.

What I mean by that is, for work to get done, the computer
must perform transformations of data over time.  You can call
that an implementation detail if you want, but there simply is
no such thing as "instantaneous" algorithms.  You can look
at a mathematical proof as existing without procedures, in the
sense that it simply "exists", as a statement of truth, but
algorithms are different.  They are, by their nature, a
process, and processes require a time axis.

Now, there are certain rule-based languages, but these describe
a process just like any other language.  The different is, rather
than describe the process as a set of linear paths, it describes
them in a way like a pachinko machine.  You have a set of pins, and
when you drop a marble at a certain angle, it bounces along the
rules defined by the pins, and comes out a certain position.  I
haven't described a linear procedure for every angle, but I have
still defined the process by which data (the marble) flows
through the algorithm (the pins).  In other words, the purpose of
the rules is data flowing through them.

> ...
> 
> >How can someone implement *any* sort in assembly language,
> >and "learn it but not really understand it"?  To implement it,
> >you have to do it in great detail, and you simply can't do the
> >"push and prod until it works" approach to programming, which
> >is what I think a lot of students do.
> 
> Example: once I had an assignment to implement a sort on IBM 360
assembler.
> What I learnt to do was convert a sequence of individual operations into
> assembler instructions. It didn't teach me a thing about the sort
algorithm
> (which was merge sort).

I find it difficult to believe that you implemented a merge sort
in assembly without understanding the merge sort algorithm.

> I can't imagine a worse way of teaching an algorithm than using assembly.
> It is essential to separate the algorithm from the implementation and
> assembly ties you up in knots with implementation specific details
> especially when you are just learning 'computing'. Once you've learnt
> an algorithm it makes a lot of sense to practice that algorithm by
> writing implementations, even in assembly, but you should understand the
> algorithm to a reasonable extent *before* attempting that.

On the contrary, I think it is essential to focus on the
implementation in as much detail as possible.  You can pack as
much algorithm theory into a student as you want, but they don't
really "get it" until they type in a program, and experience it
not working, figuring out why, fixing it, and repeating the
process until they see the expected output.

I think you take how to "think like a programmer" for granted so
much that you don't understand that it doesn't come naturally
to the average person.  The procedural nature of the computer is
something to be learned, just like algorithms.

-- Tim Behrendsen (tim@airshields.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: Goto considered really harmful
  1996-08-30  0:00                       ` Goto considered really harmful Patrick Horgan
@ 1996-09-04  0:00                         ` Dennison
  0 siblings, 0 replies; 688+ messages in thread
From: Dennison @ 1996-09-04  0:00 UTC (permalink / raw)



Patrick Horgan wrote:
> generator, so that upon detecting a goto statement it would generate a
> full 132 columns of '-' characters, followed by a carriage return w/o
> linefeed, and print this line about 50 times, on the same line of the
> output. This had the effect of chopping right through the lineprinter
> output at the goto statement, cutting the paper in half, which jammed the
> lineprinter, requiring operator intervention. This *really* pissed the
> operators off, and generally resulted in a high-decibel stream of abuse
> directed at the poor slob of an undergrad who submitted the job.

Most such institutional printers used to be of the "chain" variety. The REALLY
nasty trick is to find the order that the letters appear on the printer's 
chain, and send that particular string to the printer.

...at least that's what I heard.
-- 
T.E.D.
email    - mailto:dennison@iag.net
homepage - http://www.iag.net/~dennison




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-04  0:00                                                       ` Patrick Horgan
@ 1996-09-05  0:00                                                         ` Richard A. O'Keefe
  1996-09-05  0:00                                                           ` deafen
  0 siblings, 1 reply; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-09-05  0:00 UTC (permalink / raw)



patrick@broadvision.com (Patrick Horgan) writes:
[Quoting a fairly old posting of mine about the startling
 result in "The IMP Language and Compiler"]
 
>You don't state any conclusion from this, but I might imply that you think
>that somehow this is an implication that people that have an assembler
>language background are bad programmers in high level languages.

You might *imply* it, but you sure as heck can't *infer* it,
because *I* didn't imply it.

>I hope you quote the paper in somewhat more detail,

I already did.

>because I'd like to know how much experience they had as programmers.

It doesn't say.  The EMAS project started in 1973, back when the ICL 4-75
(was this the same as the RCA Spectra?) -- a British System/360 look-alike --
was a new machine.  The project started two years before the 4-75 was
actually delivered.  The first IMP compiler was an adaptation of the
Atlas Autocode compiler.  (Atlas Autocode was an Algol-like language.)
They actually simulated 32-bit arithmetic and byte addressing (which is
what the 4-75 had) on the 48-bit word-addressed KDF9!
"For System 4, the compiler was bootstrapped from the KDF9 compiler
via assembly language (due to difficulties with the manufacturer's
early operating systems).  This compiler was also bootstrapped onto
the IBM 360/50."  

If there is anyone associated with the project reading this, perhaps
they could supply some more information.  I think that the EMAS project
is an interesting part of computing history which deserves to be written
up in rather more detail.

>Generalizing from one small pool
>of people with assembly language skills to any conclusions about the 
>relative worth of knowing or not knowing assembler, or whether people with
>those skills do or do not write good code is suspect logically.

You have completely mistaken the structure of my argument.
The whole point of it is that I am *rebutting* a generalisation.
If I wish to rebut "all swans are white", it is sufficient to
exhibit ONE black swan; I am not obliged to show that all swans are
black.  (The swans one sees in Australia and New Zealand _are_ black,
as it happens.)

I was concerned to rebut the generalisation "exposure to assembly
language helps people become better programmers, writing more
efficient code, and it is difficult to become a good programmer
without this experience", which I took to be Tim Behrendsen's claim.

I exhibited a single real compiler-and-operating system project
which delivered bloody good working product, where it was found
that the people used to high level languages (Fortran? Algol?
BCPL? They were *certainly* aware of PL/I and intended to build
a translator from IMP to PL/I but found that they had more IMP
compilers than there were PL/I compilers to translate for)
delivered smaller faster better structured code than the people
"with a background of assembly languages" (1401? 1130? /360?).

This *establishes* no generalision, but it is good logic for
*rebutting* one.

-- 
Australian citizen since 14 August 1996.  *Now* I can vote the xxxs out!
Richard A. O'Keefe; http://www.cs.rmit.edu.au/%7Eok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-05  0:00                                                         ` Richard A. O'Keefe
@ 1996-09-05  0:00                                                           ` deafen
  0 siblings, 0 replies; 688+ messages in thread
From: deafen @ 1996-09-05  0:00 UTC (permalink / raw)



I appreciate the perspectives that all of you have offered.  However, I 
feel it necessary at this point to make a clarification of my original 
post (about being a CIS student, etc.)

I may have chosen the wrong word to describe a certain concept.  The word 
I used, "fundamentals", is being interpreted as what I would describe as 
"design concepts."  FWIW, I use the word "foundations" for that.

I was commenting on the idea that all programmers need to understand just 
exactly what's going on at the assembly level to be good programmers.  I 
disagree.

As for the comments about what employers want, absolutely, I shot my 
mouth off, foot buried deep inside.  Of course I don't know what's 
expected in the business world as far as programmers go; like the .sig 
says, I'm primarily a tech, not a programmer.  On top of that, I'm a 
student, and therefore in most ways isolated from the "real world."

The curriculum in which I am enrolled stresses design and analysis rather 
than coding.  (Again, please note that I'm in CIS, not CS.  At my school, 
CIS is a business degree, and CS is an engineering degree.)  I probably 
would be having more fun if the curriculum involved more coding and less 
design, but that's neither here nor there.

Just offering a clarification.  And I guess someday I'll learn to keep my 
mouth shut.  (Or maybe not...:)

--
Hal Haygood :: PC/Network/Unix technician, Suntek Integrated Technologies
hal@suncap.com  ::  deafen@asu.edu  ::  hagar@mail.hvs.com
My opinions and those of my employer rarely, if ever, coincide.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-04  0:00                                             ` Lawrence Kirby
  1996-09-04  0:00                                               ` Tim Behrendsen
@ 1996-09-05  0:00                                               ` Mark Wooding
  1996-09-06  0:00                                                 ` Bob Cousins
  1996-09-13  0:00                                                 ` Bengt Richter
  1 sibling, 2 replies; 688+ messages in thread
From: Mark Wooding @ 1996-09-05  0:00 UTC (permalink / raw)



Lawrence Kirby <fred@genesis.demon.co.uk> wrote:
> In article <01bb8f19$9a89d820$32ee6fce@timhome2>
>            tim@airshields.com "Tim Behrendsen" writes:
>
> >There is no other view than the procedural view.
> 
> Some functional language programmers might take issue with that
> statement. Prologgers may have a thought or two also.

I've not come across a computer yet which doesn't work by fetching an
instruction (or maybe a few at a time), doing them, and then going off
and fetching some more.  I guess you can pretend that this isn't the
case, and maybe come up with some nice ways of presenting algorithms
which don't depend on this, but that's not the way things work
underneath.  The One True View is that sequence of instructions; all
else is an illusion.  Maybe it's a helpful illusion, but illusion it is
nonetheless.
-- 
[mdw]

`When our backs are against the wall, we shall turn and fight.'
		-- John Major





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-06  0:00                                                 ` Bob Gilbert
@ 1996-09-06  0:00                                                   ` Tim Behrendsen
  1996-09-09  0:00                                                     ` Bob Gilbert
  1996-09-10  0:00                                                   ` Richard A. O'Keefe
                                                                     ` (3 subsequent siblings)
  4 siblings, 1 reply; 688+ messages in thread
From: Tim Behrendsen @ 1996-09-06  0:00 UTC (permalink / raw)



Bob Gilbert <rgilbert@unconfigured.xvnews.domain> wrote in article
<50p68s$cpi@zeus.orl.mmc.com>...
> In article <01bb9a1e$24c669e0$32ee6fcf@timhome2>, "Tim Behrendsen" <tim@airshields.com>
writes:
> > Lawrence Kirby <fred@genesis.demon.co.uk> wrote in article
> > <841797763snz@genesis.demon.co.uk>...
> > > In article <01bb8f19$9a89d820$32ee6fce@timhome2>
> > >            tim@airshields.com "Tim Behrendsen" writes:
> > > >Bob Gilbert <rgilbert@unconfigured.xvnews.domain> wrote in article
> > > ><4vcac4$gm6@zeus.orl.mmc.com>...
> > > >> A very procedural point of view.  Many of the proponents of object
> > > >> oriented design might have a problem with this view, and demonstrates
> > > >> my point about allowing the details of implementation to obscure the
> > > >> higher level problem solving process.
> > > >
> > > >There is no other view than the procedural view.
> > > 
> > > Some functional language programmers might take issue with that
> > > statement. Prologgers may have a thought or two also.
> > 
> > What I mean by that is, for work to get done, the computer
> > must perform transformations of data over time.  You can call
> > that an implementation detail if you want, but there simply is
> > no such thing as "instantaneous" algorithms.  You can look
> > at a mathematical proof as existing without procedures, in the
> > sense that it simply "exists", as a statement of truth, but
> > algorithms are different.  They are, by their nature, a
> > process, and processes require a time axis.
> 
> I suspect our definitions of procedural vs non-procedural (e.g. object
> oriented) views are not the same.  Non-procedural views do not imply 
> "instantaneous" algorithms, the non-existence of a time axis, or whatever.
> The difference is whether I veiw the problem solution as being built by
> putting together a bunch of operations (procedures) that I can invoke to 
> manipulate whatever data I happen to have, or I can view the problem 
> solution as being built by putting together the appropiate data elements 
> (objects) for which certain operations are defined.  Are the basic 
> building blocks procedures or objects?  BTW, I'm not necessarily 
> advocating one view over another, just trying to point out that there 
> are views other than a procedural view.

Object-oriented is perhaps not where you want to go to make
your point; the only difference between OOP and non-OOP is
how you bind the procedures to the objects.  non-OOP you pass
an object to a procedure; in OOP, you execute the procedure
abstractly bound to the object.

A better example are the "pure" non-procedural languages, where
you give an abstract logic statement of what the algorithm *is*,
rather than how to execute it.  In other words, the ordering of the
individual clauses are not important; the cumulative meaning
of the clauses are what expresses the algorithm.  Structured
Query Language is a good example of this.

But to get back to the original point, which is teaching students
the procedural nature of the computer, it is essential that they
understand this.  One of the most dangerous people in the world
is someone who writes SQL statements who doesn't fundamentally
understand what happens inside the mechanism.  In theory, I
should be able to happily write SQL statements anyway I want to;
in practice, I *would* like my report to come out within this
millenium.

Yes, teach them non-procedural languages in the Masters or PhD
programs.  First teach them that everything boils down to
procedural mechanisms.

-- Tim Behrendsen (tim@a-sis.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-19  0:00                                       ` Tim Behrendsen
  1996-08-19  0:00                                         ` Tim Hollebeek
  1996-08-20  0:00                                         ` Bob Gilbert
@ 1996-09-06  0:00                                         ` Robert I. Eachus
  1996-09-06  0:00                                           ` Tim Behrendsen
  1996-09-11  0:00                                         ` Jon S Anthony
                                                           ` (4 subsequent siblings)
  7 siblings, 1 reply; 688+ messages in thread
From: Robert I. Eachus @ 1996-09-06  0:00 UTC (permalink / raw)



In article <slrn52tn5u.8d.mdw@excessus.demon.co.uk> mdw@excessus.demon.co.uk (Mark Wooding) writes:

  > The One True View is that sequence of instructions; all else is an
  > illusion.  Maybe it's a helpful illusion, but illusion it is
  > nonetheless.

    Sorry, this is completely false.  On most modern processors which
typically have multiple independent pipelines, your One True View is
at best a helpful illusion.  Go get a SPARC, PA-RISC, or PowerPC
reference manual and read it. (You have to sign a non-disclosure
agreement to get the same level of detail about a Pentium.)  In all
cases, the illusion that there is such a thing as a processor state is
only true in some cases of traps or signals--and even then it is not
all that well defined.  Assembler on a Z80 chip might correspond to
some One True View, but for all modern chips architecture manuals
sections on assembly should only be read by compiler vendors.  (And
they need to read it VERY carefully, but that is a different
discussion.)

--

					Robert I. Eachus

with Standard_Disclaimer;
use  Standard_Disclaimer;
function Message (Text: in Clever_Ideas) return Better_Ideas is...




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-05  0:00                                               ` Mark Wooding
@ 1996-09-06  0:00                                                 ` Bob Cousins
  1996-09-06  0:00                                                   ` Tim Behrendsen
  1996-09-13  0:00                                                 ` Bengt Richter
  1 sibling, 1 reply; 688+ messages in thread
From: Bob Cousins @ 1996-09-06  0:00 UTC (permalink / raw)



Mark Wooding wrote:

>Lawrence Kirby <fred@genesis.demon.co.uk> wrote:
>> In article <01bb8f19$9a89d820$32ee6fce@timhome2>
>>            tim@airshields.com "Tim Behrendsen" writes:
>>
>> >There is no other view than the procedural view.
>> 
>> Some functional language programmers might take issue with that
>> statement. Prologgers may have a thought or two also.
>
>I've not come across a computer yet which doesn't work by fetching an
>instruction (or maybe a few at a time), doing them, and then going off
>and fetching some more.  I guess you can pretend that this isn't the
>case, and maybe come up with some nice ways of presenting algorithms
>which don't depend on this, but that's not the way things work
>underneath.  The One True View is that sequence of instructions; all
>else is an illusion.  Maybe it's a helpful illusion, but illusion it is
>nonetheless.

I am afraid your view is severely limited, on two counts.

1. It could be argued that executing a sequence of intructions is the
"illusion". It appears to you that that is what happens, but all the
CPU does is respond to a set of input signals on its pins, combines
them with states in internal registers, and produces a new set of
output pins and register states. 

The CPU is just a complex state machine. It is just an abstraction
that it executes instructions. Of course we have designed it to behave
according to our abstraction. If the CPU generated non-sensical
outputs, it would still be the same complex state machine, but we
would not regard it as executing instructions.

In fact, executing instructions is such a useful abstraction, that in
nearly all complex processors, there is a smaller processor executing
microcode, and in even more complex ones, the embedded processor is
executing nanocode, but at the lowest level we get to a hardware state
machine. 

The abstraction can be carried upwards, so that we can regard a
functional language as evaluating functions. It does not matter that
underneath instructions are being executed, because at the very lowest
level its just electrons whizzing around silicon. We could even
implement the functional language on a non-digital computer.

2. You either assume or mean that computer refers to a conventional
von-Neumann style digital computer. Analog computers do not execute
instructions as such, but they can be programmed and solve problems by
producing a set of outputs given a set of inputs. There was a rather
neat economic simulation which was implemented as a set of water tanks
and pipes, and it produced quite good results. Of course, it was only
implementing an abstract functional model of an economy.

Finally, you or may not consider that the human brain is a biological
computer, and that does not appear to work by fetching instructions
and executing them. So the very concept of instructions executing in
sequence is an illusion created within our own non-digital,
non-sequential, massively parallel active memory analog computer.

Regards,



-- 
Bob Cousins, Software Engineer.
Home page at http://www.demon.co.uk/sirius-cybernetics/
Note: Commercial email to bob@lintilla.demon.co.uk will be subject to a $500 handling fee.
Sending of such email constitutes acceptance of these terms.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-06  0:00                                         ` Robert I. Eachus
@ 1996-09-06  0:00                                           ` Tim Behrendsen
  0 siblings, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-09-06  0:00 UTC (permalink / raw)



Robert I. Eachus <eachus@spectre.mitre.org> wrote in article
<EACHUS.96Sep5200334@spectre.mitre.org>...
> In article <slrn52tn5u.8d.mdw@excessus.demon.co.uk>
mdw@excessus.demon.co.uk (Mark Wooding) writes:
> 
>   > The One True View is that sequence of instructions; all else is an
>   > illusion.  Maybe it's a helpful illusion, but illusion it is
>   > nonetheless.
> 
>     Sorry, this is completely false.  On most modern processors which
> typically have multiple independent pipelines, your One True View is
> at best a helpful illusion.  Go get a SPARC, PA-RISC, or PowerPC
> reference manual and read it. (You have to sign a non-disclosure
> agreement to get the same level of detail about a Pentium.)  In all
> cases, the illusion that there is such a thing as a processor state is
> only true in some cases of traps or signals--and even then it is not
> all that well defined.  Assembler on a Z80 chip might correspond to
> some One True View, but for all modern chips architecture manuals
> sections on assembly should only be read by compiler vendors.  (And
> they need to read it VERY carefully, but that is a different
> discussion.)

The point is not that there's a single thread of execution,
but that there *is* a thread of execution.  His point (and mine)
is that algorithms require instruction execution over time.  The
number of simultaneous instructions is irrelevent, and so-called
"non-procedural" languages are only non-procedural in the sense
that the programmer does not specify the procedure -- the
compiler does.

-- Tim Behrendsen (tim@a-sis.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-04  0:00                                               ` Tim Behrendsen
@ 1996-09-06  0:00                                                 ` Bob Gilbert
  1996-09-06  0:00                                                   ` Tim Behrendsen
                                                                     ` (4 more replies)
  0 siblings, 5 replies; 688+ messages in thread
From: Bob Gilbert @ 1996-09-06  0:00 UTC (permalink / raw)



In article <01bb9a1e$24c669e0$32ee6fcf@timhome2>, "Tim Behrendsen" <tim@airshields.com> writes:
> Lawrence Kirby <fred@genesis.demon.co.uk> wrote in article
> <841797763snz@genesis.demon.co.uk>...
> > In article <01bb8f19$9a89d820$32ee6fce@timhome2>
> >            tim@airshields.com "Tim Behrendsen" writes:
> > >Bob Gilbert <rgilbert@unconfigured.xvnews.domain> wrote in article
> > ><4vcac4$gm6@zeus.orl.mmc.com>...
> > >> A very procedural point of view.  Many of the proponents of object
> > >> oriented design might have a problem with this view, and demonstrates
> > >> my point about allowing the details of implementation to obscure the
> > >> higher level problem solving process.
> > >
> > >There is no other view than the procedural view.
> > 
> > Some functional language programmers might take issue with that
> > statement. Prologgers may have a thought or two also.
> 
> What I mean by that is, for work to get done, the computer
> must perform transformations of data over time.  You can call
> that an implementation detail if you want, but there simply is
> no such thing as "instantaneous" algorithms.  You can look
> at a mathematical proof as existing without procedures, in the
> sense that it simply "exists", as a statement of truth, but
> algorithms are different.  They are, by their nature, a
> process, and processes require a time axis.

I suspect our definitions of procedural vs non-procedural (e.g. object
oriented) views are not the same.  Non-procedural views do not imply 
"instantaneous" algorithms, the non-existence of a time axis, or whatever.
The difference is whether I veiw the problem solution as being built by
putting together a bunch of operations (procedures) that I can invoke to 
manipulate whatever data I happen to have, or I can view the problem 
solution as being built by putting together the appropiate data elements 
(objects) for which certain operations are defined.  Are the basic 
building blocks procedures or objects?  BTW, I'm not necessarily 
advocating one view over another, just trying to point out that there 
are views other than a procedural view.

-Bob















^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-06  0:00                                                 ` Bob Cousins
@ 1996-09-06  0:00                                                   ` Tim Behrendsen
  1996-09-07  0:00                                                     ` Craig Franck
  0 siblings, 1 reply; 688+ messages in thread
From: Tim Behrendsen @ 1996-09-06  0:00 UTC (permalink / raw)



Bob Cousins <bob@lintilla.demon.co.uk> wrote in article
<322f864d.42836625@news.demon.co.uk>...
> Mark Wooding wrote:
> 
> >Lawrence Kirby <fred@genesis.demon.co.uk> wrote:
> >> In article <01bb8f19$9a89d820$32ee6fce@timhome2>
> >>            tim@airshields.com "Tim Behrendsen" writes:
> >>
> >> >There is no other view than the procedural view.
> >> 
> >> Some functional language programmers might take issue with that
> >> statement. Prologgers may have a thought or two also.
> >
> >I've not come across a computer yet which doesn't work by fetching an
> >instruction (or maybe a few at a time), doing them, and then going off
> >and fetching some more.  I guess you can pretend that this isn't the
> >case, and maybe come up with some nice ways of presenting algorithms
> >which don't depend on this, but that's not the way things work
> >underneath.  The One True View is that sequence of instructions; all
> >else is an illusion.  Maybe it's a helpful illusion, but illusion it is
> >nonetheless.
> 
> I am afraid your view is severely limited, on two counts.
>
> 1. It could be argued that executing a sequence of intructions is the
> "illusion". It appears to you that that is what happens, but all the
> CPU does is respond to a set of input signals on its pins, combines
> them with states in internal registers, and produces a new set of
> output pins and register states. 
>
> The CPU is just a complex state machine. It is just an abstraction
> that it executes instructions. Of course we have designed it to behave
> according to our abstraction. If the CPU generated non-sensical
> outputs, it would still be the same complex state machine, but we
> would not regard it as executing instructions.

Sounds like sequential operations to me.
input signals --> [state machine] --> output pins
>>>------>>> Note the time axis here >>>------>>>
 
> In fact, executing instructions is such a useful abstraction, that in
> nearly all complex processors, there is a smaller processor executing
> microcode, and in even more complex ones, the embedded processor is
> executing nanocode, but at the lowest level we get to a hardware state
> machine. 

So what?  At the lowest level, we have electrons flowing around,
changing states of electrical subsystems such as flip/flops.  This
is data manipulation over time.  No instantaneous algorithms
occurring, here.
 
> The abstraction can be carried upwards, so that we can regard a
> functional language as evaluating functions. It does not matter that
> underneath instructions are being executed, because at the very lowest
> level its just electrons whizzing around silicon. We could even
> implement the functional language on a non-digital computer.

Yes, and even if you have gears turning, somewhere you have
data transformations occurring over time.  It can't be otherwise.
"Speed of light" also refers to signal propagation.
 
> 2. You either assume or mean that computer refers to a conventional
> von-Neumann style digital computer. Analog computers do not execute
> instructions as such, but they can be programmed and solve problems by
> producing a set of outputs given a set of inputs. There was a rather
> neat economic simulation which was implemented as a set of water tanks
> and pipes, and it produced quite good results. Of course, it was only
> implementing an abstract functional model of an economy.

Again, water flows through pipes, and the measurements change
over time.  The pure integrator analog computer uses time as
one of the inputs.

> Finally, you or may not consider that the human brain is a biological
> computer, and that does not appear to work by fetching instructions
> and executing them. So the very concept of instructions executing in
> sequence is an illusion created within our own non-digital,
> non-sequential, massively parallel active memory analog computer.

The human brain works (as far as anyone knows) by electrical patterns
flowing through neurons.  You have input from your senses which
stimulates pathways in your brain, and produces changes in the
memory structures, and possibly produces output through muscular
manipulations.

*There is no view other than the procedural view*.

Everything comes down to data transformations over time.  You
have yet to come up with an example where this is not true, and
you won't, simply because you can't eliminate the time axis.  An
algorithm has to have a beginning and an end, an input and an
output, and in the middle is a black box known as the "procedure".

-- Tim Behrendsen (tim@a-sis.com)





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-08-30  0:00                               ` Alan Peake
  1996-08-31  0:00                                 ` Robert Dewar
@ 1996-09-07  0:00                                 ` .
  1 sibling, 0 replies; 688+ messages in thread
From: . @ 1996-09-07  0:00 UTC (permalink / raw)



Yes Fortran IV was my first language too. None of this structured 
programming clap-trap in those days. 
Hello to all programming teachers. I'm on the lookout for ideas on 
algorithmic-based instruction with language sytax coming a distant second 
place.  How do you test for such skills in your students?

Thanks
Bill Mackay





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to learn? [was Re: Should I learn C or Pascal?]
  1996-09-03  0:00                                   ` Alan Peake
@ 1996-09-07  0:00                                     ` Robert Dewar
  0 siblings, 0 replies; 688+ messages in thread
From: Robert Dewar @ 1996-09-07  0:00 UTC (permalink / raw)



George II was one of the ICL operating systems, I used George 4 extensively
on an ICL 1906A at the University of Leeds in the early 70's, and was
amused by the front page of the manual, which said that "George 4 uses
the well established techniques of virtual memory and demand paging to
manage memory", or somesuch (after all Atlas did precede it by many years).
That's an interesting contrast to an IBM spokesperson describing the
first virtual memory implementations of OS/3x0 much later, who, when asked
about the Atlas, said he had never heard of it, and that IBM was too busy
developing advanced technology to waste time looking at ancient history (I
don't know if that is a verbatim quote, I was not there, but that is the
way a friend described the Florida Share meeting).

George 4 had two other nice features, which still seem advanced today.
First, it had a completely fluid file system that spilled from disk to
tape in a transparent manner. I once overheard one of the system programmers
saying to another "we are getting a little pressed for disk space, we have
300 megabytes of disks, and nearly 900 megs of online files" :-) Leeds
kept four tape drives dedicated to constantly writing updated information
(a kind of constant backup) and reading old tapes. When I asked someone
there how to delete an old version of a file (George had version numbers
like VMS), he replied "why would you ever delete a file, you never know
if you might need it", and indeed, if an unused file wanders to a dusty
tape, that makes sense. When I returned each summer to the university
of Leeds, I would log in, and it would take 5 minutes to list my top
level directory, because it was on some old tape that the operator had
to find and mount.

The other thing that was interesting was that george 4 really used a virtual
machine setup. You got presented a virtual machine whose hardware I/O
devices you could configure, and which you could attach to the real
devices or to virtual devices on disk, which in the simplest cases where
simply files. The ICL range at the time covered a huge range from tiny
minicomputers to big virtual memory mainframes. You really got upwards
and downwards compatibility across the range, because at the low end,
the machine became a real machine with virtually no operating system,
and you ran your program directly abvove the hardware, and that same
program run under George 4 using a virtual machine configured to be
identical to the hardware of the low end machine assumed by the program,
so every program was in fact terribly device specific and hardware
configuration specific, but it didn't matter, since you simply configured
your virtual machine to match the requirements of the program.

Sorry for the off-topic reminiscence, but the keyword "George" brings
back some fond memories. By the way it was during these summers at
Leeds that (a) I learned Algol-68 and got involved in that scene, which
indirectly lead to my involvement in Ada and (b) wrote Macro-SPITBOL.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-06  0:00                                                   ` Tim Behrendsen
@ 1996-09-07  0:00                                                     ` Craig Franck
  1996-09-08  0:00                                                       ` Tim Behrendsen
  0 siblings, 1 reply; 688+ messages in thread
From: Craig Franck @ 1996-09-07  0:00 UTC (permalink / raw)



"Tim Behrendsen" <tim@airshields.com> wrote:
>Bob Cousins <bob@lintilla.demon.co.uk> wrote in article
><322f864d.42836625@news.demon.co.uk>...


>> Finally, you or may not consider that the human brain is a biological
>> computer, and that does not appear to work by fetching instructions
>> and executing them. So the very concept of instructions executing in
>> sequence is an illusion created within our own non-digital,
>> non-sequential, massively parallel active memory analog computer.
>
>The human brain works (as far as anyone knows) by electrical patterns
>flowing through neurons.  You have input from your senses which
>stimulates pathways in your brain, and produces changes in the
>memory structures, and possibly produces output through muscular
>manipulations.

Yes, but the *order* of the instructions might be an illusion.


                     *
             ()               *
            earth   star A  star B

Say star A is 50 light years from Earth and star B is 100 light
years away. If A went super nova 50 years ago and B went super nova
100 years ago they appear to explode at the same time. Without 
advanced astronomical knowledge you would have no idea of the
real order in which they exploded.

If I run a program to sum a matrix of 100 numbers on a computer
with multiple execution units and several concurrent tasks, I
may have no clue in what order matrix was summed. Dependancies
will be checked and one or more execution units may be available
or not. The idea of a descrete number of steps occuring in a fixed
order would be an illusion for some steps may occur simultaneously.

>*There is no view other than the procedural view*.
>
>Everything comes down to data transformations over time.  You
>have yet to come up with an example where this is not true, and
>you won't, simply because you can't eliminate the time axis.  An
>algorithm has to have a beginning and an end, an input and an
>output, and in the middle is a black box known as the "procedure".

In a sufficently complex vector unit, everything could literally 
happen at once. So what actually happened in the "black box" might
never be acutally known, at least as far as the order in which things
occured. You can of coarse enforce inorder execution.

There can be "a set of ordered steps for solving a problem", but the 
order might be an illusion. That a computer is a "deterministic finite
state machine" is true but you could simulate one with say, groups of human 
beings pretending to be registers and doing the logic funcions in there 
heads, even though people are not "finite state" or *perhaps* 
deterministic. In this case just when and how certain things occurred
could be imposible to find out. I could lie and say that I
did some of the logic when it was someone else. How could you even
begin to model this as a state machine? Some of the behavour would
truely be random. The time line would still be there but with no
way to lay out the events the model loses it's straight "procedural"
look and feel. What if I just randomly guessed at the value of a 
logic function? Is that procedural? How come one number popped into
my head and not another?

Also, since my debugger can run a program backwards your time line
may have to be able to go in reverse. :-)



-- 
Craig  
clfranck@worldnet.att.net    
Manchester, NH
A great many people think they are thinking when they are
merely rearranging their prejudices.  -- William James






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-07  0:00                                                     ` Craig Franck
@ 1996-09-08  0:00                                                       ` Tim Behrendsen
  1996-09-08  0:00                                                         ` Craig Franck
  1996-09-11  0:00                                                         ` John Burdick
  0 siblings, 2 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-09-08  0:00 UTC (permalink / raw)



Craig Franck <clfranck@worldnet.att.net> wrote in article
<50sj6q$aci@mtinsc01-mgt.ops.worldnet.att.net>...
> "Tim Behrendsen" <tim@airshields.com> wrote:
> >Bob Cousins <bob@lintilla.demon.co.uk> wrote in article
> ><322f864d.42836625@news.demon.co.uk>...
> 
> 
> >> Finally, you or may not consider that the human brain is a biological
> >> computer, and that does not appear to work by fetching instructions
> >> and executing them. So the very concept of instructions executing in
> >> sequence is an illusion created within our own non-digital,
> >> non-sequential, massively parallel active memory analog computer.
> >
> >The human brain works (as far as anyone knows) by electrical patterns
> >flowing through neurons.  You have input from your senses which
> >stimulates pathways in your brain, and produces changes in the
> >memory structures, and possibly produces output through muscular
> >manipulations.
> 
> Yes, but the *order* of the instructions might be an illusion.
> 
> 
>                      *
>              ()               *
>             earth   star A  star B
> 
> Say star A is 50 light years from Earth and star B is 100 light
> years away. If A went super nova 50 years ago and B went super nova
> 100 years ago they appear to explode at the same time. Without 
> advanced astronomical knowledge you would have no idea of the
> real order in which they exploded.
> 
> If I run a program to sum a matrix of 100 numbers on a computer
> with multiple execution units and several concurrent tasks, I
> may have no clue in what order matrix was summed. Dependancies
> will be checked and one or more execution units may be available
> or not. The idea of a descrete number of steps occuring in a fixed
> order would be an illusion for some steps may occur simultaneously.

This is true, but irrelevent.  The point is that a procedure
exists, not that it's a linear procedure.
 
> >*There is no view other than the procedural view*.
> >
> >Everything comes down to data transformations over time.  You
> >have yet to come up with an example where this is not true, and
> >you won't, simply because you can't eliminate the time axis.  An
> >algorithm has to have a beginning and an end, an input and an
> >output, and in the middle is a black box known as the "procedure".
> 
> In a sufficently complex vector unit, everything could literally 
> happen at once. So what actually happened in the "black box" might
> never be acutally known, at least as far as the order in which things
> occured. You can of coarse enforce inorder execution.

What does inorder execution have to do with anything?  If I have
2 billion operations happening at once, it doesn't change the fact
that operations occurred in a unit of time.

> There can be "a set of ordered steps for solving a problem", but the 
> order might be an illusion. That a computer is a "deterministic finite
> state machine" is true but you could simulate one with say, groups of
human 
> beings pretending to be registers and doing the logic funcions in there 
> heads, even though people are not "finite state" or *perhaps* 
> deterministic. In this case just when and how certain things occurred
> could be imposible to find out. I could lie and say that I
> did some of the logic when it was someone else. How could you even
> begin to model this as a state machine? Some of the behavour would
> truely be random. The time line would still be there but with no
> way to lay out the events the model loses it's straight "procedural"
> look and feel.

Let me play devil's advocate for non-determinism.  Let's say I had
a Geiger counter (the typical "true" random number generator) and
hooked it up to my computer to give me truly random numbers.  I
use the random values to do mathemetical analysis using the
Monte Carlo method.  I run the simulation, and get the same
result, but the path of getting to the result is different every
time, because of the random nature of algorithm.

So now the question is, so what?  There is no question that I'm
using a non-deterministic statistical algorithm, but there's
still a procedure (the Monte Carlo algorithm), and it still
requires time to execute.  One of the inputs just happens to
be random numbers.

> What if I just randomly guessed at the value of a 
> logic function? Is that procedural? How come one number popped into
> my head and not another?

To answer this question, we need to know what the algorithm
is.  Just coming up with a number in your head is an output,
but what problem was solved by your coming up with the number?

You're approaching it backwards; you're trying to guess the contents
of the black box based on the input and the output.  That's
a different question from implementation of a known algorithm.

> Also, since my debugger can run a program backwards your time line
> may have to be able to go in reverse. :-)

If a computer computes in a forest, and no one is around to
see the result, how was it plugged in? :-)

-- Tim Behrendsen (tim@a-sis.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-08  0:00                                                       ` Tim Behrendsen
@ 1996-09-08  0:00                                                         ` Craig Franck
  1996-09-09  0:00                                                           ` Tim Behrendsen
  1996-09-11  0:00                                                         ` John Burdick
  1 sibling, 1 reply; 688+ messages in thread
From: Craig Franck @ 1996-09-08  0:00 UTC (permalink / raw)



"Tim Behrendsen" <tim@airshields.com> wrote:
>Craig Franck <clfranck@worldnet.att.net> wrote in article
><50sj6q$aci@mtinsc01-mgt.ops.worldnet.att.net>...
>> "Tim Behrendsen" <tim@airshields.com> wrote:
>> >Bob Cousins <bob@lintilla.demon.co.uk> wrote in article
>> ><322f864d.42836625@news.demon.co.uk>...
>> 
>> 
>> >> Finally, you or may not consider that the human brain is a biological
>> >> computer, and that does not appear to work by fetching instructions
>> >> and executing them. So the very concept of instructions executing in
>> >> sequence is an illusion created within our own non-digital,
>> >> non-sequential, massively parallel active memory analog computer.
>> >
>> >The human brain works (as far as anyone knows) by electrical patterns
>> >flowing through neurons.  You have input from your senses which
>> >stimulates pathways in your brain, and produces changes in the
>> >memory structures, and possibly produces output through muscular
>> >manipulations.
>> 
>> Yes, but the *order* of the instructions might be an illusion.
>> 
>> 
>>                      *
>>              ()               *
>>             earth   star A  star B
>> 
>> Say star A is 50 light years from Earth and star B is 100 light
>> years away. If A went super nova 50 years ago and B went super nova
>> 100 years ago they appear to explode at the same time. Without 
>> advanced astronomical knowledge you would have no idea of the
>> real order in which they exploded.
>> 
>> If I run a program to sum a matrix of 100 numbers on a computer
>> with multiple execution units and several concurrent tasks, I
>> may have no clue in what order matrix was summed. Dependancies
>> will be checked and one or more execution units may be available
>> or not. The idea of a descrete number of steps occuring in a fixed
>> order would be an illusion for some steps may occur simultaneously.
>
>This is true, but irrelevent.  The point is that a procedure
>exists, not that it's a linear procedure.
> 
>> >*There is no view other than the procedural view*.
>> >
>> >Everything comes down to data transformations over time.  You
>> >have yet to come up with an example where this is not true, and
>> >you won't, simply because you can't eliminate the time axis.  An
>> >algorithm has to have a beginning and an end, an input and an
>> >output, and in the middle is a black box known as the "procedure".
>> 
>> In a sufficently complex vector unit, everything could literally 
>> happen at once. So what actually happened in the "black box" might
>> never be acutally known, at least as far as the order in which things
>> occured. You can of coarse enforce inorder execution.
>
>What does inorder execution have to do with anything?  If I have
>2 billion operations happening at once, it doesn't change the fact
>that operations occurred in a unit of time.

My point was that a "time axis" is a vector; it has a direction to 
it (forward). If a line is a set of points and by anology the points 
events, it would be dificult to construct one when the order of the 
points could change. Your challenge above, as you stated it, hinged on
"eliminating the time axis". You are correct if the order of events
need not be fixed, then the above is irrelevant. If all that is 
required is "transformations over time" I would probably have to delve 
into quantum mechanics to find a suitable example.

I always associated procedural with "imperative, algorithmic". A query
language(say SQL) was non-procedural because you would describe a query 
or a table, and the SQL system would determine the best way to construct it.
You told it what you wanted to see; it figured out the best way to do it.

A rule based system would be non-procedural as well. If I have a scheduling
program that I create a rule for, say everbody gets every other weekend 
off, and it creates a schedule in which this is true(if possible) that 
would be non-procedural as well. I suspect we are using the term in two
different ways!


>> What if I just randomly guessed at the value of a 
>> logic function? Is that procedural? How come one number popped into
>> my head and not another?
>
>To answer this question, we need to know what the algorithm
>is.  Just coming up with a number in your head is an output,
>but what problem was solved by your coming up with the number?

The point was that any system that had a person coming up with numbers
by letting them just "pop into thier heads" as opposed to say, doing 
math and reporting the results, could not be fully described because
we don't know how the numbers are being generated. If this process 
turned out to non-procedural then the system as a whole could not be
described as procedural. Computing pi to 1000 digits is procedural;
so is flipping a coin. Saying "pick a number between 1 and 100" is not
the same thing.

Think of a jury system. You can pick jurists fairly and instruct them 
on how to deliberate. Nothing forces them to come up with a just verdict.
So the system can be described as just (procedural) but in the end was 
not, because they were not just (procedural).

-- 
Craig  
clfranck@worldnet.att.net    
Manchester, NH
A great many people think they are thinking when they are
merely rearranging their prejudices.  -- William James






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-06  0:00                                                   ` Tim Behrendsen
@ 1996-09-09  0:00                                                     ` Bob Gilbert
  1996-09-11  0:00                                                       ` Tim Behrendsen
  0 siblings, 1 reply; 688+ messages in thread
From: Bob Gilbert @ 1996-09-09  0:00 UTC (permalink / raw)



In article <01bb9c05$ce5b9aa0$87ee6fce@timpent.airshields.com>, "Tim Behrendsen" <tim@airshields.com> writes:
> > In article <01bb9a1e$24c669e0$32ee6fcf@timhome2>, "Tim Behrendsen" <tim@airshields.com>
> writes:
> > > > >
> > > > >There is no other view than the procedural view.

I might not have as much of a problem with the above statement if
you had said that all *implementations* are eventually procedural,
but the word "view" implies a methodology, approach, or way of thinking
to solve a problem, and there does exist non-procedural views.

> Bob Gilbert <rgilbert@unconfigured.xvnews.domain> wrote in article
> <50p68s$cpi@zeus.orl.mmc.com>...
> > I suspect our definitions of procedural vs non-procedural (e.g. object
> > oriented) views are not the same.  Non-procedural views do not imply 
> > "instantaneous" algorithms, the non-existence of a time axis, or whatever.
> > The difference is whether I veiw the problem solution as being built by
> > putting together a bunch of operations (procedures) that I can invoke to 
> > manipulate whatever data I happen to have, or I can view the problem 
> > solution as being built by putting together the appropiate data elements 
> > (objects) for which certain operations are defined.  Are the basic 
> > building blocks procedures or objects?  BTW, I'm not necessarily 
> > advocating one view over another, just trying to point out that there 
> > are views other than a procedural view.
> 
> Object-oriented is perhaps not where you want to go to make
> your point; the only difference between OOP and non-OOP is
> how you bind the procedures to the objects.  non-OOP you pass
> an object to a procedure; in OOP, you execute the procedure
> abstractly bound to the object.

I was taking object-oriented design (OOD), not necessarily object
oriented programming (OOP), which is simply the implemtation of an
OOD.  View vs implementation.

> But to get back to the original point, which is teaching students
> the procedural nature of the computer, it is essential that they
> understand this.  One of the most dangerous people in the world
> is someone who writes SQL statements who doesn't fundamentally
> understand what happens inside the mechanism.  In theory, I
> should be able to happily write SQL statements anyway I want to;
> in practice, I *would* like my report to come out within this
> millenium.
> 
> Yes, teach them non-procedural languages in the Masters or PhD
> programs.  First teach them that everything boils down to
> procedural mechanisms.

I agree that a computer science student should have a through understanding
of computer architecture(s), assembly language, and the various low level 
implementation details and concepts.  I disagree that assembly should be 
the first programming language that the student is exposed to.  I would also
disagree that assembly should be the last language taught (perhaps a close
second).  I disagree that implementing algorithms in assembly is better for 
teaching the algorithm and it's development than using a HOL such as Ada, 
Fortran, or PL/I.  Assembly should be used in conjuction with the study of 
computer architecture, but I contend that the required knowledge of the many
architecturual details necessary to effectively program in assembly hinders 
the understanding of the algorithm (e.g. sorts, random number generators, 
data base, etc.), however implementing various algorithms in assembly is a 
good exercise for the student and certainly provides a better overall 
understanding (as well as providing knowledge to better design algorithms 
to the constraints of the machine).  I wouldn't wait for Masters or PhD
programs to teach many of the HOL programming issues (OOD, etc.).

-Bob






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-08  0:00                                                         ` Craig Franck
@ 1996-09-09  0:00                                                           ` Tim Behrendsen
  1996-09-10  0:00                                                             ` Richard A. O'Keefe
  0 siblings, 1 reply; 688+ messages in thread
From: Tim Behrendsen @ 1996-09-09  0:00 UTC (permalink / raw)



Craig Franck <clfranck@worldnet.att.net> wrote in article
<50v6k3$soo@mtinsc01-mgt.ops.worldnet.att.net>...
> "Tim Behrendsen" <tim@airshields.com> wrote:
> >Craig Franck <clfranck@worldnet.att.net> wrote in article
> 
> >> What if I just randomly guessed at the value of a 
> >> logic function? Is that procedural? How come one number popped into
> >> my head and not another?
> >
> >To answer this question, we need to know what the algorithm
> >is.  Just coming up with a number in your head is an output,
> >but what problem was solved by your coming up with the number?
> 
> The point was that any system that had a person coming up with numbers
> by letting them just "pop into thier heads" as opposed to say, doing 
> math and reporting the results, could not be fully described because
> we don't know how the numbers are being generated. If this process 
> turned out to non-procedural then the system as a whole could not be
> described as procedural.
> [snip]

This is the crux of the question.

"If this process turned out to be non-procedural..."  <--- this is
impossible according to known laws of physics.  Just because we
don't know the procedure, doesn't mean there isn't one.  Now,
given our lack of knowledge on how the brain works, it may
turn out that it is not deterministic, if the brain has truly
random elements as some speculate (I doubt it, personally), but
it would still be procedural and algorithmic.

Could you mean "deterministic" when you say "procedural"?

Going back to the SQL example, SQL is an expression of the algorithm,
but it is not possible to "directly execute" SQL; it has to be
translated into a procedural algorithm, and this is the same
with all "non-procedural expression" languages.

-- Tim Behrendsen (tim@a-sis.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-06  0:00                                                 ` Bob Gilbert
  1996-09-06  0:00                                                   ` Tim Behrendsen
  1996-09-10  0:00                                                   ` Richard A. O'Keefe
@ 1996-09-10  0:00                                                   ` Jon S Anthony
  1996-09-11  0:00                                                     ` Richard A. O'Keefe
  1996-09-11  0:00                                                   ` Jon S Anthony
  1996-09-11  0:00                                                   ` Jon S Anthony
  4 siblings, 1 reply; 688+ messages in thread
From: Jon S Anthony @ 1996-09-10  0:00 UTC (permalink / raw)



In article <51368e$6ir@goanna.cs.rmit.edu.au> ok@goanna.cs.rmit.edu.au (Richard A. O'Keefe) writes:

> rgilbert@unconfigured.xvnews.domain (Bob Gilbert) writes:
> 
> >I suspect our definitions of procedural vs non-procedural (e.g. object
> >oriented) views are not the same.  Non-procedural views do not imply 
> >"instantaneous" algorithms, the non-existence of a time axis, or whatever.
> 
> It strikes me as a very strange use of language to call object-oriented
> programming (of the kind exemplified by Simula 67, Smalltalk, C++, Ada 95,
> CLOS, Eiffel, Self, Cecil, NewtonScript, &c) "non-procedural".  They are
> about as thoroughly procedural as you can get.

Exactly.  A point I have tried to point out from time to time in c.o.
All to no avail.  In fact, I've actually been flamed for this
"indiscretion" as being "anti-oop".  Go figure.  What was that line
from Schiller?  "Against stupidity, the gods themselves contend in
vain".


> The opposite of "procedural" is not "OOP" but "declarative".
> Declarative languages include Haskell, Clean, Mercury, and perhaps Goedel,
> which I don't know all that well.  Maybe Sisal at all, but I don't know it.
> In those languages the programmer has no idea and no reason to care in
> what order operations are performed; any order the compiler chooses will

Hmmm, I am somewhat surprised that you of all people would not include
Prolog in that list.  What's the scoop?

/Jon

-- 
Jon Anthony
Organon Motives, Inc.
1 Williston Road, Suite 4
Belmont, MA 02178

617.484.3383
jsa@organon.com





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-10  0:00                                                   ` Richard A. O'Keefe
@ 1996-09-10  0:00                                                     ` Kaz Kylheku
  1996-09-11  0:00                                                     ` Bob Gilbert
  1 sibling, 0 replies; 688+ messages in thread
From: Kaz Kylheku @ 1996-09-10  0:00 UTC (permalink / raw)



In article <51368e$6ir@goanna.cs.rmit.edu.au>,
Richard A. O'Keefe <ok@goanna.cs.rmit.edu.au> wrote:
>rgilbert@unconfigured.xvnews.domain (Bob Gilbert) writes:
>
>>I suspect our definitions of procedural vs non-procedural (e.g. object
>>oriented) views are not the same.  Non-procedural views do not imply 
>>"instantaneous" algorithms, the non-existence of a time axis, or whatever.
>
>It strikes me as a very strange use of language to call object-oriented
>programming (of the kind exemplified by Simula 67, Smalltalk, C++, Ada 95,
>CLOS, Eiffel, Self, Cecil, NewtonScript, &c) "non-procedural".  They are
>about as thoroughly procedural as you can get.
>
>The opposite of "procedural" is not "OOP" but "declarative".
>Declarative languages include Haskell, Clean, Mercury, and perhaps Goedel,
>which I don't know all that well.  Maybe Sisal at all, but I don't know it.

Lex and awk would also be familiar examples of such languages, to an extent.
The pattern matching rules of lex are not though of as executing in any
particular order, unless there is a conflict between two equal-length matches
in which case the pattern appearing first in the lex source is preferred.  Awk
pattern matching rules are also not arranged in a procedural sequence. Of
course, when matching occurs, ``actions'' are invoked, and the program
temporarily becomes procedural.

State machines, like the pattern matcher of a Lex generated program, are one
good example of non-procedural programming. The actual procedure which drives
the state machine is typically trivial, and is simply not telling of the intent
of the machine. It is driven by the state configuration and by input. (Of
course, the state transitions of the machine can be arranged so that the
functioning can be understood procedurally, but that's a separate issue: after
all, a state machine can interpret instructions which form a procedure).

>In those languages the programmer has no idea and no reason to care in
>what order operations are performed; any order the compiler chooses will
>make sense.  (Yes, it is still possible to write programs that interact
>with the world, and at least in Clean, very simply and elegantly.)

But this is tue of OO programming as well. You have an object, and you can
interact with it in ways that don't necessarily obey a particular order; the
object acts like a state machine. This is why OO folks like to talk about
messages rather than function calls.

OO is used a lot in GUI developemnt, and modern GUIs typify a non-procedural
approach. Most of the time, the user can interact with the interface in an
ad-hoc fashion, changing the state of various internal objects. The structure
of such a program reflects that.

I suppose that object orientation is a completely orthogonal issue to
procedural versus non-procedural. I also suppose that there are many
ways to be non procedural, so that ``declarative'' is either a very broad
umbrella, or is just one example of the many ways.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-06  0:00                                                 ` Bob Gilbert
  1996-09-06  0:00                                                   ` Tim Behrendsen
@ 1996-09-10  0:00                                                   ` Richard A. O'Keefe
  1996-09-10  0:00                                                     ` Kaz Kylheku
  1996-09-11  0:00                                                     ` Bob Gilbert
  1996-09-10  0:00                                                   ` Jon S Anthony
                                                                     ` (2 subsequent siblings)
  4 siblings, 2 replies; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-09-10  0:00 UTC (permalink / raw)



rgilbert@unconfigured.xvnews.domain (Bob Gilbert) writes:

>I suspect our definitions of procedural vs non-procedural (e.g. object
>oriented) views are not the same.  Non-procedural views do not imply 
>"instantaneous" algorithms, the non-existence of a time axis, or whatever.

It strikes me as a very strange use of language to call object-oriented
programming (of the kind exemplified by Simula 67, Smalltalk, C++, Ada 95,
CLOS, Eiffel, Self, Cecil, NewtonScript, &c) "non-procedural".  They are
about as thoroughly procedural as you can get.

The opposite of "procedural" is not "OOP" but "declarative".
Declarative languages include Haskell, Clean, Mercury, and perhaps Goedel,
which I don't know all that well.  Maybe Sisal at all, but I don't know it.
In those languages the programmer has no idea and no reason to care in
what order operations are performed; any order the compiler chooses will
make sense.  (Yes, it is still possible to write programs that interact
with the world, and at least in Clean, very simply and elegantly.)

-- 
Australian citizen since 14 August 1996.  *Now* I can vote the xxxs out!
Richard A. O'Keefe; http://www.cs.rmit.edu.au/%7Eok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-09  0:00                                                           ` Tim Behrendsen
@ 1996-09-10  0:00                                                             ` Richard A. O'Keefe
  1996-09-10  0:00                                                               ` Tim Behrendsen
  0 siblings, 1 reply; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-09-10  0:00 UTC (permalink / raw)



"Tim Behrendsen" <tim@airshields.com> writes:

>"If this process turned out to be non-procedural..."  <--- this is
>impossible according to known laws of physics.

I honestly don't see this.  I don't personally know all the known laws
of physics; my Msc was in underswater acoustics which isn't exactly
TOE-of-the-month even if the refraction equation _is_ formally identical
to the one-dimensional Schroedinger equation.  I have read a number of
papers on quantum computing, and think I understand the idea.  Now
quantum computers cannot compute anything that a "procedural" computer
cannot compute, but they _can_ compute things asymptotically faster than
any possible 'procedural' computer (precisely because they are not discrete
one-step-at-a-time finite state beasts).

So if Tim Behrendsen knows which presently known laws of physics make
quantum computers impossible, he should publish in Nature.

>Going back to the SQL example, SQL is an expression of the algorithm,
>but it is not possible to "directly execute" SQL; it has to be
>translated into a procedural algorithm, and this is the same
>with all "non-procedural expression" languages.

Tell me, do you regard optical computing as procedural?
If you don't (and since it is non-discrete, with no "time axis" that
is useful in understanding how it works), then SQL _can_ be translated
into a non-procedural but executable form.
If you _do_ regard optical computing as procedural, then you have
stretched the term to the point where you are no longer saying anything.

-- 
Australian citizen since 14 August 1996.  *Now* I can vote the xxxs out!
Richard A. O'Keefe; http://www.cs.rmit.edu.au/%7Eok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-10  0:00                                                             ` Richard A. O'Keefe
@ 1996-09-10  0:00                                                               ` Tim Behrendsen
  0 siblings, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-09-10  0:00 UTC (permalink / raw)



Richard A. O'Keefe <ok@goanna.cs.rmit.edu.au> wrote in article
<5136on$7qj@goanna.cs.rmit.edu.au>...
> "Tim Behrendsen" <tim@airshields.com> writes:
> 
> >"If this process turned out to be non-procedural..."  <--- this is
> >impossible according to known laws of physics.
> 
> I honestly don't see this.  I don't personally know all the known laws
> of physics; my Msc was in underswater acoustics which isn't exactly
> TOE-of-the-month even if the refraction equation _is_ formally identical
> to the one-dimensional Schroedinger equation.  I have read a number of
> papers on quantum computing, and think I understand the idea.  Now
> quantum computers cannot compute anything that a "procedural" computer
> cannot compute, but they _can_ compute things asymptotically faster than
> any possible 'procedural' computer (precisely because they are not discrete
> one-step-at-a-time finite state beasts).
> 
> So if Tim Behrendsen knows which presently known laws of physics make
> quantum computers impossible, he should publish in Nature.

I didn't say anything about "quantum computers", I said "non-procedural".
See below.

> >Going back to the SQL example, SQL is an expression of the algorithm,
> >but it is not possible to "directly execute" SQL; it has to be
> >translated into a procedural algorithm, and this is the same
> >with all "non-procedural expression" languages.
> 
> Tell me, do you regard optical computing as procedural?
> If you don't (and since it is non-discrete, with no "time axis" that
> is useful in understanding how it works), then SQL _can_ be translated
> into a non-procedural but executable form.

Unless I'm mistaken, you seem to regard "procedural" as meaning a
discrete sequence of steps.  My definition is a bit more broad; I
see procedural as anything with a procedure; that is, an analog
integrator is a procedural mechanism, even though there is no
sequence of individual steps.

It is similiar to the difference between summation and integration;
one consists of individual sums, the other of an infinite number
of sums.  However, both are fundamentally adding.

When I see "Optical Computing", I normally think of gates that use
photons rather than electrons, but I'm guessing you are referring to
what I know as a "data flow" computer, where the data is encoded in
a stream and "flows" through various data transformation mechanisms.

However, this still requires time.  You make a claim above that
"...and since it is non-discrete, with no 'time axis'...".  Perhaps
you could explain why non-discrete means it does not require a time
axis, and give an example of *any* algorithm that does not require
a time axis, i.e., does not require time.

> If you _do_ regard optical computing as procedural, then you have
> stretched the term to the point where you are no longer saying anything.

Procedural means "has a procedure."  Nothing in the real world
is not procedural, but we can *express* algorithms non-procedurally.
This is where the term is useful.  The term is meaningless when it
comes to implementation.

-- Tim Behrendsen (tim@a-sis.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-06  0:00                                                 ` Bob Gilbert
                                                                     ` (3 preceding siblings ...)
  1996-09-11  0:00                                                   ` Jon S Anthony
@ 1996-09-11  0:00                                                   ` Jon S Anthony
  4 siblings, 0 replies; 688+ messages in thread
From: Jon S Anthony @ 1996-09-11  0:00 UTC (permalink / raw)



In article <5149bt$kli@bcrkh13.bnr.ca> kaz@vision.crest.nt.com (Kaz Kylheku) writes:

> I suppose that object orientation is a completely orthogonal issue to
> procedural versus non-procedural. I also suppose that there are many

You suppose correctly.


> ways to be non procedural, so that ``declarative'' is either a very broad
> umbrella, or is just one example of the many ways.

Good point.  A true functional language would also be non-procedural.

/Jon
-- 
Jon Anthony
Organon Motives, Inc.
1 Williston Road, Suite 4
Belmont, MA 02178

617.484.3383
jsa@organon.com





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-06  0:00                                                 ` Bob Gilbert
                                                                     ` (2 preceding siblings ...)
  1996-09-10  0:00                                                   ` Jon S Anthony
@ 1996-09-11  0:00                                                   ` Jon S Anthony
  1996-09-11  0:00                                                   ` Jon S Anthony
  4 siblings, 0 replies; 688+ messages in thread
From: Jon S Anthony @ 1996-09-11  0:00 UTC (permalink / raw)



In article <515mq2$bri@goanna.cs.rmit.edu.au> ok@goanna.cs.rmit.edu.au (Richard A. O'Keefe) writes:

> jsa@alexandria (Jon S Anthony) writes:
> >Hmmm, I am somewhat surprised that you of all people would not include
> >Prolog in that list.  What's the scoop?
> 
> Prolog is a hybrid of a weak declarative language (weak because the
> traditional Prolog implementation doesn't quite match the semantics of
> Horn clause programs, and the difference sometimes matters a lot)

Agreed.


> with a rather limited imperative language.  It is an _extremely_ practical
> tool, but almost all Prolog programs make use of the non-declarative parts
> of the language.  In the same way, Lisp is an _exceptionally_ practical tool
> for almost every kind of programming, including writing web servers and
> operating systems, but it is not a _pure_ declarative language, so didn't
> make it into the list of declarative languages.

Hmmm, I have not typically put Lisp in such a category even if it were
more "purely applicative" - making a distinction between "declarative"
and non-procedural, with Lisp falling into "functional" which is also
non-procedural, so I was not "surprised" by its ommission.  But,
shrug, I can certainly see how you could just say functional things
are simply another kind of declarative things.


>  Mercury _is_ on the list because the _implemented_ semantics of any
> Mercury program coincides with its theoretical semantics as a Horn
> clause program.

I have begun looking into Mercury and I must say that it seems to have
a lot going for it.

 
> For what it's worth, I do not accept _any_ of the OOP languages I listed
> in my previous message as in any way declarative.  This is not to say that
> you couldn't have a declarative OOP language; it is arguable that Haskell
> is one.

Completely agreed on both counts.


/Jon
-- 
Jon Anthony
Organon Motives, Inc.
1 Williston Road, Suite 4
Belmont, MA 02178

617.484.3383
jsa@organon.com





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-19  0:00                                       ` Tim Behrendsen
                                                           ` (3 preceding siblings ...)
  1996-09-11  0:00                                         ` Jon S Anthony
@ 1996-09-11  0:00                                         ` Jon S Anthony
  1996-09-11  0:00                                         ` Richard A. O'Keefe
                                                           ` (2 subsequent siblings)
  7 siblings, 0 replies; 688+ messages in thread
From: Jon S Anthony @ 1996-09-11  0:00 UTC (permalink / raw)



In article <515o3b$d7h@goanna.cs.rmit.edu.au> ok@goanna.cs.rmit.edu.au (Richard A. O'Keefe) writes:

> In short, the discussion with you has been a waste of time, because
> you were never making any claim with empirical content, only playing
> language games.

Never discount a "language game".  They can surprise you sometimes and
offer a good deal more than what you were originally looking for.

But - not in this case :-)

/Jon
-- 
Jon Anthony
Organon Motives, Inc.
1 Williston Road, Suite 4
Belmont, MA 02178

617.484.3383
jsa@organon.com





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-08  0:00                                                       ` Tim Behrendsen
  1996-09-08  0:00                                                         ` Craig Franck
@ 1996-09-11  0:00                                                         ` John Burdick
  1 sibling, 0 replies; 688+ messages in thread
From: John Burdick @ 1996-09-11  0:00 UTC (permalink / raw)



This argument about the procedural view being the one true view
strikes me as silly. We know that Turing's machine and Church's lambda
calculus are _both_ able to express any computation the other can.
Trying to find some practical computation Turing can't do is
pointless, but that doesn't make Church subsidiary to Turing.

John






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-19  0:00                                       ` Tim Behrendsen
                                                           ` (2 preceding siblings ...)
  1996-09-06  0:00                                         ` Robert I. Eachus
@ 1996-09-11  0:00                                         ` Jon S Anthony
  1996-09-11  0:00                                           ` Craig Franck
  1996-09-17  0:00                                           ` George
  1996-09-11  0:00                                         ` Jon S Anthony
                                                           ` (3 subsequent siblings)
  7 siblings, 2 replies; 688+ messages in thread
From: Jon S Anthony @ 1996-09-11  0:00 UTC (permalink / raw)



In article <01bb9f26$36c870e0$87ee6fce@timpent.a-sis.com> "Tim Behrendsen" <tim@airshields.com> writes:

> It is similiar to the difference between summation and integration;
> one consists of individual sums, the other of an infinite number
> of sums.  However, both are fundamentally adding.

Well, that is one option.  But as "everyone" knows, the FTC allows you
to compute definite integrals without taking the limits of sums or
using summations at all.  Incidentally, none of the standard
definitions (Riemann Sum or something) uses "an infinite number of
sums".  Can't - infinity is not part of the real numbers...


> Procedural means "has a procedure."  Nothing in the real world
> is not procedural, but we can *express* algorithms non-procedurally.
> This is where the term is useful.  The term is meaningless when it
> comes to implementation.

Well, at this point, I'd have to say that your _definition_ has become
meaningless, or at the very least, empty of content.

/Jon
-- 
Jon Anthony
Organon Motives, Inc.
1 Williston Road, Suite 4
Belmont, MA 02178

617.484.3383
jsa@organon.com





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-11  0:00                                         ` Jon S Anthony
@ 1996-09-11  0:00                                           ` Craig Franck
  1996-09-11  0:00                                             ` Tim Behrendsen
  1996-09-17  0:00                                           ` George
  1 sibling, 1 reply; 688+ messages in thread
From: Craig Franck @ 1996-09-11  0:00 UTC (permalink / raw)



jsa@alexandria (Jon S Anthony) wrote:
>In article <01bb9f26$36c870e0$87ee6fce@timpent.a-sis.com> "Tim Behrendsen" <tim@airshields.com> writes:


>> Procedural means "has a procedure."  Nothing in the real world
>> is not procedural, but we can *express* algorithms non-procedurally.
>> This is where the term is useful.  The term is meaningless when it
>> comes to implementation.
>
>Well, at this point, I'd have to say that your _definition_ has become
>meaningless, or at the very least, empty of content.

It would be meaningfull if Tim could come up with an example of 
"non-procedural" behavour. Procedural means it "has a procedure" adds
nothing, unless an example of "not having a procedure" is given. For 
something to be a theory, you must be able to come up with a testable
example in which it is proven false. For a word to have meaning it must
be shown to have a "creditable distinction". "All is Up" impresses no one. 
If, say, a random change from one state into the next, with the current
state having nothing to do with the past one, is considered "non-procedural",
then the definition has meaning.

Remember, everything in the real world is real, and some things are 
inexpressable, even though they have meaning. All algorithms may be
"procedural" when they are implemented, because that is the way an
"implementation" of an algorithm is defined by some.

-- 
Craig  
clfranck@worldnet.att.net    
Manchester, NH
A great many people think they are thinking when they are
merely rearranging their prejudices.  -- William James






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-09  0:00                                                     ` Bob Gilbert
@ 1996-09-11  0:00                                                       ` Tim Behrendsen
  0 siblings, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-09-11  0:00 UTC (permalink / raw)



Bob Gilbert <rgilbert@unconfigured.xvnews.domain> wrote in article
<511smf$b4h@zeus.orl.mmc.com>...
> In article <01bb9c05$ce5b9aa0$87ee6fce@timpent.airshields.com>, "Tim
Behrendsen" <tim@airshields.com> writes:
> > > In article <01bb9a1e$24c669e0$32ee6fcf@timhome2>, "Tim Behrendsen"
<tim@airshields.com>
> > writes:
> > > > > >
> > > > > >There is no other view than the procedural view.
> 
> I might not have as much of a problem with the above statement if
> you had said that all *implementations* are eventually procedural,
> but the word "view" implies a methodology, approach, or way of thinking
> to solve a problem, and there does exist non-procedural views.

That is a reasonable statement.  There are certainly other ways
of looking at algorithms other than procedurally (SQL, for example).
I suppose there's a certain bias on my part because I break
everything down automatically into procedures.  I admit others
may more naturally think in terms of mathematical statements.

I'm still not sure if I want those people producing my SQL reports,
however.  :)

-- Tim Behrendsen (tim@a-sis.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-11  0:00                                           ` Craig Franck
@ 1996-09-11  0:00                                             ` Tim Behrendsen
  0 siblings, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-09-11  0:00 UTC (permalink / raw)



Craig Franck <clfranck@worldnet.att.net> wrote in article
<515al5$lm8@mtinsc01-mgt.ops.worldnet.att.net>...
> jsa@alexandria (Jon S Anthony) wrote:
> >In article <01bb9f26$36c870e0$87ee6fce@timpent.a-sis.com> "Tim
Behrendsen" <tim@airshields.com> writes:
> 
> 
> >> Procedural means "has a procedure."  Nothing in the real world
> >> is not procedural, but we can *express* algorithms non-procedurally.
> >> This is where the term is useful.  The term is meaningless when it
> >> comes to implementation.
> >
> >Well, at this point, I'd have to say that your _definition_ has become
> >meaningless, or at the very least, empty of content.
> 
> It would be meaningfull if Tim could come up with an example of 
> "non-procedural" behavour. Procedural means it "has a procedure" adds
> nothing, unless an example of "not having a procedure" is given. For 
> something to be a theory, you must be able to come up with a testable
> example in which it is proven false. For a word to have meaning it must
> be shown to have a "creditable distinction". "All is Up" impresses no
one. 
> If, say, a random change from one state into the next, with the current
> state having nothing to do with the past one, is considered
"non-procedural",
> then the definition has meaning.
> 
> Remember, everything in the real world is real, and some things are 
> inexpressable, even though they have meaning. All algorithms may be
> "procedural" when they are implemented, because that is the way an
> "implementation" of an algorithm is defined by some.

I agree.  Procedural vs Non-procedural only has meaning in the
expression of an algorithm, not in the implementation.

Where this all started was my making a statement that the most
crucial thing a student can learn is the procedural nature of the
computer, to which someone responded (paraphrase) "well, what
about non-procedural algorithms?", to which I said something to
the effect of "everything is procedural".  I should probably have
said "all implementations are procedural", and that would have
saved all these posts. :)  [but perhaps not]

I still stand by my statement, however, that the most critical
thing a student can learn is the fundamentally procedural nature
of the computer.  Before syntax, before subroutines, before style,
they must learn that algorithms are a forward-moving,
cause-and-effect process.

And this *is* something that must be learned, that most
programmers take for granted as obvious.

-- Tim Behrendsen (tim@a-sis.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-10  0:00                                                   ` Jon S Anthony
@ 1996-09-11  0:00                                                     ` Richard A. O'Keefe
  0 siblings, 0 replies; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-09-11  0:00 UTC (permalink / raw)



jsa@alexandria (Jon S Anthony) writes:
>Hmmm, I am somewhat surprised that you of all people would not include
>Prolog in that list.  What's the scoop?

Prolog is a hybrid of a weak declarative language (weak because the
traditional Prolog implementation doesn't quite match the semantics of
Horn clause programs, and the difference sometimes matters a lot)
with a rather limited imperative language.  It is an _extremely_ practical
tool, but almost all Prolog programs make use of the non-declarative parts
of the language.  In the same way, Lisp is an _exceptionally_ practical tool
for almost every kind of programming, including writing web servers and
operating systems, but it is not a _pure_ declarative language, so didn't
make it into the list of declarative languages.  Mercury _is_ on the list
because the _implemented_ semantics of any Mercury program coincides with
its theoretical semantics as a Horn clause program.

For what it's worth, I do not accept _any_ of the OOP languages I listed
in my previous message as in any way declarative.  This is not to say that
you couldn't have a declarative OOP language; it is arguable that Haskell
is one.
-- 
Australian citizen since 14 August 1996.  *Now* I can vote the xxxs out!
Richard A. O'Keefe; http://www.cs.rmit.edu.au/%7Eok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-19  0:00                                       ` Tim Behrendsen
                                                           ` (4 preceding siblings ...)
  1996-09-11  0:00                                         ` Jon S Anthony
@ 1996-09-11  0:00                                         ` Richard A. O'Keefe
  1996-09-11  0:00                                           ` Tim Behrendsen
  1996-09-18  0:00                                         ` Jon S Anthony
  1996-09-26  0:00                                         ` Jon S Anthony
  7 siblings, 1 reply; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-09-11  0:00 UTC (permalink / raw)



"Tim Behrendsen" <tim@airshields.com> writes:
>Unless I'm mistaken, you seem to regard "procedural" as meaning a
>discrete sequence of steps.  My definition is a bit more broad; I
>see procedural as anything with a procedure; that is, an analog
>integrator is a procedural mechanism, even though there is no
>sequence of individual steps.

Ah.  So you have broadened the term "procedural" to the point of
uselessness.  I worked with analogue computers at one time, and
there's a wiring diagram, but there is *nothing* going on in them
that I would recognise as a "procedure".

>It is similiar to the difference between summation and integration;
>one consists of individual sums, the other of an infinite number
>of sums.  However, both are fundamentally adding.

Ever seen a planimeter?  One of those neat little gadgets where you
determine areas by rolling a little wheel around?  You can repeat
"it's fundamentally adding" all you like, but that isn't how the
gadgets actually _work_.

>When I see "Optical Computing", I normally think of gates that use
>photons rather than electrons, but I'm guessing you are referring to
>what I know as a "data flow" computer, where the data is encoded in
>a stream and "flows" through various data transformation mechanisms.

I am *not* talking about gates.  That might be optronics, but it isn't
optical computing.  Optical computing is where you have a spatially
coded signal and it is processed (FFT and so on) by lenses, mirrors,
holograms, and so on.

>However, this still requires time.  You make a claim above that
>"...and since it is non-discrete, with no 'time axis'...".  Perhaps
>you could explain why non-discrete means it does not require a time
>axis, and give an example of *any* algorithm that does not require
>a time axis, i.e., does not require time.

You have misquoted me and distorted what I was saying.  Yes, we are
talking about physical devices, so things happen in time.  But what
I wrote was "with no "time axis" THAT IS USEFUL IN UNDERSTANDING HOW IT
WORKS", and that is a very different claim.  The optical bench does not
vary through time.


>> If you _do_ regard optical computing as procedural, then you have
>> stretched the term to the point where you are no longer saying anything.

>Procedural means "has a procedure."

Yes, but fundamental to what everyone in computing except you means by
"procedure" is "a discrete sequence of steps".

>Nothing in the real world is not procedural,

In short, you have have stretched the words 'procedure' and 'procedural'
until _everything_ is procedural, _everything_ "has a procedure".
This is a good way to win an argument, by ensuring that you are saying
nothing.  If I re-define 'is blue' to mean 'is a physical object',
than I can be sure that no-one can prove me wrong when I say "every
phsyical object is blue", but then I haven't _said_ anything.

> but we can *express* algorithms non-procedurally.
If you are right that _everything_ is procedural, then this cannot be true.
If I write down an equation like
	Del  R      = 0
	   [a bc]de
(the Bianchi identity, equation 4.10.1 in Penrose & Rindler),
then since "nothing in the real world is not procedural" (according to
you), and since the picture of that equation before your eyes is "in the
real world" (I say the _picture_ of the equation, not the equation), then
that picture must be procedural.  (I do not know what it means for a
picture to be procedural, but we have your word that it is so.)  So that
picture is procedural, and the attempt to express something "non-procedurally"
has failed.

In short, the discussion with you has been a waste of time, because
you were never making any claim with empirical content, only playing
language games.


-- 
Australian citizen since 14 August 1996.  *Now* I can vote the xxxs out!
Richard A. O'Keefe; http://www.cs.rmit.edu.au/%7Eok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-10  0:00                                                   ` Richard A. O'Keefe
  1996-09-10  0:00                                                     ` Kaz Kylheku
@ 1996-09-11  0:00                                                     ` Bob Gilbert
  1 sibling, 0 replies; 688+ messages in thread
From: Bob Gilbert @ 1996-09-11  0:00 UTC (permalink / raw)



In article <51368e$6ir@goanna.cs.rmit.edu.au>, ok@goanna.cs.rmit.edu.au (Richard A. O'Keefe) writes:
> rgilbert@unconfigured.xvnews.domain (Bob Gilbert) writes:
> 
> >I suspect our definitions of procedural vs non-procedural (e.g. object
> >oriented) views are not the same.  Non-procedural views do not imply 
> >"instantaneous" algorithms, the non-existence of a time axis, or whatever.
> 
> It strikes me as a very strange use of language to call object-oriented
> programming (of the kind exemplified by Simula 67, Smalltalk, C++, Ada 95,
> CLOS, Eiffel, Self, Cecil, NewtonScript, &c) "non-procedural".  They are
> about as thoroughly procedural as you can get.
> 
> The opposite of "procedural" is not "OOP" but "declarative".
> Declarative languages include Haskell, Clean, Mercury, and perhaps Goedel,
> which I don't know all that well.  Maybe Sisal at all, but I don't know it.
> In those languages the programmer has no idea and no reason to care in
> what order operations are performed; any order the compiler chooses will
> make sense.  (Yes, it is still possible to write programs that interact
> with the world, and at least in Clean, very simply and elegantly.)

I think I have been taken out of context here.  I wasn't discussing 
specific languages, but rather a "view" of the problem domain.  Originally
Tim Behrendsen made a statement to the effect that students must be taught
to think in terms of breaking problems into procedures with inputs and
outputs.  I simply claimed that this *view* was very procedureal and many
proponents of things like object oriented design might take exception to
that view-point, or at least the implication that it is the only way to 
view a solution.  I was only offering OOD as an example of an alternative
way of viewing the problem and designing a solution where one shouldn't
think simply in terms of inputs, processing, and outputs.

-Bob








^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-11  0:00                                         ` Richard A. O'Keefe
@ 1996-09-11  0:00                                           ` Tim Behrendsen
  1996-09-12  0:00                                             ` Peter Seebach
                                                               ` (2 more replies)
  0 siblings, 3 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-09-11  0:00 UTC (permalink / raw)



Richard A. O'Keefe <ok@goanna.cs.rmit.edu.au> wrote in article
<515o3b$d7h@goanna.cs.rmit.edu.au>...
> "Tim Behrendsen" <tim@airshields.com> writes:
> 
> >However, this still requires time.  You make a claim above that
> >"...and since it is non-discrete, with no 'time axis'...".  Perhaps
> >you could explain why non-discrete means it does not require a time
> >axis, and give an example of *any* algorithm that does not require
> >a time axis, i.e., does not require time.
> 
> You have misquoted me and distorted what I was saying.  Yes, we are
> talking about physical devices, so things happen in time.  But what
> I wrote was "with no "time axis" THAT IS USEFUL IN UNDERSTANDING HOW IT
> WORKS", and that is a very different claim.  The optical bench does not
> vary through time.

The original progenitor of this thread is my assertion that
the most important thing a student can learn is the fundamental
procedural nature of the the computer.

Time is extremely useful in understanding how something works,
although you don't normally invoke it's name.  Even in your
optical computer, you have light-encoded information flowing
through mirrors, etc.  *This is time*.  If you are explaining
to a student how it works, your finger will trace a path
through the machine.

My entire point before we went down this road is that I think
many teachers take it for granted that cause-and-effect and
procedure are so obvious they don't need to be learned, and
that's not true.

> >> If you _do_ regard optical computing as procedural, then you have
> >> stretched the term to the point where you are no longer saying anything.
> 
> >Procedural means "has a procedure."
> 
> Yes, but fundamental to what everyone in computing except you means by
> "procedure" is "a discrete sequence of steps".

No, fundamental to *you*, perhaps, but I have never heard this
definition before.  Of course, your optical computer fits this
definition, since it has a discrete sequence of steps (mirrors,
etc.), and you claim that *it* doesn't.

> >Nothing in the real world is not procedural,
> 
> In short, you have have stretched the words 'procedure' and 'procedural'
> until _everything_ is procedural, _everything_ "has a procedure".
> This is a good way to win an argument, by ensuring that you are saying
> nothing.  If I re-define 'is blue' to mean 'is a physical object',
> than I can be sure that no-one can prove me wrong when I say "every
> phsyical object is blue", but then I haven't _said_ anything.

No, the definition is very clear.  The point is that non-procedural
*expressions* are an abstract concept.  I have no problem with
saying that algorithms can be abstractly expressed non-procedurally,
but they cannot be implemented non-procedurally.

> > but we can *express* algorithms non-procedurally.
> If you are right that _everything_ is procedural, then this cannot be true.
> If I write down an equation like
> 	Del  R      = 0
> 	   [a bc]de
> (the Bianchi identity, equation 4.10.1 in Penrose & Rindler),
> then since "nothing in the real world is not procedural" (according to
> you), and since the picture of that equation before your eyes is "in the
> real world" (I say the _picture_ of the equation, not the equation), then
> that picture must be procedural.  (I do not know what it means for a
> picture to be procedural, but we have your word that it is so.)  So that
> picture is procedural, and the attempt to express something "non-procedurally"
> has failed.

Now you're arguing worthless semantics.  Obviously that's not what
I'm saying.  How about "No algorithm implementation is not
procedural", which is what you know that I meant.

A mathematical expression is non-procedural, because it is a
statement of truth, and not an algorithmic sequence.

An SQL expression is not procedural, because it is not an algorithmic
sequence, it is a mathematical transformation.

In other words, procedural mechanisms have a start, middle, and
end.  Non-procedural expressions do not.

> In short, the discussion with you has been a waste of time, because
> you were never making any claim with empirical content, only playing
> language games.

Well, I'm sorry you feel that way, but you are wrong.

You prove my point that programmers take the procedural nature
of the computer as so obvious as to be beneath discussion, but
it's not.  I cannot stress this enough: THIS MUST BE LEARNED BY
STUDENTS.  This is the primary, fundamental axiom of computers.

How many questions do we get in this newsgroup where a student
simply didn't follow the flow of the program to see what happens?
This is so obvious to you and I that we don't think about it,
but *they didn't*!  Because they have only a vague feeling of
flow, and are still looking at the program as a kind of
weird combination of a solid object and something with a flow
of time.

Take recursion.  How can you not understand recursion if you
understand in your soul that computers execute a flow of
instructions?  You can't, and that's the point.  Understanding
the time axis is the key.

-- Tim Behrendsen (tim@a-sis.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-11  0:00                                           ` Tim Behrendsen
  1996-09-12  0:00                                             ` Peter Seebach
@ 1996-09-12  0:00                                             ` Richard A. O'Keefe
  1996-09-13  0:00                                               ` Tim Behrendsen
  1996-09-17  0:00                                             ` George
  2 siblings, 1 reply; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-09-12  0:00 UTC (permalink / raw)



"Tim Behrendsen" <tim@airshields.com> writes:

>Time is extremely useful in understanding how something works,
>although you don't normally invoke it's name.  Even in your
>optical computer, you have light-encoded information flowing
>through mirrors, etc.  *This is time*.  If you are explaining
>to a student how it works, your finger will trace a path
>through the machine.

This is simply false.  When I was taught about these beasts, this
simply didn't happen.  To the extent that there was a progression
of _ideas_, the progression went from the _output_ back to the input.
No doubt other teachers than mine have other methods of presentation,
but that's not how it was for me.

What's more, I was introduced to Feynman's treatment of electromagnetic
radiation involving a wave propagating _backwards_ in time.

>My entire point before we went down this road is that I think
>many teachers take it for granted that cause-and-effect and
>procedure are so obvious they don't need to be learned, and
>that's not true.

There are some people who believe that backwards causation is
self-contradictory, while there are others like me, who think
that Goedel's rotating universe model has shown backwards
causation to be compatible with the presently accepted laws of
physics ("Tipler machines" are basically inside-out Goedel universes).
Then there are the old distinctions between cause and ground, between
formal, material, and final causes.  Millenia of philosophy have shown
the concept to be very tricky indeed.  But what has that to do with
computing?  If I write a constraint logic program, you cannot point
to any part of the program and say "this causes that", but it is
computer programming for all that.

And I certainly agree that "procedure" is not an obvious concept.
I have just conducted a thorough search of my office.
Three English dictionaries agree that "procedure" means something like
"rules of behaviour", in the bureaucratic sense.  I have quite a few
books about algorithms, algorithmics, foundations of computing, &c.
All of *them* basically boiled down to "procedure, see subroutine".

>> Yes, but fundamental to what everyone in computing except you means by
>> "procedure" is "a discrete sequence of steps".

>No, fundamental to *you*, perhaps, but I have never heard this
>definition before.

As I said, the English dictionaries are talking about bureaucratic
procedures, which are often written down in rule books.  Even when they
are not, to follow a (bureaucratic) procedure means to engage in a sequence
of behaviours regulated by rules, and those behaviours are conceptualised
as discrete.  In computing, as I said above, I have been unable to find
any definition of "procedure" that distinguishes it from a Fortran
subroutine or Pascal procedure.  In fact, that is why the "procedural"
languages are so-called, because a procedural program is organised as
a possibly structured collections of procedures in the Fortran/Algol/Pascal
sense.  *This* in the meaning that leaps to mind to everyone who has
posted in this thread except you.

I have been able to locate a definition of algorithm.  I wonder if you
will recognise it:

	The notion of an _algorithm_ is basic to all of
	computer programming, so we should begin with a
	careful analysis of this concept.
	...
	So this is an algorithm.  The modern meaning for algorithm
	is quite similar to that of _recipie_, _process_, _method_,
	_technique_, _procedure_, _routine_, except that the word
	"algorithm" connotes something just a little different.
	Besides merely being a finite set of rules which gives a
	sequence o operations for solving a specific type of
	problem [%], an algorithm has five important features:

	1) Finiteness.  An algorithm must always terminate after a
	finite number of steps.  ... (A procedure which has all of
	the characteristics of an algorithm except that it possibly
	lacks finiteness may be called a "computational method." ...[%])

	2) Definiteness.  Each step of an algorithm must be
	precisely defined; the actions to be carried out must be
	rigorously and unambiguously specified for each case.

	3) Input.  An algorithm has zero or more inputs ...

	4) Output.  An algorithm has one or more outputs ...

	5) Effectiveness.  An algorithm is also generally
	expected to be _effective_.  This means that all of the
	operations to be performed in the algorithm must be
	sufficiently basic that they can in principle be done
	exactly and in a finite length of time by a man using
	pencil and paper. ..

[%] These statements make it clear that the distinguished computer
scientist who wrote this important textbook, whose emeritus chair
is in fact named after the book, considers "procedures" and
"algorithms" to be pretty much the same kind of thing, except that
"procedure" is a _slightly_ looser term.  This is actually the same
book containing "procedure, see subroutine" in the index, but every
other book in this office agrees with that.

>Of course, your optical computer fits this
>definition, since it has a discrete sequence of steps (mirrors,
>etc.), and you claim that *it* doesn't.

No, the mirrors are not steps.  It is in principle possible to build
an optical computer as a single translucent block with varying refractive
index. You don't need a crisp metal/air interface to get reflection; a
continuous change in refractive index (I did say my MSc was in underwater
acoustics) will do the job nicely.  The holographic transformation of an
optical signal _certainly_ takes place in a continously varying medium;
that's why I chose that example.

Let's compute a 2-dimensional optical signal propagating through a
medium with a predetermined static spatial modulation of its refractive
index.  

    1. Finiteness:  Not necessary for a procedure.
    2. Definiteness:  the transformation is precisely defined by
	partial differential equations.
    3. Input:  the incoming stream of photons.
    4. Output:  the outgoing stream of photons.
    5. Effectiveness:  fails miserably.  The relevant number here
	is the power of the continuum.  That was the point of selecting
	a model where _continuously_ varying media are involved.

>No, the definition is very clear.

You may _say_ that the definition is very clear, but you still haven't
told us what your definition of a procedure _is_, and why such an
extremely general redefinition of "procedure" is _relevant_ to CS
education.

>The point is that non-procedural
>*expressions* are an abstract concept.

So are *any* expressions.  So are "so", "are", "any", and "expressions".
What is the _point_?  Causality is (or are) an abstract concept.  Effect
is an abstract concept.  Number is a highly abstract concept.

>I have no problem with
>saying that algorithms can be abstractly expressed non-procedurally,
>but they cannot be implemented non-procedurally.

I have said this before:  according to your definition, EVERYTHING is
procedural.  Clouds, cockroaches, ink drops swirling in milk, alpha
particles tunnelling out of nuclei, they are all in the real world and
according to you _everything_ in the real world is procedural.  If you
are right, a non-procedure expression is simply an impossibility,
because strive as hard as we may, we end up with something in the real
world, and you say "no, *everything* in the real world is procedural".

>Now you're arguing worthless semantics.  Obviously that's not what
>I'm saying.

No, it is *not* obvious.

>How about "No algorithm implementation is not
>procedural", which is what you know that I meant.

No, I did *not* know that you meant that.  But *any* method of expressing
an algorithm (and as soon as you say "algorithm" you are conceding me
finiteness, definiteness, input, output, and effectiveness") is an
implementation.  This is my claim:  any intelligible expression of an
algorithm _is_ an implementation.  Proof:  if a human being can determine
that it _is_ the expression of an algorithm, then the human being has
determined that s/he could at least in principle execute the algorithm
with pencil and paper (the definition of effectiveness).  Anything that
no human being cannot figure out a method of calculating with pencil
and paper (in principle), cannot be the expression of an algorithm.

Since any human-intelligible expression of an algorithm _is_ by its
very nature an implementation of that algorithm, and since you claim
that no algorithm implementation is not procedural, then every
human-intelligible expression of an algorithm MUST, if you are correct,
count as procedural.

>A mathematical expression is non-procedural, because it is a
>statement of truth, and not an algorithmic sequence.

_I_ have no difficulty with this, because to me most mathematical
expressions are not algorithms.  But _you_ say that _everything_ is
procedural.  If mathematical expressions are part of the real world,
and if everything in the real world is procedural, how come you now
surprise us with a claim that a mathematical expression is not
procedural?  And if a mathematical expression is not procedural,
how is a constraint logic program, in which everything is non-directional
statements of truth, procedural?

>An SQL expression is not procedural, because it is not an algorithmic
>sequence, it is a mathematical transformation.

Here I disagree.  I think an SQL expression _is_ procedural.
Is an SQL expression an algorithm?  Let's see
    Finiteness:  yes.
    Definiteness:  yes.
    Input:  yes (the extensional data base).
    Output:  yes (the computed relation).
    Effectiveness:  yes.  There is an obvious naive pencil-and-paper
	method for computing the result of an SQL formula.  In fact,
	when students are taught SQL, they are sometimes required to
	evaluate SQL expressions in their exercises.
So an SQL expression passes all the tests for being an algorithm.
Why do you say it isn't?  Sure, a computer program may compute the
value of an SQL expression some other way, but that is equally true
for every programming language.  The student machine here has
asynchronous loads:  execute a load instruction, and the result might
not arrive until 30 cycles later, during which time more than 100
other instructions may have completed.  (It's an UltraSPARC.)  The
sequence _as written_ is not the sequence _as executed_, so we can't
let _that_ be a criterion for whether something is procedural or
mathematical.

>In other words, procedural mechanisms have a start, middle, and
>end.  Non-procedural expressions do not.

Hang on, you _denied_ that "steps" were an important part of a procedure,
and now here you are listing three steps that must be part of every
procedure.

>> In short, the discussion with you has been a waste of time, because
>> you were never making any claim with empirical content, only playing
>> language games.

>Well, I'm sorry you feel that way, but you are wrong.

1.  You use a covert redefinition of procedure (you _still_ have not
    supplied a clear explanation of what you mean by this term)
    which conflicts with a book that has been one of _the_ most 
    unfluential books in CS since 1968

2.  You use a redefinition in which _everything_ is procedural, so
    there is simply no contrast.  If everything has the Buddha
    nature, then to say that something has the Buddha nature tells
    you _nothing_ about it.

3.  You use a redefinition which makes no empirical claim.
    If a digital computer with discrete components and discrete
    states is no more and no less "procedural" than an optical
    computer with continuous components and continuous states,
    then we are left with no way to predict which things you
    will call procedural and which things you will not.

4.  You miss an *extremely* important point about computational
    mechanisms.  The definition of "algorithm" given above, and
    the definition of "procedure" that goes with it, has an
    incredibly important and rather shocking consequence:

	Anything which cannot be expressed in terms of properties
	of polynomials with integer variables and integer
	coefficients cannot be computed.

    This is even true of quantum computers.  Surely a definition
    of "procedure" or "procedural" which is to be _relevant_ to
    computing must be bounded by "procedures" (whatever they are)
    that can be followed by _computing_ mechanisms.

    This says to _me_ that the majority of processes occurring in
    the real world are not computable, so any definition of
    "procedure" that encompasses them os NOT RELEVANT TO COMPUTING.

>You prove my point that programmers take the procedural nature
>of the computer as so obvious as to be beneath discussion, but
>it's not.  I cannot stress this enough: THIS MUST BE LEARNED BY
>STUDENTS.  This is the primary, fundamental axiom of computers.

No, I do not prove your point.  No, we do _not_ take the nature
to the computer as being obvious.  We *DO* teach students about
computability; we *DO* teach students that every known computing
mechanism has no more and no less power than a Turing machine;
we *DO* teach students about computer architecture, at say the
level in Hennessy & Patterson and perhaps a little lower.  But
we sure as heck don't tell them that _everything_ in the real world
is procedural.

>How many questions do we get in this newsgroup where a student
>simply didn't follow the flow of the program to see what happens?
>This is so obvious to you and I that we don't think about it,
>but *they didn't*!  Because they have only a vague feeling of
>flow, and are still looking at the program as a kind of
>weird combination of a solid object and something with a flow
>of time.

There is a fatal flaw in your argument.
Students don't think of looking in the index of their textbook.
Students are capable of staring at a GNAT error message:
	"foobar.adb must be recompiled"
and wondering what to do about it.
Students are capable of handing in an Ada program starting
like this:  "procedure 8puzzle is".  (That happened yesterday.)
Amongst the students I have to deal with, some of the _native_
speakers of English have only the shakiest grasp of English
grammar.  It used to be that if I saw "Chinglish" comments I
knew I was dealing with an Asian student.  Not any more.

There is a sort of "learned helplessness" around.  It's not that
they don't think of following the flow; it's that a lot of them
don't think of _anything_; they wait for someone to _tell_ them.

There is another flaw in the argument:  the most powerful way of
understanding a program that I know, and I am not a bad programmer,
is to see it as a web of relations and constraints.  When _I_ have
a problem with an Ada program, I don't start playing computer, I
look for relations that I have not made explicit, for constraints
that I can check.

>Take recursion.  How can you not understand recursion if you
>understand in your soul that computers execute a flow of
>instructions?  You can't, and that's the point.  Understanding
>the time axis is the key.

There is spatial recursion as well as temporal recursion.
I am completely comfortable with recursion in programming and have been
since I met it.  But I _understand_ recursion the same way I understand
mathematical induction, and the train of thought when I write, analyse,
or debug recursive code is at best indirectly related to the "flow" in
the program.  The core metaphors I use to grasp recursion are primarily
spatial.

-- 
Australian citizen since 14 August 1996.  *Now* I can vote the xxxs out!
Richard A. O'Keefe; http://www.cs.rmit.edu.au/%7Eok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Delete - Don't Bother to Read This
  1996-09-01  0:00               ` Patrick Horgan
@ 1996-09-12  0:00                 ` Charles H. Sampson
  0 siblings, 0 replies; 688+ messages in thread
From: Charles H. Sampson @ 1996-09-12  0:00 UTC (permalink / raw)



     My apologies if this gets out.  My newsreader has had difficulty
posting followups to another newgroup and I'm just checking to see if
the problem is specific to that group.  Silly, I know, but this is Unix
software.

                                    Charlie




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-11  0:00                                           ` Tim Behrendsen
@ 1996-09-12  0:00                                             ` Peter Seebach
  1996-09-18  0:00                                               ` Tim Behrendsen
  1996-09-12  0:00                                             ` Richard A. O'Keefe
  1996-09-17  0:00                                             ` George
  2 siblings, 1 reply; 688+ messages in thread
From: Peter Seebach @ 1996-09-12  0:00 UTC (permalink / raw)



In article <01bb9fe6$7299d800$87ee6fce@timpent.a-sis.com>,
Tim Behrendsen <tim@airshields.com> wrote:
>The original progenitor of this thread is my assertion that
>the most important thing a student can learn is the fundamental
>procedural nature of the the computer.

I would probably disagree.  I don't consider the implementation to be
nearly as important as the design.  Students should be learning the
principles of proof and design.

>Time is extremely useful in understanding how something works,
>although you don't normally invoke it's name.  Even in your
>optical computer, you have light-encoded information flowing
>through mirrors, etc.  *This is time*.  If you are explaining
>to a student how it works, your finger will trace a path
>through the machine.

But that explanation is an explanation of *another way to think
about it*, not of how it works.

We frequently use the procedural view because it's the most natural for
languages such as English to describe; we don't have a "simultaneous" tense.

This is frequently a downright *bad* way to model a problem, or to think about
the model.

>My entire point before we went down this road is that I think
>many teachers take it for granted that cause-and-effect and
>procedure are so obvious they don't need to be learned, and
>that's not true.

They are important, but not necessarily the most important.

>> Yes, but fundamental to what everyone in computing except you means by
>> "procedure" is "a discrete sequence of steps".

>No, fundamental to *you*, perhaps, but I have never heard this
>definition before.  Of course, your optical computer fits this
>definition, since it has a discrete sequence of steps (mirrors,
>etc.), and you claim that *it* doesn't.

That's what I'd mean by a procedure; if they aren't separate, they're only one
step.  If there's only one step in the whole process, it's not much of a
procedure.

How to build a house:

Step 1.  Assemble the materials in the form of a house.

This is *not* a procedure.  :)

I would not think of an optical computer in terms of steps happening over
time, unless it was a very *large* optical computer.  In practice, it will act
as though everything happens all at once and continuously.  This is harder to
think about, but much more useful to understanding it, and less prone to
making the wrong kinds of design decisions.

>No, the definition is very clear.  The point is that non-procedural
>*expressions* are an abstract concept.  I have no problem with
>saying that algorithms can be abstractly expressed non-procedurally,
>but they cannot be implemented non-procedurally.

So we should avoid procedural things and implementations until students are
comfortable with the abstractions, since abstraction is much more important to
an understanding of *anything* than any procedure or implementation.

>Now you're arguing worthless semantics.  Obviously that's not what
>I'm saying.  How about "No algorithm implementation is not
>procedural", which is what you know that I meant.

I'd debate that.

>In other words, procedural mechanisms have a start, middle, and
>end.  Non-procedural expressions do not.

In which case, much programming is not procedural.  In which case,
it is very bad to try to convince students that the computer
is "fundementally" procedural, because this will not help them
understand the domain they're working in.

Proceduralness is like 5-volt-ness.  It's basic to the majority of
implementations, but not relevant to a lot of the things people
will need to know to use a computer.

>You prove my point that programmers take the procedural nature
>of the computer as so obvious as to be beneath discussion, but
>it's not.  I cannot stress this enough: THIS MUST BE LEARNED BY
>STUDENTS.  This is the primary, fundamental axiom of computers.

Nonsense.  It's completely irrelevant to a good understanding of many
aspects of computers, and downright misleading for whole families of
programming languages.

It is very important that they learn to use this view of computers;
it is neither desirable nor necessary that they believe it is the only view,
or that it is the most correct view, or any such.

>How many questions do we get in this newsgroup where a student
>simply didn't follow the flow of the program to see what happens?

Quite a lot.  We also get a fair number where the flow of the program is not
relevant to an understanding of the problem.  We even get problems where
an attempt to think of the "flow" of the program will bite the student,
because they're compile-time problems, not run-time problems.

>This is so obvious to you and I that we don't think about it,
>but *they didn't*!  Because they have only a vague feeling of
>flow, and are still looking at the program as a kind of
>weird combination of a solid object and something with a flow
>of time.

It is.

>Take recursion.  How can you not understand recursion if you
>understand in your soul that computers execute a flow of
>instructions?  You can't, and that's the point.  Understanding
>the time axis is the key.

I disagree.  I think I understand recursion, and I don't think of it
in time at all, I think of it in space.  Recursion doesn't move forward
in time, it moves down in space.  I tend to look at the naive recursive
quicksort as a tree of sorts, not as a sequence of sorts.  This makes it
much more obvious how it *works*.

Not how it's implemented, mind you; how it *works*.

The flow in a recursive algorithm is up-down, not forward-back.

-s
-- 
Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs
Unix/C Wizard - send mail for help, or send money for consulting!
The *other* C FAQ, the hacker FAQ, et al.  See web page above.
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-12  0:00                                             ` Richard A. O'Keefe
@ 1996-09-13  0:00                                               ` Tim Behrendsen
  1996-09-13  0:00                                                 ` Richard A. O'Keefe
  0 siblings, 1 reply; 688+ messages in thread
From: Tim Behrendsen @ 1996-09-13  0:00 UTC (permalink / raw)



Richard A. O'Keefe <ok@goanna.cs.rmit.edu.au> wrote in article
<5182u7$17b@goanna.cs.rmit.edu.au>...
> "Tim Behrendsen" <tim@airshields.com> writes:
>
>[snip] 
> As I said, the English dictionaries are talking about bureaucratic
> procedures, which are often written down in rule books.  Even when they
> are not, to follow a (bureaucratic) procedure means to engage in a
sequence
> of behaviours regulated by rules, and those behaviours are conceptualised
> as discrete.  In computing, as I said above, I have been unable to find
> any definition of "procedure" that distinguishes it from a Fortran
> subroutine or Pascal procedure.  In fact, that is why the "procedural"
> languages are so-called, because a procedural program is organised as
> a possibly structured collections of procedures in the
Fortran/Algol/Pascal
> sense.  *This* in the meaning that leaps to mind to everyone who has
> posted in this thread except you.
> 
> I have been able to locate a definition of algorithm.  I wonder if you
> will recognise it:
> 
> 	The notion of an _algorithm_ is basic to all of
> 	computer programming, so we should begin with a
> 	careful analysis of this concept.
> 	...
> 	So this is an algorithm.  The modern meaning for algorithm
> 	is quite similar to that of _recipie_, _process_, _method_,
> 	_technique_, _procedure_, _routine_, except that the word
> 	"algorithm" connotes something just a little different.
> 	Besides merely being a finite set of rules which gives a
> 	sequence o operations for solving a specific type of
> 	problem [%], an algorithm has five important features:
> 
> 	1) Finiteness.  An algorithm must always terminate after a
> 	finite number of steps.  ... (A procedure which has all of
> 	the characteristics of an algorithm except that it possibly
> 	lacks finiteness may be called a "computational method." ...[%])
> 
> 	2) Definiteness.  Each step of an algorithm must be
> 	precisely defined; the actions to be carried out must be
> 	rigorously and unambiguously specified for each case.
> 
> 	3) Input.  An algorithm has zero or more inputs ...
> 
> 	4) Output.  An algorithm has one or more outputs ...
> 
> 	5) Effectiveness.  An algorithm is also generally
> 	expected to be _effective_.  This means that all of the
> 	operations to be performed in the algorithm must be
> 	sufficiently basic that they can in principle be done
> 	exactly and in a finite length of time by a man using
> 	pencil and paper. ..
> 
> [%] These statements make it clear that the distinguished computer
> scientist who wrote this important textbook, whose emeritus chair
> is in fact named after the book, considers "procedures" and
> "algorithms" to be pretty much the same kind of thing, except that
> "procedure" is a _slightly_ looser term.  This is actually the same
> book containing "procedure, see subroutine" in the index, but every
> other book in this office agrees with that.

This sounds reasonable to me.  I interpret (1) to mean that
it should not have an asymptotic solution; i.e., it should be
able to solved in finite time.

> >Of course, your optical computer fits this
> >definition, since it has a discrete sequence of steps (mirrors,
> >etc.), and you claim that *it* doesn't.
> 
> No, the mirrors are not steps.  It is in principle possible to build
> an optical computer as a single translucent block with varying refractive
> index. You don't need a crisp metal/air interface to get reflection; a
> continuous change in refractive index (I did say my MSc was in underwater
> acoustics) will do the job nicely.  The holographic transformation of an
> optical signal _certainly_ takes place in a continously varying medium;
> that's why I chose that example.
> 
> Let's compute a 2-dimensional optical signal propagating through a
> medium with a predetermined static spatial modulation of its refractive
> index.  
> 
>     1. Finiteness:  Not necessary for a procedure.
>     2. Definiteness:  the transformation is precisely defined by
> 	partial differential equations.
>     3. Input:  the incoming stream of photons.
>     4. Output:  the outgoing stream of photons.

>     5. Effectiveness:  fails miserably.  The relevant number here
> 	is the power of the continuum.  That was the point of selecting
> 	a model where _continuously_ varying media are involved.
      ^^^

Does it fail?  I suggest that your machine is equivalent to a
CPU with a single instruction.  The single instruction calculates
a particular mathematical formula.  I could simulate this using
pencil and paper. I quote the definition:

    "This means that all of the operations to be performed in
     the algorithm must be sufficiently basic that they can in
     principle be done exactly and in a finite length of time by  
     a man using pencil and paper."

Note that it does *not* say that I have to do the operations using
exactly the same method, only the same operations.  Your light
based algorithm is simply a mathematical operation, which I
can do using pencil and paper quite easily.

This is like saying that since most computers are based on the
flow of electrons through gates, I have to simulate the electron
flow in order to prove Quicksort is an algorithm.

> >No, the definition is very clear.
> 
> You may _say_ that the definition is very clear, but you still haven't
> told us what your definition of a procedure _is_, and why such an
> extremely general redefinition of "procedure" is _relevant_ to CS
> education.
> [giant snip of procedural debate]

OK, obviously I'm failing miserably to get my point across.  Let
me try different terms rather than the obviously loaded term
"procedural".

Try this on for size:

1. There are two ways an algorithm can be expressed:
   a. Using a methodology that expresses a path of execution (aka
      procedural languages).
   b. Using a methodology that does *not* specify a path of
      execution (aka non-procedural languages).

2. Algorithms can execute *only* using a time-based mechanism.

Now, as for education, I submit that at the beginning it is more
important to focus on what execution means rather than the
abstract expression of algorithms via languages.  Certainly
you need some form of expression, but it should be kept as
simple as possible to get the student brain into the general
concept of a flow of execution without the distraction of
methodologies.

In fact, the algorithm definition you posted above is a good
one.  Do you use that in your classes?  I would show it to
the students the first day and have them burn it into their
brains.  It very forcefully shows what algorithms are all
about.

> >How many questions do we get in this newsgroup where a student
> >simply didn't follow the flow of the program to see what happens?
> >This is so obvious to you and I that we don't think about it,
> >but *they didn't*!  Because they have only a vague feeling of
> >flow, and are still looking at the program as a kind of
> >weird combination of a solid object and something with a flow
> >of time.
> 
> There is a fatal flaw in your argument.
> Students don't think of looking in the index of their textbook.
> Students are capable of staring at a GNAT error message:
> 	"foobar.adb must be recompiled"
> and wondering what to do about it.
> Students are capable of handing in an Ada program starting
> like this:  "procedure 8puzzle is".  (That happened yesterday.)
> Amongst the students I have to deal with, some of the _native_
> speakers of English have only the shakiest grasp of English
> grammar.  It used to be that if I saw "Chinglish" comments I
> knew I was dealing with an Asian student.  Not any more.
> 
> There is a sort of "learned helplessness" around.  It's not that
> they don't think of following the flow; it's that a lot of them
> don't think of _anything_; they wait for someone to _tell_ them.

Well, certainly there are large number of students in that
frame of mind, I cannot disagree.

> There is another flaw in the argument:  the most powerful way of
> understanding a program that I know, and I am not a bad programmer,
> is to see it as a web of relations and constraints.  When _I_ have
> a problem with an Ada program, I don't start playing computer, I
> look for relations that I have not made explicit, for constraints
> that I can check.

Well, I have to admit that when I think about an algorithm,
I visualize data flowing through the algorithm undergoing
transformations until it's what I want.  I start with the
highest level "black box" transformations, and then gradually
break down each black box in greater and greater detail until
I have C code (or whatever).

This is certainly not the only way to look at problems, but it
is the closest to the reality of how algorithms execute.  And
I submit that it is the most straightforward to understand.  I
admit, though, that I have a natural bias toward this point of
view.

> >Take recursion.  How can you not understand recursion if you
> >understand in your soul that computers execute a flow of
> >instructions?  You can't, and that's the point.  Understanding
> >the time axis is the key.
> 
> There is spatial recursion as well as temporal recursion.
> I am completely comfortable with recursion in programming and have been
> since I met it.  But I _understand_ recursion the same way I understand
> mathematical induction, and the train of thought when I write, analyse,
> or debug recursive code is at best indirectly related to the "flow" in
> the program.  The core metaphors I use to grasp recursion are primarily
> spatial.

Hmm; I'm not sure if I view it the same way or not.  If I
visualize Quicksort, for example, I visualize a big block of data
that I divvy into two parts.  Each of those parts get divided,
and so on.  To map into the computer domain, I think about a
black box that does the divvying process, and then each piece
gets put back through the black box.  I suppose my core metaphor
is also spatial, but I'm not sure if it's the same way.

The point is that I (at least) don't view Quicksort as a
static "hall of mirrors", everything reflecting back and forth
with no sense of execution.  I see a "movie" of the data breaking
down section by section, and I can see the pattern between
each break and generalize it.

-- Tim Behrendsen (tim@a-sis.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-13  0:00                                               ` Tim Behrendsen
@ 1996-09-13  0:00                                                 ` Richard A. O'Keefe
  1996-09-18  0:00                                                   ` Tim Behrendsen
  0 siblings, 1 reply; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-09-13  0:00 UTC (permalink / raw)



"Tim Behrendsen" <tim@airshields.com> writes:

>Does it fail?  I suggest that your machine is equivalent to a
>CPU with a single instruction.  The single instruction calculates
>a particular mathematical formula.  I could simulate this using
>pencil and paper.

Does the word *continuous* convey nothing to you?
No, you *couldn't* simulate the machine using pencil and paper.
An optical computer works in the *real* (\br{R}) world.

>Note that it does *not* say that I have to do the operations using
>exactly the same method, only the same operations.  Your light
>based algorithm is simply a mathematical operation, which I
>can do using pencil and paper quite easily.

Here you are making a claim to be able to compute non-computable
functions.  I don't believe you.

>This is like saying that since most computers are based on the
>flow of electrons through gates, I have to simulate the electron
>flow in order to prove Quicksort is an algorithm.

Totally false.

>Try this on for size:

>1. There are two ways an algorithm can be expressed:
>   a. Using a methodology that expresses a path of execution (aka
>      procedural languages).
>   b. Using a methodology that does *not* specify a path of
>      execution (aka non-procedural languages).

This is false.  If I write a Horn clause program, one person will
look at it and say "ah hah, that is a Prolog program, it specifies
a path of execution, it is procedural", while another will look at
it and say "ah hah, that is a logical formulal, it expresses a
timeless truth which can be computed in many ways."  They are both
right.  Take for example a Definite Clause Grammar written in pure
Prolog.  ONE AND THE SAME FORM OF EXPRESSION may behave
 - as a top-down recursive descent parser (Prolog execution)
 - as a left-corner parser (I've got parsers running this way; this
   is not a theoretical point)
 - as a chart parser (XSB execution).
Take another example.  There is a programming language called Euclid.
Euclid is in the Pascal family.  It's full of records, arrays, pointers,
assignment statements, all that stuff.  But there was a paper from Xerox
showing how a Euclid program could be *naturally* and *directly* viewed
as a pure functional program.  ONE form of expression, but BOTH a
"procedural" reading AND a "declarative" reading.  Heck, it's more than
30 years since (de?) Bakker showed how to view an Algol 60 "procedural"
program as a lambda-calculus "declarative" formula.  For another example,
you accepted SQL as a non-procedural form of expression, but I countered
that I can "see" an imperative reading.

The point that I am making is that the distinction you are trying to make
is _not_ a distinction that is present in the forms of expression themselves;
the procedural and declarative views are ways that human beings _choose_ to
read texts which are themselves neutral. 

I must stress that this is not a theoretical point.  There are a couple of
companies making what seemed to me at the time "big bucks" taking COBOL
programs, *viewing* them as declarative expressions, and emitting new COBOL
code with the same _declarative_ meaning but different _procedural_ form
and behaviour.  And there are people using functional languages to design
VLSI circuits.

>2. Algorithms can execute *only* using a time-based mechanism.

You are taking "we can only _tell_ that an algorithm has executed
by contrasting a later situation with an earlier one" for a property of
algorithms rather than a property of human perception and embedding in
the world.  You may be right, but we can only notice *anything* by
contrasting earlier and later situations (I am using "situations" in
a Barwisian sense here).  That doesn't mean that it is the most fruitful
way to think about everything.  One very important computing technique,
after all, is converting time to space.  (What else is parallelism?)

You are also not confronting your assumption that the most important
thing to understand about an algorithm is its _execution_.

>Now, as for education, I submit that at the beginning it is more
>important to focus on what execution means 

I _really_ don't get you here.  The phrase is actually ambiguous.
I don't understand either of the readings.

If you mean "it is important for students to understand what it is like
when a machine executes a program", fine, we agree that it is important,
but why is it the _most_ important thing?

Case in point:  I learned matrix algebra before I learned about machines.
I know what the execution of a Gaussian elimination "means" because I
know matrix algebra, not because I know machines (though I do know both).
If I just knew machines, I wouldn't have the faintest notion what a
Gaussian elimination meant.

>Certainly
>you need some form of expression, but it should be kept as
>simple as possible to get the student brain into the general
>concept of a flow of execution without the distraction of
>methodologies.

"Methodologies" is not a word that I use; it is pretentious.

Do we teach children about the mechanics of speech before they
learn to talk?

When I was taught to play the piano as a child, I was taught about
forms of expression (how to read music), not about how to build a
piano or what happens when you press a key.

When my wife bought a knitting machine as a teenager, she was taught
how to read patterns, and how to control the machine, not how the
machine is constructed or operates.

When you are taught to drive a car, you are taught the rules of the
road and a few heuristics for operating the machine.  You are not
taught the theory of combustion; you are not shown the wiring diagram
of the car.  Heck, people these days don't even learn what gears _are_,
let alone how they work.

I got a ham license without having to learn what actually goes on inside
a valve or transistor.

What is so different about computing?

>In fact, the algorithm definition you posted above is a good
>one.  Do you use that in your classes?  

As I don't teach first year, no.  It is, of course, Knuth, The Art
of Computer Programming, Volume 1.

>Well, I have to admit that when I think about an algorithm,
>I visualize data flowing through the algorithm undergoing
>transformations until it's what I want.  I start with the
>highest level "black box" transformations, and then gradually
>break down each black box in greater and greater detail until
>I have C code (or whatever).

>This is certainly not the only way to look at problems, but it
>is the closest to the reality of how algorithms execute.  And
>I submit that it is the most straightforward to understand.  I
>admit, though, that I have a natural bias toward this point of
>view.

Just at the moment I'm playing around with trying to speed up
back-propagation neural nets.  A student here has NN programs
that regularly take 4 hours.  So far I've worked on just one
stage, but already that stage goes 5 times as fast.  (That's
the practical ceiling for that stage.)  What's the trick?  To
step *BACK* from the procedural view; with considerable effort
to figure out that the bulk of the work goes on in what _should_
have been two calls to SGEMV and one call to SGER (level-2 BLAS).
(Well, actually I'm using APL notation, not calls to the BLAS.)
Then you notice that there are a SAXPY and SDOT lying around that
could be fused with the SGER and one of the SGEMVs (making this
decision _solely_ on the timeless mathematics), and then you
see that two loops can be fused making it possible to eliminate
a vector store and reload, and before you know it, almost all of
the branches have gone.  The bit I've looked at is one of the
SGEMV steps, and the speed of that chunk has gone from 8 Mflops
to 40 Mflops on the same machine.  (To start with, one branch
per 50 instructions.)

I am already aware of one bottleneck that makes it unlikely that
the whole NN training phase will speed up that much.  (Anyone know
a really fast way to compute a reasonable approximation to a logistic
function?)

The point here is that people before me had already tried to write
good code, but their minds were straitjacketed by thinking the way
you want them to think.  I had to think at the (timeless) mathematical
level to see how to reorganise this code.

>Hmm; I'm not sure if I view it the same way or not.  If I
>visualize Quicksort, for example, I visualize a big block of data
>that I divvy into two parts.  Each of those parts get divided,
>and so on.  To map into the computer domain, I think about a
>black box that does the divvying process, and then each piece
>gets put back through the black box.  I suppose my core metaphor
>is also spatial, but I'm not sure if it's the same way.

>The point is that I (at least) don't view Quicksort as a
>static "hall of mirrors", everything reflecting back and forth
>with no sense of execution.  I see a "movie" of the data breaking
>down section by section, and I can see the pattern between
>each break and generalize it.

Done much parallel programming yet?  You are describing an approach
to understanding algorithms which makes parallel programming _very_
difficult.
-- 
Australian citizen since 14 August 1996.  *Now* I can vote the xxxs out!
Richard A. O'Keefe; http://www.cs.rmit.edu.au/%7Eok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-05  0:00                                               ` Mark Wooding
  1996-09-06  0:00                                                 ` Bob Cousins
@ 1996-09-13  0:00                                                 ` Bengt Richter
  1996-09-14  0:00                                                   ` Craig Franck
  1 sibling, 1 reply; 688+ messages in thread
From: Bengt Richter @ 1996-09-13  0:00 UTC (permalink / raw)



mdw@excessus.demon.co.uk (Mark Wooding) wrote:

>Lawrence Kirby <fred@genesis.demon.co.uk> wrote:
>> In article <01bb8f19$9a89d820$32ee6fce@timhome2>
>>            tim@airshields.com "Tim Behrendsen" writes:
>>
>> >There is no other view than the procedural view.
>> 
>> Some functional language programmers might take issue with that
>> statement. Prologgers may have a thought or two also.

>I've not come across a computer yet which doesn't work by fetching an
>instruction (or maybe a few at a time), doing them, and then going off
>and fetching some more.  I guess you can pretend that this isn't the
>case, and maybe come up with some nice ways of presenting algorithms
>which don't depend on this, but that's not the way things work
>underneath.  The One True View is that sequence of instructions; all
>else is an illusion.  Maybe it's a helpful illusion, but illusion it is
>nonetheless.

Au contraire. "The way things work underneath" is not sequential. All
the energized components of a computer exist and work in parallel.
It is only by focusing on certain parts and giving names to their
states and state transitions that you can see the abstract model
involving "that sequence of instructions."

E.g., to speak of "fetching an instruction" as if were a single event
is to ignore most of the reality "underneath." (Not to mention that
the reality differs among computer architectures). It may be useful
to exclude details that are not relevant to a particular abstract
model, but to think any such model embodies "the One True View" is
severely limiting, IMHO. To constrain a student's thoughts to such
a straight jacket without at some point untying the laces would be
a crime. Some may not be able to escape without help ;-)

Regards,
Bengt Richter






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-13  0:00                                                 ` Bengt Richter
@ 1996-09-14  0:00                                                   ` Craig Franck
  0 siblings, 0 replies; 688+ messages in thread
From: Craig Franck @ 1996-09-14  0:00 UTC (permalink / raw)



bokr@accessone.com (Bengt Richter) wrote:
>mdw@excessus.demon.co.uk (Mark Wooding) wrote:
>
>>Lawrence Kirby <fred@genesis.demon.co.uk> wrote:
>>> In article <01bb8f19$9a89d820$32ee6fce@timhome2>
>>>            tim@airshields.com "Tim Behrendsen" writes:
>>>
>>> >There is no other view than the procedural view.
>>> 
>>> Some functional language programmers might take issue with that
>>> statement. Prologgers may have a thought or two also.
>
>>I've not come across a computer yet which doesn't work by fetching an
>>instruction (or maybe a few at a time), doing them, and then going off
>>and fetching some more.  I guess you can pretend that this isn't the
>>case, and maybe come up with some nice ways of presenting algorithms
>>which don't depend on this, but that's not the way things work
>>underneath.  The One True View is that sequence of instructions; all
>>else is an illusion.  Maybe it's a helpful illusion, but illusion it is
>>nonetheless.
>
>Au contraire. "The way things work underneath" is not sequential. All
>the energized components of a computer exist and work in parallel.
>It is only by focusing on certain parts and giving names to their
>states and state transitions that you can see the abstract model
>involving "that sequence of instructions."
>
>E.g., to speak of "fetching an instruction" as if were a single event
>is to ignore most of the reality "underneath." (Not to mention that
>the reality differs among computer architectures). It may be useful
>to exclude details that are not relevant to a particular abstract
>model, but to think any such model embodies "the One True View" is
>severely limiting, IMHO. To constrain a student's thoughts to such
>a straight jacket without at some point untying the laces would be
>a crime. Some may not be able to escape without help ;-)

I would direct such a student to Gary Zukav's "The Dancing Wu Li
Masters" or Fritjof Capra's "The Toa of Physics". From reading 
these books it will become obvious that the "One True View" is
that everything is a "dancing pattern of organic energy". In this 
very eastern view, "organic" applies to the quantum states of atoms 
as well as to carbon chemistry. 

Please note that Tim's statement had to do with teaching algorithms,
and I don't think he meant that if GOD exists HE is procedural.

-- 
Craig
clfranck@worldnet.att.net
Manchester, NH
Knowledge is of two kinds; we know a subject ourselves, or we
know where we can find information about it. --  Samual Johnson






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-11  0:00                                         ` Jon S Anthony
  1996-09-11  0:00                                           ` Craig Franck
@ 1996-09-17  0:00                                           ` George
  1996-09-24  0:00                                             ` Joel VanLaven
  1 sibling, 1 reply; 688+ messages in thread
From: George @ 1996-09-17  0:00 UTC (permalink / raw)



jsa@alexandria (Jon S Anthony) wrote:

> In article <01bb9f26$36c870e0$87ee6fce@timpent.a-sis.com> "Tim Behrendsen" <tim@airshields.com> writes:

> > It is similiar to the difference between summation and integration;
> > one consists of individual sums, the other of an infinite number
> > of sums.  However, both are fundamentally adding.

> Well, that is one option.  But as "everyone" knows, the FTC allows you
> to compute definite integrals without taking the limits of sums or
> using summations at all.  Incidentally, none of the standard
> definitions (Riemann Sum or something) uses "an infinite number of
> sums".  Can't - infinity is not part of the real numbers...


Surely the definition of integration contains the phrase "...tends to
infinity", i.e. it's _as if_ there was an infinite number of sums.

G.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-11  0:00                                           ` Tim Behrendsen
  1996-09-12  0:00                                             ` Peter Seebach
  1996-09-12  0:00                                             ` Richard A. O'Keefe
@ 1996-09-17  0:00                                             ` George
  1996-09-19  0:00                                               ` Tim Behrendsen
                                                                 ` (2 more replies)
  2 siblings, 3 replies; 688+ messages in thread
From: George @ 1996-09-17  0:00 UTC (permalink / raw)



"Tim Behrendsen" <tim@airshields.com> wrote:

> You prove my point that programmers take the procedural nature
> of the computer as so obvious as to be beneath discussion, but
> it's not.  I cannot stress this enough: THIS MUST BE LEARNED BY
> STUDENTS.  This is the primary, fundamental axiom of computers.

> How many questions do we get in this newsgroup where a student
> simply didn't follow the flow of the program to see what happens?
> This is so obvious to you and I that we don't think about it,
> but *they didn't*!  Because they have only a vague feeling of
> flow, and are still looking at the program as a kind of
> weird combination of a solid object and something with a flow
> of time.


Just to get away from the pointless symantic argument; are your really
suggesting that students don't understand something this basic.
If this is the effect of teaching OOA/OOP and similar mumbo jumbo,
then maybe it's time we got back to BASICs; no way they could they
fail to understand. 

What do they actually think happens inside a computer *magic*????


> Take recursion.  How can you not understand recursion if you
> understand in your soul that computers execute a flow of
> instructions?  You can't, and that's the point.  Understanding
> the time axis is the key.

They should never have been allowed to get this far without realizing
that.

> -- Tim Behrendsen (tim@a-sis.com)

G.





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-12  0:00                                             ` Peter Seebach
@ 1996-09-18  0:00                                               ` Tim Behrendsen
  0 siblings, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-09-18  0:00 UTC (permalink / raw)



Peter Seebach <seebs@solutions.solon.com> wrote in article
<519i0o$mgb@solutions.solon.com>...
> In article <01bb9fe6$7299d800$87ee6fce@timpent.a-sis.com>,
> Tim Behrendsen <tim@airshields.com> wrote:
> >The original progenitor of this thread is my assertion that
> >the most important thing a student can learn is the fundamental
> >procedural nature of the the computer.
> 
> I would probably disagree.  I don't consider the implementation to be
> nearly as important as the design.  Students should be learning the
> principles of proof and design.

Agreed that proof and design are important, but we are talking
about first principles.  We want to get them thinking in
"normal" computer terms first, so that they are capable of knowing
what they are learning later on.  These are the students
that I'm getting; all memorized knowledge and no understanding.

Perhaps in the future, we will have "non-procedural" languages
that can be useful over a broader range of problems, and
learning these concepts will be less important.  At this time
in computer history, however, it's critical.

> >Time is extremely useful in understanding how something works,
> >although you don't normally invoke it's name.  Even in your
> >optical computer, you have light-encoded information flowing
> >through mirrors, etc.  *This is time*.  If you are explaining
> >to a student how it works, your finger will trace a path
> >through the machine.
> 
> But that explanation is an explanation of *another way to think
> about it*, not of how it works.
> 
> We frequently use the procedural view because it's the most natural for
> languages such as English to describe; we don't have a "simultaneous" tense.
> 
> This is frequently a downright *bad* way to model a problem, or to think about
> the model.

There may be some problems that are easier to understand/model
using a non-procedural view, but they are usually very
specialized and trivial.  And the execution is always a
mechanistic model.

> >My entire point before we went down this road is that I think
> >many teachers take it for granted that cause-and-effect and
> >procedure are so obvious they don't need to be learned, and
> >that's not true.
> 
> They are important, but not necessarily the most important.

But the point is, the concepts are frequently neglected as a
subject in and unto itself.

Given that this is by far the dominant implementation style,
I don't think it's a stretch to say that is the most important
thing to learn.

> >> Yes, but fundamental to what everyone in computing except you means by
> >> "procedure" is "a discrete sequence of steps".
> 
> >No, fundamental to *you*, perhaps, but I have never heard this
> >definition before.  Of course, your optical computer fits this
> >definition, since it has a discrete sequence of steps (mirrors,
> >etc.), and you claim that *it* doesn't.
> 
> That's what I'd mean by a procedure; if they aren't separate, they're only one
> step.  If there's only one step in the whole process, it's not much of a
> procedure.
> 
> How to build a house:
> 
> Step 1.  Assemble the materials in the form of a house.
> 
> This is *not* a procedure.  :)

Sure it is; it's just a big procedure. :)

Every step is composed of sub-steps; it just depends on how
much you want to split them down.  I execute a printf; this
would commonly be referred to as one step.  However, it invokes
a great deal of assembly language to create the action.  Taking
it further, it causes a lot of gates within the CPU to open
and close.  Taking it still further, it causes various
quantum changes within the transisters within the gates.

You'll note that what you call a "step-by-step" process is
actually composed of a continuous flow of electrons.

> I would not think of an optical computer in terms of steps happening over
> time, unless it was a very *large* optical computer.  In practice, it will act
> as though everything happens all at once and continuously.  This is harder to
> think about, but much more useful to understanding it, and less prone to
> making the wrong kinds of design decisions.

Ridiculous!  Even if things happen very, very fast, it still
requires time for the light to travel through the various gates.
In practice, a modern CPU can add an array of a hundred numbers
"as though it happens all at once and continuously".
 
> >No, the definition is very clear.  The point is that non-procedural
> >*expressions* are an abstract concept.  I have no problem with
> >saying that algorithms can be abstractly expressed non-procedurally,
> >but they cannot be implemented non-procedurally.
> 
> So we should avoid procedural things and implementations until students are
> comfortable with the abstractions, since abstraction is much more important to
> an understanding of *anything* than any procedure or implementation.

I strongly disagree.  Abstraction is extremely difficult to understand
without an understanding of cause-and-effect and procedure.  Look
at the questions we get in this newsgroup.  I would wager that
90% of them are simply that the student doesn't follow the flow
of the program (e.g. uninitialize variables).

> >You prove my point that programmers take the procedural nature
> >of the computer as so obvious as to be beneath discussion, but
> >it's not.  I cannot stress this enough: THIS MUST BE LEARNED BY
> >STUDENTS.  This is the primary, fundamental axiom of computers.
> 
> Nonsense.  It's completely irrelevant to a good understanding of many
> aspects of computers, and downright misleading for whole families of
> programming languages.

I should have said the execution of programs must be
procedural; my phrasing was not very good.

> It is very important that they learn to use this view of computers;
> it is neither desirable nor necessary that they believe it is the only view,
> or that it is the most correct view, or any such.
> 
> >How many questions do we get in this newsgroup where a student
> >simply didn't follow the flow of the program to see what happens?
> 
> Quite a lot.  We also get a fair number where the flow of the program is not
> relevant to an understanding of the problem.  We even get problems where
> an attempt to think of the "flow" of the program will bite the student,
> because they're compile-time problems, not run-time problems.
> 
> >This is so obvious to you and I that we don't think about it,
> >but *they didn't*!  Because they have only a vague feeling of
> >flow, and are still looking at the program as a kind of
> >weird combination of a solid object and something with a flow
> >of time.
> 
> It is.

It is, in the sense that each line of code is a solid object,
but not in the sense of the overall flow.

Yes, there are certain languages that are do not have a
programmer-specified flow, but we are talking about first
principles.

> >Take recursion.  How can you not understand recursion if you
> >understand in your soul that computers execute a flow of
> >instructions?  You can't, and that's the point.  Understanding
> >the time axis is the key.
> 
> I disagree.  I think I understand recursion, and I don't think of it
> in time at all, I think of it in space.  Recursion doesn't move forward
> in time, it moves down in space.  I tend to look at the naive recursive
> quicksort as a tree of sorts, not as a sequence of sorts.  This makes it
> much more obvious how it *works*.

But moving in space *is* time.  You can't get away from it.  Time
is only one element; data movement is certainly important as well.

The point is, execution requires time, which is why teaching
this fundamental point is so important.

> Not how it's implemented, mind you; how it *works*.
> 
> The flow in a recursive algorithm is up-down, not forward-back.

Well, all algorithms are up-down if they have any looping in
them. :)  To tell you the truth, I'm not sure what "forward-back"
means v.s. up-down.

-- Tim Behrendsen (tim@a-sis.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-13  0:00                                                 ` Richard A. O'Keefe
@ 1996-09-18  0:00                                                   ` Tim Behrendsen
  1996-09-19  0:00                                                     ` Richard A. O'Keefe
  0 siblings, 1 reply; 688+ messages in thread
From: Tim Behrendsen @ 1996-09-18  0:00 UTC (permalink / raw)



Richard A. O'Keefe <ok@goanna.cs.rmit.edu.au> wrote in article
<51b8i4$m9m@goanna.cs.rmit.edu.au>...
> "Tim Behrendsen" <tim@airshields.com> writes:
> 
> >Does it fail?  I suggest that your machine is equivalent to a
> >CPU with a single instruction.  The single instruction calculates
> >a particular mathematical formula.  I could simulate this using
> >pencil and paper.
> 
> Does the word *continuous* convey nothing to you?

I simply don't understand the resistance to the idea that a
procedure could consist of a continuous medium.  In fact, the
computer you're typing on uses a continuous flow of electrons
to give to process algorithms.

> No, you *couldn't* simulate the machine using pencil and paper.
>> An optical computer works in the *real* (\br{R}) world.

> >Note that it does *not* say that I have to do the operations using
> >exactly the same method, only the same operations.  Your light
> >based algorithm is simply a mathematical operation, which I
> >can do using pencil and paper quite easily.
> 
> Here you are making a claim to be able to compute non-computable
> functions.  I don't believe you. 

As I understood what you wrote, your optical computer was
modulating a signal based on refraction index of a medium.  This
sounds calculable to me, assuming we know how the refraction
function of your crystal is set up.  We are getting a bit
beyond my knowledge;  I don't know if you mean that the
mathematics is unknown, or unknowable, but let me take both
cases.

Unknown: Just because a natural law is not understood, doesn't
mean there is not an algorithm behind it.

Unknowable: There are certainly cases in physics where the
underlying algorithm is not predictable; for example, the
output of a geiger counter could not be simulated using pencil
and paper (that we know of, anyway).  Now, this is a much more
interesting question.  Is there an algorithm behind atomic
decay?  There must be a *method*, even if it is not predictable
(or known).  Your author's definition obviously wasn't designed
to handle questions of this type; I think it comes down to
opinion whether this fits a definition of algorithm.

> >Try this on for size:
> 
> >1. There are two ways an algorithm can be expressed:
> >   a. Using a methodology that expresses a path of execution (aka
> >      procedural languages).
> >   b. Using a methodology that does *not* specify a path of
> >      execution (aka non-procedural languages).
> 
> This is false.  If I write a Horn clause program, one person will
> look at it and say "ah hah, that is a Prolog program, it specifies
> a path of execution, it is procedural", while another will look at
> it and say "ah hah, that is a logical formulal, it expresses a
> timeless truth which can be computed in many ways."  They are both
> right.  Take for example a Definite Clause Grammar written in pure
> Prolog.  ONE AND THE SAME FORM OF EXPRESSION may behave
>  - as a top-down recursive descent parser (Prolog execution)
>  - as a left-corner parser (I've got parsers running this way; this
>    is not a theoretical point)
>  - as a chart parser (XSB execution).
> Take another example.  There is a programming language called Euclid.
> Euclid is in the Pascal family.  It's full of records, arrays, pointers,
> assignment statements, all that stuff.  But there was a paper from Xerox
> showing how a Euclid program could be *naturally* and *directly* viewed
> as a pure functional program.  ONE form of expression, but BOTH a
> "procedural" reading AND a "declarative" reading.  Heck, it's more than
> 30 years since (de?) Bakker showed how to view an Algol 60 "procedural"
> program as a lambda-calculus "declarative" formula.  For another example,
> you accepted SQL as a non-procedural form of expression, but I countered
> that I can "see" an imperative reading.
> 
> The point that I am making is that the distinction you are trying to make
> is _not_ a distinction that is present in the forms of expression themselves;
> the procedural and declarative views are ways that human beings _choose_ to
> read texts which are themselves neutral. 
> 
> I must stress that this is not a theoretical point.  There are a couple of
> companies making what seemed to me at the time "big bucks" taking COBOL
> programs, *viewing* them as declarative expressions, and emitting new COBOL
> code with the same _declarative_ meaning but different _procedural_ form
> and behaviour.  And there are people using functional languages to design
> VLSI circuits.

I probably should have said "some proportion of ..."

Even a C program has "non-procedural" elements, if you take look
at a function call such as printf as an atomic operation where
you do not specify how to do it, only what you want to be done.
In fact, "function call" pretty much says it all, doesn't it?

I was mostly referring to the "pure" non-procedural languages
that have *zero* procedural elements to it, and it's completely
up to the compiler/interpreter to figure it out.  Past that,
you have to give up some control of the procedure at some
level.

As far as SQL goes, it is a pure mathematical statement, with
no procedural element at all (assuming we are not talking about
cursors, etc.).  It simply describes a function on one or more
matrices.  In fact, joins are usually described in terms of
cartesian products, but we all know the implementation tries
to avoid the cartesian product at all costs.  There are entire
industries devoted to getting an SQL engine to do what *you* want
it to do, not what *it* wants to do. :)

> >2. Algorithms can execute *only* using a time-based mechanism.
> 
> You are taking "we can only _tell_ that an algorithm has executed
> by contrasting a later situation with an earlier one" for a property of
> algorithms rather than a property of human perception and embedding in
> the world.  You may be right, but we can only notice *anything* by
> contrasting earlier and later situations (I am using "situations" in
> a Barwisian sense here).  That doesn't mean that it is the most fruitful
> way to think about everything.  One very important computing technique,
> after all, is converting time to space.  (What else is parallelism?)

I don't think I'm looking at in a perceptual sense; relativity tells
us that signals cannot propagate faster than the speed of light.
To execute an algorithm requires signals to move through space
and change energy states in some manner.  Since propagation is
required, time is required.

I look at parellelism as simply simultaneous signals flowing
through an algorithmic system.

> You are also not confronting your assumption that the most important
> thing to understand about an algorithm is its _execution_.
>
> >Now, as for education, I submit that at the beginning it is more
> >important to focus on what execution means 
> 
> I _really_ don't get you here.  The phrase is actually ambiguous.
> I don't understand either of the readings.
> 
> If you mean "it is important for students to understand what it is like
> when a machine executes a program", fine, we agree that it is important,
> but why is it the _most_ important thing?
> 
> Case in point:  I learned matrix algebra before I learned about machines.
> I know what the execution of a Gaussian elimination "means" because I
> know matrix algebra, not because I know machines (though I do know both).
> If I just knew machines, I wouldn't have the faintest notion what a
> Gaussian elimination meant. 

There may come a day when non-procedural languages can be used
for more than very specialized uses, but let's fact it; the
"normal" procedural languages such as C are domininant at this
point in our computer history.  As such, I think it's very
important to give the student a grounding in this type of thinking.
This doesn't preclude them from learning more exotic types of
languages, but we should give them a foundation based on where
the state of the art is.

And there's a reason for the dominance of procedural languages
such as C.  They are simply the most efficient way (in an
execution sense) to express algorithms at this point.  There
may come a time when they are less important, but we are
far off from the point, and we do disservice to students when
we try and give them exotic thinking modes as the primary
outlook.  Secondarily is OK, to give them a balanced viewpoint,
but the primary viewpoint should be procedural.

> >Certainly
> >you need some form of expression, but it should be kept as
> >simple as possible to get the student brain into the general
> >concept of a flow of execution without the distraction of
> >methodologies.
> 
> "Methodologies" is not a word that I use; it is pretentious.

I suppose it *is* right up there with "paradigm" (one of my
personal favorites).

> Do we teach children about the mechanics of speech before they
> learn to talk?

I think I am taking exactly the opposite viewpoint.  I want to
focus on the basics of execution. It sounds you like you want to
teach them about the abstraction of phonemes and language
construction before they learn to talk.

> When I was taught to play the piano as a child, I was taught about
> forms of expression (how to read music), not about how to build a
> piano or what happens when you press a key.

Yes, but first you were taught how to move your fingers in
sequence and rhythm.  You were not taught the abstraction of
composition first.  Perhaps later, but not at first.

> When my wife bought a knitting machine as a teenager, she was taught
> how to read patterns, and how to control the machine, not how the
> machine is constructed or operates.

Exactly.  She focused on the practical aspects of taking the
pattern and executing the steps.  Same principle.

> When you are taught to drive a car, you are taught the rules of the
> road and a few heuristics for operating the machine.  You are not
> taught the theory of combustion; you are not shown the wiring diagram
> of the car.  Heck, people these days don't even learn what gears _are_,
> let alone how they work.
>
> I got a ham license without having to learn what actually goes on inside
> a valve or transistor.

Yes, but drivers don't "program" the car.  These both are more the
"user" model than the "programmer" model.

> What is so different about computing?

You make my point for me.  You argue above that teaching at
a high level of abstraction is good, kind of like having your
wife learn the theory of clothing design before she starts
to sew.  I say that starting with basics of execution is a
much better foundation to build on, because you don't overwhelm
them with the entire world of possibilities.

Limiting the view of the entire picture when you're first
learning something is not a bad thing; it's necessary in order
to give a "foothold" on the concepts, and to give the student
something to compare everything else to.

Just because you teach procedural concepts doesn't preclude
the student from learning non-procedural concepts.

> >In fact, the algorithm definition you posted above is a good
> >one.  Do you use that in your classes?  
> 
> As I don't teach first year, no.  It is, of course, Knuth, The Art
> of Computer Programming, Volume 1.

Erg; pretty heavy duty.  I always wondered if anyone used them
as textbooks.  A better or more complete textbook there couldn't
be, but they are, shall we say, difficult to approach. :)

> >Well, I have to admit that when I think about an algorithm,
> >I visualize data flowing through the algorithm undergoing
> >transformations until it's what I want.  I start with the
> >highest level "black box" transformations, and then gradually
> >break down each black box in greater and greater detail until
> >I have C code (or whatever).
> 
> >This is certainly not the only way to look at problems, but it
> >is the closest to the reality of how algorithms execute.  And
> >I submit that it is the most straightforward to understand.  I
> >admit, though, that I have a natural bias toward this point of
> >view.
> 
> Just at the moment I'm playing around with trying to speed up
> back-propagation neural nets.  A student here has NN programs
> that regularly take 4 hours.  So far I've worked on just one
> stage, but already that stage goes 5 times as fast.  (That's
> the practical ceiling for that stage.)  What's the trick?  To
> step *BACK* from the procedural view; with considerable effort
> to figure out that the bulk of the work goes on in what _should_
> have been two calls to SGEMV and one call to SGER (level-2 BLAS).
> (Well, actually I'm using APL notation, not calls to the BLAS.)
> Then you notice that there are a SAXPY and SDOT lying around that
> could be fused with the SGER and one of the SGEMVs (making this
> decision _solely_ on the timeless mathematics), and then you
> see that two loops can be fused making it possible to eliminate
> a vector store and reload, and before you know it, almost all of
> the branches have gone.  The bit I've looked at is one of the
> SGEMV steps, and the speed of that chunk has gone from 8 Mflops
> to 40 Mflops on the same machine.  (To start with, one branch
> per 50 instructions.)
>
> I am already aware of one bottleneck that makes it unlikely that
> the whole NN training phase will speed up that much.  (Anyone know
> a really fast way to compute a reasonable approximation to a logistic
> function?)
> 
> The point here is that people before me had already tried to write
> good code, but their minds were straitjacketed by thinking the way
> you want them to think.  I had to think at the (timeless) mathematical
> level to see how to reorganise this code. 

Well, I'm not familiar with the concepts in that juicy of
detail, but come on; this is sort of a specialized application.
There are certainly times that a mathematical view can provide
insight into a particular problem.

But even your neural net is composed of stages, and in fact,
the vast majority of problems break down the way I outlined.

> >Hmm; I'm not sure if I view it the same way or not.  If I
> >visualize Quicksort, for example, I visualize a big block of data
> >that I divvy into two parts.  Each of those parts get divided,
> >and so on.  To map into the computer domain, I think about a
> >black box that does the divvying process, and then each piece
> >gets put back through the black box.  I suppose my core metaphor
> >is also spatial, but I'm not sure if it's the same way.
> 
> >The point is that I (at least) don't view Quicksort as a
> >static "hall of mirrors", everything reflecting back and forth
> >with no sense of execution.  I see a "movie" of the data breaking
> >down section by section, and I can see the pattern between
> >each break and generalize it.
> 
> Done much parallel programming yet?  You are describing an approach
> to understanding algorithms which makes parallel programming _very_
> difficult.

Not at all.  I can visual parallel processing very easily.  I
simply have multiple transformations of data happening
simultaneously.  In fact, I usually visualize Quicksort
happening in parellel (although the implementation is linear,
of course. :) ).

Let's take the brain.  I submit my visualization of the brain
as a huge collection of "black box" neurons with signals
flowing between them is more accurate (and practical) than
a view of the brain in terms of mathematical formulas.  The
latter seems to me a complete failure if someone ever wants
to really understand what's going on.

-- Tim Behrendsen (tim@a-sis.com)





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-19  0:00                                       ` Tim Behrendsen
                                                           ` (5 preceding siblings ...)
  1996-09-11  0:00                                         ` Richard A. O'Keefe
@ 1996-09-18  0:00                                         ` Jon S Anthony
  1996-09-26  0:00                                         ` Jon S Anthony
  7 siblings, 0 replies; 688+ messages in thread
From: Jon S Anthony @ 1996-09-18  0:00 UTC (permalink / raw)



In article <51knhg$j61@dub-news-svc-8.compuserve.com> grs@liyorkrd.li.co.uk (George) writes:

> jsa@alexandria (Jon S Anthony) wrote:
> 
> > In article <01bb9f26$36c870e0$87ee6fce@timpent.a-sis.com> "Tim Behrendsen" <tim@airshields.com> writes:
> 
> > > It is similiar to the difference between summation and integration;
> > > one consists of individual sums, the other of an infinite number
> > > of sums.  However, both are fundamentally adding.
> 
> > Well, that is one option.  But as "everyone" knows, the FTC allows you
> > to compute definite integrals without taking the limits of sums or
> > using summations at all.  Incidentally, none of the standard
> > definitions (Riemann Sum or something) uses "an infinite number of
> > sums".  Can't - infinity is not part of the real numbers...
> 
> 
> Surely the definition of integration contains the phrase "...tends to
> infinity", i.e. it's _as if_ there was an infinite number of sums.

Nothing so sloppy as that.  You can't have "real definitions" that
hand wave stuff like "as if", "tends to infinity", etc.  Also,
"integration" is an activity - the definitions in question involve the
notions of "definite integral" and "indefinite integral" of a function
(over some interval).  Integration then involves various techniques
for finding such things.  In _standard_ analysis the definitions for
definite integral (the sort of "summing stuff" first mentioned) all
involve the _formal_ notion of a limit (you remember that - all that
epsilon and delta stuff with universal and existential quantification).
Infinity (as in a completed concept, i.e., transfinite number) does
not enter into it.  If you are really interested, I can give you the
details in email - or you can just pull out that first year calculus
book...

/Jon
-- 
Jon Anthony
Organon Motives, Inc.
1 Williston Road, Suite 4
Belmont, MA 02178

617.484.3383
jsa@organon.com





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-18  0:00                                                   ` Tim Behrendsen
@ 1996-09-19  0:00                                                     ` Richard A. O'Keefe
  0 siblings, 0 replies; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-09-19  0:00 UTC (permalink / raw)



"Tim Behrendsen" <tim@a-sis.com> writes:
>> Here you are making a claim to be able to compute non-computable
>> functions.  I don't believe you. 

>As I understood what you wrote, your optical computer was
>modulating a signal based on refraction index of a medium.  This
>sounds calculable to me, assuming we know how the refraction
>function of your crystal is set up.  We are getting a bit
>beyond my knowledge;  I don't know if you mean that the
>mathematics is unknown, or unknowable, but let me take both
>cases.

Neither.  "Computable" has a standard precise technical sense, which you
appear to be unaware of.

>Unknown: Just because a natural law is not understood, doesn't
>mean there is not an algorithm behind it.

For Ghu's sake, there are functions from the naturals to the naturals
which are not computable, let alone functions from nondenumerable spaces!

>Unknowable: There are certainly cases in physics where the
>underlying algorithm is not predictable; for example, the
>output of a geiger counter could not be simulated using pencil
>and paper (that we know of, anyway).  Now, this is a much more
>interesting question.  Is there an algorithm behind atomic
>decay?  There must be a *method*, even if it is not predictable
>(or known).  Your author's definition obviously wasn't designed
>to handle questions of this type; I think it comes down to
>opinion whether this fits a definition of algorithm.

You are confusing predictability with determinism.
Popper showed several decades ago that a Newtonian universe is not
*predictable* by any computing mechanism even though it is *deterministic*.
(To the best of my knowledge, Popper published this well before "chaos"
became a fashionable concept.)

The Universe *does* things which cannot be accurately modelled by
anything that has ever been proposed as a realistic physical model
of computation.  Weather is the most obvious case.

When you talk about about there being an "algorithm" or "method"
behind atomic decay, you enter the realms of science fiction.
(Wasn't it A.E.van Vogt who had 'electron psychology' in one of his
stories?)  Either I have completely misunderstood (which is likely,
because quantum mechanics was by far my worst subject) or the current
belief is that there are *no* hidden variables and there is *no*
"algorithm" or "method", quantum events just happen as often as their
amplitudes say they should.  (Mermin savaged Popper for failing to
distinguish between probabilities and amplitudes.  Rather a relief
to me, as I had never quite understood Popper's "propensities".)

If you are _right_ that everything in the real world including atomic
decay is procedural, then everything in modern physics is as wrong as
it can possibly be, not just incomplete but wrong-headed through and
through.

>Even a C program has "non-procedural" elements, if you take look
>at a function call such as printf as an atomic operation where
>you do not specify how to do it, only what you want to be done.
>In fact, "function call" pretty much says it all, doesn't it?

>I was mostly referring to the "pure" non-procedural languages
>that have *zero* procedural elements to it, and it's completely
>up to the compiler/interpreter to figure it out.

How often do I have to say it before you get the point?

    A.	THERE IS NO SUCH THING AS A FORM OF EXPRESSION FOR
	COMPUTABLE FUNCTIONS WHICH DOES NOT ADMIT A
	PROCEDURAL READING.

    B.	THERE IS SUCH THING AS A FORM OF EXPRESSION FOR
	COMPUTABLE FUNCTIONS WHICH DOES NOT ADMIT A
	NON-PROCEDURAL READING.

"Function" here is to be interpreted in a wide sense including mappings
from concurrent input streams to concurrent output streams ("functions
over histories").  The class of languages you are trying to talk about
simply does not exist.

>As far as SQL goes, it is a pure mathematical statement, with
>no procedural element at all (assuming we are not talking about
>cursors, etc.).

No, no, and no.  You are insisting that your reading of SQL is the
only possible correct one.  But it isn't.  I am allowed to read the
specification of a computable function any way I want as long as I
get the same answers.

The classical example of this is of course the lambda calculus.
There is a large body of work on alternative readings of the lambda
calculus.

>It simply describes a function on one or more
>matrices.

In exactly the same way, I can say of *any* program that it "simply
describes a function on" certain denotations.

>In fact, joins are usually described in terms of
>cartesian products, but we all know the implementation tries
>to avoid the cartesian product at all costs.

The readings a particular computer program applies to a formalism
place no upper bound on the set of readings the formalism admits,
not even on the set of *useful* readings.

>There may come a day when non-procedural languages can be used
>for more than very specialized uses, but let's fact it; the
>"normal" procedural languages such as C are domininant at this
>point in our computer history.  As such, I think it's very
>important to give the student a grounding in this type of thinking.
>This doesn't preclude them from learning more exotic types of
>languages, but we should give them a foundation based on where
>the state of the art is.


There are so many differences between us packed into this one little
paragraph that I think it is time for me to pull out of this thread
after listing a few of them.  Recall that the context is teaching;
nobody disputs that computing professionals should _eventually_ know
a whole lot of things.

1.  "The imperative paradigm is dominant."

    I wonder whether it is _true_ that the imperative paradigm is
    dominant.  Does anybody know what proportion of work is done
    using spreadsheets and data base languages compared with the
    proportion of work done using imperative languages?  I am
    supervising a Masters student doing a project on data quality;
    this student has a lot of experience with software engineering
    and commands high consulting pay from *big* companies, but hasn't
    written an imperative program in *years*.

2.  "Whatever is, is right."
    I'm Dr O'Keefe, not Dr Pangloss.  I don't believe this.

3.  "The initial approach to _teaching_ a subject should be confined
    to what is currently fashionable in some blinkered view of 'the
    real world'."

    This sounds to me like a good recipe for ensuring
    that your students will be obsolete when they graduate.

4.  "The initial approach to _teaching_ a subject should be based on
    what yesterday's practitioners identified as the foundations of
    the subject."

    This proved a disaster when applied to the teaching
    of mathematics in schools.  Should we expect computing to be
    different?  (I am not saying that foundations can or should be
    ignored, just that one starts a five-year-old with "one, two,
    three", not with metamathematics or category theory.)

5.  "Non-procedural languages cannot be used for anything but
    specialised uses."

    Talk about blinkered.  Take a look at what you can do with Clean.
    It's perhaps the easiest language I've found for writing GUIs, and
    its performance is darned good.  Take a look at FoxNet 2.0 (TCP/IP
    in ML).  Take a look at Sisal.  Take a look at Erlang.  Let's see,
    that's GUI, TCP/IP, scientific computing, telecoms, distributed,
    all covered.

    I have to concede that if I were doing any 8051 programming I would
    use C or assembler.  Hmm.  Maybe.  It might be better engineering
    to design a declarative language tailored to the application and to
    write a compiler for it.  In an application like that you really
    *really* cannot afford the kinds of errors that imperative programming
    allows.

6.  "Declarative forms of expression are exotic."

    Alas, there is some truth in this.  When I and my peers entered
    University and started to learn about computing, we already had a
    good grounding in differential and integral calculus, elementary
    probability and statistics, vector and matrix algebra, and had
    met differential equations.  We had studied Euclidean geometry
    (the textbook was basically Euclid's Elements) and the notion of
    proof, including proof by cases and proof by induction.  To us,
    _discrete_ mathematics was exotic; the idea of step-by-step
    calculations in the computer was exotic.

    Alas, we in this department are not allowed to turn down prospective
    students simply because they didn't do mathematics at high school.
    (Or so I was told when I helped in the selection process.)
    And the ones who _did_ do it don't seem to recognise much of it when
    they see it again.  One of the scariest things I've heard in the
    last decade was a student complaining to me because a lecture had
    been too 'mathematical':  "I took CS because I'm no good at maths"
    and a large group of his friends agreeing with him.

    So it is unfortunately true that declarative forms of expression
    are exotic to a majority of our students.

    It is also unfortunately true that the high schools offer computing
    subjects, and that students sometimes take these instead of mathematics.
    Now, I know a high school teacher.  She is qualified as a librarian.
    A couple of months ago she was complaining about having to learn
    Italian because the school was going to have to teach it, and she'd
    been selected as one of the teachers.  Friends of the Italian language,
    are you quaking in your boots?  Are you weeping?  If they do that for
    Italian, in a state with a lot of fluent speakers of Italian, what do
    think they do for computing?

    So it is unfortunately true that procedural forms of expression
    are _familiar_ to a lot of our students.  "Unfortunately" because
    they think they understand it, which makes it harder for us to teach
    them.  Which brings me to the next point:

7.  "It is always best to teach using familiar concepts."

    One of the reasons why a number of Universities are using Miranda,
    or Haskell, or Orwell, or something of the sort, and certainly one
    of the reasons why we carefully selected a procedural language that
    was not currently being taught in the local schools (Ada) was that
    we'd like to get the students' attention.  And we simply don't get
    it as long as we are trying to teach them something they _think_
    they already know.  We did seriously consider something less like
    Pascal, but we got phone calls from high school careers people saying
    "if you go 'academic' we'll tell our students to stay away from you".
    No good doing the best thing if you go out of business, so we're
    doing a _good_ thing (Ada) and staying in business.

    The fact remains that one good educational reason for selecting a
    declarative language like Haskell is "listen up, we've got things
    to tell you that you don't already know".

8.  "How last decade's computers used to execute programs is the most
    important thing to understand, so it should be taught first."

    Well, there are a _lot_ of things we have to teach.  How to use the
    index in a textbook.  How to look things up in a library.  How to
    use a spelling checker, a word processor, electronic mail.  How to
    write comments (we need to put more effort into that).  How to write
    documentation for a computer program (they're doing better than they
    used to before we started teaching this explicitly, but not well
    enough yet) or project.  (We really need to fit in a remedial English
    course.  The Center for English Learning already offers such courses
    for overseas students, and the department pays for students who need
    them badly to attend them, but very very few of those who need them
    actually go.  In another year or two we'll have to make it compulsory
    for all students.)  How to name things.  Elementary discrete
    mathematics.  Elementary probability and statistics.  The idea of
    testing.  I could go on and on and on.

    The machine our students use is four-way superscalar, with 6
    functional units, longish pipes, and loads may be up to 30 cycles
    "in flight" before a datum arrives.  I don't know if it does out-
    of-order execution or not; if it doesn't, then a paper I was reading
    yesterday from NYU would call it an "early" RISC!  This is why I call
    the 'classical' procedural reading of an imperative program "how
    last decade's computers used to execute programs."  The vendor's C
    compiler automatically does function inlining, loop unrolling, and
    apparently will do modulo scheduling (though the option for that is
    not documented).  The order specified in a C program isn't even
    close to the order in which things actually happen in the hardware.

    So why is an abstraction which is NOT genuinely the way the students'
    computer works the FIRST thing they should learn?  Why is it more
    important than, oh, first order predicate calculus?  Or elementary
    probability?  Or how to write a specification?  Or, come to that, how
    to _read_ a specification?  (Such as an assignment.  Some of our
    students, even articulate ones, have real trouble with that.  If they
    do what my high-school teachers called "comprehension" exercises, it
    doesn't seem to help enough.)

    I put it to the few remaining readers of this thread that it is MUCH
    more urgent to learn how to communicate with other human beings
    about what is to be computed than it is to learn how last decade's
    machines used to work.


>And there's a reason for the dominance of procedural languages
>such as C.  They are simply the most efficient way (in an
>execution sense) to express algorithms at this point.

This hasn't been true for some time.  Sisal is perhaps the oldest
counter-example.

The reason for the continued dominance of imperative languages is
economics, nothing else.  It doesn't matter if declarative languages
have now reached the point where their compilers can routinely run
rings around C compilers (but not Fortran compilers); if the bulk
of programmers are trained in old-fashioned techniques and the bulk
of development tools support old-fashioned techniques, then the
conversion cost may be prohibitive.

I gave an example in my last posting where abandoning C, going back to
a declarative notation, and then writing a specialised code generator
gave me a speed-up of about 5 or 6 over the best that C could do.
It turns out that using the next generation of the hardware gives a
factor of 7 speedup.  (The specialised code speeds up too, but it doesn't
speed up as much.  Improving the scheduler should restore the relative
positions of the "derived-from-declarative" and C code.  I hope to have
that--still fairly crude--scheduler finished today.)

The point here is that even a factor of 5 or 6 speedup for abandoning
the procedural paradigm isn't a big enough reward for people and
organisations already mired in the old procedural mindset.  Wait a
couple of years, buy new hardware, and you get about the same speed,
*without* retraining or retooling.  "Incumbent advantage" can survive
major challenges.

If the most efficient execution were the main, or even a very important,
reason for programming language choice, who would use Java?  or Visual
Basic?  Or C?  We would all be using High Performance Fortran.

The things I like in Ada (generics, packages, static type checking without
too tight a straitjacket, &c) have very little to do with its procedural
side.  We didn't pick it as our first year language because it was faster
than C!  Although for a number of benchmarks that I've tried, it _is_.  So
if execution speed were the most important thing, we'd see everyone jumping
from C to Ada 95.  We don't see that.  Incumbent advantage.

>There
>may come a time when they are less important, but we are
>far off from the point, and we do disservice to students when
>we try and give them exotic thinking modes as the primary
>outlook.  Secondarily is OK, to give them a balanced viewpoint,
>but the primary viewpoint should be procedural.

Now I begin to understand how Sun were able to release Java on the
world with a clear conscience.  A specification of what exactly the
security model is (not available), a specification of exactly what it
is that the byte code verifier has to do (not available), and a proof
that the latter ensures the former (not available) would have involved
"exotic thinking modes".  Better to ship a "secure" language implementation
with security holes and rely on net users to finish the debugging than to
"do disservice" to anyone by taking "exotic thinking modes" as primary.
All that counts is how it works, not what it is supposed to do, right?


			WELL I AM SICK OF IT!

You can rail at it as much as you like; you can call it exotic; you can
say it's a disservice; you can say that the real world is algorithmic
right down to the nuclei.  But the fact remains that the viewpoint you
advocate as primary is the one which has flooded us with buggy software.
(Java isn't even _close_ to being the worst offender.  I was unfair to
Sun.  Windows 95 is probably doing far more real damage.)

When our students have trouble writing a working program, it is not because
they don't understand how the computer works.  It's because they never sat
down and said clearly, even to themselves, what the program was *supposed*
to do.  We preach at them and preach at them "Write the comments FIRST",
and they keep on handing in uncommented stuff saying "I didn't have the
time to write the comments."  The students *HAVE* your primary view as
their primary view, and that's why they are bad programmers.  They *HAVE*
your primary view as their primary view, and that's why they are so
hopeless at debugging.  They will spend hours stepping through a program
rather than 10 minutes thinking about it.

Mind you, I'm talking about 1st and 2nd year students.  We put a *lot* of
effort into trying to teach software engineering, not just programming,
and by the time the students graduate they have improved somewhat.

>> Do we teach children about the mechanics of speech before they
>> learn to talk?

>I think I am taking exactly the opposite viewpoint.  I want to
>focus on the basics of execution. It sounds you like you want to
>teach them about the abstraction of phonemes and language
>construction before they learn to talk.

This is like Hitler calling the Chief Rabbi a fascist.  Sigh.

SEQUENTIAL EXECUTION IS AN ABSTRACTION.

This rather remote abstraction (which is not true to the
way modern distributed systems with multi-processor superscalar out-of-order
executing pipelined machines actually work) is not basic.

"WHAT DO YOU *WANT* TO COMPUTE?"

is basic.  I don't know any theory of child language learning that
doesn't start from the child having something it wants to say.

>> When I was taught to play the piano as a child, I was taught about
>> forms of expression (how to read music), not about how to build a
>> piano or what happens when you press a key.

>Yes, but first you were taught how to move your fingers in
>sequence and rhythm.

Not so.  Nobody ever taught me how to move my fingers.

>You were not taught the abstraction of composition first.

Perhaps not, but this is a point to me, not a point to you.
Because composition is precisely the analogue of the sequential
execution abstraction.

>> When my wife bought a knitting machine as a teenager, she was taught
>> how to read patterns, and how to control the machine, not how the
>> machine is constructed or operates.

>Exactly.  She focused on the practical aspects of taking the
>pattern and executing the steps.  Same principle.

No, it's the *opposite* of your principle.  When you use a knitting
machine, you have little or no concern with the steps that the MACHINE
follows.  You lay the pattern out SPATIALLY.  The machine does it
temporarlly, but the human being sets it out SPATIALLY.

There are admittedly data-flow constraints on the order of setups,
but they too are understood in structural/geometric rather than
algorithmic terms.

>> I got a ham license without having to learn what actually goes on inside
>> a valve or transistor.

>Yes, but drivers don't "program" the car.  These both are more the
>"user" model than the "programmer" model.

I built my own ham equipment.  The radio amateur license exam tests your
knowledge of circuit and antenna theory.  Building your own equipment is
a hell of a lot closer to the "programmer" model than the "user" model.
My point is that, just as I didn't have to understand the device physics
underlying valves and transistors, only the mathematical "transducer"
model, in order to read, understand, and build an electronic circuit,
neither do programmers need to understand the low level mechanics of
computers in order to build programs.

>> What is so different about computing?

>You make my point for me.  You argue above that teaching at
>a high level of abstraction is good, 

This is exactly the opposite of what I said and meant.

The point that I am making is that you start from

	WHAT DO YOU WANT TO DO?

in all these areas, not

	HOW DOES THE MACHINE DO IT?

You have consistently argued that the fundamental which must be mastered
first is the sequential execution abstraction, which is concerned with
the secondary question "how does the machine do it".

Children learn to talk by having something they want to say.
They are explicitly taught the sequential execution abstraction
(grammar) much later.

People learn to drive by having somewhere they want to go
and wanting to learn how to make the car go there.
They are explicitly taught the execution abstract "the steering wheel
is connected to the rack and pinion which is connected to the steering
rods which are connected to the front wheels" (or whatever it is, I
don't really know, don't really care, and don't have any particular
reason why I _should_ know or care) afterwards, if at all.

People learn to play a musical instrument by having a tune that
they want to produce.
They learn how a piano actually _works_ much later, if at all.
(How many drummers need to understand the physics of vibrating
membranes?  Apart from Richard Feynman, that is.)

*How things work* is secondary to *what they are for*.

>Just because you teach procedural concepts doesn't preclude
>the student from learning non-procedural concepts.

Given that we and the students have only limited time,  the more time
we spend on abstractions like sequential execution the less time we
have to spend on fundamentals like requirements and specifications.

Just because we teach Ada doesn't mean the students are precluded from
learning Sisal or Haskell or RAISE or ...   The fact remains that they
*don't* learn these things.  (We do teach some Z.  But they learn it
*only* as a specification language.  Alas.)  Heck, just because we
teach computing doesn't preclude them learning quantum chromodynamics
either.  But that doesn't happen.

>> As I don't teach first year, no.  It is, of course, Knuth, The Art
>> of Computer Programming, Volume 1.

>Erg; pretty heavy duty.  I always wondered if anyone used them
>as textbooks.

Pretty heavy duty?  Well, the mathematics is a lot of fun.  It's closely
related to recreational mathematics after all.  I take it you are
referring to all that MIX assembly code as the "heavy duty" stuff.

>But even your neural net is composed of stages, and in fact,
>the vast majority of problems break down the way I outlined.

The neural net is composed of *layers*, not *stages*.
The defining concept is *data* flow, not *control* flow.
One way to wring another nice big constant factor out of the
computation, which I am not going to bother with, is to throw yet
another sequential imposition down the toilet, and run several training
instances through the net in parallel.  This can be done in two ways,
both of which should be followed up:  folded into the code so that we're
doing level-3 BLAS rather than level-2 BLAS, which would make better use
of the cache, and true real-world parallelism with multiple CPUs crunching
at the same time.

The really important thing here is not the particular problem or the
details, but that the kind of thinking I need for this problem is the
kind of thinking you need to write parallel code.  Allowing myself to
be hypnotised by one-step-at-a-time thinking, or how C does things,
would have been *fatal*.
	
>> Done much parallel programming yet?  You are describing an approach
>> to understanding algorithms which makes parallel programming _very_
>> difficult.

>Not at all.  I can visual parallel processing very easily.

I didn't ask whether you can _visualise_ parallel processing,
but whether you have *done* much parallel programming.
Since the only "parallel" operation you mention is quicksort,
I suspect that the answer is "no".

>Let's take the brain.  I submit my visualization of the brain
>as a huge collection of "black box" neurons with signals
>flowing between them is more accurate (and practical) than
>a view of the brain in terms of mathematical formulas.

A fuzzy mental "visualisation" with no specification as to _how_
the signals are transformed is more accurate (and practical)?
Surely you are living on the other side of the looking glass.

>The latter seems to me a complete failure if someone ever wants
>to really understand what's going on.

For your information, people *do* model neurons by mathematical formulas.
In fact, it wasn't until the formulas fitted actual neural behaviour
reasonably well that people believed they understood what was going on
inside neurons.

And of course "neural nets" *are* mathematical formulas.


Enough.  It has taken me about an hour and a half to write this,
and while it has been good to let off steam (I am still outraged by
the idea that teaching people to say what their programs are supposed
to do, which is a declarative notion, is an "exotic mode of thinking"
and a "disservice") it probably won't accomplish much else.

The thread is going in my kill-file.
-- 
Australian citizen since 14 August 1996.  *Now* I can vote the xxxs out!
Richard A. O'Keefe; http://www.cs.rmit.edu.au/%7Eok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-17  0:00                                             ` George
@ 1996-09-19  0:00                                               ` Tim Behrendsen
  1996-09-24  0:00                                                 ` Matthew M. Lih
  1996-09-26  0:00                                               ` Jon S Anthony
  1996-09-28  0:00                                               ` Jon S Anthony
  2 siblings, 1 reply; 688+ messages in thread
From: Tim Behrendsen @ 1996-09-19  0:00 UTC (permalink / raw)



George <grs@liyorkrd.li.co.uk> wrote in article
<51knkn$j61@dub-news-svc-8.compuserve.com>...
> "Tim Behrendsen" <tim@airshields.com> wrote:
> 
> > You prove my point that programmers take the procedural nature
> > of the computer as so obvious as to be beneath discussion, but
> > it's not.  I cannot stress this enough: THIS MUST BE LEARNED BY
> > STUDENTS.  This is the primary, fundamental axiom of computers.
> 
> > How many questions do we get in this newsgroup where a student
> > simply didn't follow the flow of the program to see what happens?
> > This is so obvious to you and I that we don't think about it,
> > but *they didn't*!  Because they have only a vague feeling of
> > flow, and are still looking at the program as a kind of
> > weird combination of a solid object and something with a flow
> > of time.
> 
> 
> Just to get away from the pointless symantic argument; are your really
> suggesting that students don't understand something this basic.
> If this is the effect of teaching OOA/OOP and similar mumbo jumbo,
> then maybe it's time we got back to BASICs; no way they could they
> fail to understand. 
> 
> What do they actually think happens inside a computer *magic*????

They don't think *anything*.  Think about the fresh-faced newbie on
his/her first day in CS 101.  Their only experience with computers,
if they have any at all, is interacting with them at the user
level.  They press a button, something happens.  There's obviously
a mechanism behind it, but they don't have any concept of how it
works.

Way back in the thread where this all started, one of the first
points was my complaint about the huge number of graduates that
come to me and are completely unable to take a problem that they
have never seen before and generate a solution.  I chalked this
up to students being unable to "think like a programmer", and
are only memorizing lists of algorithms that they can cough up
on a test, but not really understanding them.

> > Take recursion.  How can you not understand recursion if you
> > understand in your soul that computers execute a flow of
> > instructions?  You can't, and that's the point.  Understanding
> > the time axis is the key.
> 
> They should never have been allowed to get this far without realizing
> that.

Well, I agree.  Unfortunately, computer science today is not
focused on thinking, it's focused on packing as many abstractions
as possible into a student's head to protect them from the
evils of "what's really going on."

-- Tim Behrendsen (tim@a-sis.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-17  0:00                                           ` George
@ 1996-09-24  0:00                                             ` Joel VanLaven
  1996-09-27  0:00                                               ` Dann Corbit
  1996-09-27  0:00                                               ` Tom Payne
  0 siblings, 2 replies; 688+ messages in thread
From: Joel VanLaven @ 1996-09-24  0:00 UTC (permalink / raw)



George (grs@liyorkrd.li.co.uk) wrote:
: jsa@alexandria (Jon S Anthony) wrote:

: > In article <01bb9f26$36c870e0$87ee6fce@timpent.a-sis.com> "Tim Behrendsen" <tim@airshields.com> writes:

: > > It is similiar to the difference between summation and integration;
: > > one consists of individual sums, the other of an infinite number
: > > of sums.  However, both are fundamentally adding.

: > Well, that is one option.  But as "everyone" knows, the FTC allows you
: > to compute definite integrals without taking the limits of sums or
: > using summations at all.  Incidentally, none of the standard
: > definitions (Riemann Sum or something) uses "an infinite number of
: > sums".  Can't - infinity is not part of the real numbers...


: Surely the definition of integration contains the phrase "...tends to
: infinity", i.e. it's _as if_ there was an infinite number of sums.

: G.

  Actually, not all functions are integrable.  The most complete and
irrefutable definition of the integral of a function to my knowledge is :

Given a partition P of [a,b] 
          (P is a finite subset of [a,b] including a and b)
P={x0,x1,x2,...xn} such that a=x0, b=xn, and x(j+1)>xj
call Mj the lub(f([x(j-1),xj]) (least upper bound)
call mj the glb(f([x(j-1),xj]) (greatest lower bound)

The number Uf(P)=SUM(Mj(xj-x(j-1))) 1<=j<=n 
is called the P upper sum for f

The number Lf(P)=SUM(mj(xj-x(j-1))) 1<=j<=n
is called the P lower sum for f

The unique number I that satisfies the inequality
Lf(P)<=I<=Uf(P) for all possible P of [a,b]
is called the definite integral of f from a to b.

So, you could talk about an infinite (aleph 2)! number of sums, but it
is ridiculus.  We are talking about more uncountable that uncountable.
No one integrates this way.  It is a mathematical abstraction. This is
the definition that we then prove is equivalent to some simpler method
for certain "nice" functions.  Riemann sums only work for uniformly
continuous functions.  Even then we don't actually USE the Riemann sums!
We use them only as ways to let us use even better methods.  In the
mathematics there are REAL infinities but a good mathematician NEVER
actually calculates an infinite anything, as that is impossible! they 
PROVE that thier answer is correct through techniques that remove
infiniteness like induction or certain properties of the real numbers
like completeness.  While a hand-wavy infinite sums explanation usually
satisfies the non-mathematicians (like say engineers), the truth is at
once more complicated, simple, and beautiful.

-- A Mathematician
is correct

If the the most REAL :) definition of integration (as I was taught in
advanced calculus) is:
-- 
-- Joel VanLaven




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-19  0:00                                               ` Tim Behrendsen
@ 1996-09-24  0:00                                                 ` Matthew M. Lih
  1996-09-25  0:00                                                   ` Richard A. O'Keefe
  1996-09-25  0:00                                                   ` Bjarne Stroustrup
  0 siblings, 2 replies; 688+ messages in thread
From: Matthew M. Lih @ 1996-09-24  0:00 UTC (permalink / raw)



Hope you don't mind if I interject an experience.

Tim Behrendsen wrote:

> > What do they actually think happens inside a computer *magic*????

In some cases, yes!

> They don't think *anything*.  Think about the fresh-faced newbie on
> his/her first day in CS 101.  Their only experience with computers,
> if they have any at all, is interacting with them at the user
> level.  They press a button, something happens.  There's obviously
> a mechanism behind it, but they don't have any concept of how it
> works.

Even some veterans don't have any idea. I'm taking
a C++ class, and the instructor didn't have any idea
that the "++" operator was developed because it
corresponded to a very quick machine language
instruction in the old PDP machines. When I pointed
this out, his comment was along the lines of "Oh,
you hardware types."

I mention this not to rag on my instructor (who
*is* knowledgable and good), but to point out that
we have successfully insulated the users from the
machine, which was intended. Unfortunately, a lot
of times we *do* need to know what's happening behind
the scenes. I was rather disillusioned when I figured
out that in order to be a successful database programmer
I really had to know how the DBMS worked in order to
develop useful software.


Matthew M. Lih
Software Lead
TRW Enterprise Solutions




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-24  0:00                                                 ` Matthew M. Lih
@ 1996-09-25  0:00                                                   ` Richard A. O'Keefe
  1996-09-26  0:00                                                     ` Mark Wooding
  1996-09-25  0:00                                                   ` Bjarne Stroustrup
  1 sibling, 1 reply; 688+ messages in thread
From: Richard A. O'Keefe @ 1996-09-25  0:00 UTC (permalink / raw)



"Matthew M. Lih" <matt.lih@trw.com> writes:

>Even some veterans don't have any idea. I'm taking
>a C++ class, and the instructor didn't have any idea
>that the "++" operator was developed because it
>corresponded to a very quick machine language
>instruction in the old PDP machines.

This isn't true.
C inherited ++ from B, which was an *interpreted* language.
The updated operators += -= and so on were copied from Algol 68
(where they were spelled +:= -:= and so on), and the designers
of Algol 68 paid very little attention to hardware issues.
On at least some models of PDP-11, the INC instruction was *not*
quicker than ADD, just shorter.

>When I pointed
>this out, his comment was along the lines of "Oh,
>you hardware types."

He *should* have said "oh, you superstitious types!
Dennis Ritchie has been trying to kill this lie for _years_."

>I mention this not to rag on my instructor (who
>*is* knowledgable and good), but to point out that
>we have successfully insulated the users from the
>machine, which was intended.

Your instructor even has the right gaps in his knowledge:
he didn't know something that wasn't true.

C has *not* insulated the programmers from the machine.
One look at the type system will tell you *that*.
If you want to insulate the programmers from the machine,
try Lisp, or Ada.

>I was rather disillusioned when I figured
>out that in order to be a successful database programmer
>I really had to know how the DBMS worked in order to
>develop useful software.

Can anyone please supply the source and correct form of this
quotation:
	"When anyone tells me 'I want to have a programming language
	in which I have merely to say what I want done' I give him a
	lollipop."

-- 
Australian citizen since 14 August 1996.  *Now* I can vote the xxxs out!
Richard A. O'Keefe; http://www.cs.rmit.edu.au/%7Eok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-24  0:00                                                 ` Matthew M. Lih
  1996-09-25  0:00                                                   ` Richard A. O'Keefe
@ 1996-09-25  0:00                                                   ` Bjarne Stroustrup
  1996-09-26  0:00                                                     ` Bengt Richter
  1996-09-28  0:00                                                     ` Dan Pop
  1 sibling, 2 replies; 688+ messages in thread
From: Bjarne Stroustrup @ 1996-09-25  0:00 UTC (permalink / raw)




"Matthew M. Lih" <matt.lih@trw.com> writes:

 > Hope you don't mind if I interject an experience.
 > 
 > Tim Behrendsen wrote:
 > 
 > > > What do they actually think happens inside a computer *magic*????
 > 
 > In some cases, yes!
 > 
 > > They don't think *anything*.  Think about the fresh-faced newbie on
 > > his/her first day in CS 101.  Their only experience with computers,
 > > if they have any at all, is interacting with them at the user
 > > level.  They press a button, something happens.  There's obviously
 > > a mechanism behind it, but they don't have any concept of how it
 > > works.
 > 
 > Even some veterans don't have any idea. I'm taking
 > a C++ class, and the instructor didn't have any idea
 > that the "++" operator was developed because it
 > corresponded to a very quick machine language
 > instruction in the old PDP machines. When I pointed
 > this out, his comment was along the lines of "Oh,
 > you hardware types."

Actually, the story that ++ comes from the PDP11 instruction
set is a myth. Dennis Ritchie has denied it quite often, but
that doesn't seem to impress people. ++ is in C and C++ because
Dennis (being a mathematician) considered it a fundamental
(and useful) operation. It was in the PDP11 instruction set
because the designers at DEC independently had figured out
that it was an important operation to optimize.

	- Bjarne







^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-25  0:00                                                   ` Bjarne Stroustrup
@ 1996-09-26  0:00                                                     ` Bengt Richter
  1996-09-28  0:00                                                     ` Dan Pop
  1 sibling, 0 replies; 688+ messages in thread
From: Bengt Richter @ 1996-09-26  0:00 UTC (permalink / raw)



bs@research.att.com (Bjarne Stroustrup) wrote:


>"Matthew M. Lih" <matt.lih@trw.com> writes:

> > Hope you don't mind if I interject an experience.
> > 
> > Tim Behrendsen wrote:
> > 
> > > > What do they actually think happens inside a computer *magic*????
> > 
> > In some cases, yes!
> > 
> > > They don't think *anything*.  Think about the fresh-faced newbie on
> > > his/her first day in CS 101.  Their only experience with computers,
> > > if they have any at all, is interacting with them at the user
> > > level.  They press a button, something happens.  There's obviously
> > > a mechanism behind it, but they don't have any concept of how it
> > > works.
> > 
> > Even some veterans don't have any idea. I'm taking
> > a C++ class, and the instructor didn't have any idea
> > that the "++" operator was developed because it
> > corresponded to a very quick machine language
> > instruction in the old PDP machines. When I pointed
> > this out, his comment was along the lines of "Oh,
> > you hardware types."

>Actually, the story that ++ comes from the PDP11 instruction
>set is a myth. Dennis Ritchie has denied it quite often, but
>that doesn't seem to impress people. ++ is in C and C++ because
>Dennis (being a mathematician) considered it a fundamental
>(and useful) operation. It was in the PDP11 instruction set
>because the designers at DEC independently had figured out
>that it was an important operation to optimize.

>	- Bjarne

You are certainly the one to correct me if I'm wrong, but
to me the most important aspect of ++ (and --) is not
that i++ increments and --k decrements, but that
i++ evaluates to i and has the side effect i+=1,
whereas --k evaluates to k-1 and has the side effect k-=1,
allowing C statements such as a[i++] = b[--k]; to have
special useful meaning.

The really strikingly similar thing about the PDP-11
architecture was not instructions per se, but
particular addressing modes for memory-referencing
instructions, namely post-incremented register indexing
and pre-decremented register indexing, providing
a practically direct correspondence to a[i++] or
b[--k].

If memory serves, PDP-11 machine language syntax
used a single '+' suffix to indicate a post-incremented
register index addressing mode, applicable to many
memory-referencing instructions. A single prefixed '-'
signified the corresponding pre-decrement indexing mode.
E.g.,something like  mov r0,(r1)+ or mov r0,-(r1).

Instructions implementing a C statement such as i++; or
--k; don't reflect the pre/post semantics that are
important in  a[i++] = b[--k]; and the corresponding
addressing modes of the PDP-11.

My 2 cents ;-)
Regards,
Bengt Richter
(We had the 2nd PDP-11/45 on the West Coast, I believe :-)
(With 4k added bipolar 300ns memory to keep your feet warm in winter).





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-25  0:00                                                   ` Richard A. O'Keefe
@ 1996-09-26  0:00                                                     ` Mark Wooding
  0 siblings, 0 replies; 688+ messages in thread
From: Mark Wooding @ 1996-09-26  0:00 UTC (permalink / raw)



Richard A. O'Keefe <ok@goanna.cs.rmit.edu.au> wrote:

> Can anyone please supply the source and correct form of this
> quotation:

`When someone says, ``I want a programming language in which I need only
say what I wish done,'' give him a lollipop.'
		-- Alan Perlis, Epigrams on Programming (1982)

(Good ol' Don Knuth -- always gives his sources.  That one is on
page 365 of the TeXbook.)

[Well off topic now -- followups to poster.]
-- 
[mdw]

`When our backs are against the wall, we shall turn and fight.'
		-- John Major





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-08-19  0:00                                       ` Tim Behrendsen
                                                           ` (6 preceding siblings ...)
  1996-09-18  0:00                                         ` Jon S Anthony
@ 1996-09-26  0:00                                         ` Jon S Anthony
  1996-10-01  0:00                                           ` Andrew Gierth
  7 siblings, 1 reply; 688+ messages in thread
From: Jon S Anthony @ 1996-09-26  0:00 UTC (permalink / raw)



In article <1996Sep24.133312.9745@ocsystems.com> jvl@ocsystems.com (Joel VanLaven) writes:

> George (grs@liyorkrd.li.co.uk) wrote:
> : Surely the definition of integration contains the phrase "...tends to
> : infinity", i.e. it's _as if_ there was an infinite number of sums.
> 
> : G.

>   Actually, not all functions are integrable.  The most complete and
> irrefutable definition of the integral of a function to my knowledge is :
> 
> Given a partition P of [a,b] 
>           (P is a finite subset of [a,b] including a and b)
> P={x0,x1,x2,...xn} such that a=x0, b=xn, and x(j+1)>xj
> call Mj the lub(f([x(j-1),xj]) (least upper bound)
> call mj the glb(f([x(j-1),xj]) (greatest lower bound)
> 
> The number Uf(P)=SUM(Mj(xj-x(j-1))) 1<=j<=n 
> is called the P upper sum for f
> 
> The number Lf(P)=SUM(mj(xj-x(j-1))) 1<=j<=n
> is called the P lower sum for f
> 
> The unique number I that satisfies the inequality
> Lf(P)<=I<=Uf(P) for all possible P of [a,b]
> is called the definite integral of f from a to b.

*IF* it exists.  If f is not "nice" it won't.  When it does, it is
also the LUB of the set of all lower sums and the GLB of set of all
the upper sums.  Actually, I've always been partial to the
upper-and-lower sums definition for the definite integral.  However,
it is _not_ any more "complete and irrefutable" than Riemann Sums.


> So, you could talk about an infinite (aleph 2)! number of sums, but it

Actually, when people use the side-ways "8" notation, the typical
intent is simply that of indicating "arbitrarily large" or (somewhat
less so) denumerably infinite (size of naturals - aka Aleph0).  Your
"Aleph2" is a denotation for the size of sets the size of the power
set of R (the reals) where Aleph1 denotes the size of sets
equinumerous with R.  In general you have this whole "backbone" of
transfinite numbers constructed (typically) via the power set
operation.


> is ridiculus.  We are talking about more uncountable that uncountable.
> No one integrates this way.  It is a mathematical abstraction. This is

Well, Archimedes _did_! :-) (Just one of the reasons he is often
regarded among the top 3 mathematicians of all time).  But you only
need countably infinite sums.


> the definition that we then prove is equivalent to some simpler method
> for certain "nice" functions.  Riemann sums only work for uniformly
> continuous functions.

The RS definition is _equivalent_ to the ULS definition.  This is
reasonably straightforward to prove, but it is definitely NON-trivial.


> mathematics there are REAL infinities but a good mathematician NEVER
> actually calculates an infinite anything, as that is impossible! they

Depends on what you mean by "calculate".  If you mean "construct"
(ala' Poincare or Intuitionism) then what you say is accurate.  OTOH,
letting P be the power set of operation, then P(R) is a perfectly nice
entity (IMO) and the cardinality (size) of P(R) is a perfectly nice
transfinite number.


> While a hand-wavy infinite sums explanation usually
> satisfies the non-mathematicians (like say engineers), the truth is at
> once more complicated, simple, and beautiful.

Hey, you really are a mathematician! :-)


/Jon
-- 
Jon Anthony
Organon Motives, Inc.
1 Williston Road, Suite 4
Belmont, MA 02178

617.484.3383
jsa@organon.com





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-17  0:00                                             ` George
  1996-09-19  0:00                                               ` Tim Behrendsen
@ 1996-09-26  0:00                                               ` Jon S Anthony
  1996-09-26  0:00                                                 ` Dann Corbit
                                                                   ` (2 more replies)
  1996-09-28  0:00                                               ` Jon S Anthony
  2 siblings, 3 replies; 688+ messages in thread
From: Jon S Anthony @ 1996-09-26  0:00 UTC (permalink / raw)



In article <Dy9KEL.6nD@research.att.com> bs@research.att.com (Bjarne Stroustrup) writes:

> Actually, the story that ++ comes from the PDP11 instruction
> set is a myth. Dennis Ritchie has denied it quite often, but
> that doesn't seem to impress people. ++ is in C and C++ because
> Dennis (being a mathematician) considered it a fundamental
          ^^^^^^^^^^^^^^^^^^^^^
> (and useful) operation. It was in the PDP11 instruction set

I don't see how that has anything to do with it.  I'm a mathematician
and I don't see it as a particularly interesting or useful operation
to be singled out for special status (I'm not talking about the machine
level here...)

/Jon

-- 
Jon Anthony
Organon Motives, Inc.
1 Williston Road, Suite 4
Belmont, MA 02178

617.484.3383
jsa@organon.com





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-26  0:00                                               ` Jon S Anthony
@ 1996-09-26  0:00                                                 ` Dann Corbit
  1996-09-27  0:00                                                 ` Craig Franck
  1996-09-27  0:00                                                 ` Jay Martin
  2 siblings, 0 replies; 688+ messages in thread
From: Dann Corbit @ 1996-09-26  0:00 UTC (permalink / raw)



It's the successor to the object.
Dennis probaly loved to play the peano.
(I'm not really a mathematician,
my degree is in Numerical Analysis)
;-)
-- 
"I speak for myself and all of the lawyers of the world"
If I say something dumb, then they will have to sue themselves.


Jon S Anthony <jsa@alexandria> wrote in article
<JSA.96Sep26151720@alexandria>...
> In article <Dy9KEL.6nD@research.att.com> bs@research.att.com (Bjarne
Stroustrup) writes:
> 
> > Actually, the story that ++ comes from the PDP11 instruction
> > set is a myth. Dennis Ritchie has denied it quite often, but
> > that doesn't seem to impress people. ++ is in C and C++ because
> > Dennis (being a mathematician) considered it a fundamental
>           ^^^^^^^^^^^^^^^^^^^^^
> > (and useful) operation. It was in the PDP11 instruction set
> 
> I don't see how that has anything to do with it.  I'm a mathematician
> and I don't see it as a particularly interesting or useful operation
> to be singled out for special status (I'm not talking about the machine
> level here...)
> 
> /Jon
> 
> -- 
> Jon Anthony
> Organon Motives, Inc.
> 1 Williston Road, Suite 4
> Belmont, MA 02178
> 
> 617.484.3383
> jsa@organon.com
> 
> 




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-26  0:00                                               ` Jon S Anthony
  1996-09-26  0:00                                                 ` Dann Corbit
@ 1996-09-27  0:00                                                 ` Craig Franck
  1996-09-27  0:00                                                   ` Bob Cousins
  1996-09-27  0:00                                                 ` Jay Martin
  2 siblings, 1 reply; 688+ messages in thread
From: Craig Franck @ 1996-09-27  0:00 UTC (permalink / raw)



jsa@alexandria (Jon S Anthony) wrote:
>In article <Dy9KEL.6nD@research.att.com> bs@research.att.com (Bjarne Stroustrup) writes:
>
>> Actually, the story that ++ comes from the PDP11 instruction
>> set is a myth. Dennis Ritchie has denied it quite often, but
>> that doesn't seem to impress people. ++ is in C and C++ because
>> Dennis (being a mathematician) considered it a fundamental
>          ^^^^^^^^^^^^^^^^^^^^^
>> (and useful) operation. It was in the PDP11 instruction set
>
>I don't see how that has anything to do with it.  I'm a mathematician
>and I don't see it as a particularly interesting or useful operation
>to be singled out for special status (I'm not talking about the machine
>level here...)

Well, I'm not a mathematician, but I have noticed that a large 
number of them have a fascination with numbers theory. Maybe the
addition of integers seems "fundamental". I hope I do not need to
mention that c++; blows the doors off c = c + 1; nor shall I begin
to discuss the obvious inferiority of c := c + 1; :-)


-- 
Craig
clfranck@worldnet.att.net
Manchester, NH
"What Java revolution?"  -- Bjarne Stroustrup






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-27  0:00                                                 ` Craig Franck
@ 1996-09-27  0:00                                                   ` Bob Cousins
  0 siblings, 0 replies; 688+ messages in thread
From: Bob Cousins @ 1996-09-27  0:00 UTC (permalink / raw)



In comp.lang.c, Craig Franck wrote:

>jsa@alexandria (Jon S Anthony) wrote:
>>In article <Dy9KEL.6nD@research.att.com> bs@research.att.com (Bjarne Stroustrup) writes:
>>
>>> Actually, the story that ++ comes from the PDP11 instruction
>>> set is a myth. Dennis Ritchie has denied it quite often, but
>>> that doesn't seem to impress people. ++ is in C and C++ because
>>> Dennis (being a mathematician) considered it a fundamental
>>          ^^^^^^^^^^^^^^^^^^^^^
>>> (and useful) operation. It was in the PDP11 instruction set
>>
>>I don't see how that has anything to do with it.  I'm a mathematician
>>and I don't see it as a particularly interesting or useful operation
>>to be singled out for special status (I'm not talking about the machine
>>level here...)
>
>Well, I'm not a mathematician, but I have noticed that a large 
>number of them have a fascination with numbers theory. Maybe the
>addition of integers seems "fundamental". I hope I do not need to
>mention that c++; blows the doors off c = c + 1; nor shall I begin
>to discuss the obvious inferiority of c := c + 1; :-)

I don't think a mathematician would write c = c + 1 either.

-- 
Bob Cousins, Software Engineer.
Home page at http://www.demon.co.uk/sirius-cybernetics/

Please do NOT use my email address on mailing lists without my prior permission.




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-26  0:00                                               ` Jon S Anthony
  1996-09-26  0:00                                                 ` Dann Corbit
  1996-09-27  0:00                                                 ` Craig Franck
@ 1996-09-27  0:00                                                 ` Jay Martin
  1996-09-27  0:00                                                   ` Kent Budge
  1996-09-27  0:00                                                   ` Tim Behrendsen
  2 siblings, 2 replies; 688+ messages in thread
From: Jay Martin @ 1996-09-27  0:00 UTC (permalink / raw)



jsa@alexandria (Jon S Anthony) writes:

>In article <Dy9KEL.6nD@research.att.com> bs@research.att.com (Bjarne Stroustrup) writes:

>> Actually, the story that ++ comes from the PDP11 instruction
>> set is a myth. Dennis Ritchie has denied it quite often, but
>> that doesn't seem to impress people. ++ is in C and C++ because
>> Dennis (being a mathematician) considered it a fundamental
>          ^^^^^^^^^^^^^^^^^^^^^
>> (and useful) operation. It was in the PDP11 instruction set

>I don't see how that has anything to do with it.  I'm a mathematician
>and I don't see it as a particularly interesting or useful operation
>to be singled out for special status (I'm not talking about the machine
>level here...)

I don't see any mathematical justification for it either, maybe Mr
Richie should publish a paper on the fundamental nature of "++" to the
foundations of mathematical thought. It seems incredible to me that Mr
Ritchie had never seen an "increment" assembly instruction or that the
inclusion of 10+ REDUNDANT and side-effect producing operators was not
motivated by some low-level performance concern/"too lazy to write an
optimizing compiler"  or an anti-software engineering desire to
minimize keystrokes on some primitive input device.   If he did think
he was doing mathematics, then I would say that he is an even poorer
mathematician than he is a language designer.






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-27  0:00                                                 ` Jay Martin
  1996-09-27  0:00                                                   ` Kent Budge
@ 1996-09-27  0:00                                                   ` Tim Behrendsen
  1996-09-30  0:00                                                     ` Art Schwarz
  1 sibling, 1 reply; 688+ messages in thread
From: Tim Behrendsen @ 1996-09-27  0:00 UTC (permalink / raw)



Jay Martin <jmartin@cs.ucla.edu> wrote in article <52g7f6$1fv0@uni.library.ucla.edu>...
> jsa@alexandria (Jon S Anthony) writes:
> 
> >In article <Dy9KEL.6nD@research.att.com> bs@research.att.com (Bjarne Stroustrup)
writes:
> 
> >> Actually, the story that ++ comes from the PDP11 instruction
> >> set is a myth. Dennis Ritchie has denied it quite often, but
> >> that doesn't seem to impress people. ++ is in C and C++ because
> >> Dennis (being a mathematician) considered it a fundamental
> >          ^^^^^^^^^^^^^^^^^^^^^
> >> (and useful) operation. It was in the PDP11 instruction set
> 
> >I don't see how that has anything to do with it.  I'm a mathematician
> >and I don't see it as a particularly interesting or useful operation
> >to be singled out for special status (I'm not talking about the machine
> >level here...)
> 
> I don't see any mathematical justification for it either, maybe Mr
> Richie should publish a paper on the fundamental nature of "++" to the
> foundations of mathematical thought. It seems incredible to me that Mr
> Ritchie had never seen an "increment" assembly instruction or that the
> inclusion of 10+ REDUNDANT and side-effect producing operators was not
> motivated by some low-level performance concern/"too lazy to write an
> optimizing compiler"  or an anti-software engineering desire to
> minimize keystrokes on some primitive input device.   If he did think
> he was doing mathematics, then I would say that he is an even poorer
> mathematician than he is a language designer.

I don't know about mathematical thought, but if you find it
difficult to imagine why increment and decrement are useful,
perhaps you need to practice a bit more programming before making
criticisms.

-- Tim Behrendsen (tim@a-sis.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-27  0:00                                                 ` Jay Martin
@ 1996-09-27  0:00                                                   ` Kent Budge
  1996-09-27  0:00                                                     ` George Haddad
                                                                       ` (4 more replies)
  1996-09-27  0:00                                                   ` Tim Behrendsen
  1 sibling, 5 replies; 688+ messages in thread
From: Kent Budge @ 1996-09-27  0:00 UTC (permalink / raw)



Jay Martin wrote:
...
> I don't see any mathematical justification for it either, maybe Mr
> Richie should publish a paper on the fundamental nature of "++" to the
> foundations of mathematical thought. It seems incredible to me that Mr
> Ritchie had never seen an "increment" assembly instruction or that the
> inclusion of 10+ REDUNDANT and side-effect producing operators was not
> motivated by some low-level performance concern/"too lazy to write an
> optimizing compiler"  or an anti-software engineering desire to
> minimize keystrokes on some primitive input device.   If he did think
> he was doing mathematics, then I would say that he is an even poorer
> mathematician than he is a language designer.

Sorry to see this degenerating into a flame war.  Hope I'm not putting
more gasoline on the fire ...

The mathematical concept of "successor to i", which corresponds quite
closely to the C notation "i++", has been a fundamental part of 
mathematics (in the guise of typographical number theory) since at 
least the time of Hilbert.

I agree that the correspondence between the ++ operator and the 
autoincrement address mode on the old PDP is sufficiently close to
make it surprising that Richie didn't get his inspiration in this way.
But if he says he didn't (why should he lie?) and if his background
is as a mathematician, then I accept that the inspiration was the
concept of "successor to i" in typographical number theory.

Incidentally (to go off on a completely different tangent) I really
miss the ol' PDP-11 ... the only assembly language I've used that was 
really a pleasure to work in.  Sure, it was 16 bit and the floating
point and memory management were a kludgy add-on ... but the basic
integer instruction set and addressing modes were beautifully simple
and simply beautiful.   

kgbudge@sandia.gov
(usual disclaimer)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-24  0:00                                             ` Joel VanLaven
  1996-09-27  0:00                                               ` Dann Corbit
@ 1996-09-27  0:00                                               ` Tom Payne
  1996-09-28  0:00                                                 ` Tim Behrendsen
  1 sibling, 1 reply; 688+ messages in thread
From: Tom Payne @ 1996-09-27  0:00 UTC (permalink / raw)



In comp.lang.c++ Joel VanLaven <jvl@ocsystems.com> wrote:
[...]
:   Actually, not all functions are integrable.  The most complete and
: irrefutable definition of the integral of a function to my knowledge is :

: Given a partition P of [a,b] 
:           (P is a finite subset of [a,b] including a and b)
: P={x0,x1,x2,...xn} such that a=x0, b=xn, and x(j+1)>xj
: call Mj the lub(f([x(j-1),xj]) (least upper bound)
: call mj the glb(f([x(j-1),xj]) (greatest lower bound)

: The number Uf(P)=SUM(Mj(xj-x(j-1))) 1<=j<=n 
: is called the P upper sum for f

: The number Lf(P)=SUM(mj(xj-x(j-1))) 1<=j<=n
: is called the P lower sum for f

: The unique number I that satisfies the inequality
  ^^^^^^^^^^^^^^^^^^^
   if it exists

: Lf(P)<=I<=Uf(P) for all possible P of [a,b]
: is called the definite integral of f from a to b.

Actually, there is a significant generalization, called Lebesgue
integration, that can be found in any graduate text on Real Analysis
(e.g., those by Royden and by Rudin).  Nevertheless, if you believe in
the axiom of choice, there are still functions that are not integrable
(but they get pretty weird).

Tom Payne




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-27  0:00                                                     ` George Haddad
@ 1996-09-27  0:00                                                       ` George Haddad
  0 siblings, 0 replies; 688+ messages in thread
From: George Haddad @ 1996-09-27  0:00 UTC (permalink / raw)



Disclaimer:  It isn't much of a mail program.

   Sorry for repeating my post excessive numbers of times.  (I suppose 
there are some of you out there who considered the first one 
"excessive".  :-))
-- 
I found these opinions on my doorstep, would you please give them a good 
home?




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-24  0:00                                             ` Joel VanLaven
@ 1996-09-27  0:00                                               ` Dann Corbit
  1996-09-27  0:00                                               ` Tom Payne
  1 sibling, 0 replies; 688+ messages in thread
From: Dann Corbit @ 1996-09-27  0:00 UTC (permalink / raw)




Joel VanLaven <jvl@ocsystems.com> wrote in article
<1996Sep24.133312.9745@ocsystems.com>...
>   Actually, not all functions are integrable.  The most complete and
> irrefutable definition of the integral of a function to my knowledge is :
> 
> Given a partition P of [a,b] 
>           (P is a finite subset of [a,b] including a and b)
> P={x0,x1,x2,...xn} such that a=x0, b=xn, and x(j+1)>xj
> call Mj the lub(f([x(j-1),xj]) (least upper bound)
> call mj the glb(f([x(j-1),xj]) (greatest lower bound)
> 
> The number Uf(P)=SUM(Mj(xj-x(j-1))) 1<=j<=n 
> is called the P upper sum for f
> 
> The number Lf(P)=SUM(mj(xj-x(j-1))) 1<=j<=n
> is called the P lower sum for f
> 
> The unique number I that satisfies the inequality
> Lf(P)<=I<=Uf(P) for all possible P of [a,b]
> is called the definite integral of f from a to b.
There are far simpler examples of functions that
are not integrable.
Consder f(x) = 1/x over [-1,1]
(It is not continuous) 

Or consider the integral of e^x over [0, infinity]
There is no finite value to the integral of this smooth,
continuous function.
There is however, for e^(-x*x) over the same interval
being sqrt(pi)/2.

> So, you could talk about an infinite (aleph 2)! number of sums, but it
> is ridiculus.  We are talking about more uncountable that uncountable.
> No one integrates this way.  It is a mathematical abstraction. This is
> the definition that we then prove is equivalent to some simpler method
> for certain "nice" functions.  Riemann sums only work for uniformly
> continuous functions.  Even then we don't actually USE the Riemann sums!
> We use them only as ways to let us use even better methods.  In the
> mathematics there are REAL infinities but a good mathematician NEVER
> actually calculates an infinite anything, as that is impossible! they 
> PROVE that thier answer is correct through techniques that remove
> infiniteness like induction or certain properties of the real numbers
> like completeness.  While a hand-wavy infinite sums explanation usually
> satisfies the non-mathematicians (like say engineers), the truth is at
> once more complicated, simple, and beautiful.

Often, infinte integrals can be converted to simple
finite integrals with a change of variables. 
"Numerical Recipies in C" discusses several solutions
to "improper" integrals.

> -- A Mathematician
> is correct
> 
> If the the most REAL :) definition of integration (as I was taught in
> advanced calculus) is:
> -- 
> -- Joel VanLaven
> 
-- 
"I speak for myself and all of the lawyers of the world"
If I say something dumb, then they will have to sue themselves.






^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-27  0:00                                                   ` Kent Budge
  1996-09-27  0:00                                                     ` George Haddad
@ 1996-09-27  0:00                                                     ` George Haddad
  1996-09-28  0:00                                                       ` Matthew Heaney
  1996-09-27  0:00                                                     ` George Haddad
                                                                       ` (2 subsequent siblings)
  4 siblings, 1 reply; 688+ messages in thread
From: George Haddad @ 1996-09-27  0:00 UTC (permalink / raw)



Kent Budge wrote:
> The mathematical concept of "successor to i", which corresponds quite
> closely to the C notation "i++", has been a fundamental part of
> mathematics (in the guise of typographical number theory) since at
> least the time of Hilbert.

   Disclaimer:  I am not any kind of mathematician.

   Observation: Ada models the notion of the "successor of i" much more 
closely (IMHO) with T'SUCC(i).  AFAIK, "the successor of i" does not 
imply that i _changes_ as a result.  Of course, if you are referring to 
i as used in sigma and pi notation (i.e., summations and product 
sequences) then one might think of i as "taking on the value of its 
successor".
-- 
I found these opinions on my doorstep, would you please give them a good 
home?




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-27  0:00                                                   ` Kent Budge
                                                                       ` (2 preceding siblings ...)
  1996-09-27  0:00                                                     ` George Haddad
@ 1996-09-27  0:00                                                     ` George Haddad
  1996-09-28  0:00                                                     ` Steve Heller
  4 siblings, 0 replies; 688+ messages in thread
From: George Haddad @ 1996-09-27  0:00 UTC (permalink / raw)



Kent Budge wrote:
> The mathematical concept of "successor to i", which corresponds quite
> closely to the C notation "i++", has been a fundamental part of
> mathematics (in the guise of typographical number theory) since at
> least the time of Hilbert.

   Disclaimer:  I am not any kind of mathematician.

   Observation: Ada models the notion of the "successor of i" much more 
closely (IMHO) with T'SUCC(i).  AFAIK, "the successor of i" does not 
imply that i _changes_ as a result.  Of course, if you are referring to 
i as used in sigma and pi notation (i.e., summations and product 
sequences) then one might think of i as "taking on the value of its 
successor".

-- 
I found these opinions on my doorstep, would you please give them a good 
home?




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-27  0:00                                                   ` Kent Budge
  1996-09-27  0:00                                                     ` George Haddad
  1996-09-27  0:00                                                     ` George Haddad
@ 1996-09-27  0:00                                                     ` George Haddad
  1996-09-27  0:00                                                     ` George Haddad
  1996-09-28  0:00                                                     ` Steve Heller
  4 siblings, 0 replies; 688+ messages in thread
From: George Haddad @ 1996-09-27  0:00 UTC (permalink / raw)



Kent Budge wrote:
> The mathematical concept of "successor to i", which corresponds quite
> closely to the C notation "i++", has been a fundamental part of
> mathematics (in the guise of typographical number theory) since at
> least the time of Hilbert.

   Disclaimer:  I am not any kind of mathematician.

   Observation: Ada models the notion of the "successor of i" much more 
closely (IMHO) with T'SUCC(i).  AFAIK, "the successor of i" does not 
imply that i _changes_ as a result.  Of course, if you are referring to 
i as used in sigma and pi notation (i.e., summations and product 
sequences) then one might think of i as "taking on the value of its 
successor".
-- 
I found these opinions on my doorstep, would you please give them a good 
home?




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-27  0:00                                                   ` Kent Budge
@ 1996-09-27  0:00                                                     ` George Haddad
  1996-09-27  0:00                                                       ` George Haddad
  1996-09-27  0:00                                                     ` George Haddad
                                                                       ` (3 subsequent siblings)
  4 siblings, 1 reply; 688+ messages in thread
From: George Haddad @ 1996-09-27  0:00 UTC (permalink / raw)



Kent Budge wrote:
> The mathematical concept of "successor to i", which corresponds quite
> closely to the C notation "i++", has been a fundamental part of
> mathematics (in the guise of typographical number theory) since at
> least the time of Hilbert.

   Disclaimer:  I am not any kind of mathematician.

   Observation: Ada models the notion of the "successor of i" much more 
closely (IMHO) with T'SUCC(i).  AFAIK, "the successor of i" does not 
imply that i _changes_ as a result.  Of course, if you are referring to 
i as used in sigma and pi notation (i.e., summations and product 
sequences) then one might think of i as "taking on the value of its 
successor".
-- 
I found these opinions on my doorstep, would you please give them a good 
home?




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-17  0:00                                             ` George
  1996-09-19  0:00                                               ` Tim Behrendsen
  1996-09-26  0:00                                               ` Jon S Anthony
@ 1996-09-28  0:00                                               ` Jon S Anthony
  2 siblings, 0 replies; 688+ messages in thread
From: Jon S Anthony @ 1996-09-28  0:00 UTC (permalink / raw)



In article <324b5c87.3413989@news.demon.co.uk> bob@lintilla.demon.co.uk (Bob Cousins) writes:

> 
> I don't think a mathematician would write c = c + 1 either.

Depends on.  If C is an infinite cardinal (say Aleph0) they would.  If
not, then it is an obvious _false_ statement, and one would hope your
are correct.

/Jon
-- 
Jon Anthony
Organon Motives, Inc.
1 Williston Road, Suite 4
Belmont, MA 02178

617.484.3383
jsa@organon.com





^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-27  0:00                                                   ` Kent Budge
                                                                       ` (3 preceding siblings ...)
  1996-09-27  0:00                                                     ` George Haddad
@ 1996-09-28  0:00                                                     ` Steve Heller
  1996-10-01  0:00                                                       ` DJ Kindberg
  4 siblings, 1 reply; 688+ messages in thread
From: Steve Heller @ 1996-09-28  0:00 UTC (permalink / raw)



Kent Budge <kgbudge@sandia.gov> wrote:

>Incidentally (to go off on a completely different tangent) I really
>miss the ol' PDP-11 ... the only assembly language I've used that was 
>really a pleasure to work in.  Sure, it was 16 bit and the floating
>point and memory management were a kludgy add-on ... but the basic
>integer instruction set and addressing modes were beautifully simple
>and simply beautiful.   
  I liked the PDP-11 instruction set too, as well as the PDP-10 set.
By the way, have you ever seen the 6809? I think it may have been even
prettier than either of the PDP sets mentioned above, as it made
virtually perfect use of a small set of registers. Of course, the
68000 and its successors look a lot like a 32-bit PDP-11 with separate
address and data registers, and is also quite pleasant to program.









^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-27  0:00                                               ` Tom Payne
@ 1996-09-28  0:00                                                 ` Tim Behrendsen
  0 siblings, 0 replies; 688+ messages in thread
From: Tim Behrendsen @ 1996-09-28  0:00 UTC (permalink / raw)



Tom Payne <thp@cs.ucr.edu> wrote in article <52gvu3$jhb@rumors.ucr.edu>...
> In comp.lang.c++ Joel VanLaven <jvl@ocsystems.com> wrote:
> [...]
> :   Actually, not all functions are integrable.  The most complete and
> : irrefutable definition of the integral of a function to my knowledge is
:
> 
> : Given a partition P of [a,b] 
> :           (P is a finite subset of [a,b] including a and b)
> : P={x0,x1,x2,...xn} such that a=x0, b=xn, and x(j+1)>xj
> : call Mj the lub(f([x(j-1),xj]) (least upper bound)
> : call mj the glb(f([x(j-1),xj]) (greatest lower bound)
> 
> : The number Uf(P)=SUM(Mj(xj-x(j-1))) 1<=j<=n 
> : is called the P upper sum for f
> 
> : The number Lf(P)=SUM(mj(xj-x(j-1))) 1<=j<=n
> : is called the P lower sum for f
> 
> : The unique number I that satisfies the inequality
>   ^^^^^^^^^^^^^^^^^^^
>    if it exists
> 
> : Lf(P)<=I<=Uf(P) for all possible P of [a,b]
> : is called the definite integral of f from a to b.
> 
> Actually, there is a significant generalization, called Lebesgue
> integration, that can be found in any graduate text on Real Analysis
> (e.g., those by Royden and by Rudin).  Nevertheless, if you believe in
> the axiom of choice, there are still functions that are not integrable
> (but they get pretty weird).

Somewhat off the subject (but way off the subject of comp.lang.*), the
site http://www.integrals.com (sponsored by Mathematica?) that allows
you to type an integral in and it will try and evaluate it.  If it
can't do it, it's either not integrable or their algorithm doesn't
support it (they ask you to e-mail them if you think it can be done).
I managed to break it right off by

   integral X^X dx

Which I *think* is integrable (at least I remember figuring it out
in college, but that was a long time ago and I may have done it
wrong. :-) )

-- Tim Behrendsen (tim@a-sis.com)




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-25  0:00                                                   ` Bjarne Stroustrup
  1996-09-26  0:00                                                     ` Bengt Richter
@ 1996-09-28  0:00                                                     ` Dan Pop
  1 sibling, 0 replies; 688+ messages in thread
From: Dan Pop @ 1996-09-28  0:00 UTC (permalink / raw)



In <Dy9KEL.6nD@research.att.com> bs@research.att.com (Bjarne Stroustrup) writes:

>"Matthew M. Lih" <matt.lih@trw.com> writes:
> > 
> > Even some veterans don't have any idea. I'm taking
> > a C++ class, and the instructor didn't have any idea
> > that the "++" operator was developed because it
> > corresponded to a very quick machine language
> > instruction in the old PDP machines. When I pointed
> > this out, his comment was along the lines of "Oh,
> > you hardware types."
>
>Actually, the story that ++ comes from the PDP11 instruction
>set is a myth. Dennis Ritchie has denied it quite often, but
>that doesn't seem to impress people. ++ is in C and C++ because
>Dennis (being a mathematician) considered it a fundamental
>(and useful) operation. It was in the PDP11 instruction set
>because the designers at DEC independently had figured out
>that it was an important operation to optimize.
>
>	- Bjarne

Actually, according to Dennis, ++ is in C because it was inherited from
B :-) and the credit for it has to be given to Ken Thompson.  Quoting
from Ritchie himself:

# C was developed on the PDP-11; most of it aside from the type
# structure and associated syntax came from B, which was developed on
# the PDP-7.  B already had the ++ and -- operators (and the associated
# idioms like *p++).  The first B compiler produced interpreted
# (threaded) code.  Doubtless the invention of ++ and -- (due to
# Thompson) was suggested by the auto-increment cells of the PDP-7, or
# perhaps the one such cell on the PDP-8, or perhaps some of the more
# recondite indirection modes on the GE 635/645, or the
# count-increment-refill mechanism of the Stretch.  In any event, neither
# B nor the first C compiler generated -7 or -11 instructions that used
# autoincrement.
# 
# C was less influenced by the PDP-11 than most people think.  Certainly
# the addition of types was motivated by the desire to take advantage
# of byte addressing and the (future) existence of floating point
# (indeed, C compiled 11/45 floating point instructions before
# the delivery of any 11/45s; it was an annoyance when DEC changed
# its mind about what opcodes to use.)
# 
# Discounting general things like that, the only strong PDP-11isms I can
# think of in C are the signed characters and about 50% of the
# justification for the widening rules for floats.  (The other 50% is
# simplification of the language rules and libraries).
# 
#       Dennis Ritchie
#       research!dmr
#       dmr@research.att.com


--
Dan Pop
CERN, CN Division
Email: Dan.Pop@cern.ch 
Mail:  CERN - PPE, Bat. 31 R-004, CH-1211 Geneve 23, Switzerland




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-27  0:00                                                     ` George Haddad
@ 1996-09-28  0:00                                                       ` Matthew Heaney
  0 siblings, 0 replies; 688+ messages in thread
From: Matthew Heaney @ 1996-09-28  0:00 UTC (permalink / raw)



In article <324C2C77.1B72@lmco.com>, george.haddad@lmco.com wrote:

>   Observation: Ada models the notion of the "successor of i" much more 
>closely (IMHO) with T'SUCC(i).  AFAIK, "the successor of i" does not 
>imply that i _changes_ as a result.  Of course, if you are referring to 
>i as used in sigma and pi notation (i.e., summations and product 
>sequences) then one might think of i as "taking on the value of its 
>successor".

You've identified the fact that there are really 2 issues we're debating:
the concept of a value and the concept of assignment of a value to an
object.

The concept of "successor to i" is modeled very elegantly in Ada by T'Succ
(i), but people seem to be having a problem with how one assigns that
value.

The abstract world of mathematics is all very theoretically interesting,
but in the end, we still have to program real programs on very concrete
computers.  So the pure, theoretical model that mathematics gives us is of
course the goal, but in this world on this computer I might not reach that
goal exactly.

So while the statement

 i := t'succ(i);

might make the mathematician happy, it doesn't seem to sit well with many
real-world programmers, who would seem to prefer

increment (i);

This is an area where one may justifiably criticize Ada (and many have), in
that it's *too* abstract.

For example, I've often seen something on paper that uses pure functions,
but when I have to code it, would use procedures, for no other reason than
efficiency.  Consider a set:

   S = {1, 2, 3}

Now to add an element to S, on paper, I would say

   S <- S union 4

If I did this exactly in Ada:

   S : Set;
...
   S := Union_Of (S, 4);

this would be horribly inefficient, because I'd have to make a copy of S,
then add 4 to that copy, and assign the whole thing back to S.   But the
procedure

   Add (4, To => S);

conveys exactly what I want to do, which is to add an element to an
already-existing set.  So the paper-based solution isn't always directly
applicable to what must be done on this computer, it's more a model for
what must be done.

So when the C programmer says

i++;

he is saying "increment the object i," which I feel is better than

i = i + 1;

because it more clearly captures his intent, and in spite of the fact the
2nd method is more pure.

matt

--------------------------------------------------------------------
Matthew Heaney
Software Development Consultant
mheaney@ni.net
(818) 985-1271




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-27  0:00                                                   ` Tim Behrendsen
@ 1996-09-30  0:00                                                     ` Art Schwarz
  0 siblings, 0 replies; 688+ messages in thread
From: Art Schwarz @ 1996-09-30  0:00 UTC (permalink / raw)



(I hate to enter the fray.)

During the time of initial construction of the C compilers, not much was
available with respect to optimization. The nearest that I remember, there
was peephole optimization (McKeeman, Wortman, et alia) and various articles
by (for example) Cocke, a (rather good) book by Schwartz and little else. 
The seminal article by Knuth (Software Practices and Experiences, Vol. 1, 
Issue 1 - I believe) on his observation of typical Fortran expression usage
and the first article on sourcle level optimization (Software Practices and
Experiences, Vol ?) had yet to be written.

I can't speak for any of the developers, however, given the state of the
art in compiling at that time, it does not appear to be far-fetched to
consider the various 'condensed' operators to be a cheap way of doing 
optimization. An aid to the compiler given the state of the compiling art
at that time.

On the other hand, there are probably any number of people who were initially
on the C project who could answer the why's and wherefor's. Maybe one of them
could be asked to contribute.

art schwarz

(My opinions are someone else's.)

In article <01bbac7f$23e42940$87ee6fce@timpent.a-sis.com>, "Tim Behrendsen" <tim@a-sis.com> writes:
>Jay Martin <jmartin@cs.ucla.edu> wrote in article <52g7f6$1fv0@uni.library.ucla.edu>...
>> jsa@alexandria (Jon S Anthony) writes:
>> 
>> >In article <Dy9KEL.6nD@research.att.com> bs@research.att.com (Bjarne Stroustrup)
>writes:
>> 
>> >> Actually, the story that ++ comes from the PDP11 instruction
>> >> set is a myth. Dennis Ritchie has denied it quite often, but
>> >> that doesn't seem to impress people. ++ is in C and C++ because
>> >> Dennis (being a mathematician) considered it a fundamental
>> >          ^^^^^^^^^^^^^^^^^^^^^
>> >> (and useful) operation. It was in the PDP11 instruction set
>> 
>> >I don't see how that has anything to do with it.  I'm a mathematician
>> >and I don't see it as a particularly interesting or useful operation
>> >to be singled out for special status (I'm not talking about the machine
>> >level here...)
>> 
>> I don't see any mathematical justification for it either, maybe Mr
>> Richie should publish a paper on the fundamental nature of "++" to the
>> foundations of mathematical thought. It seems incredible to me that Mr
>> Ritchie had never seen an "increment" assembly instruction or that the
>> inclusion of 10+ REDUNDANT and side-effect producing operators was not
>> motivated by some low-level performance concern/"too lazy to write an
>> optimizing compiler"  or an anti-software engineering desire to
>> minimize keystrokes on some primitive input device.   If he did think
>> he was doing mathematics, then I would say that he is an even poorer
>> mathematician than he is a language designer.
>
>I don't know about mathematical thought, but if you find it
>difficult to imagine why increment and decrement are useful,
>perhaps you need to practice a bit more programming before making
>criticisms.
>
>-- Tim Behrendsen (tim@a-sis.com)







^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-26  0:00                                         ` Jon S Anthony
@ 1996-10-01  0:00                                           ` Andrew Gierth
  0 siblings, 0 replies; 688+ messages in thread
From: Andrew Gierth @ 1996-10-01  0:00 UTC (permalink / raw)



>>>>> "Craig" == Craig Franck <clfranck@worldnet.att.net> writes:

 > "Spencer M. Simpson, Jr." <mapwiz@erols.com> wrote:
 >> Adam Beneschan wrote:
 >>> 
 >>> jsa@alexandria (Jon S Anthony) writes:
 >>> >
 >>> >Actually, when people use the side-ways "8" notation,...
 >>> 
 >>> As long as we're getting pedantic:...
 >> As long as we're getting pedantic, shouldn't this thread have a 
 >> defferent name?

 Craig> The problem is that you never know what a thread will evolve into!
 Craig> Certain threads exist as a kind of dumping ground for idle musings.
 Craig> I remember pondering "If god exists, is he procedural?". Also, notice
 Craig> that it has already undergone a name change.

It's also been going on for over 2 months now, and it's *still* off-topic
in comp.unix.programmer. The References: headers got so big that several
people's newsreaders have trashed them, and various additional groups
have migrated in and out of the Newsgroups: line. At least the religious
wars that it used to contain have died down a little (or possibly migrated
elsewhere). Ah well, such is Usenet.

-- 
Andrew Gierth (andrewg@microlise.co.uk)

"Ceterum censeo Microsoftam delendam esse" - Alain Knaff in nanam




^ permalink raw reply	[flat|nested] 688+ messages in thread

* Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?]
  1996-09-28  0:00                                                     ` Steve Heller
@ 1996-10-01  0:00                                                       ` DJ Kindberg
  0 siblings, 0 replies; 688+ messages in thread
From: DJ Kindberg @ 1996-10-01  0:00 UTC (permalink / raw)



Off on a tangent, yes. I miss the ol' PDP-11, too.

I've written many programs using 6809 assembler. But alas, the past is
gone.

Darren

:   I liked the PDP-11 instruction set too, as well as the PDP-10 set.
: By the way, have you ever seen the 6809? I think it may have been even
: prettier than either of the PDP sets mentioned above, as it made
: virtually perfect use of a small set of registers. Of course, the
: 68000 and its successors look a lot like a 32-bit PDP-11 with separate
: address and data registers, and is also quite pleasant to program.
: 
: 
: 
: 
: 
: 




^ permalink raw reply	[flat|nested] 688+ messages in thread

end of thread, other threads:[~1996-10-01  0:00 UTC | newest]

Thread overview: 688+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
     [not found] <sperlman-0507961717550001@p121.ezo.net>
     [not found] ` <4rs76l$aqd@ccshst05.uoguelph.ca>
1996-07-15  0:00   ` Should I learn C or Pascal? Ralph Silverman
1996-07-15  0:00     ` Steve Sobol
1996-07-16  0:00     ` Lee Crites
1996-07-17  0:00       ` David Verschoore
1996-07-17  0:00         ` Anthony Kanner
1996-07-17  0:00         ` Mark McKinney
1996-07-19  0:00           ` Philip Brashear
1996-07-23  0:00             ` John A Hughes
1996-07-26  0:00               ` Randy Kaelber
1996-07-29  0:00                 ` Ralph Silverman
1996-08-06  0:00                 ` StHeller
1996-07-20  0:00         ` TRAN PHAN ANH
1996-07-20  0:00           ` Mark Eissler
1996-07-25  0:00             ` Erik Seaberg
1996-07-26  0:00             ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Tim Behrendsen
1996-07-27  0:00               ` Rick Elbers
1996-07-28  0:00                 ` Mark Eissler
1996-07-28  0:00                 ` J. Christian Blanchette
1996-07-28  0:00                   ` Robert Dewar
1996-07-29  0:00                   ` Tim Behrendsen
1996-07-30  0:00                     ` Arra Avakian
1996-07-31  0:00                       ` James Youngman
1996-07-31  0:00                       ` Stephen M O'Shaughnessy
1996-08-02  0:00                       ` Tim Behrendsen
1996-08-05  0:00                         ` Henrik Wetterstrom
1996-08-05  0:00                         ` Fergus Henderson
1996-08-06  0:00                           ` Tim Behrendsen
1996-08-06  0:00                             ` Fergus Henderson
1996-08-07  0:00                               ` Tim Behrendsen
1996-08-08  0:00                                 ` Thomas Hood
1996-08-09  0:00                                   ` Tim Behrendsen
1996-08-17  0:00                                 ` Lawrence Kirby
1996-08-17  0:00                                   ` Tim Behrendsen
1996-08-19  0:00                                     ` Bob Gilbert
1996-08-19  0:00                                       ` Tim Behrendsen
1996-08-19  0:00                                         ` Tim Hollebeek
1996-08-20  0:00                                           ` Tim Behrendsen
1996-08-20  0:00                                         ` Bob Gilbert
1996-08-21  0:00                                           ` Tim Behrendsen
1996-08-22  0:00                                             ` Bob Gilbert
1996-08-22  0:00                                               ` Tim Behrendsen
1996-09-04  0:00                                             ` Lawrence Kirby
1996-09-04  0:00                                               ` Tim Behrendsen
1996-09-06  0:00                                                 ` Bob Gilbert
1996-09-06  0:00                                                   ` Tim Behrendsen
1996-09-09  0:00                                                     ` Bob Gilbert
1996-09-11  0:00                                                       ` Tim Behrendsen
1996-09-10  0:00                                                   ` Richard A. O'Keefe
1996-09-10  0:00                                                     ` Kaz Kylheku
1996-09-11  0:00                                                     ` Bob Gilbert
1996-09-10  0:00                                                   ` Jon S Anthony
1996-09-11  0:00                                                     ` Richard A. O'Keefe
1996-09-11  0:00                                                   ` Jon S Anthony
1996-09-11  0:00                                                   ` Jon S Anthony
1996-09-05  0:00                                               ` Mark Wooding
1996-09-06  0:00                                                 ` Bob Cousins
1996-09-06  0:00                                                   ` Tim Behrendsen
1996-09-07  0:00                                                     ` Craig Franck
1996-09-08  0:00                                                       ` Tim Behrendsen
1996-09-08  0:00                                                         ` Craig Franck
1996-09-09  0:00                                                           ` Tim Behrendsen
1996-09-10  0:00                                                             ` Richard A. O'Keefe
1996-09-10  0:00                                                               ` Tim Behrendsen
1996-09-11  0:00                                                         ` John Burdick
1996-09-13  0:00                                                 ` Bengt Richter
1996-09-14  0:00                                                   ` Craig Franck
1996-09-06  0:00                                         ` Robert I. Eachus
1996-09-06  0:00                                           ` Tim Behrendsen
1996-09-11  0:00                                         ` Jon S Anthony
1996-09-11  0:00                                           ` Craig Franck
1996-09-11  0:00                                             ` Tim Behrendsen
1996-09-17  0:00                                           ` George
1996-09-24  0:00                                             ` Joel VanLaven
1996-09-27  0:00                                               ` Dann Corbit
1996-09-27  0:00                                               ` Tom Payne
1996-09-28  0:00                                                 ` Tim Behrendsen
1996-09-11  0:00                                         ` Jon S Anthony
1996-09-11  0:00                                         ` Richard A. O'Keefe
1996-09-11  0:00                                           ` Tim Behrendsen
1996-09-12  0:00                                             ` Peter Seebach
1996-09-18  0:00                                               ` Tim Behrendsen
1996-09-12  0:00                                             ` Richard A. O'Keefe
1996-09-13  0:00                                               ` Tim Behrendsen
1996-09-13  0:00                                                 ` Richard A. O'Keefe
1996-09-18  0:00                                                   ` Tim Behrendsen
1996-09-19  0:00                                                     ` Richard A. O'Keefe
1996-09-17  0:00                                             ` George
1996-09-19  0:00                                               ` Tim Behrendsen
1996-09-24  0:00                                                 ` Matthew M. Lih
1996-09-25  0:00                                                   ` Richard A. O'Keefe
1996-09-26  0:00                                                     ` Mark Wooding
1996-09-25  0:00                                                   ` Bjarne Stroustrup
1996-09-26  0:00                                                     ` Bengt Richter
1996-09-28  0:00                                                     ` Dan Pop
1996-09-26  0:00                                               ` Jon S Anthony
1996-09-26  0:00                                                 ` Dann Corbit
1996-09-27  0:00                                                 ` Craig Franck
1996-09-27  0:00                                                   ` Bob Cousins
1996-09-27  0:00                                                 ` Jay Martin
1996-09-27  0:00                                                   ` Kent Budge
1996-09-27  0:00                                                     ` George Haddad
1996-09-27  0:00                                                       ` George Haddad
1996-09-27  0:00                                                     ` George Haddad
1996-09-28  0:00                                                       ` Matthew Heaney
1996-09-27  0:00                                                     ` George Haddad
1996-09-27  0:00                                                     ` George Haddad
1996-09-28  0:00                                                     ` Steve Heller
1996-10-01  0:00                                                       ` DJ Kindberg
1996-09-27  0:00                                                   ` Tim Behrendsen
1996-09-30  0:00                                                     ` Art Schwarz
1996-09-28  0:00                                               ` Jon S Anthony
1996-09-18  0:00                                         ` Jon S Anthony
1996-09-26  0:00                                         ` Jon S Anthony
1996-10-01  0:00                                           ` Andrew Gierth
1996-08-22  0:00                                     ` Bengt Richter
1996-08-22  0:00                                       ` Frank Manning
1996-08-31  0:00                                         ` Bengt Richter
1996-08-31  0:00                                           ` Frank Manning
1996-08-31  0:00                                             ` Frank Manning
1996-09-02  0:00                                             ` deafen
1996-09-03  0:00                                               ` Frank Manning
1996-09-03  0:00                                               ` Steve Howard
1996-09-03  0:00                                               ` Bob Kitzberger
1996-09-03  0:00                                               ` Tim Behrendsen
1996-09-03  0:00                                               ` Phil Barnett
1996-08-22  0:00                                       ` Tim Behrendsen
1996-08-23  0:00                                         ` Larry J. Elmore
1996-08-08  0:00                               ` Stephen M O'Shaughnessy
1996-08-09  0:00                               ` Stephen M O'Shaughnessy
1996-08-09  0:00                                 ` Tim Behrendsen
     [not found]                               ` <01bb846d$ <Dvtnon.I49@most.fw.hac.com>
1996-08-09  0:00                                 ` Tim Behrendsen
1996-08-06  0:00                             ` Szu-Wen Huang
1996-08-06  0:00                               ` Tim Behrendsen
1996-08-06  0:00                                 ` Peter Seebach
1996-08-07  0:00                                   ` Tim Behrendsen
1996-08-07  0:00                                     ` Dan Pop
1996-08-08  0:00                                       ` Tim Behrendsen
1996-08-08  0:00                                         ` Peter Seebach
1996-08-08  0:00                                           ` Randy Kaelber
1996-08-09  0:00                                           ` J. Blustein
1996-08-11  0:00                                             ` Peter Seebach
1996-08-09  0:00                                           ` Chris Sonnack
1996-08-10  0:00                                             ` Tim Behrendsen
1996-08-11  0:00                                               ` Dan Pop
1996-08-12  0:00                                                 ` Tim Behrendsen
1996-08-12  0:00                                                 ` Chris Sonnack
1996-08-15  0:00                                                   ` Bob Hoffmann
1996-08-11  0:00                                               ` Chris Sonnack
1996-08-09  0:00                                         ` Dan Pop
1996-08-11  0:00                                           ` Tim Behrendsen
1996-08-11  0:00                                             ` Dan Pop
1996-08-13  0:00                                               ` Tim Behrendsen
1996-08-13  0:00                                                 ` Giuliano Carlini
1996-08-14  0:00                                                 ` Dan Pop
1996-08-14  0:00                                                   ` Tim Behrendsen
1996-08-16  0:00                                                   ` Dik T. Winter
1996-08-12  0:00                                             ` Peter Seebach
1996-08-13  0:00                                               ` Tim Behrendsen
1996-08-11  0:00                                           ` Mark Wooding
1996-08-19  0:00                                             ` James Youngman
1996-08-18  0:00                                         ` Sam B. Siegel
1996-08-19  0:00                                           ` Dan Pop
1996-08-08  0:00                                       ` Christopher R Volpe
1996-08-07  0:00                                     ` Peter Seebach
1996-08-08  0:00                                       ` Tim Behrendsen
1996-08-08  0:00                                         ` Peter Seebach
1996-08-08  0:00                                           ` Tim Behrendsen
1996-08-08  0:00                                             ` Peter Seebach
1996-08-14  0:00                                             ` Richard A. O'Keefe
1996-08-16  0:00                                               ` Tim Behrendsen
1996-08-20  0:00                                                 ` Richard A. O'Keefe
1996-08-20  0:00                                                   ` Alan Bowler
1996-08-21  0:00                                                   ` Tim Behrendsen
1996-08-22  0:00                                                     ` Bengt Richter
1996-08-22  0:00                                                       ` Tim Behrendsen
1996-08-31  0:00                                                         ` Bengt Richter
1996-09-01  0:00                                                           ` Maurice M. Carey IV
1996-08-26  0:00                                                     ` Richard A. O'Keefe
1996-08-26  0:00                                                       ` Mark Wooding
1996-08-30  0:00                                                         ` Kaz Kylheku
1996-08-30  0:00                                                         ` Richard A. O'Keefe
1996-08-30  0:00                                                           ` Peter Seebach
1996-09-03  0:00                                                             ` Lawrence Kirby
1996-09-01  0:00                                                           ` Joe Keane
1996-09-04  0:00                                                             ` Richard A. O'Keefe
1996-09-03  0:00                                                           ` Arkady Belousov
1996-08-26  0:00                                                       ` Tim Behrendsen
1996-08-29  0:00                                                         ` Richard A. O'Keefe
1996-08-29  0:00                                                           ` Craig Franck
1996-08-30  0:00                                                           ` system
1996-08-31  0:00                                                             ` Kenneth Mays
     [not found]                                                           ` <01bb95ba$9dfed580$496700cf@ljelmore.montana>
1996-08-30  0:00                                                             ` Steve Heller
1996-08-31  0:00                                                             ` Clayton Weaver
1996-09-01  0:00                                                           ` Tim Behrendsen
1996-08-26  0:00                                                       ` madscientist
1996-08-29  0:00                                                         ` Richard A. O'Keefe
1996-08-31  0:00                                                       ` Tanmoy Bhattacharya
1996-09-04  0:00                                                         ` Tom Payne
1996-09-04  0:00                                                       ` Patrick Horgan
1996-09-05  0:00                                                         ` Richard A. O'Keefe
1996-09-05  0:00                                                           ` deafen
1996-08-09  0:00                                           ` Chris Sonnack
1996-08-08  0:00                                         ` telnet user
1996-08-09  0:00                                           ` Tim Behrendsen
1996-08-09  0:00                                           ` Ed Hook
1996-08-09  0:00                                         ` Mike Rubenstein
1996-08-09  0:00                                           ` Tim Behrendsen
1996-08-10  0:00                                             ` Mike Rubenstein
1996-08-12  0:00                                               ` Tim Behrendsen
1996-08-12  0:00                                                 ` Mike Rubenstein
1996-08-12  0:00                                                   ` Mark Wooding
1996-08-13  0:00                                                     ` Mike Rubenstein
1996-08-15  0:00                                                     ` Richard A. O'Keefe
1996-08-12  0:00                                                   ` Tim Behrendsen
1996-08-13  0:00                                                     ` Mike Rubenstein
1996-08-13  0:00                                                       ` Tim Behrendsen
1996-08-13  0:00                                                         ` Giuliano Carlini
1996-08-14  0:00                                                           ` Tim Behrendsen
1996-08-15  0:00                                                         ` Mike Rubenstein
     [not found]                                                     ` <32 <01bb8923$e1d34280$87ee6fce@timpent.airshields.com>
1996-08-14  0:00                                                       ` Peter Seebach
1996-08-14  0:00                                                         ` Tim Behrendsen
1996-08-14  0:00                                                           ` Peter Seebach
1996-08-12  0:00                                                 ` Bob Kitzberger
1996-08-22  0:00                                                   ` Patrick Horgan
1996-08-23  0:00                                                     ` Steve Heller
1996-08-08  0:00                                     ` Teaching sorts [was Re: What's the best language to start with?] Robert I. Eachus
1996-08-09  0:00                                       ` Robert Dewar
1996-08-10  0:00                                       ` Lawrence Kirby
1996-08-10  0:00                                       ` Al Aab
1996-08-12  0:00                                       ` Steve Heller
1996-08-12  0:00                                         ` Robert Dewar
1996-08-16  0:00                                           ` Steve Heller
1996-08-16  0:00                                             ` Adam Beneschan
1996-08-18  0:00                                               ` Steve Heller
1996-08-18  0:00                                                 ` Jeff Dege
1996-08-18  0:00                                                   ` Robert Dewar
1996-08-16  0:00                                             ` Szu-Wen Huang
1996-08-17  0:00                                               ` Robert Dewar
1996-08-20  0:00                                                 ` Szu-Wen Huang
1996-08-20  0:00                                                   ` Dann Corbit
1996-08-21  0:00                                                     ` Tim Behrendsen
1996-08-21  0:00                                                       ` Dann Corbit
1996-08-22  0:00                                                       ` Richard A. O'Keefe
1996-08-22  0:00                                                         ` Szu-Wen Huang
1996-08-23  0:00                                                           ` Richard A. O'Keefe
1996-08-25  0:00                                                         ` Robert Dewar
1996-08-21  0:00                                                   ` Dik T. Winter
1996-08-21  0:00                                                     ` Tim Behrendsen
1996-08-21  0:00                                                       ` Pete Becker
1996-08-22  0:00                                                         ` Szu-Wen Huang
1996-08-22  0:00                                                           ` Pete Becker
1996-08-22  0:00                                                           ` Robert Dewar
1996-08-21  0:00                                                       ` Matt Austern
1996-08-21  0:00                                                         ` Tim Behrendsen
1996-08-21  0:00                                                       ` Tanmoy Bhattacharya
1996-08-22  0:00                                                         ` Mike Rubenstein
1996-08-22  0:00                                                         ` Dann Corbit
1996-08-22  0:00                                                       ` Robert Dewar
1996-08-24  0:00                                                         ` Joe Keane
1996-08-22  0:00                                                     ` Tanmoy Bhattacharya
1996-08-21  0:00                                                 ` Tanmoy Bhattacharya
1996-08-21  0:00                                                   ` Adam Beneschan
1996-08-22  0:00                                                     ` Andrew Koenig
1996-08-24  0:00                                                       ` Robert Dewar
1996-08-22  0:00                                                     ` Christian Bau
1996-08-22  0:00                                                       ` Larry Kilgallen
1996-08-23  0:00                                                         ` Tim Hollebeek
1996-08-24  0:00                                                           ` Robert Dewar
1996-08-24  0:00                                                         ` Robert Dewar
1996-08-22  0:00                                                       ` (topic change on) Teaching sorts Marcus H. Mendenhall
1996-08-27  0:00                                                         ` Ralph Silverman
1996-08-23  0:00                                                       ` Teaching sorts [was Re: What's the best language to start with?] Andrew Koenig
1996-08-21  0:00                                                   ` Tim Behrendsen
1996-08-22  0:00                                                     ` Mike Rubenstein
1996-08-22  0:00                                                     ` Robert Dewar
1996-08-17  0:00                                               ` Robert Dewar
1996-08-18  0:00                                               ` Steve Heller
1996-08-21  0:00                                               ` Matt Austern
1996-08-23  0:00                                               ` Tanmoy Bhattacharya
1996-08-23  0:00                                                 ` Adam Beneschan
1996-08-16  0:00                                             ` Robert Dewar
1996-08-18  0:00                                               ` Steve Heller
1996-08-18  0:00                                                 ` Robert Dewar
1996-08-18  0:00                                                   ` Steve Heller
1996-08-18  0:00                                                     ` Robert Dewar
1996-08-20  0:00                                                       ` Steve Heller
1996-08-14  0:00                                       ` Stephen Baynes
1996-08-14  0:00                                         ` Robert Dewar
1996-08-16  0:00                                           ` Dik T. Winter
1996-08-16  0:00                                             ` Joe Foster
1996-08-18  0:00                                           ` Glenn Rhoads
1996-08-19  0:00                                           ` Stephen Baynes
1996-08-19  0:00                                             ` Robert Dewar
1996-08-19  0:00                                             ` Robert Dewar
1996-08-19  0:00                                           ` Richard A. O'Keefe
     [not found]                                             ` <dewar.840491732@schonberg>
1996-08-19  0:00                                               ` Robert Dewar
1996-08-22  0:00                                                 ` Stephen Baynes
1996-08-27  0:00                                                 ` Richard A. O'Keefe
1996-08-14  0:00                                         ` Robert Dewar
1996-08-13  0:00                                     ` Robert I. Eachus
1996-08-14  0:00                                       ` Robert Dewar
1996-08-15  0:00                                       ` Tom Payne
1996-08-13  0:00                                     ` Robert I. Eachus
1996-08-13  0:00                                       ` Lawrence Kirby
1996-08-14  0:00                                       ` Robert Dewar
1996-08-14  0:00                                     ` Robert I. Eachus
1996-08-15  0:00                                       ` Robert Dewar
1996-08-15  0:00                                     ` Blair Phillips
1996-08-27  0:00                                     ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Tanmoy Bhattacharya
1996-08-29  0:00                                     ` Robert I. Eachus
1996-08-30  0:00                                       ` Steve Heller
1996-08-30  0:00                                     ` Tanmoy Bhattacharya
1996-08-07  0:00                                 ` What's the best language to start with Ian Ward
1996-08-08  0:00                                   ` Tim Behrendsen
1996-08-09  0:00                                     ` Robert Dewar
1996-08-11  0:00                                 ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Jerone A. Bowers
1996-08-06  0:00                             ` Dan Pop
1996-08-06  0:00                               ` Tim Behrendsen
1996-08-06  0:00                                 ` Peter Seebach
1996-08-07  0:00                                   ` Tim Behrendsen
1996-08-07  0:00                                     ` Peter Seebach
1996-08-08  0:00                                       ` Tim Behrendsen
1996-08-08  0:00                                         ` Peter Seebach
1996-08-07  0:00                                     ` James A. Squire
1996-08-08  0:00                                     ` David Weller
1996-08-09  0:00                                     ` Bob Gilbert
1996-08-10  0:00                                       ` Tim Behrendsen
1996-08-11  0:00                                         ` Craig Franck
1996-08-11  0:00                                           ` Tim Behrendsen
1996-08-11  0:00                                         ` Peter Seebach
1996-08-11  0:00                                           ` Tim Behrendsen
1996-08-12  0:00                                             ` Alf P. Steinbach
1996-08-12  0:00                                               ` Tim Behrendsen
1996-08-13  0:00                                             ` Szu-Wen Huang
1996-08-07  0:00                                 ` Mark Eissler
     [not found]                               ` <01bb83cc$fb <tequila-0708960947140001@tequila.interlog.com>
1996-08-07  0:00                                 ` Peter Seebach
1996-08-12  0:00                             ` Robert I. Eachus
1996-08-05  0:00                         ` Chris Sonnack
1996-08-06  0:00                           ` Stephen M O'Shaughnessy
1996-08-13  0:00                       ` Chris Sonnack
1996-08-16  0:00                         ` Steve Heller
1996-08-16  0:00                           ` John Hobson
1996-07-31  0:00                   ` Patrick Horgan
1996-07-31  0:00                   ` AJ Musgrove
1996-08-01  0:00                     ` Sam Harris
1996-08-02  0:00                       ` Eric W. Nikitin
1996-08-01  0:00                     ` Tim Hollebeek
1996-08-01  0:00                     ` Ken Pizzini
1996-08-03  0:00                     ` Raffael Cavallaro
1996-08-05  0:00                       ` Chris Sonnack
1996-08-08  0:00                   ` William Clodius
1996-08-11  0:00                     ` Dik T. Winter
1996-08-11  0:00                     ` Fergus Henderson
1996-08-08  0:00                   ` William Clodius
1996-08-13  0:00                   ` Ole-Hjalmar Kristensen FOU.TD/DELAB
1996-08-14  0:00                   ` Richard A. O'Keefe
1996-08-15  0:00                   ` Teaching sorts [was Re: What's the best language to start with?] Norman H. Cohen
1996-08-16  0:00                     ` Steve Heller
1996-08-19  0:00                   ` Ted Dennison
1996-08-23  0:00                     ` Richard A. O'Keefe
1996-08-23  0:00                       ` Ted Dennison
1996-08-24  0:00                       ` Robert Dewar
1996-08-27  0:00                         ` Richard A. O'Keefe
1996-09-02  0:00                           ` Lawrence Kirby
1996-07-28  0:00               ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Robert Dewar
1996-07-29  0:00                 ` Tim Behrendsen
1996-07-30  0:00                   ` Paul Campbell
1996-07-30  0:00                     ` Robert Dewar
1996-08-02  0:00                       ` Tim Behrendsen
1996-08-03  0:00                         ` Peter Seebach
1996-08-04  0:00                           ` Alf P. Steinbach
1996-08-04  0:00                             ` Peter Seebach
1996-08-04  0:00                               ` Jerry van Dijk
1996-08-05  0:00                               ` Tim Behrendsen
1996-08-04  0:00                                 ` Peter Seebach
1996-08-05  0:00                                   ` Chris Sonnack
1996-08-05  0:00                                     ` Peter Seebach
1996-08-07  0:00                                       ` Tom Watson
1996-08-05  0:00                                     ` Tim Hollebeek
1996-08-10  0:00                                       ` Mike Rubenstein
1996-08-06  0:00                                   ` Tim Behrendsen
1996-08-03  0:00                     ` Patrick Horgan
1996-08-04  0:00                       ` Kurt E. Huhner
1996-07-30  0:00                   ` What's the best language to start with? [was: Re: Should I learn TRAN PHAN ANH
1996-07-31  0:00                   ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Arne W. Flones
1996-08-02  0:00                   ` David Wheeler
1996-08-02  0:00                     ` Peter Seebach
1996-08-02  0:00                       ` Gary M. Greenberg
1996-08-03  0:00                       ` Alf P. Steinbach
1996-08-02  0:00                         ` Peter Seebach
1996-08-05  0:00                       ` Chris Sonnack
1996-08-05  0:00                         ` Peter Seebach
1996-08-06  0:00                     ` What's the best language to start with? [was: Re: Should I learn C or Pasca StHeller
1996-08-06  0:00                       ` Robert Dewar
1996-08-06  0:00                 ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Alf P. Steinbach
1996-08-06  0:00                 ` Robert I. Eachus
1996-08-06  0:00                 ` Conrad Herrmann
1996-07-29  0:00               ` Byron B. Kauffman
1996-07-30  0:00               ` Alan Peake
     [not found]               ` <dewar. <peake.206.002D549F@dstos3.dsto.gov.au>
1996-07-31  0:00                 ` P. Cnudde VH14 (8218)
1996-07-31  0:00                   ` Nicolas Devillard
1996-08-02  0:00                   ` Matt Austern
1996-08-15  0:00                     ` Lawrence Kirby
1996-07-31  0:00                 ` Tim Behrendsen
1996-07-31  0:00                 ` Stephen M O'Shaughnessy
1996-08-02  0:00                   ` Tim Behrendsen
1996-08-05  0:00                     ` Mark McKinney
1996-08-05  0:00                     ` Mark McKinney
1996-08-05  0:00                     ` Mark McKinney
1996-07-20  0:00           ` Should I learn C or Pascal? Robert Dewar
1996-07-22  0:00             ` TRAN PHAN ANH
1996-07-23  0:00             ` Ken Garlington
1996-07-20  0:00           ` Andy Askey
1996-07-20  0:00             ` steidl
1996-07-21  0:00               ` Andy Askey
1996-07-22  0:00           ` Stephen M O'Shaughnessy
1996-07-23  0:00             ` TRAN PHAN ANH
1996-07-18  0:00       ` Carlos DeAngulo
1996-07-18  0:00         ` Robert Dewar
     [not found]           ` <01bb7588$236982e0$7b91f780@deangulo>
1996-07-19  0:00             ` Robert Dewar
1996-07-20  0:00             ` steidl
1996-07-19  0:00           ` Jon Bell
1996-07-22  0:00             ` Tim Oxler
1996-07-22  0:00               ` Stig Norland
1996-07-22  0:00               ` Janus
1996-07-22  0:00               ` Robert Dewar
1996-07-30  0:00                 ` Tim Behrendsen
1996-07-31  0:00                 ` Patrick Horgan
     [not found]         ` <01bb7591$83087d60$87ee6fce@timpent.airshields.com>
1996-07-19  0:00           ` johnf
1996-07-19  0:00             ` Jeremy Nelson
1996-07-19  0:00             ` Jason Alan Turnage
1996-07-19  0:00               ` Robert Dewar
1996-07-20  0:00                 ` TRAN PHAN ANH
1996-07-22  0:00                   ` Ralph Silverman
1996-07-20  0:00                 ` Jon Bell
1996-07-20  0:00                   ` Robert Dewar
1996-07-21  0:00                     ` Alexander Vrenios
1996-07-21  0:00                   ` Steve Tate
1996-07-21  0:00                     ` Robert Dewar
1996-07-21  0:00                     ` Phil Howard
1996-07-21  0:00                       ` Robert Dewar
1996-07-22  0:00                         ` Steve Tate
1996-07-22  0:00                   ` Stephen M O'Shaughnessy
1996-07-25  0:00                   ` ++           robin
1996-07-20  0:00                 ` Crash
1996-07-20  0:00                   ` Robert Dewar
1996-07-23  0:00                 ` Ralph Silverman
1996-07-22  0:00               ` Stephen M O'Shaughnessy
1996-07-22  0:00                 ` Jeremy Nelson
1996-07-22  0:00                   ` Stephen M O'Shaughnessy
1996-07-20  0:00             ` Tim Behrendsen
1996-07-22  0:00             ` Ralph Silverman
1996-07-23  0:00               ` Joe Gwinn
1996-07-24  0:00                 ` John A Hughes
1996-07-24  0:00                 ` Theodore E. Dennison
1996-07-23  0:00             ` John A Hughes
1996-07-19  0:00           ` Craig Franck
1996-07-19  0:00         ` Dirk Dickmanns
1996-07-18  0:00       ` Patrick Horgan
1996-07-18  0:00         ` Robert Dewar
1996-07-19  0:00           ` Billy Chambless
1996-07-18  0:00         ` Jason Alan Turnage
1996-07-19  0:00           ` Vic Metcalfe
1996-07-19  0:00           ` Robert Dewar
1996-07-20  0:00             ` steved
1996-07-19  0:00               ` Peter Seebach
1996-07-20  0:00                 ` Jon Bell
1996-07-20  0:00                   ` Andy Askey
1996-07-20  0:00                 ` Robert Dewar
1996-07-22  0:00                   ` steidl
1996-07-22  0:00                     ` Stephen M O'Shaughnessy
1996-07-23  0:00                       ` Richard A. O'Keefe
1996-07-23  0:00                         ` Michael Ickes
1996-07-25  0:00                           ` Andy Askey
1996-07-24  0:00                         ` system
1996-07-23  0:00             ` Ralph Silverman
1996-07-19  0:00         ` Andrew Gierth
1996-07-19  0:00         ` Scott McMahan - Softbase Systems
1996-07-20  0:00           ` steidl
1996-07-20  0:00           ` Tim Behrendsen
1996-07-21  0:00             ` Rich Maggio
1996-07-21  0:00               ` Robert Dewar
1996-07-22  0:00             ` Ralph Silverman
1996-07-23  0:00               ` Tim Behrendsen
1996-07-19  0:00         ` Reto Koradi
1996-07-23  0:00           ` TRAN PHAN ANH
1996-07-18  0:00       ` Walter B. Hollman Sr.
1996-07-23  0:00     ` Richard A. O'Keefe
1996-07-16  0:00 ` Darin Johnson
1996-07-24  0:00   ` Ralph Silverman
1996-07-17  0:00 ` Aron Felix Gurski
1996-07-19  0:00 ` Andrew Gierth
1996-07-19  0:00 ` Andrew Gierth
1996-07-19  0:00 ` Andrew Gierth
1996-07-21  0:00 ` Laurent Guerby
1996-07-22  0:00   ` Stephen M O'Shaughnessy
1996-07-21  0:00 ` Wayne
1996-07-22  0:00 ` Darin Johnson
1996-07-22  0:00 ` Darin Johnson
1996-07-23  0:00 ` Darin Johnson
1996-07-24  0:00   ` Michael Feldman
1996-07-24  0:00   ` Ralph Silverman
1996-07-24  0:00     ` TRAN PHAN ANH
1996-07-24  0:00   ` Andrew J Steinbach
1996-07-24  0:00     ` John A Hughes
1996-07-24  0:00     ` Jon Bell
1996-07-24  0:00     ` system
1996-07-24  0:00 ` Darin Johnson
1996-07-25  0:00   ` Andy Askey
1996-07-26  0:00     ` Mark Eissler
1996-08-02  0:00   ` Patrick Horgan
1996-08-04  0:00     ` Gary M. Greenberg
     [not found]     ` <4u76ej$7s9@newsbf02.news.aol.com>
1996-08-06  0:00       ` Ralph Silverman
1996-08-12  0:00         ` Patrick Horgan
1996-08-13  0:00           ` Darin Johnson
1996-08-13  0:00             ` What's the best language to learn? [was Re: Should I learn C or Pascal?] Tim Behrendsen
1996-08-14  0:00               ` Gabor Egressy
1996-08-15  0:00                 ` Robert Dewar
1996-08-17  0:00                   ` Lawrence Kirby
1996-08-17  0:00                     ` Robert Dewar
1996-08-20  0:00                       ` Lawrence Kirby
1996-08-16  0:00                 ` Mark Wooding
1996-08-17  0:00                   ` Dan Pop
1996-08-17  0:00                     ` Tim Behrendsen
1996-08-17  0:00                       ` Robert Dewar
1996-08-17  0:00                       ` Dan Pop
1996-08-18  0:00                         ` Mark Wooding
1996-08-20  0:00                           ` Peter Seebach
1996-08-21  0:00                           ` Szu-Wen Huang
1996-08-21  0:00                             ` Adam Beneschan
1996-08-21  0:00                             ` Tim Behrendsen
1996-08-17  0:00                       ` Peter Seebach
1996-08-18  0:00                         ` Tim Behrendsen
1996-08-21  0:00                     ` Tanmoy Bhattacharya
1996-08-30  0:00                       ` Goto considered really harmful Patrick Horgan
1996-09-04  0:00                         ` Dennison
1996-08-14  0:00               ` What's the best language to learn? [was Re: Should I learn C or Pascal?] Peter Seebach
1996-08-14  0:00                 ` Tim Behrendsen
1996-08-14  0:00                   ` Peter Seebach
1996-08-14  0:00                     ` Tim Behrendsen
1996-08-14  0:00                       ` Peter Seebach
1996-08-15  0:00                       ` Robert Dewar
1996-08-16  0:00                         ` Joe Foster
1996-08-18  0:00                         ` Tim Behrendsen
1996-08-20  0:00                           ` James Youngman
1996-08-21  0:00                           ` Szu-Wen Huang
1996-08-15  0:00                       ` Bob Gilbert
1996-08-15  0:00                     ` Bob Gilbert
1996-08-18  0:00                       ` Tim Behrendsen
1996-08-15  0:00                     ` DAVID A MOLNAR
1996-08-14  0:00                   ` Robert Dewar
1996-08-14  0:00                     ` Tim Behrendsen
1996-08-14  0:00                     ` Dan Pop
1996-08-14  0:00                       ` Robert Dewar
1996-08-15  0:00                     ` Joe Foster
1996-08-16  0:00                   ` Dr. Richard Botting
1996-08-18  0:00                     ` Tim Behrendsen
1996-08-21  0:00                       ` Szu-Wen Huang
1996-08-21  0:00                         ` Tim Behrendsen
1996-08-22  0:00                         ` Mark Wooding
1996-08-23  0:00                           ` Bengt Richter
1996-08-23  0:00                         ` Clayton Weaver
1996-08-16  0:00                   ` Bob Gilbert
1996-08-17  0:00                     ` Tim Behrendsen
1996-08-18  0:00                       ` Robert Dewar
1996-08-18  0:00                         ` Tim Behrendsen
1996-08-26  0:00                         ` Patrick Horgan
1996-08-27  0:00                           ` Alan Peake
1996-08-27  0:00                             ` Steve Heller
1996-08-28  0:00                             ` Robert Dewar
1996-08-28  0:00                             ` Tom Watson
1996-08-28  0:00                               ` Robert Dewar
1996-08-30  0:00                               ` Alan Peake
1996-08-31  0:00                                 ` Robert Dewar
1996-09-03  0:00                                   ` Alan Peake
1996-09-07  0:00                                     ` Robert Dewar
1996-09-07  0:00                                 ` .
1996-08-29  0:00                           ` Darin Johnson
1996-08-19  0:00                       ` John Hobson
1996-08-19  0:00                         ` Tim Behrendsen
1996-08-19  0:00                           ` John Hobson
1996-08-20  0:00                             ` Szu-Wen Huang
1996-08-27  0:00                               ` Richard A. O'Keefe
1996-08-23  0:00                           ` Alan Bowler
1996-08-21  0:00               ` What's the best language to learn? [any language except Ada] Bill Mackay
1996-08-22  0:00                 ` Stephen M O'Shaughnessy
1996-08-22  0:00                 ` Robert Dewar
1996-08-23  0:00                   ` Larry J. Elmore
1996-08-24  0:00                 ` Alan Brain
1996-08-15  0:00             ` Should I learn C or Pascal? Richard A. O'Keefe
1996-08-17  0:00               ` Lawrence Kirby
1996-08-18  0:00                 ` Ken Pizzini
1996-08-19  0:00                 ` Richard A. O'Keefe
1996-08-23  0:00                   ` Joe Keane
1996-08-17  0:00               ` Mike Rubenstein
1996-08-17  0:00               ` Alexander J Russell
1996-08-16  0:00             ` Dr E. Buxbaum
1996-08-16  0:00               ` Mike Rubenstein
1996-08-16  0:00               ` Lawrence Kirby
1996-08-17  0:00                 ` Paul Hsieh
1996-08-17  0:00                   ` Mike Rubenstein
1996-08-19  0:00                     ` Richard A. O'Keefe
1996-08-20  0:00                       ` Mike Rubenstein
1996-08-22  0:00                         ` Richard A. O'Keefe
1996-08-22  0:00                           ` Mike Rubenstein
1996-08-20  0:00               ` Paul Schlyter
1996-08-20  0:00                 ` Mike Rubenstein
1996-08-21  0:00                 ` James Youngman
1996-08-22  0:00                   ` TRAN PHAN ANH
1996-08-22  0:00                     ` Dr E. Buxbaum
1996-08-27  0:00             ` Jeffrey C. Dege
1996-08-27  0:00               ` Bob Cousins
1996-08-27  0:00               ` Steve Heller
1996-08-27  0:00               ` Craig Franck
1996-08-27  0:00                 ` Ted Dennison
1996-08-27  0:00                   ` John Hobson
1996-08-27  0:00               ` Ted Dennison
1996-08-28  0:00               ` Robert Dewar
1996-09-01  0:00               ` Patrick Horgan
1996-09-12  0:00                 ` Delete - Don't Bother to Read This Charles H. Sampson
1996-08-13  0:00           ` Should I learn C or Pascal? Ralph Silverman
1996-08-16  0:00           ` What's the best language to learn? [was Re: Should I learn C or Pascal?] Darin Johnson
1996-08-16  0:00             ` Robert Dewar
1996-08-16  0:00             ` system
1996-08-16  0:00           ` Should I learn C or Pascal? Darin Johnson
1996-08-20  0:00           ` Darin Johnson
1996-08-21  0:00           ` What's the best language to learn? [was Re: Should I learn C or Pascal?] Darin Johnson
1996-08-22  0:00           ` What's the best language to learn? [any language except Ada] Jon S Anthony
1996-08-23  0:00           ` Darin Johnson
1996-08-25  0:00             ` Robert Dewar
1996-08-24  0:00           ` Jon S Anthony
1996-08-05  0:00   ` Should I learn C or Pascal? Sherwin Anthony Sequeira
1996-07-24  0:00 ` Jon S Anthony
1996-07-25  0:00 ` ++           robin
1996-07-25  0:00 ` ++           robin
1996-07-25  0:00 ` ++           robin
1996-07-30  0:00   ` Robert Barnes
1996-07-30  0:00     ` Rob(t.) Brannan
1996-08-01  0:00       ` Tony Konashenok
1996-08-04  0:00         ` Lawrence Kirby
1996-08-09  0:00         ` Verne Arase
1996-08-01  0:00       ` ++           robin
1996-08-01  0:00         ` Ralph Silverman
1996-08-06  0:00           ` ++           robin
1996-07-31  0:00 ` What's the best language to start with? [was: Re: Should I learn C or Pascal?] Darin Johnson
1996-08-01  0:00   ` Tim Behrendsen
1996-08-01  0:00     ` Stephen M O'Shaughnessy
1996-08-03  0:00       ` Tim Behrendsen
1996-08-06  0:00         ` Stephen M O'Shaughnessy
1996-08-05  0:00     ` Patrick Horgan
1996-08-06  0:00       ` Szu-Wen Huang
1996-08-06  0:00       ` Dan Pop
1996-08-08  0:00         ` steidl
1996-07-31  0:00 ` Darin Johnson
1996-08-02  0:00   ` Alan Peake
1996-08-01  0:00 ` Stefan 'Stetson' Skoglund
1996-08-05  0:00   ` Stephen M O'Shaughnessy
1996-08-06  0:00     ` Bob Gilbert
1996-08-07  0:00       ` Stephen M O'Shaughnessy
1996-08-09  0:00         ` Bob Gilbert
1996-08-06  0:00   ` Patrick Horgan
1996-08-01  0:00 ` Andy Hardy
1996-08-07  0:00 ` Fergus Henderson
1996-08-07  0:00   ` Tim Behrendsen
1996-08-08  0:00     ` Szu-Wen Huang
1996-08-08  0:00       ` Tim Behrendsen
1996-08-08  0:00         ` Peter Seebach
1996-08-08  0:00           ` Tim Behrendsen
1996-08-08  0:00             ` Peter Seebach
1996-08-09  0:00               ` Tim Behrendsen
1996-08-09  0:00                 ` Peter Seebach
1996-08-15  0:00                   ` James_Rogers
1996-08-17  0:00                     ` Tim Behrendsen
1996-08-10  0:00           ` Mike Rubenstein
1996-08-10  0:00             ` Peter Seebach
1996-08-11  0:00             ` Craig Franck
1996-08-08  0:00         ` Szu-Wen Huang
1996-08-08  0:00           ` Tim Behrendsen
1996-08-09  0:00             ` Szu-Wen Huang
1996-08-09  0:00               ` Tim Behrendsen
1996-08-10  0:00                 ` Szu-Wen Huang
1996-08-11  0:00                   ` Tim Behrendsen
1996-08-09  0:00           ` some days weren't there at all
1996-08-10  0:00           ` Mike Rubenstein
1996-08-11  0:00             ` Szu-Wen Huang
1996-08-17  0:00             ` Richard Chiu
1996-09-04  0:00               ` Lawrence Kirby
1996-08-08  0:00         ` Christopher R Volpe

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox