From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: *** X-Spam-Status: No, score=3.8 required=5.0 tests=BAYES_00,INVALID_MSGID, RATWARE_MS_HASH,RATWARE_OUTLOOK_NONAME autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: fc89c,97188312486d4578 X-Google-Attributes: gidfc89c,public X-Google-Thread: 109fba,baaf5f793d03d420 X-Google-Attributes: gid109fba,public X-Google-Thread: 1014db,6154de2e240de72a X-Google-Attributes: gid1014db,public X-Google-Thread: 103376,97188312486d4578 X-Google-Attributes: gid103376,public From: "Tim Behrendsen" Subject: Re: Teaching sorts [was Re: What's the best language to start with?] Date: 1996/08/21 Message-ID: <01bb8f1b$ce59c820$32ee6fce@timhome2>#1/1 X-Deja-AN: 175433547 references: <4v2qkd$40f@news1.mnsinc.com> <4vd05f$rt5@news1.mnsinc.com> content-type: text/plain; charset=ISO-8859-1 organization: A-SIS mime-version: 1.0 newsgroups: comp.lang.c,comp.lang.c++,comp.unix.programmer,comp.lang.ada Date: 1996-08-21T00:00:00+00:00 List-Id: Dik T. Winter wrote in article ... > In article <4vd05f$rt5@news1.mnsinc.com> huang@mnsinc.com (Szu-Wen Huang) writes: > > : For example, suppose the behavior of an algorithm is > > > > : for n up to 1000, time is 1 second > > > > "time is *n* seconds", I presume. Otherwise this would be O(1). > > Nope. Robert Dewar wrote 1 second and intended 1 second, you can not > conclude big-oh behaviour from a finite number of samples. > > > > : for n greater than 1000, time is 1000*n seconds > > > > : that's clearly O(N), but the time for 2000 items will be 2_000_000 > > : seconds. > > [snip] > > > > I disagree. > > You may disagree, but that is what the definition is! Big-oh notation > is about asymptotic behaviour, i.e. what is expected to happen for n > very large. > > > I expect a linear algorithm to display linear behavior > > unless otherwise stated. > > It will, for large enough n. And "enough" is explicitly not specified. > > > What you cite is a case where it needs to > > be explicitly stated, because calling that algorithm "O(n)" is next > > to useless in predicting its behavior without knowing this peculiar > > behavior at n=1,000. > > But big-oh notation is about prediction for large n. You can not use > the notation to really predict the running time for a particular value, > only to estimate it; and your estimation may be way off. If an > algorithm runs in N^2 + 10^100 N seconds, it is still O(N^2), although > you never will experience the quadratic behaviour of the algorithm. > (Actually, of course, you will never see the algorithm come to completion.) I have to admit, I have to side with Szu-Wen. I've never really thought about this case for the O() notation, but it seems from a purely mathematical standpoint that O(f(n)) means that f(n) is *the* function. If f(n) happens to be a non-continuous function, then so be it. If the running time is kn for n < 1000 k(n^2) for n >= 1000 then f(n) = O(n) for n < 1000 O(n^2) for n >= 1000 then the big-Oh function should be (pardon my syntax) O( (n < 1000) ? n : n^2 ) The latter is not only more useful, but absolutely accurate. AFAIK, there are no limits on the complexity of the Big-Oh function, and no requirement that it be a continuous function. -- Tim Behrendsen (tim@airshields.com)