From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,c42dbf68f5320193 X-Google-Attributes: gid103376,public X-Google-ArrivalTime: 2002-05-09 12:05:04 PST Path: archiver1.google.com!news1.google.com!newsfeed.stanford.edu!logbridge.uoregon.edu!newsfeed1.bredband.com!bredband!news.tele.dk!small.news.tele.dk!207.115.63.138!newscon04.news.prodigy.com!prodigy.com!newsmst01.news.prodigy.com!prodigy.com!postmaster.news.prodigy.com!newssvr21.news.prodigy.com.POSTED!3bae8248!not-for-mail From: tmoran@acm.org Newsgroups: comp.lang.ada Subject: Re: Generation of permutations References: X-Newsreader: Tom's custom newsreader Message-ID: <0LzC8.634$pr6.73577781@newssvr21.news.prodigy.com> NNTP-Posting-Host: 67.112.203.59 X-Complaints-To: abuse@prodigy.net X-Trace: newssvr21.news.prodigy.com 1020971068 ST000 67.112.203.59 (Thu, 09 May 2002 15:04:28 EDT) NNTP-Posting-Date: Thu, 09 May 2002 15:04:28 EDT Organization: Prodigy Internet http://www.prodigy.com X-UserInfo1: Q[R_PJSCO@SIRQ@[ORHD]_\@VR]^@B@MCPWZKB]MPXHZUSAANVUEAE[YETZPIWWI[FCIZA^NBFXZ_D[BFNTCNVPDTNTKHWXKB@X^B_OCJLPZ@ET_O[G\XSG@E\G[ZKVLBL^CJINM@I_KVIOR\T_M_AW_M[_BWU_HFA_]@A_A^SGFAUDE_DFTMQPFWVW[QPJN Date: Thu, 09 May 2002 19:04:28 GMT Xref: archiver1.google.com comp.lang.ada:23796 Date: 2002-05-09T19:04:28+00:00 List-Id: > > calculate how long T some standard sort algorithm would take, maximum, > > then abort if your randomly generated program is still running at T+1 sec. > > But that's just looking for the occurrence of a specific algorithm, isn't > it? Say I calculate the time needed for the Slow Sort algorithm > (permutations) to get through some data and use this as the worst case. I'd > reject every valid random algorithm that was T>Slow Sort. I'm sure I can > come up with valid code that sorts at a time greater than that of the Slow You could of course change the time requirement from T+1 second to T+one year, and that would probably uncover most sorts of interest. But you could never be sure that way that if you had just let it run a little bit longer it wouldn't have come up with a different, better algorithm. It's interesting that random generation isn't needed here, sequential Big_Number's would work just fine. Randomness buys you that the probability you have tried a point in search space arbitrarily close to any given point, gets larger, while with sequential test points it's guaranteed to take a very long time before you get anywhere near Big_Number'last. But here closeness doesn't count. You could be one bit away from a wonderful sort algorithm and, since it doesn't work, that's no better than having many bits wrong. But if, instead of a totally random new program each time, you try tweaks to the best you've seen so far, then being close does count, and randomness does help.