From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Thread: 103376,5412c98a3943e746 X-Google-NewGroupId: yes X-Google-Attributes: gida07f3367d7,domainid0,public,usenet X-Google-Language: ENGLISH,ASCII-7-bit Received: by 10.68.227.166 with SMTP id sb6mr4752321pbc.4.1331197450313; Thu, 08 Mar 2012 01:04:10 -0800 (PST) Path: h9ni1818pbe.0!nntp.google.com!news1.google.com!goblin3!goblin.stu.neva.ru!nntp-feed.chiark.greenend.org.uk!ewrotcd!reality.xs3.de!news.jacob-sparre.dk!munin.jacob-sparre.dk!pnx.dk!.POSTED!not-for-mail From: Jacob Sparre Andersen Newsgroups: comp.lang.ada Subject: Re: Verified compilers? Date: Thu, 08 Mar 2012 10:04:04 +0100 Organization: Jacob Sparre Andersen Research & Innovation Message-ID: <87d38nv4zf.fsf@adaheads.sparre-andersen.dk> References: <15276255.3865.1331124143041.JavaMail.geo-discussion-forums@vbze11> NNTP-Posting-Host: 95.209.233.234.bredband.3.dk Mime-Version: 1.0 X-Trace: munin.nbi.dk 1331197447 8177 95.209.233.234 (8 Mar 2012 09:04:07 GMT) X-Complaints-To: news@jacob-sparre.dk NNTP-Posting-Date: Thu, 8 Mar 2012 09:04:07 +0000 (UTC) User-Agent: Gnus/5.13 (Gnus v5.13) Emacs/23.2 (gnu/linux) Cancel-Lock: sha1:b3GlHKJRmtleRPcERm1UB63cOeU= Content-Type: text/plain; charset=us-ascii Date: 2012-03-08T10:04:04+01:00 List-Id: Randy Brukardt wrote: > [...] But I don't think that it would be all that practical to verify > an entire compiler for a real programming language. [...] It surely > would be "easy" to verify the translation of our intermediate code to > machine code [...] This makes me think that introducing an extra layer of intermediate language/code might make it easier to convince yourself that each transformation from one layer to another is correct (maybe even formally verify it). It would of course on the other hand introduce an extra - possibly incorrect - transformation in the compilation system. Can we somehow estimate the likelihood of errors in the complete compilation system (produced software), depending on the number of transformations involved and the procedures used to assure the correctness of each transformation? (Well, I know the math; multiplying the likelihoods of each transformation being correct gives you the likelihood that all the transformations are correct.) But how do we estimate the likelihoods of each transformation being correct? Can we do better than 0.999 ** Source_Line_Count? (Using the rule of thumb that there on average is an error for every thousand lines of code.) Oh. And who verifies that the silicon is correct? Greetings, Jacob -- "Be who you are and say what you feel, because those who mind don't matter and those who matter don't mind." -- Dr. Seuss