From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.4 required=5.0 tests=BAYES_00,PDS_BTC_ID autolearn=no autolearn_force=no version=3.4.4 X-Google-Thread: 103376,1525d71e3169ed06,start X-Google-Attributes: gid103376,public X-Google-Language: ENGLISH,ASCII-7-bit Path: g2news1.google.com!postnews.google.com!not-for-mail From: knutby@torle.com (P Torle) Newsgroups: comp.lang.ada Subject: Float precision - gnat vs objectada Date: 1 Dec 2004 07:46:38 -0800 Organization: http://groups.google.com Message-ID: <1c8dc73c.0412010746.272b978@posting.google.com> NNTP-Posting-Host: 138.14.239.132 Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: 8bit X-Trace: posting.google.com 1101915998 12541 127.0.0.1 (1 Dec 2004 15:46:38 GMT) X-Complaints-To: groups-abuse@google.com NNTP-Posting-Date: Wed, 1 Dec 2004 15:46:38 +0000 (UTC) Xref: g2news1.google.com comp.lang.ada:6691 Date: 2004-12-01T07:46:38-08:00 List-Id: I'm working on a Ada95/C project that shall run on both Linux and Windows. I have some problems getting the same float precision on both platforms. Example: ----------------------------- subtype Real is Long_Float; pi : Real; pi := 3.14159_26535_89793_23846_26433_83279_50288_41971_69399_37511; Real_Io.Put(pi, Aft => 25); ----------------------------- Results: Linux: 3.1415926535897931200000000E+00 Win32: 3.1415926535897931159979634E+00 The thing is, the following float-precision definitions are equal on both Linux and Windows: Real'Machine_Mantissa : 53 Real'Machine_Emin : -1021 Real'Machine_Emax : 1024 Real'Digits : 15 So why the difference? The Windows code is built with ObjectAda/adacomp and Microsoft tools cl.exe/link.exe. The Linux code is built with gnat/gcc 3.4.2