From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,751584f55705ddb7 X-Google-Attributes: gid103376,public From: Alan Brain Subject: Re: Ada 83 in embedded systems Date: 1996/02/26 Message-ID: <4gs8gu$67f@fred.netinfo.com.au> X-Deja-AN: 141200816 references: <823906039.22113@assen.demon.co.uk> <823965654.4500@assen.demon.co.uk> <824165619.14894@assen.demon.co.uk> <4fs7ml$cf1@rational.rational.com> content-type: text/plain; charset=us-ascii organization: Netinfo Pty Ltd - Canberra Australia mime-version: 1.0 newsgroups: comp.lang.ada x-mailer: Mozilla 1.2N (Windows; I; 16bit) Date: 1996-02-26T00:00:00+00:00 List-Id: One example regarding the use of Ada 83 in embedded systems: A firm I used to work for had 2 sets of processors on a VME databus. The KAV-30/40 (DEC) was programmed almost exclusively in Ada, with a few lines of assembler here and there ( < .001% by memory). The other was a card containing 2 Intel i860s, which were ORIGINALLY programmed in C, with an Ada Superstructure, but whose C portions shrank with time. One reason for this was the greater productivity, in terms of working code, shown by novice Ada programmers as opposed to expert C programmers. Case in Point: The I/O between them was via DMA. To give an idea of how novice the programmers were, there was great consternation in the Ada team, and joyous delight in the C team, when it was found that the internal representation of a 6-level variant record* containing 22K Bytes of data differed significantly between the DDCI (i860) compiler and the DEC Compiler. So the decision was made against my strong reccomendations to 'go for C'. Three months later, no progress. The C compiler just didn't specify the representation well enough for portability. The i860 is an excellent number-cruncher, but particularly awful at moving memory around. A decision was therefore made that the interface should be as per the DDCI native (unspecified) representation. It took me a fair while, and quite a bit of experimentation, to find out what the DDCI compiler did, especially regarding word boundaries. The documentation differed significantly from actuality. Nonetheless, soon I was able to have a base type, plus a sub-type on the KAV processor with a different representation to the default, which just happened to be the same as the i860. This sub-type was used on the i860 throughout, (as it was exactly the same one used by the compiler if no representation instructions were used), and as an input buffer type for the KAV, converted to the base type by assignment statements (which incidentally performed boundary checks on each field individually, so that error correction was eased). The only traps for young players were : Conversion from Big-Endian to little-endian ( performed by hardware - jumpers on the cards), the fact that one compiler started its enumerated types at 0, while the other started at one - which made assignment statements neccessary - and that one card used the DEC float, while the other used IEEE. This latter was cured by a horrible Kludge - before signalling data ready, a program was executed that recognised the data type on the i860, transformed the relevant Floats to DEC format, before flushing the cache and signalling data ready. An inverse process took care of received floats. This was done because the i860 is great for bit manipulation too, and efficiency was a very high concern. The IEEE->DEC floating point conversion was originally in C, and was from a library. As an exercise, I converted I re-wrote it in Ada - and found a Bug in the C! To add insult to injury, the object from the DDCI Ada 83 compiler worked 15% faster than the buggy C code, and 5% faster than the re-written (by a Guru) C. A trap even for experienced players was that the particular stepping of the i860 that we used in the first cards had a design fault. Without Ada's excellent error detection capabilities, it is very doubtful we would have found the problem. ( When Caching was enabled, Pipelining was enabled, and an integer in the program segment was loaded into the floating point part of the chip within 16 cycles of an interrupt being serviced, the interrupt return register would be corrupted - usually to 0, ie machine reset, but sometimes randomly in address space - hairy!) Bottom Line: Measured Data Transfer rate from the first attempt without optimisation was 192 MBits/Sec, well over the 176 MBits/Sec required. Lessons from the above: This is the type of compromise you gotta live with in the 'real world'. These are some of the issues - sometimes efficiency is crucial, and causes you to do some very iffy things, even in Ada. (Ada 83 anyway) That Ada 83 even in this case performed radically better than C - even though C was on its home ground. That the Ada was maintainable - and as time went by, and specs/requirements changed, by a process of 'survival of the fittest' a greater and greater proportion changed from C to Ada 83, as the unmodifiable C code had to be thrown away. (By Unmodifiable, I mean 'Not Modifiable for X Dollars, where X dollars = the cost of writing it from scratch in Ada 83") That the type checking, the exception handling and other much-maligned-by-C-hackers safety features of Ada 83, when used properly, enabled a very obscure chip fault to be found (and worked around by massaging the compiler's interrupt processing code generator), and made the whole system relatively well protected against code bugs. soft failures, and hardware faults in a very large and safety-critical system. * 6 levels of Variant Record: A Variant record, one of whose fields is a variant record, one of whose fields is a variant record.... to 6 levels. Yes, some hardware people deserve defenestrating... So: The original title of the thread - 'Ada is nearly useless for embedded systems' would seem to be bovine scatology. Or at least, if true, I'd have to say that all others are not just nearly, but completely useless!