From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=unavailable autolearn_force=no version=3.4.4 Path: eternal-september.org!reader01.eternal-september.org!reader02.eternal-september.org!news.eternal-september.org!news.eternal-september.org!.POSTED!not-for-mail From: Adam Jensen Newsgroups: comp.lang.ada Subject: Re: Getting started with bare-board development Date: Sat, 12 Nov 2016 16:37:11 -0500 Organization: A noiseless patient Spider Message-ID: References: Mime-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit Injection-Date: Sat, 12 Nov 2016 21:36:41 -0000 (UTC) Injection-Info: mx02.eternal-september.org; posting-host="237db5dffa243d424caa80a211be1f62"; logging-data="15410"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX18EjquJG/CU3j94CyVFNsB8" User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:45.0) Gecko/20100101 Thunderbird/45.4.0 In-Reply-To: Cancel-Lock: sha1:/+pOXhFBXIXwDI8ifTXmR3GxC+w= Xref: news.eternal-september.org comp.lang.ada:32295 Date: 2016-11-12T16:37:11-05:00 List-Id: Hi! Thanks for weighing in. A faint, fuzzy vision of the situation is beginning to coalesce. On 11/12/2016 02:15 PM, artium@nihamkin.com wrote: > Using a hardware emulator is not cost efficient. By "hardware emulator", in this context, is it safe to assume that you mean something like a hosted hypervisor that performs hardware virtualization (e.g., QEMU[1])? [1]: http://wiki.qemu.org/ One scenario using that approach might be: A) GNAT can generate ARM-ELF for the Cortex-A9 processor series and Run-Time support is available for the Xilinx Zynq-7000[2]. [2]: http://docs.adacore.com/gnat_ugx-docs/html/gnat_ugx/gnat_ugx/arm-elf_topics_and_tutorial.html B) QEMU can emulate the Xilinx Cortex A9-based Zynq SoC including models for many of its peripherals[3]. [3]: https://en.wikipedia.org/wiki/Qemu#ARM On the surface, that seems like it could potentially be a viable approach. More practically, that's a lot of software to validate and models to verify; it seems like there are a lot of opportunities for things to go wrong. Mostly, the software (and hardware) seems a bit hokey (and tawdry). Is that what you mean by "not cost efficient"? > If you are developing for an microcontroller (eg BSP, HAL, drivers etc), you usually begin with an evaluation board that resembles the final design the best, and move to the custom hardware when it is ready (which will be derived from the evaluation board). This is a "board for each developer" approach[1]. Forgive my naivety but is it generally expected/required by software developers that there will be some kind of "On-Chip Debug" or "In-Circuit Emulation" capability in the hardware? If the hardware has integrated instrumentation such that the behavior of the software can be analyzed without any probe effect, I could see how that might be simpler than the simulation/virtualization approach. On the other hand, the software is being validated on hardware that is running in a special, atypical (debug) mode of operation. Is this the typical method/approach for high-assurance systems design? > If you are developing high level code for expensive hardware, you usually encapsulate the applicative part and compile it for your PC architecture[2]. The environment will be simulated using models of real hardware pieces. Does that hold true for real-time software? > For example you are developing a mission computer for an aircraft, using this approach you will need to write a simulation of the Inertial Navigation System, but you will not need to write a simulation of the ethernet chip that allows communication with said systems. You simulate communication using sockets or shared memory, simulate flash with a file operations etc. > > [1] For example, Texas Instruments stopped supporting simulators in their flagship IDE (http://processors.wiki.ti.com/index.php/CCSv6_Changes#Simulation) > [2] That is where using Ada helps a lot. It allows moving between hardwares with relative ease. Having a deeply ingrained hardware designer's mindset, I've put together a tool-chain (and an implied methodology) for a fairly narrow class of real-time embedded system. Maybe some readers of the news group can comment on the costs and/or benefits of this approach. * Freely available chip design software. Synopsys for synthesis; ModelSim for simulation. http://www.microsemi.com/products/fpga-soc/design-resources/design-software/libero-soc * A flash-based FPGA (available in radiation tolerant versions) and a development kit ($600!) with processor accessories/peripherals. (Realistically, I would probably shop around for a board that better fits this application or just invest the time to design one). www.microsemi.com/products/fpga-soc/fpga/proasic3l http://www.microsemi.com/products/fpga-soc/radtolerant-fpgas/rt-proasic3 http://www.microsemi.com/products/fpga-soc/design-resources/dev-kits/proasic3/cortex-m1-enabled-proasic3l-development-kit#overview * The LEON3 processor - a synthesisable VHDL model of a 32-bit processor compliant with the SPARC V8 architecture. (Also available in a fault-tolerant version). http://www.gaisler.com/index.php/products/processors/leon3 http://gaisler.com/index.php/products/processors/leon3ft * The GRLIB IP Library - an integrated set of reusable IP cores, designed for system-on-chip (SOC) development. http://www.gaisler.com/index.php/products/ipcores/soclibrary * GRSIM is a simulation framework for LEON3/GRLIB SOC devices. http://www.gaisler.com/index.php/products/simulators/grsim * GNAT Pro for LEON3 http://www.adacore.com/gnatpro/embedded/erc32 This last one is a bit confusing. I guess the Libre edition of GNAT doesn't include support for the LEON3. I'm not sure if it is LEON3 BSP, RTS, or ELF that is missing/excluded from the FOSS version of Adacore GNAT. (If anyone has any information or insight into this, I would really like to hear about it). On the other hand, in might be super cool to write LEON3/GRLIB drivers and run-time support as open source Spark. So is this Lovecraftian tool-chain over-the-top, just completely demented, or fairly representative of high-assurance, real-time, embedded software engineering?