comp.lang.ada
 help / color / mirror / Atom feed
From: Bojan Bozovic <bozovic.bojan@gmail.com>
Subject: Re: requiring Ada(2020?) in self-driving autonomous automobiles
Date: Thu, 29 Mar 2018 00:21:00 -0700 (PDT)
Date: 2018-03-29T00:21:00-07:00	[thread overview]
Message-ID: <28252fa0-d6a3-48b3-908e-0538201d2dd8@googlegroups.com> (raw)
In-Reply-To: <e6306b3f-ad87-45a4-b59f-c3675b0df813@googlegroups.com>

On Thursday, March 29, 2018 at 5:56:22 AM UTC+2, Dan'l Miller wrote:
> On Wednesday, March 28, 2018 at 4:38:42 PM UTC-5, Bojan Bozovic wrote:
> > Programming (or software development, if you like that words better) methodology needs to be changed,
> > and verifying software against specification
> 
>   So the specification for self-driving cars is in good shape, defining exactly what the expectations in the problem-space are to emulate human-quality of driving?  No, they are not, because in general they are falling into the category of:
> 0) Use the reference implementations of realtime software provided by the sensor hardware manufacturers.
> 1) Have a neural network.
> 2) Train the neural network on dry roads on a sunny day without road construction.
> 3) Once the neural network drives no worse than a nervous teenager, add one hazard at a time to further train the neural network.
> 3) Then a miracle occurs.
> 4) The neural network knows how to have the reflex of the spinal cord & brain stem.  The neural network knows how to have the reptilian brain for tracking object identity and movement.  The neural network knows how to have the mammalian brain for social order.  The neural network knows how to have a human-level of inductive & deductive logical & spatial reasoning about why spontaneous events are happening and how the other guy will likely respond.  And it knows how to coordinate all those layers of brain processing in all the varieties of road hazards concurrently:  pedestrian walks out onto the roadway during a snow storm at night when the road is glazed with ice with a tractor-trailer semi truck tailgating behind with a disabled vehicle on the shoulder up ahead.
> 
> What we would recognize as software in these self-driving autonomous vehicles is primary in the drive-by-wire category of operating the realtime control of the LIDAR and other sensors.  The failure of self-driving autonomous vehicle software is far more likely in the vicinity of steps 2 through 4 of the requirements specification above.  Fidelity of transliteration of the requirements specification into neural nets on one hand and realtime software control of sensor hardware on the other hand probably doesn't even make the top 10 root causes of wrecks of self-driving automobiles.  The neural network and its undesired ability to learn the wrong lesson during training is far more likely the root cause of wrecks & mishaps & maiming & deaths.
> 
> > together with testing must be viewed as indispensable,
> 
> Remember what you think of as testing (either the verification of the fidelity of transliteration of the requirements or the validation of whether the requirements specification was wise in the first place), they think of as training the neural net.  What you think of as a sacred methodical process, the neural network thinks more of as merely more opportunities for trial-and-error training.
> 
> > not some arcane procedure reserved for those that code aircraft, spacecraft and missile software.
> 
> Yeah right.  Vehicles with hardware sensors in the air have an entirely different set of •realtime•-software characteristics than vehicles with hardware sensors on the ground.  Both real and time differ when in the air and when on the ground.  Fly-by-wire is entirely different physics & control-theory processing than drive-by-wire.  Yeah right.
> 
> > Anything that makes writing correct programs easier
> 
> Define easier with respect to emulating a human being's conscious brain atop a mammalian brain atop a reptilian brain atop a brain stem atop a spinal cord in front of a steering wheel and gas pedal and brake pedal (and clutch pedal and gear shift in the case of self-driving autonomous tractor-trailer semis/lorries).  
> 
> Even independent of this difficult emulation, I think Python is generally perceived to be easier for programmers than, say, Ada.  That is, easier in every metric other than assuring correctness (and perhaps readability).  What kind of ease do you wish to optimize?  (Probably the wrong kind.)
> 
> > is welcome and needed, self-driving cars are just one aspect of the problem.
> 
> Randy/ARG has identified self-driving autonomous automobiles as a likely killer* app of Ada2020, so that is why it is a very important topic to discuss, despite your attempt to deflect away from it.
> 
> * so that self-driving autonomous automobiles are no longer killers.  Oh the irony.

Now, if the self-driving cars are made from a neural network doing trial and error learning, sound logical proof is needed that such methodology is really not going to endanger other participants of the traffic. As such proof doesn't exist (at least to my knowledge) self-driving cars are hazard that can't be allowed on the streets. I didn't say that real-time systems in the aircraft have the need of safety that, for example, robots used for car manufacture doesn't, or that medical devices doesn't, many, even those who should know better, don't think such systems run something like VxWorks and Ada rather than some form of Linux. "There is always one more bug" is a mantra that is repeated over and over again, and needn't be so. Also, if we were to compare Python and Ada, even for a beginner Ada wins for ease of programming. Documentation is free and readily available, as are the tutorials and books on the subject. On the other hand Python has nothing like Ada LRM/ALRM online, and Ada 83, it still compiles, 35 years later, on Ada 2012 compiler, while Python went isn't compatible between 2.x and 3.x version. I wasn't trying to deflect the topic from self-driving automobiles, you wrongly accuse me, but I don't see how the problem really differs from an autopilot in an aircraft that is certified to DO-178B/C. At least nobody has tried to put a primitive neural network into these and justify it by saying it performs better than a drunk pilot.
And no, constructing software to be correct as better to debugging isn't my idea, but one by C.A.R "Tony" Hoare and Edsger W. Dijkstra, I'm just repeating it. It predates Ada.
Mr Dan'l Miller, you started out too defensive, as if I somehow insulted you, or Mr. Randy Brukhardt. If it sounded so, it wasn't my intention. If neural networks are indeed the future, the onus is on those who implement those systems to prove they pose no risk. One way or another it will come to pass.

  reply	other threads:[~2018-03-29  7:21 UTC|newest]

Thread overview: 10+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2018-03-28 13:26 requiring Ada(2020?) in self-driving autonomous automobiles Dan'l Miller
2018-03-28 14:24 ` Dan'l Miller
2018-03-28 14:26   ` Dan'l Miller
2018-03-28 21:38     ` Bojan Bozovic
2018-03-29  3:56       ` Dan'l Miller
2018-03-29  7:21         ` Bojan Bozovic [this message]
2018-04-02 21:06         ` Robert I. Eachus
2018-04-03  8:58           ` Dmitry A. Kazakov
2018-11-26 11:44             ` Marius Amado-Alves
2018-11-26 17:31               ` Dmitry A. Kazakov
replies disabled

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox