comp.lang.ada
 help / color / mirror / Atom feed
* Re: Ethics & Isaac Asimov
  1999-01-20  0:00 ` Tom Moran
@ 1999-01-20  0:00   ` Marin David Condic
  1999-01-20  0:00     ` Larry Elmore
  1999-01-21  0:00     ` robert_dewar
  0 siblings, 2 replies; 7+ messages in thread
From: Marin David Condic @ 1999-01-20  0:00 UTC (permalink / raw)


Talk about drifting off topic! ;-)

Asimov's laws are, of course, based on the assumption that machines
"think", have "identity" and can make moral decisions. This is, at best,
wishful thinking. It's a machine. You can't bargain with it or reason
with it. It doesn't feel pity or remorse... (place those lines if you
can! ;-) You can lead a computer to data, but you can't make it think.

And besides, who elected Asimov to be chief legislator of Robotic Law?
And who's he going to get to enforce that Robotic Law if I decide to
build robots willing to ignore the law and wage war against his pacifist
robots? Game over, man! (Tongue firmly planted in cheek...)

MDC

Tom Moran wrote:
> 
> If a thrown stone is allowed as a (particularly dumb) robot, then it
> surely may violate the first law, and the laws of physics may cause it
> to violate the human's *intent* in the second law (though we in
> computers have lots of experience with the difference between
> following a human's orders literally vs doing what he wanted).  The
> stone generally tries, to the best of its ability, not to violate the
> second law.  The same applies of course to other things (General
> Patton's car caused his death).
>   As to software, I suppose a copy of Pagemaker used to make a
> recruiting poster might eventually cause harm to a human, so, being
> very generous about interpreting words like "cause", software also can
> violate the first law.  Same as above re the second law, and software
> usually tries even less hard to obey the third law.
>   Perhaps Asimov's laws need a little work. ;)

-- 
Marin David Condic
Real Time & Embedded Systems, Propulsion Systems Analysis
United Technologies, Pratt & Whitney, Large Military Engines
M/S 731-95, P.O.B. 109600, West Palm Beach, FL, 33410-9600
Ph: 561.796.8997         Fx: 561.796.4669
***To reply, remove "bogon" from the domain name.***

    "Airplanes are interesting toys but of no military value."

        --  Marechal Ferdinand Foch, Professor of Strategy, Ecole
Superieure
            de Guerre.




^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: Ethics & Isaac Asimov
  1999-01-20  0:00 Ethics & Isaac Asimov G.M. Wallace
@ 1999-01-20  0:00 ` Carl Bauman
  1999-01-20  0:00 ` Tom Moran
  1 sibling, 0 replies; 7+ messages in thread
From: Carl Bauman @ 1999-01-20  0:00 UTC (permalink / raw)



G.M. Wallace wrote in message <36A56985.1B891566@interact.net.au>...
|
|Does military software/hardware violate one or more of these laws of
|robotics ?


... snip ...

1. Military (soft/hard)ware does not come anywhere close to being a robot,
so the "three laws of robotics", should they ever be implemented, doesn't
apply.
2. Since the reason for this (soft/hard)ware's existence is to cause harm
and destruction, it is counter-productive to try to apply them.

BTW, including the "zeroth" law of robotics, there are 4 laws.  :->

-CB








^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: Ethics & Isaac Asimov
  1999-01-20  0:00   ` Marin David Condic
@ 1999-01-20  0:00     ` Larry Elmore
  1999-01-20  0:00       ` Marin David Condic
  1999-01-21  0:00     ` robert_dewar
  1 sibling, 1 reply; 7+ messages in thread
From: Larry Elmore @ 1999-01-20  0:00 UTC (permalink / raw)


Marin David Condic wrote in message <36A605FA.818C9328@pwfl.com>...
>Talk about drifting off topic! ;-)
>
>Asimov's laws are, of course, based on the assumption that machines
>"think", have "identity" and can make moral decisions. This is, at best,
>wishful thinking. It's a machine. You can't bargain with it or reason
>with it. It doesn't feel pity or remorse... (place those lines if you
>can! ;-) You can lead a computer to data, but you can't make it think.
>
>And besides, who elected Asimov to be chief legislator of Robotic Law?
>And who's he going to get to enforce that Robotic Law if I decide to
>build robots willing to ignore the law and wage war against his pacifist
>robots? Game over, man! (Tongue firmly planted in cheek...)

But there's nothing in Asimov's Laws saying that robots are pacifistic! Only
that they can't harm people. They can certainly defend themselves against
non-human threats (see 3rd "Law") unless ordered not to do so. They can even
defend humans from other humans, if it's possible in a non-harmful manner.

>    "Airplanes are interesting toys but of no military value."
>
>        --  Marechal Ferdinand Foch, Professor of Strategy, Ecole
>Superieure
>            de Guerre.

IIRC, he was Colonel Foch when he made this statement. He only became a
Marshal of France well after the start of WWI, didn't he?

Larry






^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: Ethics & Isaac Asimov
  1999-01-20  0:00     ` Larry Elmore
@ 1999-01-20  0:00       ` Marin David Condic
  0 siblings, 0 replies; 7+ messages in thread
From: Marin David Condic @ 1999-01-20  0:00 UTC (permalink / raw)


Larry Elmore wrote:
> >    "Airplanes are interesting toys but of no military value."
> >
> >        --  Marechal Ferdinand Foch, Professor of Strategy, Ecole
> >Superieure
> >            de Guerre.
> 
> IIRC, he was Colonel Foch when he made this statement. He only became a
> Marshal of France well after the start of WWI, didn't he?
> 
You've got me there. The quote was given to me to add to my long
collection of quotes and I simply took it for granted that it was
accurately attributed. I have little knowledge of French military
history.

MDC
-- 
Marin David Condic
Real Time & Embedded Systems, Propulsion Systems Analysis
United Technologies, Pratt & Whitney, Large Military Engines
M/S 731-95, P.O.B. 109600, West Palm Beach, FL, 33410-9600
Ph: 561.796.8997         Fx: 561.796.4669
***To reply, remove "bogon" from the domain name.***

    "Airplanes are interesting toys but of no military value."

        --  Marechal Ferdinand Foch, Professor of Strategy, Ecole
Superieure
            de Guerre.




^ permalink raw reply	[flat|nested] 7+ messages in thread

* Ethics & Isaac Asimov
@ 1999-01-20  0:00 G.M. Wallace
  1999-01-20  0:00 ` Carl Bauman
  1999-01-20  0:00 ` Tom Moran
  0 siblings, 2 replies; 7+ messages in thread
From: G.M. Wallace @ 1999-01-20  0:00 UTC (permalink / raw)



Does military software/hardware violate one or more of these laws of
robotics ?

################################################################
1. A robot may not injure a human being or, through inaction, allow a
human being to come to harm.
 2. A robot must obey the orders given it by human beings, except where
such orders would conflict with the First Law.
 3. A robot must protect its own existence, as long as such protection
does not conflict with either the First or the Second Law.
#################################################################

-GMW.





^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: Ethics & Isaac Asimov
  1999-01-20  0:00 Ethics & Isaac Asimov G.M. Wallace
  1999-01-20  0:00 ` Carl Bauman
@ 1999-01-20  0:00 ` Tom Moran
  1999-01-20  0:00   ` Marin David Condic
  1 sibling, 1 reply; 7+ messages in thread
From: Tom Moran @ 1999-01-20  0:00 UTC (permalink / raw)


If a thrown stone is allowed as a (particularly dumb) robot, then it
surely may violate the first law, and the laws of physics may cause it
to violate the human's *intent* in the second law (though we in
computers have lots of experience with the difference between
following a human's orders literally vs doing what he wanted).  The
stone generally tries, to the best of its ability, not to violate the
second law.  The same applies of course to other things (General
Patton's car caused his death).  
  As to software, I suppose a copy of Pagemaker used to make a
recruiting poster might eventually cause harm to a human, so, being
very generous about interpreting words like "cause", software also can
violate the first law.  Same as above re the second law, and software
usually tries even less hard to obey the third law.
  Perhaps Asimov's laws need a little work. ;)




^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: Ethics & Isaac Asimov
  1999-01-20  0:00   ` Marin David Condic
  1999-01-20  0:00     ` Larry Elmore
@ 1999-01-21  0:00     ` robert_dewar
  1 sibling, 0 replies; 7+ messages in thread
From: robert_dewar @ 1999-01-21  0:00 UTC (permalink / raw)


In article <36A605FA.818C9328@pwfl.com>,
  diespammer@pwfl.com wrote:
> Talk about drifting off topic! ;-)

Oh dear, it looks like this will end up being another
successful troll on comp.lang.ada. Please move this to
some more appropriate group. Yes, yes, I know, messages
like this never work, oh well ...

-----------== Posted via Deja News, The Discussion Network ==----------
http://www.dejanews.com/       Search, Read, Discuss, or Start Your Own    




^ permalink raw reply	[flat|nested] 7+ messages in thread

end of thread, other threads:[~1999-01-21  0:00 UTC | newest]

Thread overview: 7+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
1999-01-20  0:00 Ethics & Isaac Asimov G.M. Wallace
1999-01-20  0:00 ` Carl Bauman
1999-01-20  0:00 ` Tom Moran
1999-01-20  0:00   ` Marin David Condic
1999-01-20  0:00     ` Larry Elmore
1999-01-20  0:00       ` Marin David Condic
1999-01-21  0:00     ` robert_dewar

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox