Post Reply  
 
 
Thread Tools Search this Thread
15-12-2016, 02:23   #1
Fathom
Moderator
 
Fathom's Avatar
Killer Robots

Drones occupy battlefields today. Airborne. Strike targets. Automated warfare exists today. This technology said to be an expanding "continuum." AI weapons development replacing direct human involvement. Should limits be placed on such “lethal autonomous weapons?” Computer scientist Stewart Russell at University of California at Berkeley thinks so. But is it too late?

Ask I-Robot? Terminator? Sci Fi leads the way?

Ref: http://www.popsci.com/big-idea-killer-robots-are-coming
Fathom is offline  
Thanks from:
Advertisement
17-12-2016, 18:12   #2
Capt'n Midnight
00:00
 
Capt'n Midnight's Avatar
From of all things a Chuck Norris film Code of Silence. It featured a Prowler Robot with machine guns , laser range finders and infra red night vision back in 1985. Product placement of a real murder machine.

Promo here https://www.youtube.com/watch?v=GfT96j-7Zjc
Movie Kill-count (so NSFW) the first robot kill was with Chuck at the remote control, can't remember if the second ones were, but after that it was autonomous, it was kinda sickening watching it the first time because it wasn't a human doing the killing.
https://www.youtube.com/watch?v=E9dqfCvKC1A

These Prowlers existed before the first Terminator movie. But back then no one was buying them. We have entered the fourth decade of functional autonomous killer robots.




http://explainxkcd.com/652/
Quote:
We live in a world where there are actual fleets of robot assassins patrolling the skies. At some point there, we left the present and entered the future.
Capt'n Midnight is offline  
Thanks from:
19-12-2016, 14:43   #3
ScumLord
Registered User
 
ScumLord's Avatar
 
Join Date: Jun 2006
Posts: 28,483
Quote:
Originally Posted by Fathom View Post
Should limits be placed on such “lethal autonomous weapons?”
By who? America wouldn't give up it's advantage and Russia wouldn't want to be told they can't match the US on military might.

At this stage a super intelligent AI taking over is probably the best case scenario for humanity. We keep putting our animal instincts and human emotions onto AI and we don't seem to be able to imagine past them being just extra human. It would have none of those hangups and there's no reason to think it will want to kill everything by default.
ScumLord is offline  
Thanks from:
19-12-2016, 15:47   #4
 
Join Date: Jul 2014
Posts: 9,507
Quote:
Originally Posted by ScumLord View Post
At this stage a super intelligent AI taking over is probably the best case scenario for humanity. .
i) Artificial Narrow Intelligence, has been around for a fair while
ii) Assisted Artificial General Intelligence is the current, and 'aspired to' standard.

iii) Artificial 'Super-intelligence', (most likely through some sort of H+ organic neural interfacing) is still some time away, perhaps 2050++.
So of little use to us between now and then.

One concern is the new breed of robotics that may be able to self-reproduce and even sustain themselves upon organic matter.
Archer Eager Viper is offline  
19-12-2016, 18:03   #5
ScumLord
Registered User
 
ScumLord's Avatar
 
Join Date: Jun 2006
Posts: 28,483
I'm not really too worried about robots until there's something independant to run them without human help.

So much sci-fi gives the impression that we'll accidentally make robots that will just spontaneously become smart and hate us as a consequence. Not only do they become sentient but they become human sentient, as if that's the only state of consciousness that matters, or can exist.

I don't really see an automated system just deciding to run amok. When our tech fails it doesn't fail in a coordinated way, it just stops working. At worst the AI will just make the same mistakes as people, like incorrectly identifying targets, it would probably just do it less than people. It won't get afraid that a child might have a bomb, it will just register a non combatant and leave it at that.

Bottom line is machines are far from self sufficient. They need people in so many ways that any attempt to turn against the will of people will fail pretty rapidly.
ScumLord is offline  
Thanks from:
Advertisement
19-12-2016, 18:30   #6
Fathom
Moderator
 
Fathom's Avatar
Sending humans to Mars? Will flight inventory include weaponry? Armed robots? Spare humans. Save oxygen payload. Send only robots.

(Insert Star Trek meme. To explore...)
Fathom is offline  
19-12-2016, 18:35   #7
 
Join Date: Dec 2016
Posts: 1,759
China and several others are already calling for a international treaty on the Geneva convention level to outlaw any that can operate without human control.

So like drones being flown from what appears to be an Xbox remote from base a possible future terminator like soldier woukd need to also be controlled by a "pilot" of sorts.

Apparently so there can be no washing of hands if one killed a civilian. I suppose in a nation would claim software error due to battle damage or some BS. Also that it means humans always remain in control.
Jurgen Klopp is offline  
19-12-2016, 18:36   #8
Fieldog
Registered User
 
Fieldog's Avatar
 
Join Date: Sep 2005
Posts: 9,186
Quote:
Originally Posted by Jurgen Klopp View Post
China and several others are already calling for a international treaty on the Geneva convention level to outlaw any that can operate without human control.

So like drones being flown from what appears to be an Xbox remote from base a possible future terminator like soldier woukd need to also be controlled by a "pilot" of sorts.

Apparently so there can be no washing of hands if one killed a civilian. I suppose in a nation would claim software error due to battle damage or some BS. Also that it means humans always remain in control.
Jurgen, you have a matter of a Derby to be planning, get off Boards...
Fieldog is offline  
19-12-2016, 18:40   #9
Fathom
Moderator
 
Fathom's Avatar
Quote:
Originally Posted by Jurgen Klopp View Post
China and several others are already calling for a international treaty on the Geneva convention level to outlaw any that can operate without human control.
Film I-Robot. Can robots murder? Current legal definitions?
Fathom is offline  
Advertisement
19-12-2016, 18:46   #10
Fathom
Moderator
 
Fathom's Avatar
Quote:
Originally Posted by ScumLord View Post
By who? America wouldn't give up it's advantage and Russia wouldn't want to be told they can't match the US on military might.
Another Cold War (robot) arms race?
Fathom is offline  
19-12-2016, 22:49   #11
Capt'n Midnight
00:00
 
Capt'n Midnight's Avatar
Quote:
Originally Posted by Fathom View Post
Film I-Robot. Can robots murder? Current legal definitions?
At the very least I consider it manslaughter.

Once the AI goes live it's the same as arming a landmine.
Capt'n Midnight is offline  
29-12-2016, 16:07   #12
Fathom
Moderator
 
Fathom's Avatar
Robots injure or kill. Workplace. "Accident" or what? Proving criminal law "intent" vs civil law damages? Can AI be held responsible? Human owner? Programmer? All the above? Interesting legal questions.

Ref: http://www.nytimes.com/interactive/2...002&abg=0&_r=1
Fathom is offline  
29-12-2016, 22:14   #13
Capt'n Midnight
00:00
 
Capt'n Midnight's Avatar
Quote:
Originally Posted by Fathom View Post
Robots injure or kill. Workplace. "Accident" or what? Proving criminal law "intent" vs civil law damages? Can AI be held responsible? Human owner? Programmer? All the above? Interesting legal questions.

Ref: http://www.nytimes.com/interactive/2...002&abg=0&_r=1
I don't thin AI was involved in any of those. Sounds like all those incidents could have been preventable with adequate safety switches or procedures or staying out of the well marked danger area.
Capt'n Midnight is offline  
Thanks from:
31-12-2016, 17:31   #14
Fathom
Moderator
 
Fathom's Avatar
Quote:
Originally Posted by Capt'n Midnight View Post
I don't thin AI was involved in any of those. Sounds like all those incidents could have been preventable with adequate safety switches or procedures or staying out of the well marked danger area.
Agree. Sci Fi Skynet may be in the distant future. If at all.
Fathom is offline  
17-09-2017, 00:43   #15
ban resistant recalcitrant debutant
Registered User
 
ban resistant recalcitrant debutant's Avatar
 
Join Date: Sep 2017
Posts: 31
A self driving car is a robot and it could kill someone.

edit.
I didn't realise the thread was so old.

About the car, if a self driving car today did kill someone in an accident what information would be available about the crash?

Are there black boxes mandated by law that self driving cars must have?

There should be really, and the black box should contain information from the sensors, and information about what decisions were made by the car and why.

Last edited by ban resistant recalcitrant debutant; 17-09-2017 at 00:46.
ban resistant recalcitrant debutant is offline  
Post Reply

Quick Reply
Message:
Remove Text Formatting
Bold
Italic
Underline

Insert Image
Wrap [QUOTE] tags around selected text
 
Decrease Size
Increase Size
Please sign up or log in to join the discussion

Thread Tools Search this Thread
Search this Thread:

Advanced Search



Share Tweet