View Single Post
Old 02-10-2018, 04:25 PM   #17542
Number 3
Hail to the King baby!
 
Number 3's Avatar
 
Drives: '19 XT4 2.0T & '22 VW Atlas 2.0T
Join Date: Dec 2008
Location: Illinois
Posts: 12,170
Quote:
Originally Posted by DGthe3 View Post
I don't like getting a speeding ticket any more than the next guy. Thats why after my second one, I decided to never try pushing the limit. Haven't been pulled over (let alone received a ticket) in nearly 10 years. Probably saved a bit on gas money in the process too.



I don't know where the law sits on this, but I've always felt that if something is happening on PUBLIC property, you don't have much of a right to PRIVACY anymore.


You don't count the Unmanned Combat Air Vehicles like the Predator or Reaper as 'drone war machines'? They fly over battlefields, blow things up, and there isn't a squishy meat bag inside piloting it.

Or were you thinking of something that can authorize itself to kill (ie, not having on someone in a 20' shipping container with a satellite feed pushing the button)? Like a gun with a sophisticated sensor system, capable of tracking & discerning objects (not too unlike what is being used in some cars today), and programmed rules of engagement? Because those have existed for decades. One of the first was the Navy's Phalanx CIWS missile defence system. They have their own radar & shoot any missiles (or aircraft) that get too close to naval vessels. And South Korea allegedly has a system that can work the same way against ground targets (vehicles, soldiers) along the DMZ with North Korea (though they say that a human still provides authorization before it fires).

The hurdle isn't the technology, its ethics. I doubt that will be much of a hassle though. The debate over 'is it unethical to let a robot decide kill a person' kinda glosses over the assumption that it is ethical for a person to decide to kill a person.

Its been accepted that its fine to condition people to kill other people, so long as they're soldiers. Mainly, this is achieved by dehumanizing the enemy. Some of this is done through propaganda ('save your loved ones from those savage beasts'), some through training (by calling that humanoid figure out on the firing range a 'target' instead of a potential father of 2). But the point is, as a society, we're fine with making the enemy seem less human because then there is no empathizing with them & thus they're easier to kill. And yet constructing robots utterly incapable of feeling empathy, the logical conclusion to that process, thats somehow crossing a line? I doubt it.
I do count them. They have missile capability. But elimination of pilots in a fighter jet, or tank crew was what I meant.

But going from human controlled drones to AI? In a recent article more people are now afraid of AI than aliens. Aliens? What? I didn't know we we supposed to be afraid of aliens.
__________________
"Speed, it seems to me, provides the one genuinely modern pleasure." - Aldous Huxley
Number 3 is offline   Reply With Quote