Seriously. Not in the physical sense. Current robots are no match for me, but the Terminator era is on the rise. Is it legal to beat your automated pal? #AskingForAFriend
I'm not really asking for a friend. I'm asking on behalf of humanity. Recently a guy in Japan kicked a robot while he was drunk. The robot does not walk as well and it's internal computer may have been damaged. You could say he crippled it and gave it brain damage.
This raises so many philosophical questions for me, I can't begin to list them all. But I'll try anyway.
Is it more wrong for me to hit a humanoid robot?
Assuming a robot (not a big fan of that term) is just a machine, how is it different than a toaster? Thanks to BSG for that reference. Hitting your toaster means mild anger issues. Hitting Rosie the Robot means you have much more violence based emotional problems.
Not all robots are humanoid though...
What if it is the Doctor's K9? Or a Felinoid companion? Or R2 D2? Think about it, without giggling please. I am assuming you are familiar with Star Wars. If not, invest a few hours, catch up, and continue reading here. Imagine you had a fully functional R2 D2 at work. All beeps, bobbles and quirks included. A client/customer comes in and angrily beats R2 for poor service. Would you keep working with them? File charges? Hit them back?
Should there be a middle ground between hitting a human and damaging property?
The above linked article mentions such a middle ground. It sounds like a great solution, on its face. Getting angry and beating your neighbor's dog because it crapped in your yard is a lesser offense than beating your neighbor, but a greater offense than flinging the crap at their house.
I fear we would be creating an environment where Turing Test capable (dare I say sentient!) machines in the future are second class citizens. Treated similarly to pets. Get me a little tipsy and I will go on my tirade about how Star Wars treats droids. We don't allow their kind in here. I don't recall ever owning a droid. These are owned sentient beings that can have their memory wiped without a second thought. Will we be remembered for a Three-Fifths compromise of sorts, because we lacked the forethought to imagine that someday machine intelligence may (at least) be indistinguishable from our own?
What about robots designed for violent purposes?
I train in karate. I'm not even half bad, and an awfully large man on top of it. When I hit you, you know it. Therefore, even my sparring partners who willingly train with me (while I am only trying to basically tag them) tend to regret it afterwards. It stands to reason someone will build a sparring droid. Even though it has no choice in the matter (unlike my fellow students) will it be exempt so that I may kick it in the face?
Other robots are built for much more violent purposes. War. I can treat war differently, right? If my soldier shoots the semi-intelligent drone trying to ferret out his location, that is just survival. But what if I wipe them all out with an EMP blast? Am I defending myself or creating a new kind of genocide? Is it a war crime?
Did I give unfair consideration to robots?
Earlier in this post I said the assaulted robot "Pepper" was crippled and had brain damage. I was absolutely intentionally anthropomorphizing a machine to give emotional weight to my argument. And at some point I should. At this stage of their evolution, it is probably unfair to say a machine can be injured. This will not always be the case. The classic TNG episode Measure of a Man addresses this future concern. I've even argued the episode's best speech is a call to reconsidering how you treat all beings. Gay, Black, Straight, Vulcan, Cylon. Stop considering yourself better than others.
What about a compromise based on intelligence?
It is okay to hit a humanoid robot if it has the intelligence of a flea? That will be an argument. If it isn't as intelligent as Commander Data, it's just property. Who draws that line? I am more intelligent than many people. Can I hit them? Ignoring that absurdity and bringing it to a more realistic level. It is not okay to beat or kill a person with an IQ of 50, no matter what your IQ is.
I remember reading that an average dog is basically as smart as an average 2 year old. (Knowledgeable is a better term though.) And it is not okay to beat a dog or a two year old. Last year, the dog I had for nearly 15 years of my life died. Sad as it is, that's about all the time we get with them. And at some point I think we may have machines that are as much family to us. Somewhere far above your car, but below your child. Just keep in mind, liking my family more than your family is a human survival instinct. And I think robots may someday be part of our families.
What about nonviolence and crimes involving robots?
Sex robots exist. What might that be like in a decade? If I steal one and use it is that theft? Kidnapping? Rape? In fact, can they ever give consent? Can I program consent?
Conversely, can I program an intelligent robot to commit a crime? Can I brainwash a human to commit a crime? Can I create a human the old fashioned way and either intentionally or unintentionally raise a criminal?
There are more questions here than I have asked. I sincerely apologize for not having the answers. I'm only human.
You would benefit by reading several novels by P.K.Dickand of course Issac Asimov.As you know both of these authors and others not mentioned, have visited non-human intelligence, and left us with as many questions as truths.
ReplyDeleteThe problem with the three laws is how they address concerns over robotic behavior and not our own.
ReplyDeleteThis comment has been removed by the author.
ReplyDelete