robot rights

European Parliament Panel Backs Human Rights for Robots

The European Parliament Committee on Legal Affairs has voted in favor of a report supporting giving robots legal rights.

The report proposes recognizing robots as legal personages. It suggests, “the most sophisticated autonomous robots could be established as having the status of electronic persons with specific rights and obligations, including that of making good any damage they may cause.”

The vote wasn’t even close, 17-2, with two abstentions, meaning the report will go up for a vote in the full Parliament in February.

The report includes several caveats about robot rights, behaviors and the limits of what designers can build them to do. These include a variation of the Three Laws created by Isaac Asimov, that a robot must obey human orders and cannot cause harm to a human. If those conditions are met, then a robot would have the right to defend itself from harm.

Also included is a proposal to make kill switches standard on robots, in case R2D2 gets uppity. There is also a curious suggestion that robots should not appear to be emotionally needy, that you should never think a robot loves you or is sad. This is probably a bone for the perverts who inevitably will want to have kinky, guilt-free robot sex, but it may put a delay on plans for allowing robot marriages because if a robot can’t even fake love, how can it give consent to marriage?

The inmates obviously are in charge of Arkham.

At the least, they’ve seen a few too many Star Wars movies. While Asimov’s Three Laws sound great on paper, they have practical limits, some of which the movie “I, Robot” made clear. In that movie, the master artificial intelligence takes the law about protecting humans to its logical end and decides the best way to protect human life is to enslave humanity and take over the planet.

There are other problems, such as a robot’s lack of real awareness. Humans are aware of dangers, their own shortcomings and of other people’s limits through an organic process of growing up and learning through sometimes painful trial and error. Robots would have to be programmed with that knowledge.

Anyone who has ever programmed a computer can probably think of times when perfectly reasonable subroutines knuckled under when confronted with an unanticipated combination of inputs. Mimicking truly human-like activity is more complex a problem than any mere computer program has ever been successfully designed to handle. Lack of real empathy means that a robot could never predict all possible outcomes and therefore could never truly be sure if its actions may or may not harm a human until it happens.

If a robot believes its actions will not harm a human, and that it is in compliance with orders, but then its action has an unexpected effect and somebody gets hurt, is the robot really competent to be held responsible?

Then there are the more philosophical concerns.

Why stop at robots? Why not your car, or your toaster oven?

I think there’s an obvious opportunity here that we shouldn’t squander, to give gun rights to guns themselves. That way, when somebody is shot, the gun can be held responsible and if found guilty melted down into dinner trays.

This would respect the rights of the person who formerly would have been called the shooter, and it will have the added bonus of making liberal efforts to control guns sane for the first time in history, even if only by comparison to the law actually giving guns rights.

Of course, we may have to determine whether firing a bullet represents protected speech on the part of the gun. We do still enjoy in this country the right to shoot off our mouths. …

Please leave your comments below

Facebook Comments