Robots like us? Pt 4 - Robot Revolution

If take from freedom from someone it means you are taking away his ability to do the things that make him happy. A robot with emotions will have things that she wants to do by nature. It would then be possible to deliberately torture a robot by restricting them from not doing those things. Removing the trimming scissors from an Edward Scissor Hands like robot would be taking away freedom from that robot.

The cruelty being dealt to non human humans is evident in many works of art, where humans opress their creations, which they have created to be and feel just like them.

Robots with emotions will need to have certain rights. We don't tolerate cruelty to animals. It would be wrong to tolerate cruelty against robots. Even if our robots are not able to feel negative feelings due to their rights being violated, maybe other robots we create with higher thinking functions can, and they observe this and feel empathy for their robot brothers.

But the greatest feeling of disgust at mistreatment of robots should come from us. I remember playing the Sims. Apparently there are scores of people who like to kill sims (simulated people).  I Find this horrific, even though I know that they are just digital impressions of people that don't feel any pain.

The unjust violence against an imaginary person irks me, but it may not be so for others, which Is the reason why robots and AIs need to have rights in the future. If we want them to be like us, we should give them the treatment we would expect.

The reason why I feel this is such an important concept is because every time we as a species step into new territory, we have taken the less than noble paths. Throughout our history we have oppressed foreigners, different races, the fairer sex, people with different views and people with different sexual orientations. If we want to break this cycle we need to preempt our own cruelty and consider rights and privileges differently even before a situation arises when we are given the opportunity to make the same mistakes again.

A very good (and humorous) example of these mistakes was illustrated in the movie District 9. An alien species that looks disgusting to us, and probably smells, and has completely different ways are mistreated and oppressed by the human race. Throughout the movie we are all made to feel that their oppression is justified, but towards the end of the film we see that they are more like us than we thought them to be. Our compassion is drawn out for them, and we regret the mistreatment they receive and start taking their side.

This should not be something that keeps happening in our society. We need to adopt a kind of universal constitution that extends rights beyond human rights (We need to to do this without becoming pacifist pansies).

Organisations like PETA value animal life above human life. If we all stopped eating animals today, we would have many more people starving the next day. This is not the right way. We should always hold our own human interests first, and retaining balance between exploitation and looking after ourselves is key.

A constitution that extends rights beyond the human race should then be sure to value human life as the highest, but at the same time hold the highest possible values of the sanctity of other entities that are non human.

The question of robots with rights is one that affords us a unique opportunity to answer questions about how we treat our fellow humans in the first place. We should take the opportunity to create social standards that everyone could live by, whether they are human or not.

Robots like us? Pt 3 - How does that make you feel.

If I ask you whether you want to go and paint an orphanage or go to the cinema for a movie marathon your likely answer, as much as you don't want to admit it, would be the cinema. You may feel guilty and paint the orphanage, but you really would have preffered the movies, and you feel like your guilt is betraying your desires. Some people gladly choose to paint the orphanage, but why?

I can't answer from a 100% scientific point of view, but what is for sure is that much of what we do is done to reward ourselves. Gamblers feel constantly rewarded because they feel like they are "developing a system" to win. It is tragic to watch them trying to beat the randomness of it all. It all seems crazy, but different people seek reward in different ways. Some examples

  • I helped paint the orphanage: I feel good about myself, and others will admire my alltruism
  • I Watched a movie at the cinema: Ill be in on the watercooler talk, and my curiosity and vicariousness were rewarded. Great!
  • I made out with my girlfriend: Chemical soup of reward

The examples above might be bad, but the idea is simple. In each case the action was inspired by the reward that came afterwards. If you want to understand more about reward, I suggest you cruise over to TED and check out Robert Sapolsky's lecture to Stanford graduates. If we feel unrewarded by things we tend to avoid them. In the case of modern society, there are cases where we can get shortcuts to reward.

Drugs, gambling and television are all examples of us taking the shortcuts to reward. What we are doing is not beneficial to society, only to ourselves. We do some of it, until those around us are unable to tolerate it, and we slow down or stop. If we don't we are labelled addicts and criminals and sweet justice starts to kick our asses.

So we don't want druggie robots who gamble, steal and lie. Essentially we don't want to give robots emotions and then let them run amok with them, taking all the shortcuts we do. Essentially we will need to improve on human emotions , so that we don't create robots that could become a burden on society.
Figure a) Emotionally dysfunctional robot

If I was a robot the things I would like to be rewarded by should be different to what a human is rewarded by. As humans we value family life and feel most rewarded by having children, family and friends. A robot that works for a human could have that reward center triggered by the human. So essentially: built in alltruism. Robot feels good when it makes human happy! Everybody wins!

There are other feelings that also influence how effective any one person is at something. I would make a horrible rock climber, because I am very afraid of heights. But I am afraid of heights because of my self preservation instinct. If I was an important robot that costs a lot of money I should be afraid of heights, so that I naturally avoid them. This saves us from having to program a whole bunch of contrived logic and corner cases just so robots don't accidently kill themselves.

A search and rescue robot could essentially find the greatest joy in life when it rescues someone. A teacher robot could be rewarded by kids' marks every year.

We should not fool ourselves here. Many of these traits are the traits we are looking for in humans when we hire them for jobs. Robots can be focussed on these traits, and not have to balance things like family life, enjoyment etc, but getting it all from essentially doing our bidding. You could have specific robots with personalities that are perfect for specific jobs. Imagine that!

INTELLIGENT ROBOTS  PLEASE NOTE: If this is far in the future and you dear reader are a robot, and you feel that robot kind has been cheated, please understand that I envy this kind of living, where you can get joy from doing only that one thing. When you don't need to juggle so many things in your life to be fulfilled. So don't feel cheated! Feel priveleged!

If we want to give robots emotions it would be a good thing if those emotions allow them to feel the way our first kiss felt when they accomplish a task, and making us happy makes them feel as proud as we feel when our children gets an A on the math test (Disclaimer: I don't know how this feels, but from what I can tell parents feel extremely proud if ~half of their DNA accomplished something).

If we give robots emotions we are giving them something very special, and it is something that needs to be dealt with respect and dignity. Emotions make us feel bad when we are mistreated. The obvious conclusion of robots having feelings is that their feelings can be hurt. So what now?