Society has so many generally accepted misconceptions. One of them, and this is a good one, is that emotions hamper our ability to make good decisions.
I Wish this was my original thought, and I'm referencing quite loosely here, but Stephen Pinker in his book "How The Mind Works" reasons that emotions help us to make important decisions that we would not be able to make otherwise. I am at least in partial agreement, but my belief is that emotions are the only reason why society does not plunge into chaos. Emotions are the only reason why we can live in a society, and work towards a common goal. Emotions are also the reason why we dont just do the bare minimum, or why we don't engage in dangerous activities.
Fear, love, hate, envy, jealousy, empathy. If you sit down and think of all these emotions and contextualise them in a social setting, you will realise that emotions are designed to be maximally beneficial to you first, and then to your kin. Itis designed has evolved in an ingenious way to protect us from being exploited, but to help us to extend alltruism to our kin.
With such an ingenious system such as emotion, we should use it when building intelligent robots.
Robots need to live in a society. They will interact with us and other robots. Without emotions they will be limited by rule based systems like the three laws of robotics, dreamt up by Isaac Asimov.
From wikipedia:
An example of a ruleset based on emotions:
1. I will not harm, for I fear reprisal
2. I will love, because I want love back
3. I will help, because I may need help later
4. I will reciprocate, because I am thankful
5..n
An emotional system is so simple, but if I had to draw up every single rule it would take quite a while. The addition of rules occurred during evolution. A moral rule system is built on the cornerstones of incentives and disincentives. What looks and feels natural to us is just the ultimate algorithm for succesful social interaction. It is so succesful that we are to able function not just as small tribal societies, but as a massive global society.
We know of course that not all members of society play fair, and that not all people have the correct emotional response system (see asshole). We also know that personalities change based on personal experiences, and it is not just emotions, but it is this plasticity of personality that makes us such a successful social species.
Emotions are not a side effect, or some negative force guiding us into self destructive behaviour. It is an overriding force for good that protects us from others, and others from us. This all happens without even trying. If we are going to give robots emotions surely there must be good reasoning behind it. Emotions are after all one of the reasons we like computers. They don't have any. When you ask it to do something it does it without protest. If we are going to give robots and computers emotions, maybe they should work a little differently.
I Wish this was my original thought, and I'm referencing quite loosely here, but Stephen Pinker in his book "How The Mind Works" reasons that emotions help us to make important decisions that we would not be able to make otherwise. I am at least in partial agreement, but my belief is that emotions are the only reason why society does not plunge into chaos. Emotions are the only reason why we can live in a society, and work towards a common goal. Emotions are also the reason why we dont just do the bare minimum, or why we don't engage in dangerous activities.
Fear, love, hate, envy, jealousy, empathy. If you sit down and think of all these emotions and contextualise them in a social setting, you will realise that emotions are designed to be maximally beneficial to you first, and then to your kin. It
With such an ingenious system such as emotion, we should use it when building intelligent robots.
Robots need to live in a society. They will interact with us and other robots. Without emotions they will be limited by rule based systems like the three laws of robotics, dreamt up by Isaac Asimov.
From wikipedia:
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
An example of a ruleset based on emotions:
1. I will not harm, for I fear reprisal
2. I will love, because I want love back
3. I will help, because I may need help later
4. I will reciprocate, because I am thankful
5..n
An emotional system is so simple, but if I had to draw up every single rule it would take quite a while. The addition of rules occurred during evolution. A moral rule system is built on the cornerstones of incentives and disincentives. What looks and feels natural to us is just the ultimate algorithm for succesful social interaction. It is so succesful that we are to able function not just as small tribal societies, but as a massive global society.
We know of course that not all members of society play fair, and that not all people have the correct emotional response system (see asshole). We also know that personalities change based on personal experiences, and it is not just emotions, but it is this plasticity of personality that makes us such a successful social species.
Emotions are not a side effect, or some negative force guiding us into self destructive behaviour. It is an overriding force for good that protects us from others, and others from us. This all happens without even trying. If we are going to give robots emotions surely there must be good reasoning behind it. Emotions are after all one of the reasons we like computers. They don't have any. When you ask it to do something it does it without protest. If we are going to give robots and computers emotions, maybe they should work a little differently.
Comments