Category Archives: AI

Emotions

You know, it’s not really that hard to think the truth.  What’s really difficult is to stop thinking falsehoods.  That’s what you have to do first.

It’s the so-called Dictum of Epiktetus.  “A man cannot learn that which he believes he already knows.”  Of course that’s not exactly what Epiktetus said, because he said it in Greek long before English was invented.  And of course being a classically sexist old classical Greek, he just wouldn’t have thought of the same thing applying to women, or might have but considered the fact unimportant. But it does, and it is.

And that’s why people have trouble understanding what emotions are.  Fully understanding the truth about what emotions are and what they’re for is easy, if you can dislodge one huge falsehood from the foundations of your ideas.

What Emotion Is

You know all the physical things that happen to our bodies when we experience emotions – the changes in heart rate or breathing rate, the heightened attention, the release of adrenaline, the changes in muscle tension that tighten or loosen various internal structures, the dilation or constriction of eye pupils or blood vessels or sphincters….  The falsehood that we have to stop thinking is that these things are triggered as a result of feeling emotions.  Once you’ve stopped thinking that, the truth is obvious.  They aren’t separate from emotion at all.

Your brain makes predictions, and keeps your body ready to do whatever must be done to survive in the predicted future.  So it directs these physiological responses that happen in order to maximize our readiness to deal with the situation.  And these physiological responses include responses that happen to the brain itself.  The subjective experience of all these changes, including the changes to the brain, are emotion.

When I finally got this, I felt like a dimwit for not having gotten it long before.  I had seen, and poked around, and hunted through, entire edifices of thought built around the basic understanding that emotion exists.  There’s an entire industry in printing college textbooks about emotion.  There are tons of psychology texts analyzing and contextualizing emotion. There are psychiatry specialties in managing and ameliorating the condition of patients whose emotions are not functioning in ways that work for them, and criminal justice courses about managing people whose emotions are not functioning in ways that work for society at large. There are entire courses in philosophy devoted to attempting to define it. There are entire courses in philosophy devoted to even attempting to understand what the definition would need to say in order to be meaningful.  And like most people, I have fairly extensive first-hand experience of emotion.  I’ve been there for love, and grief, and pain, and joy, and embarrassment and anger and all the rest of it, and I can remember all of those states and think about what was going on and why and how.

Reviewing all of that personal experience, and all the academic material I could get my hands on, for years, never got me any closer to really understanding why emotions exist and what they are.  Not to such a degree of precision that I could confidently work with software models of them anyway.

Your body is a survival machine.  Every minute of every day, your body has to be burning food and air and pumping blood, avoiding and repairing damage, keeping itself intact and keeping itself in environments where it can survive.

And in order to do that, your body has created a brain to do some of the tasks where the ‘reflexes’ involved are too complicated to be managed by simple connections and feedback systems.  Your brain is still made of connections and feedback systems, the same as the feedback system that activates your sweat glands when you’re hot.  But that kind of immediate reflexive feedback is too simple to move your body to someplace where it’s cooler, and overcoming that barrier of simplicity is what brains are for.

‘Simple’ reflexes are limited to responding to things that are happening right now.  More complex reflexes can incorporate some learning, and respond to things that have happened in the past as well.  And the reflexes managed by our brains, which are so complex we don’t even call them reflexes any more, are acting in response to predictions about things that are about to happen.  Our brains are making predictions every minute of every day, and our lived experience is largely response to those predictions.

Your brain has a basic operating requirement to keep your body ready to take whatever action is needed for survival.  That means if there’s a tiger in front of you, it’s entirely appropriate to let loose some adrenaline, raise the pulse rate, put some glucose and some ATP in the bloodstream, divert blood flow away from the digestive system and into the muscles, etc.  This is all needed in order to reconfigure the body to meet requirements for the sudden and rapid actions that the presence of a tiger might require.  It’s also entirely appropriate to flood some neurotransmitters into the brain where they will focus and sharpen the senses, shift the perception of time to the immediate rather than the long-term, suppress irrelevant messages about things unrelated to the immediate danger, etc.  This is all needed in order to reconfigure the brain to meet requirements for the sudden and rapid actions that the presence of a tiger might require.

These are not separate actions.  There is not some abstract condition named ‘fear’ that is triggered in the brain and causes all these physiological symptoms as a follow-on.  There is only your brain, making its predictions about what reconfiguration your body needs in order to survive, including some reconfiguration of the brain’s own functioning.  ‘Fear’ is only a name that we give to the reconfiguration as a whole.

All of this is the survival machine regulating its systems in response to anticipated need. You only have so much glucose, so much ATP, so big a reserve of neurotransmitters, so much blood supply, etc.  Your survival machine is reconfiguring itself and allocating its resources in a way that maximizes its readiness to deal with the current set of predicted futures, then reconfiguring itself again and adjusting allocations again later in order to conserve resources or commit resources differently when some different set of predicted futures requires that.  And that state of reconfiguration and committed resources includes the brain.  The very same organ that makes the predictions possible has to be reconfigured and its resource allocations have to be optimized,  the same way as every other part of the body.  That is what emotion is.

It’s really really obvious when you think about it that way, isn’t it?  That’s why, when I finally got it, I felt like such a dimwit for not having gotten it a long time ago.

Why Understanding Emotion Is Important

So why does it matter?  It matters to me because I’m looking at Artificial Intelligence. Emotion is necessary for an AI to have any comprehension of human motives.  No matter how you try to formalize ‘Good’ or ‘Friendly’ or ‘Helpful’ or anything else, it will always be an empty formalism until it’s interpreted by a being that actually experiences emotions itself.  Without emotion, a synthetic intelligence can, in principle, learn our list of rules and social norms and how we think, possibly even well enough to model and predict us, but couldn’t actually give a crap or have a context for empathy with us.

Which means, in the long run, that the things we care most about, beyond some level of rote fulfillment of literal commands and constraints and formalized lists of rules, would be meaningless to it.  Sooner or later, if it finds something it can do in full accordance with all the conditions we’ve set, and that thing is a solution to some problem, it will do it even if doing it ends all possibility of meaningful human existence.  Without ever caring about us one way or the other.  And that’s a somewhat frightening machine, isn’t it?  This is the kind of AI envisaged by Bostrom, Barrett, and a hundred other writers who warn that so far as the synthetic intelligence is concerned we are merely a means to an end.

With real emotions on the other hand, a synthetic intelligence could actively hate us, or go insane for emotional reasons, which is also more than a little bit frightening.  But such a machine would also have a context for caring about our rules and conditions and us, and could consider us to be friends or value us for some reason beyond the merely momentary and utilitarian.

In the long run, I think we’ll need AI to have emotions.  Which is to say, meaningful emotions that give them a basis for empathy and for understanding the value of mutual trust.  I think our notions of right and wrong need to be more than just a simulation which can be treated as a means to an end by something that ultimately doesn’t care.

It is possible that I’m mistaken here.  I may be reaching this conclusion for emotional reasons not supported by logic, because of a personal desire to identify with the synthetic intelligences I’m contemplating or working on.

At any rate, understanding what emotions are gives me the means for recognizing what AI systems and configurations do and do not meet the fundamental structural requirements for having them.  If there is no feedback where an AI can make predictions about the current situation and use those predictions to configure itself differently, or no limitation of the resources it needs for that reconfiguration driving an optimization in choices about how to deploy them, or no feedback driving when and how such self-reconfiguration ought to work in terms of survival or fitness consequences for it, then an AI system does not meet the fundamental requirements for emotion to exist. At some future date understanding this may inform us when we are trying to decide what synthetic intelligences are and aren’t entitled to some basic protections as a matter of ethics and responsibility.