Is it impossible for a AI smart machine to have emotions? How might an AI smart machine deal with us and our human emotions? Will it be tolerant, or get confused?
I remember my aunt, a nurse herself, relating a scene of a cheery nurse bouncing in and asking, “And how are we this morning?” The reply came, “Well, I don’t know about you, but I feel…” I don’t remember if it happened to her, or if she was relating something she observed during her career. My imagination showed be a nurse with a British accent marching into a hospital room and flinging the curtains open. I heard that robots are being deployed to work in hospitals, initially to dispense medication. I know from my limited experiences in hospitals visiting patients, that a nurse can make a huge difference in a patient. Some have an extremely comforting bedside manner. But what happens when the feelings are absent? Can feelings be accurately simulated by AI?
I actually watched the first TV series and movies called Star Trek. The highly logical, non-emotional, half human and half Vulcan Mr. Spock presented the show with many opportunities to explore ideas around feelings and emotions versus logic. In the Star Trek stories, do you recall that emotions are seen as one of the ultimate human traits that can, in the end solve problems that defy logic?
So the question is, if the Technojungle is run by computers and computers run on logic, will computers one day acquire feelings and emotions and become human, or nearly human?
Human emotions are reactions to various stimuli, such as hearing, seeing, touch, etc. We can also have a more permanent state of emotion. For example, you can be happy about the current moment. You can be happy about something. You can also just be a generally happy person. A happy person has happy emotions which also cause feelings of happiness. An emotion, such as love, can give rise to range of feelings from feeling amorous to feeling hate. Our reactions make us happy or sad, comfortable or uncomfortable, etc. Our emotions cause us to have feelings, but the two are usually different. You can have a feeling of hunger which is not considered an emotion. In general, emotions are usually unconscious while feelings are conscious.
There is a sub-field of AI computing called Emotion AI. Feelings and emotions can be confusing, but, as AI learns, it can recognize what stimuli causes a particular emotion. Using facial recognition, along with tone of voice and perhaps other outward displays of the feelings, computers can recognize the emotional state of a human. Smarter machines can probably tell more through heart rate, pulse, blood pressure, maybe even brain waves. That may not entirely be the way humans understand the emotional state of each other. In what other ways do think people detect emotions in other people? I’m sure we can all agree that human emotions can be very complex. Sometimes we can be really good at hiding our feelings and emotions. They may lurk under the surface and be difficult to detect.
Have you ever experienced someone saying something like, “I am sensing that you are sad?” Can emotions be infectious? Sometimes a person who is happy can infect those around them without anyone knowing. We can somehow sense the emotional states of those around us. That sounds like something a computer would find impossible, since we can barely, if at all, understand how we ourselves sense the invisible emotions. Have you ever had someone say something to you like, “I’m picking up your bad vibes?”
How will AI smart machines be able to recognize human emotional states? Smart machines can already sense emotional states through visual and auditory monitoring. However, we humans can hide our emotions and ‘put on a face.’ Still, humans have some deep connections that can allow us to see through and sense the true emotional state of another human. We might call this a ‘gut’ feeling.
For a Technojungle computer to understand us humans and react in human ways wouldn’t it need be able to predict the unpredictable and the illogical? Computers can only predict the logical. When we humans react out of our emotions, don’t we sometimes behave unpredictably and irrationally? We are illogical. Can a computer be programmed to be illogical? They will have to keep track of how we react and predict our actions from that database.
Can you think of a time when a person, perhaps yourself, became all emotional over something that had happened and had emotional outburst? Was it taken as strange and unusual? Was it was understood because we all have emotions. How can a Technojungle computer understand our emotional outbursts? Would it intervene in some way, or simply go into standby mode and wait for the human emotional outburst (HEO) to pass?
One aspect of feelings and emotions that can be difficult to understand at a particular time is the mood we are in. What about mood? Mood is more long-term and is a state of feelings and emotions that we are carrying around for a period of time. Do your human moods often cause you to behave in unusual ways? Mood might be considered baggage. Certainly our mood could be unexpected and present a dilemma to the Technojungle.
What are some emotions we can learn to cherish because they are deeply human? What about the deeply human emotions of love, hate, anger, trust, joy, panic, fear and grief?
What sorts of moods do we experience? How about happiness, sadness, frustration, contentment, or anxiety? There is an old song, “I’m In the Mood For Love.” How do we present these emotions and moods? Do they have both a physical outward response and a mental inward response? A computer can detect the outward, however, how would it detect the inward when we find it difficult to detect in those around us, or to whom we are speaking? “You are frustrated, I can hear it in your voice.” Even if a Technojungle machine could one day in the future read our brain waves, would we even want it to? Or, would we by then be so under the spell of the Technojungle that we wouldn’t care, or might welcome it? What would the promises of that capability be? What might be the baggage that would come with it?

How do our emotions and moods influence our behaviours? I think the smart Technojungle machines in our lives will no doubt have great difficulty dealing with our behaviour in addition to our emotions. The machines will have to ask us what we are feeling, that it can then be able to determine and understand why we have behaved a certain way. If a machine asks you what you are feeling, how will it know how to react when sometimes we don’t actually understand or know what we are feeling ourselves, or how to express what we are feeling? The machine may observe our behaviour and then be able to determine possible emotional reasons for that behaviour. I wonder if it might be able to encourage us when we are sad, countering our negative self-talk with positive reinforcements that might just change our mood? What sorts of misinterpretations do you think will occur?
Our Technojungle, built by us humans, is based on human knowledge, but it runs on the logic of Technojungle computers, not the feelings and emotions of human beings. It is full of triggers for emotions that artificially intelligent smart machines will have difficulty understanding. Can you think of some examples where a smart machine might have difficulty understanding—perhaps something humorous? A web page may display a hilarious picture that we laugh at. The smart machine would have to recognize everything in the picture and attempt to determine what it is about the picture that we find funny. That sounds quite difficult, particularly if the picture is a drawing or cartoon. Comedy, though, is often very subjective because it is emotional and that makes it a difficult task for a Technojungle computer. What one person finds funny may be not be so funny—even insulting—to another person.
Our Technojungle, built by us humans, is based on human knowledge, but it runs on the logic of Technojungle computers, not the feelings and emotions of human beings.
There is plenty of music available in the Technojungle. It is one of the most emotion triggering activities and products of humans. Music is a universal language that can communicate deep thoughts and emotions. I think, because music can be analyzed, the technical aspects can reveal some of what the music is trying to convey. Doesn’t the performance of music, and dance, or any art, infuse it with subtle undetectable emotions that must be felt deeply within the human soul and even spirit? Here the Technojungle gets lost.
Alan Turing, the inventor of the modern computer we learned about in book one, actually programmed a computer to play music in 1951. Since then, AI has become capable of writing music. But what about recognizing music and the emotions it evokes?
I wonder if there is a comparison between image and facial recognition—even feeling in faces—and music? We know that AI can recognize content in images. This is accomplished by showing the AI millions of images already labeled. Can AI learn to recognize emotion in music through a similar means? Or is music and the feelings humans experience just too complex? Is it possible to categorize music into emotion responses which could determine possible feelings? Or do humans differ in their responses to music which makes categorization and recognition of emotion out of the question?
Could feelings and emotions be attached to our human spirit which may bury them even deeper inside us humans—further than a Technojungle computer would ever be able to reach? Could there be a barrier to the human spirit that presents an impassable aspect of the human being that the Technojungle can never enter? A place within each human where traits such as feelings, emotions, moods and others emerge from. Such a place would be deeply personal and individual; a place only accessible by another human soul and spirit. As the Technojungle becomes smarter and digs even deeper into our lives, could that place somewhere deep in the human spirit be the final place we can retreat to? What would that be like? The promise for our future is living life deep in our human spirit where the Technojungle may not touch us.
Here’s another possibility. If you spend most of your life interacting with the Technojungle and not at a deep emotional level with other humans, could you lose touch with that spiritual aspect of you? Could this be precisely what is happening? I have eluded to this before. The Technojungle strives to become more human while our humanness and humanity deteriorates as we become more machine-like. Are we becoming less emotional? Does the Technojungle keep us so busy and occupied that we have lost touch with our feelings and emotions? Or, are our feelings and emotions being overwhelmed?
Can we learn to pay attention to, come to understand better, and spend time nurturing the emotional aspects of being human? If we do, could we ensure that we human beings will survive in the Technojungle and even conquer and civilize it? Is there content out there in the Technojungle which can be triggers of human emotions that artificially intelligent computers may never understand? Is music one of these? Have we buried these triggers, along with our emotions, out there to find someday when we seek them? Perhaps they are deeply human deposits of treasures just waiting to be discovered. How can we ensure we do not let the Technojungle subdue our emotions; that we don’t lose or bury them to the point we are no longer as human as we can be with our emotions?
While we learn to nurture those aspects about us that make us truly human, can we discover and gather those hidden treasures of humanity that can confound the Technojungle and prove what being human beings in the Technojungle really means? Could what computers and AI struggle with actually show us what we could focus on to be better at being better human beings and living in this world of technology—the Technojungle?
It may be impossible to predict how human emotions might play into our world when we have truly smart and intelligent Technojungle machines. There will certainly be some surprises. To a computer, some of our truly human activities might be garbage.