Affective Computing - Straight from the Heart

Computing with heart

By Peter Fingar  |  February 15, 2018

Let's turn to M.I.T.’s Affective Computing group, which aims to bridge the gap between human emotions and computational technology, to open our discussion, “Affective Computing is computing that relates to, arises from, or deliberately influences emotion or other affective phenomena. Emotion is fundamental to human experience, influencing cognition, perception, and everyday tasks such as learning, communication, and even rational decision-making.

However, technologists have largely ignored emotion and created an often frustrating experience for people, in part because affect has been misunderstood and hard to measure. Research at this group at MIT sets out to develop new technologies and theories that advance basic understanding of affect and its role in human experience. They aim to restore a proper balance between emotion and cognition in the design of technologies for addressing human needs.

“Affective Computing research combines engineering and computer science with psychology, cognitive science, neuroscience, sociology, education, psychophysiology, value-centered design, ethics, and more. We bring together individuals with a diversity of technical, artistic, and human abilities in a collaborative spirit to push the boundaries of what can be achieved to improve human affective experience with technology.” Affectiva, a spin out of MIT's Media Lab led by Rana el Kaliouby, states their technology is used by one-third of Fortune Global 100 and 1,400+ brands, including Mars, Kellogg’s and CBS to measure consumer emotion responses to digital content.

Rana el Kaliouby
Computer Scientist Rana el Kaliouby presents, at a TED conference, an app that knows how you feel by the look on your face. (Click image or here to link through.) Dr. el Kaliouby is CEO of MIT spin out Affectiva.

Tel Aviv based Beyond Verbal Communication, Ltd. commercializes technology that extracts a person’s full set of emotions and character traits, using their raw voice in real-time, as they speak. This ability to extract, decode and measure human moods, attitudes and decision-making profiles introduces a whole new dimension of emotional understanding which the firm calls Emotions Analytics,™ transforming the way we interact with machines and with each other.

Science of EmotionsThe company developed software that can detect 400 different variations of human “moods,” and is now integrating this software into call centers that can help a sales assistant understand and react to customer’s emotions in real time. The software itself can also pinpoint and influence how consumers make decisions. For example, if this person is an innovator, you want to offer the latest and greatest product. On the other hand, if the customer is conservative, you offer them something tried and true. Talk about targeted advertising! Think this is for tomorrow? It’s embedded in's PULS smartband, and being sold to large call centers to assist in customer service.

Meet Pepper. In June 2014, Softbank CEO Masayoshi Son announced an amazing new robot called Pepper. The most amazing feature isn’t that it will only cost $1,900, it’s that Pepper is designed to understand and respond to human emotion. Update: IBM has joined forces with Softbank to have Watson cover Pepper’s back!

Meet PepperPepper is designed with single goal in mind: become a household companion for owners. The robot is capable of judging situations and adapting rationally, as well as recognizing human tones and expressions to see how someone feels. Pepper’s software was developed with the purpose of making it “able to recognize people’s emotions by analyzing their speech, facial expressions, and body language, and then deliver appropriate responses. ”Pepper is the robot with “a heart.” Pepper still has some kinks and does not “behave perfectly in all situations” but learn on its own. Observation of human responses, such as laughing at a joke, is central to Pepper’s ability to learn.  

As reported in the Washington Post, “Cognitive psychologist Mary Czerwinski and her boyfriend were having a vigorous argument as they drove to Vancouver, B.C., from Seattle, where she works at Microsoft Research. She can’t remember the subject, but she does recall that suddenly, his phone went off, and he read out the text message: ‘Your friend Mary isn’t feeling well. You might want to give her a call.’

Clothes that expose emotions

“At the time, Czerwinski was wearing on her wrist a wireless device intended to monitor her emotional ups and downs. Similar to the technology used in lie detector tests, it interprets signals such as heart rate and electrical changes in the skin. The argument may have been trivial, but Czerwinski’s internal response was not. That prompted the device to send a distress message to her cellphone, which broadcast it to a network of her friends. Including the one with whom she was arguing, right beside her. Ain’t technology grand?”

Now, let’s look at an application for military personnel. “The computer will see you now” is an article in the Economist magazine. “Ellie is a psychologist, and a damned good one at that. Smile in a certain way, and she knows precisely what your smile means. Develop a nervous tic or tension in an eye, and she instantly picks up on it. She listens to what you say, processes every word, works out the meaning of your pitch, your tone, your posture, everything. She is at the top of her game but, according to a new study, her greatest asset is that she is not human."

In The Economist's article, The computer will see you now, Ellie is a virtual shrink.. that 'may sometimes be better than the real thing.'

“When faced with tough or potentially embarrassing questions, people often do not tell doctors what they need to hear. Yet the researchers behind Ellie, led by Jonathan Gratch at the University of Southern California Institute for Creative Technologies in Los Angeles, suspected from their years of monitoring human interactions with computers that people might be more willing to talk if presented with an avatar. This quality of encouraging openness and honesty, Dr. Gratch believes, will be of particular value in assessing the psychological problems of soldiers—a view shared by America’s Defense Advanced Research Projects Agency, which is helping to pay for the project.

“Soldiers place a premium on being tough, and many avoid seeing psychologists at all costs. That means conditions such as post-traumatic stress disorder (PTSD), to which military men and women are particularly prone, often get dangerous before they are caught. Ellie could change things for the better by confidentially informing soldiers with PTSD that she feels they could be a risk to themselves and others, and advising them on how to seek treatment.”

On the heels of a fine Valentine's Day yesterday, I'm feeling big with heart and will let you in on another source I follow to keep up with affective computing: IEEE Transactions on Affective Computing (TAC)





Lisa Wood adding:
Sophia, the first robot to obtain citizenship (in Saudi Arabia):