Part one: What is emotional AI and how is it embracing computing?

part-one-what-is-emotional-ai-and-how-is-it-embracing-computing-feat-img
part-one-what-is-emotional-ai-and-how-is-it-embracing-computing-feat-img

How affective computing emotional AI are changing society


“…there are lots of facets to how we signal emotion, body language captures this”… “multi-level signals can be monitored in real-time” — a talk on Emotional AI by Amir Tamrakar, Sr. Technical Manager of SRI International

Our emotions are intrinsically associated with humanity. They inform cognition and are a vital aspect of human communication. So how can a machine possibly recognize human feelings and itself seem ‘emotional’?

Many artistic presentations of the “emotional computer” have had dystopian undertones. The 1973 book, Demon Seed by Dean R Koontz, presents an emotionally controlling and even angry intelligent computer. The 2014 film, Her, is a love story between a man and a digital assistant. However, Emotional AI (Artificial Intelligence) has a more utopian back story influenced by biology.

Line-of-robots-looking at-computer-monitors

The next step on the evolutionary roadmap of the computer is emotion recognition. Artificial Intelligence provides the digital equivalent of a computer dipping a toe into an ocean; add in emotion, and computers can jump on a boat and swim away from shore.

AI is driving the new discipline of ‘Affective Computing’ where machines take on human-like feelings to bring a new era in technology to life. Affective Computing is a truly multi-disciplinary approach to computing, combining the skills and knowledge from areas as diverse as engineering, neuroscience, and behavioral psychology.

What is affective computing?

We already have the nascent murmurings of human-machine connectivity in the form of digital assistants. Amazon Echo has initiated ‘emotional responses’ the world over, some good, some not so good. The ability to interpret biometric data to manage the machine is evidenced in the Toyota Concept-i car. This smart car is based on SRI International’s Multimodal Driver Monitoring System (DMS) equipped with technology that uses biometric sensors to monitor the driver’s condition and adjust operations based on those inputs.

Robot-posed-like-the-thinker22

Affective Computing takes smart to new levels. It is all about emotions. “Affect” is another word for emotion. The discipline, as already mentioned, takes its steer from many areas that deal with computing and human behavior, pulling these together to create highly innovative and game-changing tools.

The concept was first proposed in a beautifully composed seminal paper by Rosalind Picard, published in 1995, entitled “Affective Computing”. The paper is written with an emphasis on the (then early) development of ‘wearables’. One of the conclusions of the paper is:

“emotions play a necessary role not only in human creativity and intelligence, but also in rational human thinking and decision-making. Computers that will interact naturally and intelligently with humans need the ability to at least recognize and express affect.”

The building blocks of Emotional AI

The emotional computer is in many ways like its human counterpart. Just as the human brain uses the Limbic system for emotions, working across multiple connected parts, so too, Affected Computing is made up from fundamental building blocks:

In 2018, IEEE published an article that outlines the three building blocks of emotional AI:

Emotion recognition

This fundamental area of emotional AI plays a significant part in music, sound, images, video, and text. It primarily works by analyzing acoustic speech, written content, facial expressions, posture and movement, and even brain activity. Early emotion recognition engines include openSMILE, used for audio analysis, and OpenCV which is used with video content. Current emotional AI-related solutions such as the End2You toolkit, focus heavily on end-to-end learning.

Emotion generation

These technologies have been around for more than three decades and primarily use pre-defined rules rather than data-training models for their functionality. Text-to-speech systems such as the MARY text-to-speech (MARYtts) engine are the most prevalent examples of these solutions.

Emotion augmentation

These solutions are centered on taking AI engines that face humans, then adding emotional capabilities to them. The SEMAINE project and ARIA VALUSPA are examples of AI-engines that enable developers to create virtual characters that can sustain interactions with humans for an extended period of time, then react appropriately to the user’s non-verbal behavior.

Emotional AI and affected computing, where next?

The building blocks of Emotional AI are established, with enabling developments within each of these areas; these provide the ground rock for further development. As we continue to build more accurate systems based on Emotional AI, we will likely see further innovations in products like wearables, smart cars, healthcare, and many more.

In Part Two, we will explore further the areas Emotional AI is pushing into; whilst keeping a watchful eye on the privacy aspects of the technology.


Read more from SRI