Will AI Be Able to Grasp Human Emotion?

Drawing by Charlie Elizabeth ©

Artificial intelligence is the technology that will define the future of humanity. The scientific community is making strides towards programming AI machines that are sentient, but can a computer ever understand human emotion? Some companies claim to have already cracked the algorithm for programming human emotion, but researchers at MIT are not so sure. 

The pioneering research teams of emotion AI most likely still have a ways to go before computers can fully grasp the mysterious and unpredictable range of emotions expressed by human beings. Emotion AI is a business that has a huge financial backing for research, which means that progress could be pushed along very fast, and we can expect to see huge strides in the advancement of these algorithmic technologies very soon. The field is predicted to grow to be worth $25 billion in the next few years. 

Can AI Understand the Nuances of Emotions? 

Human emotion is so complex that the many means of expression are still mysterious to even the highest-paid experts in the field. Like most artificial intelligence research, one of the goals of emotion AI is to use technology to exceed the capabilities of humanity to solve problems that have not yet been solved. 

An intuitive person can see beyond the obvious expressions of emotions, such as facial expressions and body language, so the question is, can an AI be programmed to have the same level of intuition when it comes to detecting human emotion? 

Things to Consider 

Here are a few of the things that AI programmers will need to consider when it comes to emotion AI. One of the purposes of emotion AI is to create a virtual personality that people can interact with. This type of AI detects the emotion of the user and mirrors it back to them, which creates a feeling of connection. Just because AI can mimic human emotions does not mean that it fully understands what it means to feel emotion, but it is a step closer to that end goal. 

Facial Expression 

The most obvious indicators of emotion are in facial expressions. Many of us wear our thoughts and feelings on our faces. This includes the way the mouth moves; smiling, laughing, frowning, and crying. The eyes and eyebrows are also important indicators of emotion. Are the eyes facing downwards in sadness or spread open wide in surprise. An eyebrow raise can convey a lot of emotions.

Analyzing facial expressions is one of the most straightforward ways that AI can detect changes in emotion and this technology already exists with facial mapping. The problem is that we as humans all know how to fake a smile, which means that facial expression alone is not enough for AI to fully grasp emotion.

Tone of Voice      

The tone of voice is a huge tell when it comes to emotions. Low and high-pitched tones of voice can indicate surprise, dejection, or excitement. When people are angry or excited they tend to talk faster, and when the speaker is sad or unsure their speech habits slow down. 

By analyzing the frequencies and the speed of a person’s voice while they express different emotions, the AI algorithm can create a map for interpreting the tone of voice as connected with emotion. 

Body Language

Body language is just as important as voice and facial expressions when it comes to detecting and mimicking emotions. In the future, AI will be used for job interviews where it will analyze body language for signs of lying and a lack of confidence. 

Personality 

One of the challenges for AI programmers when it comes to emotion AI is that everyone expresses emotion differently. The current emotion AI products have a period of trial and error when it comes to interacting with an individual, but as the AI gathers data on the user’s personality it can become more effective at responding to the nuances of their personality. 

Cultural Differences 

Another important factor for AI to grasp human emotion is cultural differences in how emotion is expressed. An AI programmed for Americans may not respond adequately to the emotional expression of other cultures. 

This means that a range of data needs to be collected from individuals in a variety of cultures so that the AI can accurately interpret facial expressions and body language which can vary greatly depending on the country and community of the individual. 

What Does the Future Hold For Emotion AI?

Every day, AI is analyzing more and more data from individuals all over the world. As the database grows and the algorithms become more advanced and precise, we can expect to see AI reach a point where it can almost fully grasp emotion. This technology will be used for creating algorithms for AI that can interact with individuals in a very human way. 

Being able to detect and respond to a person’s emotions allows the machine to create a bond with a person. We can expect to see a rise in AI companions. With the current state of the world in a global pandemic, the demand for emotion AI is rising as people seek companionship within their homes. 

Most emotion AI today can mimic human emotions by analyzing things like tone of voice, facial expression, and body language. Future advancements in technology will aim to create an even more human experience. For example, an AI that mimics the emotions of the user will have a “sad” reaction when the user is sad, but if the AI instead reacted more intuitively, it could respond to the emotion of sadness with comfort to make the user not just feel less alone but help them to feel better. 

Current emotion AI can detect inflections in the voice and understand facial expressions. The technology is being backed with financial support that will take it to higher levels in the next coming years. Emotion AI is being programmed for more intuitive responses that are more than just emotional, but emotionally intelligent as well. 

Charlie Beth

Multimedia artist from North Carolina.

https://www.instagram.com/charliebeth.art
Previous
Previous

History of Streetwear

Next
Next

Three Anti-Inflammatory Herbs to Add To Your Diet