Academic Integrity: tutoring, explanations, and feedback — we don’t complete graded work or submit on a student’s behalf.

The Next Frontier Of Artificial Intelligence: Building Machines That Read Your E

ID: 3745028 • Letter: T

Question

The Next Frontier Of Artificial Intelligence: Building Machines That Read Your Emotions: Affective Computing and Affectiva

How is AI advancing rapidly at emotional intelligence?

Recently, emotion-focused AI developer Affectiva became one of the few small businesses to be asked to join the Partnership on AI to Benefit People and Society. The interest of the “grand masters” of AI which make up the partnership – Google, Microsoft, Facebook, etc – in Affectiva’s growing business is a sure sign that this overlooked aspect of AI is starting to get the attention it deserves.

Anyone who has been frustrated asking questions of Siri or Alexa—and then annoyed at the digital assistant's tone-deaf responses—knows how dumb these supposedly intelligent assistants are, at least when it comes to emotional intelligence. "Even your dog knows when you're getting frustrated with it," says Rosalind Picard, director of Affective Computing Research at the Massachusetts Institute of Technology (MIT) Media Lab. "Siri doesn't yet have the intelligence of a dog," she says.

Is it wishful thinking or a pipe-dream to endow AI digital assistants with emotional intelligence?

So far, there are not that many researh projects or university courses devoted to solving this highly-complex puzzle.

Creating Emotional Artificial Intelligence

CREATING EMOTIONAL ARTIFICIAL INTELLIGENCE André Mainville, Ph.D

Explanation / Answer

imagine a world in which machines interpret the emotional state of humans and adapt their behavior to give appropriate responses to those emotions.

Well, artificial emotional intelligence, which is also known as emotion AI or affective computing, is already being used to develop systems and products that can recognize, interpret, process, and simulate human affects (with an “a,” not an “e”). In psychology, an “affect” is a term used to describe the experience of feeling or emotion.

If you’ve seen “Solo: A Star Wars Story”, then you’ve seen the poster child for artificial emotional intelligence: L3-37.

Lando Calrissian’s droid companion and navigator (voiced by Phoebe Waller-Bridge) instigates a slave revolt to escape from Kessel, but is severely damaged during the diversion. Lando (played by Donald Glover) is also injured during the getaway.

The “woke robot” demonstrates the ability to simulate empathy by interpreting the emotional state of a human, adapting its behavior to him, and giving an appropriate response to those emotions.

Now, this example might lead some video marketers and advertisers to think that emotion AI is science fiction. But, it is very real.

A number of companies are already working to give computers the capacity to read our feelings and react, in ways that have come to seem startlingly human. This includes Affectiva, an emotion measurement technology company that spun out of MIT’s Media Lab in 2009, and Realeyes, an emotion tech company that spun out of Oxford University in 2007.

So, how do their technologies help brands, agencies, and media companies improve their advertising and marketing messages? Let’s tackle this question by examining how affective computing works.

Brands know emotions influence consumer behavior and decision making. So, they’re willing to spend money on market research to understand consumer emotional engagement with their brand content.

Affectiva uses a webcam to track a user’s smirks, smiles, frowns, and furrows, which measure the user’s levels of surprise, amusement, or confusion.

It also uses a webcam to measure a person’s heart rate without wearing a sensor by tracking color changes in the person’s face, which pulses each time the heart beats.

Hire Me For All Your Tutoring Needs
Integrity-first tutoring: clear explanations, guidance, and feedback.
Drop an Email at
drjack9650@gmail.com
Chat Now And Get Quote