Investor, Speaker, and CEO at Behavioral Signals. Listed in Inc. as an "AI Entrepreneur to Watch".
How people behave in solitude is vastly different than how they behave in public, but the foundation of one’s persona remains constant. Dancing around the apartment when nobody’s watching expresses a secret desire to do so on a grand stage, but humans modulate those whims as societal norms dictate.
Emotional intelligence determines everything from our confidence to our communication skills to our everyday routines. As individuals maneuver from the comfort of the home to the world outside, they become social creatures. One’s emotional intelligence shifts gears from self-indulgent to self-monitoring.
Thought leaders and industry pioneers strive to improve our public discourse, one conversation at a time. Personal AI assistants may be the best tools to understand the rift between the personal and public spheres of influence. When a person is alone, his or her inhibitions melt away. But when they are alone with a mechanized assistant, they have a tech mirror to reflect their true selves.
Case in point: perhaps you ask your voice-activated system to repeat ugly words or validate your darkest thoughts. This is a way for you to act in antisocial ways outside the judgmental purview of your human peer groups.
Conversely, you might flatter your AI and flirt as if you’re on a digital date. So is the AI world reciprocating your affection? How can technology engineer emotion detection to encourage the best that humanity has to offer? And how can AI learn from our collective mistakes to create a less divisiveness future?
Not to be too biblical about the matter, but consider the golden robot rule: do unto your AI as you would have it do unto you. If you are polite and courteous with your voice assistant, the subtle inflections of your voice will register in its emotionality mainframe. This opens up your dialogue to more multifaceted interactions.
Research indicates that humans want their voice assistants to be confident, yet subordinate; polite, yet productive. As with any communiqué, this is a delicate balance, requiring effort from both parties to yield the most beneficial results.
For example, if you say “Good morning” to your AI every dawn, it will most likely respond with a similar greeting. But if you deliver those same words while sobbing, an emotion AI detector will know that the meaning of your terms and the inflection behind them are at odds. You may be wishing others a “good” morning, but you certainly aren’t experiencing one yourself. A sympathetic program will inquire as to why your salutation from the previous day was more genuine than the tear-stained one from today.
Alternately, consider a different scenario entirely. If you cuss and yell at your AI unit, it will reply with a perfunctory set of responses without engaging in the negativity you are expressing. You don’t gain any interpersonal (so to speak) connection with your voice assistant. From the AI’s vantage point, you are an emotional brick wall who just needs simple tasks to be performed with no deeper quest for learning. In that sense, you aren’t getting the most from your emotion AI software. You should be understanding if you want to be understood.
This is a cyclical concept. The more people can empathize with the AI in their lives, the more it becomes their companion in the true sense of the word. In a fascinating recent study, humans were given the opportunity to look through the eyes of a robot and see the world through their digital lens. The result: participants’ admiration for their mechanized counterparts increased dramatically.
For years, scientists have been asking robots to think and act as they see fit, but now they are starting to shift the paradigm. How do AI units feel? And how can humans cheer them up for a change?
As previously discussed, there is a distinct dichotomy between our private and public selves, but there is also a continuum at play. If you have a rocky wake-up experience, it is bound to spill over into your morning commute, clouding your entire day. By the time you arrive at the workplace, your mood is firmly negative, rippling through your office community and refracting back at you.
But if your inaugural experiences of the day are positive, it sets the right tone for the rest of the morning. When your AI assistant is able to bolster the brightest notes of your personality, that illumination will carry over into other realms of your daily interaction. You spread positivity rather than bile, creating a chain reaction that makes the world more manageable, one bit at a time.
But it’s more than just a simple matter of politeness. AI can support the better nature of the human species on a macro level as well. The most toxic elements of modern society, from racism to sexism to aggression in general, are reflected in how people treat the machines in their lives. Empirical data indicates that humans extend their prejudices to robots; black robots are perceived as more threatening and female voices are considered more passive.
But experts must do more than just study bigotry; they need to eradicate it. Instead of allowing people’s biases to dictate the human/AI relationship, the tech world should be changing the conversation drastically. For example, if a user denigrates their female voice assistant, engineers can program her to teach empowerment through example. She can react to negative input with enlightened emotion AI, rising above the muck and “reprogramming” her human companion more and more with every interaction.
This is a future worth exploring. The journey of personal discovery may actually begin outside the self, and lead all the way to the cutting edge of Emotion AI.
Create your free account to unlock your custom reading experience.