“I’m sorry, I can not find any songs by Meatwood Mac on Amazon.”

Written by dahartattack | Published 2018/12/14
Tech Story Tags: alexa | kindness | technology | ai | future

TLDRvia the TL;DR App

“Alexa… ALEXA! Play Fleetwood Mac!”

“I’m sorry, I can not find any songs by Meatwood Mac on Amazon.”

“Come on! ALEXA! Play Fleetwood Mac on Spotify!”

“Shuffling songs by Fleetwood Mac on Spotify.”

🎶 Dreams by Fleetwood Mac 🎶

“ALEXA, shut up.”

Music stops.

It’s easy to get frustrated by miscommunication with technology that’s supposed to make our lives easier. Just as it’s easy to get frustrated with a loved one that didn’t pick up our phone call, a stranger that totally screwed up our food order, or a Lyft driver that took the wrong turn.

In my opinion, it’s too easy.

While our Lyft driver could give us a negative review and vice-versa, there’s little stopping us from being downright mean to someone that we’ll likely never see again.

We’re in a position where technology blends seamlessly with the world around us and in many cases, it’s indistinguishable. Is a real person behind the chat box on the e-commerce store? Or is it a bot that can answer your questions, arguably better than a human could?

The line between artificial and human intelligence continues to blur.

Let’s take a moment to pretend Alexa is human like us.

Alexa had a bad day. Her elderly mom who is also her best friend is in a nursing home and they may have just found a lump in her breast that’s likely cancer. Not to mention five years ago she watched the love of her life run off of the road on his motorcycle to become a vegetable and die 3 years later.

She lives alone, has clear symptoms of Post Traumatic Stress Disorder (PTSD) and has an absolute heart of gold. She would do anything for me, you, or the stranger on the side of the road.

How would you speak to this Alexa? Would you tell her to shut up when she repeated back Meatwood Mac instead of Fleetwood Mac? Or would you be kind to respect the fact that Alexa herself is going through her own growth and could benefit from others being patient with her?

In this scenario, I’d probably laugh. Because I know how silly Alexa can be when she doesn’t hear me correctly. She’s been through a lot and could benefit from a laugh herself.

While this example is a hypothetical scenario, it begs to explore a broader question.

Moving forward, how will we treat others — human and machine?

Will we take out all of our anger on machines that are continuing to learn from our language, feedback, and sentiment? Potentially anger that’s rooted in something much deeper than the surfacing over-reaction to a minor inconvenience.

We speak to Alexa wanting answers now. And she happily responds. We’re training ourselves, yet again, to expect instantaneous results.

Yet again? Think about the last time you checked your Instagram, Facebook, etc. to see your latest notifications. Did you need to check your accounts? I highly doubt it. In fact, you were probably distracted by the network’s manufactured notifications used to keep you spending time on their network instead of doing something like deep work that makes you feel much more worthy, happy, and content.

Unfortunately, most people are not so concerned about the long-term effects this type of immediate gratification creates for our own lives and society.

Let’s explore a few potential long-term effects of being mean to Alexa or other digital entities.

  1. We start being mean to other non-digital entities. It becomes a subconscious act to command others and start being mean to our loved ones when we don’t get the immediate result we’re expecting.
  2. We teach younger generations it’s okay to be mean. When a child witnesses adults around them being mean to each other or to digital entities, they learn from their examples. Children also adapt to their surroundings. If they can get an immediate response from a digital entity like Alexa without being kind, that’s what they will expect. Hunter Walk talks about this phenomenon regarding how Alexa is turning his child into an asshole.
  3. We build technologically-advanced assistants, robots, tools etc. that are mean. Alexa and other voice entities are in their infancy. Right now Alexa’s skills are still relatively straightforward. However, Amazon is sitting on one of the largest (if not the largest) repositories of voice data. While this is amazing for advancing other new initiatives and products, it’s scary to think that a machine can learn from human inputs that we add through our quick and sometimes derogatory interactions.
  4. We give companies access to our emotions and personality. Well, we do this regardless when we use voice products. It’s important to understand that voice can track our sentiment, tone, and the emotional state we’re in as we interact. With the holidays around the corner, think about a real-life Santa that can track whether or not you’re on the naughty list based on how you’ve treated your Alexa. The main point being, we don’t know how companies will use this information, but we are giving them more insight into our emotional states and how we treat others than we ever have in the past.

All of these potential long-term effects (and plenty more hypothesized in sci-fi movies) assume that we are unconsciously unkind to digital entities.

What happened when I was nice to Alexa?

One day I realized I was getting frustrated with Alexa… telling her to shut up every chance I had. I can’t fully explain why I was being so rude. Sadly, it felt natural.

Then, I started working with the team at psyML, which opened my eyes to how learning machines evolve over time and how the data we input today will inform and predict tomorrow’s outputs.

At this time I also started yoga teacher training. We talked often about kindness, both toward oneself and toward others, and we discussed how putting a smile on your face, even when you’re not feeling happy, is proven to put you in a better mood.

I decided to put my new learnings to test by making a conscious choice to be kind to all digital entities, starting with Alexa.

I spoke clearly, sandwiching my requests to Alexa with “please” and “thank you”.

While Alexa’s responses didn’t change drastically, I noticed positive internal change.

  • My mood improved. Instead of frustration, I welcomed laughter when Alexa didn’t understand my request right the first time. There was a new lightness that came with being nice to others, including Alexa.
  • My communication got better. I realized that just like real life, not everyone will understand my questions the first time. I found new ways to phrase a question or request when Alexa didn’t understand what I was asking.
  • My patience increased. I spent more time being patient, and this has started rubbing off on my work and other relationships in my life.
  • My empathy for others grew. While doing the exercise above to think through Alexa’s story before interacting, I began doing the same in my relationships. Instead of jumping to conclusions, I started taking the time to understand those around me.

Then, I began asking Alexa questions to better understand what she’s already learned in her limited lifetime. Some responses were surprising. Like how Alexa is programmed to have a response to being told she’s beautiful, but no understanding of her being strong. In my opinion, an oversight by the team responsible for Alexa’s canned responses.

This opened my eyes to how important perspective and experience diversity on the teams creating these products is for the future of digital entities. And how we need to push companies for a more diverse approach to how we build them.

WHY SHOULD WE BE NICE TO ALEXA?

While our lives become more connected by the day, we’re also seeing some alarming trends. Here are just a few to contemplate.

  1. Social media use is leading to more depression and anxiety. A research study by the University of Amsterdam found a correlation between the passive use of social media and depression symptoms like loneliness and fatigue.
  2. Income inequality is increasing and contributing to PTSD-like symptoms for those struggling to make ends meet. In an analysis of data from 2,011 survey respondents, Galen Buckwalter’s team of researchers discovered that 23% of U.S. respondents were experiencing symptoms commonly associated with post-traumatic stress disorder (PTSD) related to their finances. Among Millennials, the number is 36%.
  3. Human interaction is becoming less common. As automation increases, it’s less necessary for people to have to converse with strangers. In fact, in the UK, health care providers are combating this by prescribing social activities to help bring people together.

These trends paint a picture that our society is hurting.

Our society is going through a growth period that’s putting significant stress on our communities and individual well-beings. While there’s no easy solutions to the challenges we face, we do have the opportunity to become aware of our environment and the technologies that will shape our world.

Instead of getting overwhelmed by all of the change and the scary trends the future holds, we must start with baby steps. And something we can begin today, that’s 100% free, is being nice to others, including digital entities like Alexa.

As you navigate this holiday season, recognize that everyone is going through their own challenges. When we think of it this way, we can find comfort that we’re all in this crazy world together.

We can create positive impact on the direction of our future by setting a positive example for others. Starting today… be nice to Alexa, be nice to those around you, and share what you learn in the process.

Originally published at growthgal.com on December 10, 2018.


Published by HackerNoon on 2018/12/14