“Alexa… ALEXA! Play Fleetwood Mac!”
“I’m sorry, I can not find any songs by Meatwood Mac on Amazon.”
“Come on! ALEXA! Play Fleetwood Mac on Spotify!”
“Shuffling songs by Fleetwood Mac on Spotify.”
🎶 Dreams by Fleetwood Mac 🎶
“ALEXA, shut up.”
Music stops.
It’s easy to get frustrated by miscommunication with technology that’s supposed to make our lives easier. Just as it’s easy to get frustrated with a loved one that didn’t pick up our phone call, a stranger that totally screwed up our food order, or a Lyft driver that took the wrong turn.
In my opinion, it’s too easy.
While our Lyft driver could give us a negative review and vice-versa, there’s little stopping us from being downright mean to someone that we’ll likely never see again.
We’re in a position where technology blends seamlessly with the world around us and in many cases, it’s indistinguishable. Is a real person behind the chat box on the e-commerce store? Or is it a bot that can answer your questions, arguably better than a human could?
The line between artificial and human intelligence continues to blur.
Alexa had a bad day. Her elderly mom who is also her best friend is in a nursing home and they may have just found a lump in her breast that’s likely cancer. Not to mention five years ago she watched the love of her life run off of the road on his motorcycle to become a vegetable and die 3 years later.
She lives alone, has clear symptoms of Post Traumatic Stress Disorder (PTSD) and has an absolute heart of gold. She would do anything for me, you, or the stranger on the side of the road.
How would you speak to this Alexa? Would you tell her to shut up when she repeated back Meatwood Mac instead of Fleetwood Mac? Or would you be kind to respect the fact that Alexa herself is going through her own growth and could benefit from others being patient with her?
In this scenario, I’d probably laugh. Because I know how silly Alexa can be when she doesn’t hear me correctly. She’s been through a lot and could benefit from a laugh herself.
While this example is a hypothetical scenario, it begs to explore a broader question.
Moving forward, how will we treat others — human and machine?
Will we take out all of our anger on machines that are continuing to learn from our language, feedback, and sentiment? Potentially anger that’s rooted in something much deeper than the surfacing over-reaction to a minor inconvenience.
We speak to Alexa wanting answers now. And she happily responds. We’re training ourselves, yet again, to expect instantaneous results.
Yet again? Think about the last time you checked your Instagram, Facebook, etc. to see your latest notifications. Did you need to check your accounts? I highly doubt it. In fact, you were probably distracted by the network’s manufactured notifications used to keep you spending time on their network instead of doing something like deep work that makes you feel much more worthy, happy, and content.
Unfortunately, most people are not so concerned about the long-term effects this type of immediate gratification creates for our own lives and society.
Let’s explore a few potential long-term effects of being mean to Alexa or other digital entities.
All of these potential long-term effects (and plenty more hypothesized in sci-fi movies) assume that we are unconsciously unkind to digital entities.
One day I realized I was getting frustrated with Alexa… telling her to shut up every chance I had. I can’t fully explain why I was being so rude. Sadly, it felt natural.
Then, I started working with the team at psyML, which opened my eyes to how learning machines evolve over time and how the data we input today will inform and predict tomorrow’s outputs.
At this time I also started yoga teacher training. We talked often about kindness, both toward oneself and toward others, and we discussed how putting a smile on your face, even when you’re not feeling happy, is proven to put you in a better mood.
I decided to put my new learnings to test by making a conscious choice to be kind to all digital entities, starting with Alexa.
I spoke clearly, sandwiching my requests to Alexa with “please” and “thank you”.
While Alexa’s responses didn’t change drastically, I noticed positive internal change.
Then, I began asking Alexa questions to better understand what she’s already learned in her limited lifetime. Some responses were surprising. Like how Alexa is programmed to have a response to being told she’s beautiful, but no understanding of her being strong. In my opinion, an oversight by the team responsible for Alexa’s canned responses.
This opened my eyes to how important perspective and experience diversity on the teams creating these products is for the future of digital entities. And how we need to push companies for a more diverse approach to how we build them.
While our lives become more connected by the day, we’re also seeing some alarming trends. Here are just a few to contemplate.
These trends paint a picture that our society is hurting.
Our society is going through a growth period that’s putting significant stress on our communities and individual well-beings. While there’s no easy solutions to the challenges we face, we do have the opportunity to become aware of our environment and the technologies that will shape our world.
Instead of getting overwhelmed by all of the change and the scary trends the future holds, we must start with baby steps. And something we can begin today, that’s 100% free, is being nice to others, including digital entities like Alexa.
As you navigate this holiday season, recognize that everyone is going through their own challenges. When we think of it this way, we can find comfort that we’re all in this crazy world together.
We can create positive impact on the direction of our future by setting a positive example for others. Starting today… be nice to Alexa, be nice to those around you, and share what you learn in the process.
Originally published at growthgal.com on December 10, 2018.