blog-amazon-echo.png

Case Study: Amazon Echo

Kim Smiley

Private conversations sent without user's knowledge

Amazon Echoes recently made headlines after reports that a family’s private conversation was recorded and sent to someone on their contact list without their knowledge.  Beyond the creepy factor of being recorded without your knowledge, it is easy to see how a situation like this could quickly get awkward. The specific conservation involved in this incident was about hardwood floors and pretty harmless but imagine for a moment if you were talking about one of your contacts (and maybe not in the most flattering terms) and your Echo thought you wanted a recording of that conversation sent to that contact.

Analyze the issue by building a Cause Map

We created a root cause analysis case study and built a Cause Map to show the causes that contributed to the Echo’s inadvertent conversation sharing. A Cause Map is a visual, intuitive method for performing root cause analysis and is built by starting at one organizational goal that was impacted by the incident and asking “why” questions. In this example, the starting point will be the customer service goal which was impacted because a customer had their private conversation recorded and shared without their knowledge or consent.

So why did it happen?

Amazon’s assistant devices, the Echo, Echo Plus and Echo Dot, are all equipped with seven microphones and noise-canceling technology and once they hear their wake word (the default being “Alexa” unless you have changed it), they start recording what they hear and automatically sends it to Amazon computers.  Amazon uses the recordings to personalize your Alexa experience and to create an acoustic model of your voice.

One of the many things that Alexa can do is to record and send a voice message to one of your contacts, but Amazon obviously never intended for an Alexa to go rogue and sent out a recorded conservation without intentionally being given directions to do so. In a statement released after the incident, Amazon said that they believed they know what happened. The first step in the chain of events that lead to the recording being unintentionally shared was that the Echo woke up after detecting a word that sounded like “Alexa”.

After the Echo woke up, words in the background conversation were  interpreted as a ‘send message’ request.  According to Amazon, the Echo should have then asked “To whom” out loud. The family has stated that they did not hear Alexa ask any questions out loud and that they were sitting near the device with the volume set to 7 (out of 10) at the time, but it is impossible to verify whether or not the Echo asked questions prior to sending the recording.  After asking “To whom”,  Alexa then interpreted the conversation as a name in the family’s contact list.  At that point, Alexa should have asked “[Contact name], right?”. At this point, Alexa again detected a sound that it interpreted as “right” and then sent the recording.

Multiple errors occurred

In order for this event to have occurred, Alexa had to incorrectly interpret the background conversation at least four different times. To understand how this occurred, investigators would need to dig into the details of Alexa’s programming and understand the intricacies of how the voice recognition software works.  Amazon has stated that “as unlikely as this string of events is, we are evaluating options to make this case even less likely” so it can be assumed that they are looking deep into the technical details of the problem to understand exactly how so many errors interpreting the conversation occurred.

Also, it is worth noting that something similar to this “unlikely string” of events has been reported at least once before with another Echo user who said his private conservation was sent to a contact without his knowledge in 2017.Read that story here

Adding detail to the Cause Map

You can view a high-level Cause Map for this incident here.  While this high-level Cause Map is helpful in understanding the basics of the incident, a more detailed Cause Map would be needed if it was being used as a tool to aid in the investigation. This Cause Map was built using our free Excel template and it can quickly be expanded if more information becomes available.

DOWNLOAD: Amazon Echo Cause Map

New call-to-action

Reducing the risk of a similar issue?

If you have an Echo and intend to keep it, but are a little concerned about what it is recording and might potentially share without your knowledge, you can take the following cautionary steps:

  1. You can view what conversations have been recorded and you can delete them on an individual basis or go to the Amazon’s Manage Your Content and Devices page to wipe it entirely
  2. You can disable the microphone when you are having sensitive conversations by pushing a physical button on your Echo. The button will turn red to indicate that the microphone is disabled.
  3. You can also limit locations where you put the home assistant devices.

And if you really want to creep yourself out, read this article about how Alexa and Siri can hear commands that are undetectable to humans. Read about that here

But on the bright side, at least Alexa isn’t randomly laughing at us anymore…here's more on that: https://www.usatoday.com/story/tech/2018/03/07/alexas-weird-random-laughter-freaking-people-out/404476002/ 

Additional resources:

DOWNLOAD: Cause Mapping Root Cause Analysis Template in Excel

SHARPEN your root cause analysis skills and attend a Cause Mapping Root Cause  Analysis Workshop

Attend A Cause Mapping Workshop To Facilitate Better Investigations

 

Share This Post With A Friend

   

Similar Posts

Facilitate Better Investigations | Attend a Webinar