Healthcare Employees Took Legal Action Against Amazon Alleging Alexa Devices Violated HIPAA

Four healthcare employees filed a lawsuit against Amazon because allegedly their Amazon Alexa devices possibly captured conversations without their intention or permission and might have caught health data protected by HIPAA.

Amazon Alexa devices listen for words and phrases that awaken the devices and activates them to begin recording. Particularly, the devices listen to the term “Alexa,” and subsequently try to respond to a question that is inquired. Having said that, the plaintiffs assert that there are other terms that can wake up the devices and activate them to begin recording though it’s not the intention of the device users.

The lawsuit states a study done at Northeastern University which revealed the devices awaken and capture the response upon hearing phrases like “I got something,“ “I care about,” and “I messed up.” The study furthermore discovered that the devices awaken and capture the response to the terms “head coach,” “I’m sorry,” and “pickle.”

The plaintiffs claim “Amazon’s conduct in surreptitiously capturing users is a violation of the federal and state wiretapping, privacy, and consumer protection regulations. In spite of Alexa’s integrated listening and recording features, Amazon didn’t make known that it creates, stores, evaluates, and utilizes recordings of these communications when the plaintiffs’ and putative class members’ bought their Alexa gadgets.

The four plaintiffs stated they discontinued using their gadgets or bought more recent models that include a mute feature just in consideration that the devices might be saving sensitive data.

Amazon made an announcement in 2019 that it will make sure that all transcripts would be erased from Alexa servers as soon as users delete the voice records. The next year Amazon mentioned that users can choose not to participate in the human annotation of transcribed records, can set up the devices to erase voice records that are 3 or 18 months old on autopilot, or can choose altogether not to have their recordings stored in any way.

The plaintiffs assert that during that time, Amazon experts might have actually listened to an audio recording that contained protected health information (PHI). They likewise said that if Amazon had advised them that the company saved information or that its workers listened to audio recordings, they most likely did not buy the devices.

According to Amazon, only a small percentage(1%) of voice recordings are analyzed by its employees and that the annotation procedure doesn’t link voice recordings with the identifiable information of customers.

Link copied to clipboard
Photo of author

Posted by

Mark Wilson

Mark Wilson is a news reporter specializing in information technology cyber security. Mark has contributed to leading publications and spoken at international forums with a focus on cybersecurity threats and the importance of data privacy. Mark is a computer science graduate.