Two Massachusetts residents say Amazon's Alexa devices are "recording every conversation she has with users" but without the consent required under the state law on recording conversations, so they've sued.
The suits, filed last week in Suffolk Superior Court - because Amazon has an office here that works on Alexa - are a bit unusual because they are not class-action cases, the two plaintiffs have sued for less than $75,000 in damages and they are only alleging a violation of Massachusetts law, not broader federal constitutional issues in a possible attempt to keep them out of federal court.
Law360, which broke the story, reports that among their lawyers are firms from Chicago and Los Angeles that are involved in similar Alexa litigation in Los Angeles and Seattle.
The complaint by Wilfried Braunack of Milford, who filed one of the suits, and who has been chatting - and allegedly being recorded - by his Alexa since March 21, 2019, begins by stating what he and his lawyers claim Alexa is doing behind the scenes:
While encouraging people to speak with Alexa, Amazon is recording every conversation she has with users. What's more, Amazon also records conversations when no one is speaking with Alexa. Amazon makes these so-called "false wake" recordings when a user says a word that sounds like "Alexa" or another wake word - for example, Alexa will activate when a person says the word "election." Amazon designed Alexa devices to record and store the private conversations it captures via "false wakes" just as they do conversations with Alexa, even though in "false wakes" users do not intend to activate Alexa. After all, Alexa is one of the ways Amazon collects raw data on consumers.
Given their value to Amazon, the recordings might never be deleted unless the owner explicitly asks for them to be deleted - and sometimes not even then. So, for each Alexa device in a household, Amazon may have thousands, if not tens of thousands, of permanent recordings in its database of not only the device owner's voice but also the voices of their family members and anyone else who has ever spoken in a device's presence. Amazon has thus built a massive database of billions of voice recordings containing the private details of millions of Americans.
Amazon in turn discloses the Alexa recordings to some unknown number of Amazon employees and contractors around the world, who use Alexa recordings to improve and develop new technologies for Amazon. Indeed, Amazon now has the technology to listen to peoples' conversations and make targeted advertisements based on what is said, or to hear when a person is sick so it can suggest purchasing cough drops.
The complaint then continues how allegedly wrong this is in Massachusetts, because of the state law on conversation recording, part of a wiretap law enacted in 1968:
Massachusetts law makes it illegal to record a person who has not "given prior authority" to be recorded. Amazon purports to obtain consent to record individuals who set up an Alexa-enabled device through its terms and conditions. But Plaintiff Wilfried Braunack did not provide his consent to be recorded.
Amazon does not obtain actual consent to record users' voices. Amazon does not tell Alexa users it will keep an audio recording of everything they ask, either when an Alexa Device is first set up or at any time thereafter. Amazon does not tell users that their voice will be stored, analyzed, and exploited for Amazon's benefit. Amazon could disclose these things and ask users to explicitly agree to them, but it does not. ...
Amazon instead has buried in the Alexa terms and conditions an oblique reference to the fact that it is permanently recording what its users say. Users supposedly "agree" to these terms and conditions, but Amazon never actually obtains an affirmative confirmation that they agree, or even requires users to actually read them. Amazon does not present the agreement as a popup; does not require users to click a link to access it; and certainly does not require users to scroll through the document, acknowledge, and accept the language that Amazon contends provides actual notice. Even if Amazon had actually presented the terms upon which it now seeks to rely, that would not be sufficient because the description of how Alexa operates is deliberately vague and obfuscates the reality that Amazon is recording everything they say, storing these recordings, and monetizing the information.
My complaint involves only a claim for violation of state law, not federal law.