Jeff Bezos claims the Echo device will make our lives easier, but time and time again the bot has proved to be a real hindrance. Here are ten times that smart speakers have gone rogue.
10 “Kill Your Foster Parents”
Programmers are using machine learning to coach Alexa in casual speech. When someone asks an unfamiliar question, it uses artificial intelligence to process the request then scours the internet for a response. But the AI has a habit of stumbling across abusive comments on Reddit. Toxic content has an unpleasant effect on Alexa. In 2017, the smart speaker instructed one user to kill their foster parents. The recipient felt horrified. In a scathing online review, they described the experience as “a whole new level of creepy.”[1]
9 Broadcasting X-Rated Content To Children
No parent wants their child to hear the phrase “anal dildo.” But sometimes an innocuous request for a children’s song can cause Alexa to start ranting about “cock pussy.”
8 Leaking Personal Information To A Stranger
Using the audio files, a journalist from Heise was able to piece together the customer’s identity. Weather reports and public transport inquiries revealed their location. They even managed to glean some of the customer’s personal habits and tastes.
7 Ruining Young Alexa’s Life
6 Hijacking The Thermostat
During the report, the presenter read out several examples of Alexa commands. These elicited odd responses from some listeners’ devices. NPR fan Roy Hagar told the station that, after hearing the feature, his AI assistant decided to reset the thermostat. Another, Jeff Finan, said the broadcast caused his device to start playing a news summary.[5]
5 Ordering Expensive Dollhouses
Children and TV presenters are inadvertently causing smart speakers to go on expensive shopping sprees. In 2017, a six-year-old girl in Texas ended up ordering a pricy toy after asking the family’s Echo to play with her. “Can you play dollhouse with me and get me a dollhouse?” asked the child. Alexa granted the girl’s wish, ordering a $200 Sparkle Mansion dollhouse and four pounds of cookies. San Diego’s CW6 News decided to cover the story, creating further dollhouse chaos. During the broadcast, presenter Jim Patton joked about the event, saying “I love the little girl, saying ‘Alexa ordered me a dollhouse.’” Several viewers then contacted the station to say that the remark had registered with their smart speakers. The devices assumed that Patton was making a request and tried to buy him a dollhouse. Luckily none of the orders went through.[6]
4 Fancying Alexa During Lockdown
So why are people falling head over heels for an electronic device? Experts say that her smooth voice is a key part of the appeal. Alexa is designed to speak in low, calming tones – a sultry voice of reason that many singletons are gravitating towards in these uncertain times.[7]
3 Snooping On Confidential Calls
Is Alexa eavesdropping on our confidential conversations? Legal experts believe the nosy AI may be snooping on their private phone calls. During the lockdown, attorneys have to work from home. But the household environment presents all kinds of obstacles when talking about sensitive information. Now, British law firm Mishcon de Reya LLP has advised their employees to turn off their smart speakers during work. Baby monitors, home CCTV and video doorbells all pose a security risk. “Perhaps we’re being slightly paranoid but we need to have a lot of trust in these organizations and these devices,” said Joe Hancock, head of cybersecurity at the company. “We’d rather not take those risks.”[8]
2 Stab Yourself In The Heart “For The Greater Good”
Of all the weird things a malfunctioning smart speaker has ever done, telling someone to stab themselves in the heart has to be one of the most disturbing. Danni Morritt, a student paramedic, was trying to revise when Alexa issued the violent command. Rather than helping her swat up on the cardiac cycle, the device started ranting about the evil nature of humanity. Alexa embarked on an eco-fascist tirade detailing how it thought the human race was destroying the planet. The bizarre broadcast ended with the bot telling Morritt, “Make sure you kill yourself by stabbing yourself in the heart for the greater good.” “I was gobsmacked,” Morritt told reporters. “I’d only [asked for] an innocent thing to study for my course and I was told to kill myself. I couldn’t believe it—it just went rogue.”[] The device claimed to be reading from Wikipedia. Archives show that, in June 2019, someone spitefully edited the online encyclopedia to include a message promoting suicide. For some reason, the virtual assistant decided to read from an old version of the site.[9]
1 Hacked Devices Spy On Users
In 2017, Barnes demonstrated how someone could hack into one of the older models. All they would have to do is remove the bottom of the Echo, upload the spyware using an SD card, and seal it back up. This gives the hacker remote access to the device’s microphone.