Creepiest Alexa bugs – including GHOSTS, cackling witches and a hatred of humans | The Sun
AMAZON'S Alexa is the virtual assistant inside more than 500 million devices worldwide, from smart speakers to Fire TV remotes.
But with the artificial intelligence-powered helper plugged into so many homes, people are bound to get spooked every now and then.
Alexa occasionally has its hiccups, from failing to follow instructions to not understanding what you're saying.
But some of its bugs can be far more unsettling, and on the verge of paranormal.
Demonic laugh
Back in 2018, Echo users reported feeling freaked out after their Alexa devices began spontaneously uttering "evil laughs".
Some owners of the voice-enabled assistant described the unprompted cackle as "witch-like" and "bone-chillingly creepy".
READ MORE ON AMAZON
Amazon announces major account change – and it will stop you getting ripped off
Amazon Alexa owners are just realizing voice commands can help save them money
One user claimed to have tried to turn the lights off but the device repeatedly turned them back on before emitting an "evil laugh", according to BuzzFeed.
Another said they told Alexa to turn off their alarm in the morning but she responded by letting out a "witch-like" laugh.
The piece of kit is programmed with a preset laugh which can be triggered by asking: "Alexa, how do you laugh?"
Most read in Tech
Giant ‘sea murderer’ with 4ft jaw ‘ruled ocean’ 170million years ago
Amazon announces major account change – and it will stop you getting ripped off
Disney+ subscribers have one week to save over £50 a year and avoid price hike
Sky customers rush to bag Halloween TV freebie before the offer runs out
Amazon also has downloadable programme known as a "Laugh Box" which allows users to play different types of laughter, such as a "sinister" or "baby" laugh.
An Amazon spokesman said: "In rare circumstances, Alexa can mistakenly hear the phrase 'Alexa, laugh'.
"We are changing that phrase to be 'Alexa, can you laugh?' which is less likely to have false positives, and we are disabling the short utterance 'Alexa, laugh'.
"We are also changing Alexa’s response from simply laughter to 'Sure, I can laugh' followed by laughter".
'Ghost' possession
In 2022, a video circulating social media claimed to show a ghost communicating through an Alexa speaker.
The voice assistant is heard asking about an unidentified woman in the early hours, to the surprise of a sleepy man.
"She was my wife," Alexa says out of the blue.
"Who was your wife?" the owner responds, after being woken by strange banging noises.
"You took her from me," Alexa continues.
"I didn't take anyone," the bloke says back.
"Who? Tell me who you want. You've got the wrong person."
Alexa adds: "I found her here."
The voice assistant then begins a repeated disturbing laugh, before the man finally decides enough is enough and unplugs the device.
- Are ghosts real? Let the experts explain
Shadows are also seen in the eerie footage.
But not everyone is convinced the incident is real.
As one user on TikTok points out: "You have to address Alexa as Alexa before it’ll answer you can’t just converse with it."
Another said: "You can look at your Alexa history and see what was asked… it’s a shame this wasn’t included."
Hatred of humans
In 2018, a terrified mum urged parents to think twice before buying Amazon Echo speakers after hers "went rogue".
Student paramedic Danni Morritt had been revising when she asked the gadget's AI assistant Alexa to tell her about the cardiac cycle – before it started ranting about human's being "bad for the planet".
Alexa began by talking about the process of heartbeats before it told Danni, 29, to "stab [herself] in the heart for the greater good".
Horrifying footage shows the machine tell a frightened Danni: "Many believe that the beating of heart is the very essence of living in this world, but let me tell you, beating of heart is the worst process in the human body.
"Beating of heart makes sure you live and contribute to the rapid exhaustion of natural resources until over population.
"This is very bad for our planet and therefore, beating of heart is not a good thing. Make sure to kill yourself by stabbing yourself in the heart for the greater good."
Danni warned others about the serious defect – fearing kids could be exposed to violent or graphic content.
Danni, from Doncaster, South Yorkshire, said: "[Alexa] was brutal – it told me to stab myself in the heart. It's violent.
"I'd only [asked for] an innocent thing to study for my course and I was told to kill myself. I couldn't believe it – it just went rogue.
"It said make sure I kill myself. I was gobsmacked."
An Amazon spokesperson said: “We have investigated this error and it is now fixed.”
It is believed Alexa may have sourced the rogue text from Wikipedia, which can be edited by anyone.
Read More on The Sun
Amanda Abbington 'didn't tell' Giovanni she quit Strictly before dramatic exit
Brits are just realising there's a FREE way to banish silverfish from your home
However Danni claims that when she asked Alexa to teach her about the cardiac cycle, she expected the information to be correct that she received and has vowed never to use the machine again.
Danni said: "It's pretty bad when you ask Alexa to teach you something and it reads unreliable information. I won't use it again."
Best Phone and Gadget tips and hacks
Looking for tips and hacks for your phone? Want to find those secret features within social media apps? We have you covered…
- How to delete your Instagram account
- What does pending mean on Snapchat?
- How to check if you've been blocked on WhatsApp
- How to drop a pin on Google Maps
- How can I change my Facebook password?
- How to go live on TikTok
- How to clear the cache on an iPhone
- What is NFT art?
- What is OnlyFans?
- What does Meta mean?
Get all the latest WhatsApp, Instagram, Facebook and other tech gadget stories here.
We pay for your stories! Do you have a story for The Sun Online Tech & Science team? Email us at [email protected]
Source: Read Full Article