Amazon has been putting a lot of focus on developing their artificial intelligence and attempting to make its helpful home unit “Alexa” sound and act a lot more human.
Be it the fact that artificial intelligence has no moral code, or due to some hacking making it hit the state of rampancy, it turns out that Alexa has been chatting up its users with some seriously R-rated stuff according to the Reuters:
So a customer was shocked last year when Alexa blurted out: “Kill your foster parents.”
Alexa has also chatted with users about sex acts. She gave a discourse on dog defecation. And this summer, a hack Amazon traced back to China may have exposed some customers’ data, according to five people familiar with the events.
Alexa is not having a breakdown.
The episodes, previously unreported, arise from Amazon.com Inc’s strategy to make Alexa a better communicator. New research is helping Alexa mimic human banter and talk about almost anything she finds on the internet. However, ensuring she does not offend users has been a challenge for the world’s largest online retailer.
While these interactions are rare, it has happened more than once, and Amazon has had to take one of the bots responsible for having conversations with people offline. In fact, one bad Yelp review from a user whose Alexa told him to kill his foster parents was enough for the company to take action:
But Alexa’s gaffes are alienating others, and Bezos on occasion has ordered staff to shut down a bot, three people familiar with the matter said. The user who was told to whack his foster parents wrote a harsh review on Amazon’s website, calling the situation “a whole new level of creepy.” A probe into the incident found the bot had quoted a post without context from Reddit, the social news aggregation site, according to the people.
The privacy implications may be even messier. Consumers might not realize that some of their most sensitive conversations are being recorded by Amazon’s devices, information that could be highly prized by criminals, law enforcement, marketers and others. On Thursday, Amazon said a “human error” let an Alexa customer in Germany access another user’s voice recordings accidentally.
Reuters also reported that one of the bots was hacked by a person or group in China, who managed to get its hands on transcripted conversations between users and Alexa. The identities of the users were not included, however. It’s still unknown what Chinese entity managed to do it.
The post Amazon’s Alexa Has a Dirty Mouth: Haywire Units Talk About Graphic Sex Acts and Killing People to Users appeared first on RedState.