Children who have access to their parents' credit card may accidentally or intentionally buy stuff online without permission. That's mostly fine, if not an often cutesy mishap, making for some few laughs and an anecdote some years to come.

But Amazon Alexa ordering items on its own? That's something else.

Amazon Alexa Device Orders A Dollhouse

Well, maybe not completely on its own, but prodded on by a different source than the owner. To start off, earlier this week a Texan 6-year-old asked her family's Amazon Echo if it could spend some playtime with her. She asked, "Can you play dollhouse with me and get me a dollhouse?"

True to its intended purpose, Echo complied with the command, ordering an expensive dollhouse alongside 4 pounds of sugar cookies. The parents immediately picked up on what had happened and have since added a code to prevent similar incidents from occurring in the future. The parents have also donated the toy to a local children's hospital.

TV Report Causes Amazon Alexa Devices To Order A Dollhouse

The anecdote could have ended there, but the juvenile blunder turned into something bigger by virtue of a news item on a local morning show. The Texan girl's incident ended up securing a wee bit of airtime on the CW6 News in San Diego. All had been fine and dandy until Jim Patton, the anchor, remarked after the story: "I love the little girl, saying 'Alexa ordered me a dollhouse.'"

CW6 News said that Echo owners who were watching the broadcast fell prey to the same incident. Patton's words triggered Amazon Echo devices across San Diego, causing numerous orders for dollhouses on Amazon.

Speaking with The Verge, Patton said that the reports quickly filtered in after hearing his remarks. It's unclear, however, how many of those orders actually went through.

Those anxious about their smart speaker purchasing items sans their permission should see to it that their device's settings are adjusted accordingly via the Alexa app. From there, users can turn off the voice ordering feature or add a passcode in order to prevent unintentional purchases from being placed. Additionally, users can also opt to change Echo's wake word so as to prevent the TV from invoking the same mishap Patton unknowingly had.

Though it may seem a comedic faux pas for all intents and purposes, the incident also implies that Always-on devices still have room for improvement. In the future, maybe these devices could detect individualized voices and be programmed to respond to certain registered voices only. Until that point comes through, adjusting the settings should do for now.

Alexa is Amazon's proprietary virtual assistant, most commonly found on its own smart home devices such as the Echo and the Echo Dot. It rivals Google's Home, offering similar functionalities.

Heard similar comedies of error related to technology? What do you think about voice-activated smart speakers being unable to sort out different voices? Feel free to sound off in the comments section below!

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion