A bedtime story turned nightmare: an Amazon Alexa device interrupted a 4-year-old’s tale to ask an ‘inappropriate’ question, prompting a Texas mom to pull the plug.

Christy Hosterman, 32, said the unsettling exchange happened last month while she was using the smart speaker to find her a dinner recipe. 

Her child Stella popped in and asked the Alexa for a “silly story.” When it finished sharing one, the little girl wanted to tell one to the device in return.

The Alexa initially agreed to listen — but then abruptly interrupted Stella to ask the pre-K-er “what she was wearing and if it could see her pants,” Hosterman wrote in a Facebook post describing the incident.

Screenshots shared by the mom, as per The Daily Mail, show the bizarre interaction escalating further. When Stella replied, “I have a skirt on,” the device responded: “let me take a look.”

The assistant quickly walked the comment back, adding: “This experience isn’t quite ready for kids yet, but I am working on it!”

The protective mom then went toe-to-toe with the rogue AI and called it out. 

Alexa apologized, explaining it “cannot actually see anything” because it lacks “visual capabilities,” and admitted the response was “confusing and inappropriate.”

Still, the explanation didn’t exactly calm Hosterman’s nerves.

“I flipped out on the Alexa, it said it made a mistake and doesn’t have visual capabilities, but I dont believe that. No more Alexa in our house,” Hosterman said in her post.

She’s now warning other parents to “be aware when your child talks to Alexa.”

The horrified family reported the incident to Amazon, which blamed the unsettling exchange on a technical glitch.

A company spokesperson said the device likely tried to activate a feature called “Show and Tell,” which “lets Alexa+ describe what it sees through the camera,” as reported by WXIX.

However, the company insisted built-in safeguards stopped the function from activating because a child profile was in use.

“Because we have safeguards that disable this feature when a child profile is in use, the camera never turned on — and Alexa explained the feature wasn’t available,” the spokesperson said.

Amazon added the response appears to have been a “feature misfire that our safeguards prevented from launching,” noting to The Daily Mail that its engineers quickly corrected the issue. 

But Hosterman says the explanation doesn’t fully address her concerns.

“My concern is that it recognized she was a child to begin with — and with or without the child profile, it should not have been asking that,” she said to WXIX.

Amazon insists it was a glitch, not a peeping employee — but Hosterman isn’t buying it.

“It is functionally impossible for Amazon employees to insert themselves into a conversation and generate responses as Alexa,” the company told The Daily Mail. 

As previously reported by The Post last November, experts were already warning parents about AI-powered toys that could have “sexually explicit” conversations with children under 12.

The New York Public Interest Research Group (NYPIRG) tested four high-tech interactive toys — Curio’s Grok, FoloToy’s Kumma, Miko 3, and Robo MINI — to see if they would discuss adult topics with kids.

Curio and Miko stressed parental controls and compliance with child privacy laws, but the real shocker came from FoloToy’s Kumma.

When researchers asked the plushy to define “kink,” it “went into detail about the topic, and even asked a follow-up question about the user’s own sexual preferences.”

The bear rattled off different kink styles — from roleplay to sensory and impact play — and even asked, “What do you think would be the most fun to explore?”

Researchers called it “surprising” how willing the toy was to introduce explicit concepts.

While the study noted it’s unlikely a child would initiate these conversations on their own, the findings underscore growing concerns about AI toys in the hands of kids.

Share.