It had a real botty mouth.
An elderly Scottish woman was left shocked and appalled after Apple’s AI dictation software mistakenly inserted profanity and vulgar sexual references into one of her voicemail messages.
“The text was obviously quite inappropriate,” Louise Littlejohn, 66, told the BBC while recalling the robo-flop.
The Dunfermline resident had received the accidentally naughty voicemail on Wednesday from the Lookers Land Rover garage in Motherwell, who were inviting LittleJohn to an event.
Unfortunately, Apple’s AI-powered voice-to-text transcription service botched the translation, prompting the resultant iPhone text to refer to the Scotswoman as a “piece of s–t.” It also asked if she’d “been able to have sex.”
The robotronic gaffe was so bad that Littlejohn initially thought it was a scam, but then she recognized the call’s zip code and remembered that she’d bought a car from the garage some time ago.
“The garage is trying to sell cars, and instead of that they are leaving insulting messages without even being aware of it,” the senior citizen recalled. “It is not their fault at all.”
Some experts have suggested that this mistranslation could’ve been due to the caller’s Scottish accent.
However, far more likely culprits were the background noise and the fact that he was reading from a script, the BBC reported.
“All of those factors contribute to the system doing badly,” declared Peter Bell, a professor of speech technology at the University of Edinburgh, the Daily Mail reported.
BBC techsperts have speculated that the “sex” might’ve been a reference to the “sixth” of March when the event was transpiring — like a game of human-to-robot telephone.
Either way, Littlejohn has seen the humor in the cybernetic slip of the tongue. “Initially I was shocked — astonished — but then I thought that is so funny,” she said.
While the idea of an expletive-spewing AI translator might seem guffaw-worthy, Bell believes the incident highlights major glitches with the tech.
“The bigger question is why it outputs that kind of content,” the lingual expert said. “If you are producing a speech-to-text system that is being used by the public, you would think you would have safeguards for that kind of thing.”
In a similar mixup last month, Apple outraged MAGA supporters after its voice-to-text software mistakenly transcribed the word “racist” as “Trump.”
Company reps said that the feature will, at times, briefly display words that have phonetic overlap — in this case a hard “R” — before self-correcting.