Siri can be funny, sassy and sometimes horribly insensible, and this one instance shows just how insensitive Apple's AI-powered voice assistant can be when asked an innocent question.
"Hey Siri, define mother" is not usually the kind of question people ask, but it's an innocent query nevertheless. Siri attempted and rightly returned with the right answer for that, but any word with multiple meanings prompts the digital assistant to ask if the user wants to hear the next meaning in the search results.
If you said "yes", you better be prepared for a very NSFW answer. Spotted by users on Reddit (r/Apple subreddit), the second definition for the word "mother" provided by Siri is "short for motherf****r."
At the International Business Times India office, we gave this a spin to see if Siri can really get so insensitive, and it did. Check out a brief video below where we asked Siri to define "mother" and yes, it contains strong language.
We tested it several times on iPhone 6 running iOS 11.3 and the response did not change. Apple hasn't commented on this yet, but we'll update if the company releases a statement.
Siri's response clearly did not sit well with many users, who thrashed Apple's digital voice assistant for turning such a response. Some users expressed their rage over this on the Reddit thread that was upvoted over 7,000 times.
"The weirdest part is the fact that Siri is such a prude when you swear at her," wrote one user.
"I was expecting her to just read the next dot. Did not expect motherf****r," another Redditor wrote.
"How does something like this get through?!? Shouldn't Siri have some kind of Explicit filter before anything is said?" asked a Redditor iComputerGeek101.
It's worth mentioning that Siri's insensitive response isn't Apple's fault entirely. Siri draws out any definitions from OED (Oxford's Dictionaries API), which explicitly lists "short for mother****r" as the second definition for "mother."
Siri's unexpected responses - Not a first
This is not the first time Siri has gotten itself in hot waters with one of its definitions. Back in 2015, it had an offensive definition of the word "bitch" but Apple corrected it later. Another instance happened in 2013 when Siri offered another offensive definition for the word "retard" and the Russian version of Siri had responded negatively to queries with words "gay" and "lesbian." Both times, Apple corrected the responses.
It remains to be seen if and when Apple will correct the latest mishap.