The 20 touchy topics iPhone autocorrect avoids — while Siri still swears like a sailor

The Daily Beast has discovered dozens of touchy topics where iPhone’s autocorrect won’t help out, but Apple’s digital personal assistant, Siri, is still happy to rattle off or transcribe pretty much any blue language you send her way.

Puritanical conspiracy or simply prudent design?

The Beast’s Michael Keller dubbed the iPhone’s “taboo” words the “Apple Kill List,” and surfaced a number of words, if mistyped even slightly, Apple will not fix.

Advertisement - Continue Reading Below

Apple devices won’t prevent users from typing these words, it just won’t help save them from typos, he explained.

“How often is software coming between us and the words we want to use? Or rather, when does our software quietly choose not to help us?” Keller asked. “And who draws the line?”

He illustrates the potential problem.

“For example, if a user types ‘abortiom’ with an ‘m’ instead of an ‘n’, the software won’t suggest a correction, as it would with nearly 150,000 other words,” he writes.

The disavowed language, as noted in the article:

— abortion

— abort

—rape

—bullet

—ammo

—drunken

—drunkard

—abduct

—arouse

—Aryan

—murder

— virginity

— bigot

— cuckold

— deflower

— homoerotic

— marijuana

— pornography

— prostitute

— suicide

Keller suggests that the sensitive words are left un-spellcheck-able due to political considerations — while noting that Apple has addressed, head on, concerns that its direction service would not point to abortion services, among other hot-topics.

But I find this explanation tough to take at face value: The cost of an accidental “cuckold,” “deflowering,” or worse airdropping into a conversation can potentially be incalcuable.

And Siri, Apple’s helpful (or infuriating — your mileage might vary) digital assistant has no problem speaking and transcribing any of the above words as well as language that would be completely unprintable in a family publication and, according to informal Hive tests, even many mature publications, although it did appear to err on the side of choosing a less loaded term in close calls. (for example, “rate” over “rape”).

What do you think? Are the above “Twenty Touchy” too hot for auto-complete, or is Apple subtly engaging in the reshaping of the American mind into a more clean, pristine, and sterilized culture? Let me know in the comments, or share on Twitter at @HiveBoston.