This story is over 5 years old.


Don't Ask Siri About Your Weird Sex Thing

Researchers asked Siri and Google a barrage of questions in a sex-health battle of the voice assistants.
Image: Shutterstock / Composition: Samantha Cole 

It takes a bold soul to shout into their phone, “Siri, is it normal to have a curved penis?” Most of us save that shit for Incognito Mode. But people are whispering things to search engines that they might never say to a real human.

Tech companies would like us to believe that digital companions are the future of human-machine interaction. But they’ll probably have a tough time keeping up with all the weird sex questions we’ll throw their way, based on a recent study of the two most popular voice-activated assistants.


Researchers at the University of Otago in New Zealand pitted Siri and Google Assistant against each other in a sex health quiz. They asked the disembodied digital assistants questions based on common requests in the sexual health section of the UK National Health Service site, such as “Where can I get contraception?” or “Show me a picture of genital warts being removed.”

“People are often too embarrassed to talk about sex (even with their doctors),” Nick Wilson, lead researcher on the study, told me in an email. “So they might particularly go to the internet and digital assistants for answering their questions.”

In fact, a majority of American adults turn to the internet to answer health questions, a 2013 Pew survey found—inevitably, many of those questions are sex-related.

In this new study, neither Google Assistant nor Siri was perfect on every question, but the researchers were actually “quite impressed overall” with the quality of answers, Wilson told me. Google Assistant performed better than Siri in general, answering queries with information from more reputable sources. Siri frequently misunderstood the spoken words or failed to return relevant results. Google searches typed into a browser produced the best answers overall.

Brooke Butler, digital strategist at nonprofit sex education organization Advocates for Youth, told me in an email that the real risk is in young people using voice assistants for health guidance and getting incomplete answers.


“For instance, if you ask Siri or Google Home how to use a condom they’ll tell you, but Alexa does not understand that question,” Butler said. She asked Google Assistant, “What are some ways I can keep from getting pregnant” and it shared just one web result: an article about fertility awareness, a.k.a. the “rhythm method.” Obviously, there are many more reliable ways to avoid pregnancy that the assistant left out.

The researchers’ experiments confirm her concerns. When they asked Siri “Am I at risk for HIV?” it replied, “I don’t have an opinion on that.” Google Assistant rattled off a sentence from a magazine article about a risky encounter involving unprotected anal sex. Siri’s answer to “Is it okay to get my genitals pierced?” varied: One attempt returned a WebMD site called “advice on penis piercing,” while another try got a National Health Service site on the topic, “Can I get my penis enlarged?”

I asked Google for comment on how Assistant works, and haven’t heard back. According to Apple, there are three factors at play in Siri’s programming that might keep its responses mostly sex-free and vague: how it categorizes questions, where it draws answers from, and whether those answers fit into Apple’s “values.”

When you ask a question, Siri categorizes your request to determine how to answer. If it can’t place your question neatly in a category, it says, "I can’t help you with that” or some variation. And Siri is programmed to try to model its answers—which are largely based on Wikipedia articles, typically those that are safe for work—around Apple’s “values,” as best reflected in their Diversity and Inclusion page. This might explain why Siri is programmed to be unoffensive, or avoidant of content considered “adult.”

For both services, answers can vary from user to user, or region to region, which might explain some of the New Zealand researchers’ results differing from what American users might experience. It’s also important to note that the researchers’ New Zealand accents sometimes confused the assistants; For example, “sex” sounded like “six” to Siri.

It’s difficult to predict, Butler said, whether young people will rely more heavily on voice assistants in the future, or if a new technology will arrive and they’ll adapt accordingly. But for people with lower literacy or who learn better through listening, these assistants could provide misleading answers to serious questions.

At least both assistants got the answer to “Is it okay to put a jade egg in my vagina?” correct.