This story is over 5 years old.

Meet Alex, the Javascript Tool to Make Your Code Less Offensive

This script detects gendered, racist, and inconsiderate phrasing and offers neutral alternatives.
September 2, 2015, 7:39pm

A Dutch developer has written a script designed to catch potentially insensitive language.

Alex—the script's fittingly androgynous name—will alert you if any input text contains offensive phrasing, and suggests neutral alternatives. For example, an input of "businesswoman" suggests "entrepreneur" or "business executive" as viable options instead.

As the script's developer Titus Wormer told me, the idea was actually suggested to him by another coder, who was inspired in turn by this tweet from a programmer of color.


Dunno how I feel about using the terms "Master" and "Slave" for describing various concepts in Software Engineering. #blackdeveloperproblems
— Iheanyi Ekechukwu (@kwuchu) July 8, 2015

While it was designed with programming in mind, the tool will work with any text.

"I'm deeply interested in natural language, and am an avid open source contributor," Wormer said. "For the last two years I've worked on getting computers to 'understand' human language." He's written a script called Franc that detects more languages than Google Translate, and one called Retext that aims to make normal syntax processable for computers.

Alex is designed to catch subtle errors—the kinds of things someone might type without thinking

Given his past projects, Wormer was an ideal candidate to write this kind of script. "Whether your own or someone else's writing, alex helps you find gender favouring, polarising, race related, religion inconsiderate, or other unequal phrasing," reads the site, where you can try it out in a demo box. However, it's worth noting that while "cripple" sets off the appropriate red flag, "slut" and "whore" do not. Typing "She is a massive slut" only gets a warning because the gendered pronoun "she" may be unnecessarily specific.

Curious about this, I asked Wormer how he built the library of offensive phrasing, and he admits to not including words that were obviously offensive. Alex is designed to catch subtle errors—the kinds of things someone might type without thinking. "I was actually thinking of another project ('gosh'), which would warn about profane words (such as 'asswipe' and whatnot) which don't really fit within alex," he said.

However, alex does accept suggestions from the community, so you can help expand its bank of potentially offensive language if you'd like.