A new tool for generating lines of code from natural speech can churn out programming language from just the sound of your voice.
It uses OpenAI's Codex, an AI system for translating natural language to programming language that launched this week. It's a "descendent" of GPT-3, OpenAI's language model that can generate eerily realistic text conversations—recent examples being someone using it to grieve their deceased girlfriend and writings from a woman processing her sister's death. Unlike GPT-3, which is trained on hundreds of billions of words, Codex also draws from a dataset of publicly-available code, like public Github repositories. In June, OpenAI and Github launched Copilot, a tool that auto-suggests lines of code as the programmer types.
In the demonstration, Mayne hits a microphone button in the application like you would if you were about to speak to Siri or Google Assistant, and tells it to "create a list of ten superheroes and print one to the console." The program converts his speech to text on the screen, then converts that text to code.
“The CodeVox app is a demonstration of speech to text for coding to encourage developers to explore how OpenAI Codex can be used to help people who are visually impaired and make coding accessible to a larger audience,” Mayne told Motherboard.
It can also work with other programs on the same machine: he tells it to import Turtle, a graphics processing program, and have it draw a triangle, which it does.
"As the rote work of coding becomes easier, computer science education can focus on higher-level computational thinking concepts" like interface design and algorithms, CEO of the nonprofit Code.org Hadi Partovi told Wired—and he denies that Codex will kill coders' jobs, but instead increase demand for more programmers.
Codex also isn't quite as slick as demos make it out to be, yet: its current version can execute about 37 percent of tasks users give it. It won't be coding the next lunar mission anytime soon, let alone coding circles around an experienced human programmer. And, like any other algorithm designed by people, it's prone to bias. But it's easy to imagine a future where learning to code could involve yelling debugging tasks into a microphone and hoping it knows how to process curse words.