This story is over 5 years old.

Remembering Ada Lovelace, the World's First Programmer

_This post originally "appeared":
Janus Rose
New York, US
October 7, 2011, 4:07pm

This post originally appeared on Motherboard in March of this year.

Like many throughout the history of computer science, the story in The Social Network is one dominated by males. Shot after shot, a young Mark Zuckerberg and his Harvard mates spit lightning-fast flashes of dialogic wit at other male characters. All the females are either plot devices or groupies; spectators in a desperate power struggle for control over what is perhaps the most significant technological development of our generation. But unlike Zuck’s modern day miracle, computer programming’s alpha-nerd was no dude.

This year, on October 7th, we remember Ada Lovelace, the daughter of Lord Byron who in the mid-19th century wrote the world’s first machine algorithm for Charles Babbage’s analytical engine. That basically means that yes, she was, in fact, the world’s first computer programmer.

Programming back then wasn’t as simple as knowing a computer language, however — Babbage’s machine was essentially a gigantic analog calculator programmed entirely using physical “punch cards.” These cards — which would be used as the de facto method of computer interface all the way through the 1950’s — represented the most fundamental form of data, ones and zeroes, and engineering even the simplest of mathematical operations was a challenging and mentally taxing task.

Fortunately, Lovelace was a brilliant mathematician. From 1842-1843, Lovelace devised a method of calculating Bernoulli numbers using Babbage’s machine. It was an achievement so impressive that Babbage himself called her “The Enchantress of Numbers.”

Moral of the story? Never send a Zuckerberg to do a woman’s job.

Listen to BBC’s In Our Time discuss Lovelace
The ‘Women In Science’ Debate That Should Not Be