This article appears in VICE Magazine's Stupid Issue, which is dedicated to the entertaining, goofy, and just plain dumb. It features stories celebrating ridiculous ideas, trends, and products; pieces arguing that unabashed stupidity can be a great part of life; and articles calling out the bad side of stupidity. Click HERE to subscribe to the print edition.
Anthony Levandowski is a very smart man who has said and done a lot of dumb things. Once a brilliant young engineer, Levandowski established himself as a pioneer in the area of self-driving cars, long thought to be the next big thing. In the mid-2000s, he helped build a self-balancing motorcycle that could drive itself (poorly) and spent close to a decade at Google working on Street View and the self-driving-car teams.
Every profile of Levandowski produced a nearly identical quote from a former superior attesting to his brilliance. A representative one from his adviser at UC Berkeley, Ken Goldberg, went as follows: “Anthony is probably the most creative undergraduate I’ve encountered in 20 years.”
Never mind that Levandowski has taken shortcuts while operating experimental software on public roads that put people’s lives in danger and injured a coworker. The crash, and every other line he crossed, was just another “invaluable source of data” in his quest to change the world and handsomely profit from it.
Levandowski’s creativity extended to his finances. While at Google, he licensed or used products from companies he also owned, the kind of financial subterfuge more befitting a Trump administration cabinet member than a Google engineer. He also set up a self-driving truck company called Otto, which he sold to Uber for $680 million just months after cashing out and quitting Google, even though Otto was barely a year old. Waymo, the self-driving car subsidiary of Google’s parent company Alphabet, sued Uber and Otto for stealing trade secrets. (The suit was settled in 2018.) In August 2019, Levandowski was indicted by the federal government for that alleged theft. (He pleaded not guilty and has contended he did nothing wrong, and the case is awaiting trial.)
This is Levandowski, the poster boy of Silicon Valley hubris. In a 2018 profile, the New Yorker deemed him “an exemplar of Silicon Valley ethics,” an oxymoronic and backhanded compliment if there ever was one.
The publication was, of course, referring to his alleged felony and financial chicanery, which left him astoundingly wealthy because his repeated duplicity was constantly excused by his superiors as a regrettable side effect of world-altering intelligence. He was another difficult man in a world of difficult men.
But that’s not the sole or even most important reason Levandowski is an emblem of the industry that made him rich. Levandowski is an avatar for the tech industry’s foibles because of his obsession with the future and disdain for the past, a consistent refrain at the center of the Valley’s beating heart. If the past has no relevance, everything is innovation.
As with everything else, Levandowski doesn’t go about it half-assed. In 2015, he started a church called Way of the Future, shortened to WOTF, just one letter off from the more appropriate abbreviation. WOTF worships a divine artificial intelligence being called “the Godhead.” The idea here, as Levandowski told Wired in 2017, is to ease humanity’s transition from the smartest species on earth to mere pets of our AI overlords in a positive manner.
“We believe in progress,” WOTF’s official website states, noting that it wants to be on the Godhead’s good side when the technological rapture arrives. “Change is good, even if a bit scary sometimes.”
About a year after Levandowski talked to Wired about WOTF, the New Yorker ran another long feature on Levandowski and his escapades at Google and Uber and the ensuing lawsuit. Levandowski told the writer Charles Duhigg not only that the future is all that matters, but that he didn’t care much for history either:
“The only thing that matters is the future,” he told me after the civil trial was settled. “I don’t even know why we study history. It’s entertaining, I guess—the dinosaurs and the Neanderthals and the Industrial Revolution, and stuff like that. But what already happened doesn’t really matter. You don’t need to know that history to build on what they made. In technology, all that matters is tomorrow.”
Levandowski may say it more harshly than others, but he is hardly alone in the belief that the past is irrelevant for those obsessed with the future.
“Tech, historically, has been deeply uninterested in looking backwards,” said Margaret O’Mara, a history professor at the University of Washington and the author of The Code: Silicon Valley and the Remaking of America, a history of Silicon Valley. When tech companies do invoke history, she pointed out, it’s often closer to mythology. Consider the Tale of Two Steves of Apple in a garage. Otherwise, as she asked rhetorically in the book’s introduction, “Why care about history when you’re building the future?”
This anti-history bias is not merely a curious quirk of a group of people that has drastically shaped the modern world. It is a foundational principle. Like Levandowski’s church, it is the very basis for a belief system.
But O’Mara argues that this altar of progress is a distortion of what really made Silicon Valley what it is. “When you actually study history,” O’Mara said, “things get really messy really fast.” None more so than the history of the tech industry itself.
This hostility toward the past has deep roots in internet culture. In 1996, the Grateful Dead lyricist and early internet evangelist John Perry Barlow wrote “A Declaration of Independence of Cyberspace.” The second sentence is: “On behalf of the future, I ask you of the past to leave us alone. You are not welcome among us. You have no sovereignty where we gather.” Wiping the slate clean with the digital era paved the way for the kind of ignorance techno-utopian narratives traffic in.
Whether intentional or not, reformatting the tech industry’s memory around the proliferation of the internet helped perpetuate a myth that the nascent industry sprang up from the brilliant minds of a chosen few without anyone else’s help. In turn, this story became the justification for a limited government that didn’t interfere with the independent spirit and economic structure that made the web great. Too bad it wasn’t true.
History does a lot of telling us what we don’t want to hear. It disposes of the progress myth we are taught in schools— which is also also a foundational principle of Levandowski’s AI church—that things just keep getting better, even as it feels like they are only getting worse.
To be sure, there were many brilliant minds working in tech, but they had help, and lots of it, from Uncle Sam. O’Mara painstakingly details such events in her book: Federal grants accounted for 70 percent of the money spent on academic research in computer science and electrical engineering from the mid-1970s to 1999; the fruits of that research were often spun off into some of the biggest and most influential tech companies of the day. Hell, the actual internet, at the time called ARPANET, was named after the Advanced Research Projects Agency (ARPA), a government agency that provided it with about $1 million in funding. Starting in 1994, the National Science Foundation, NASA, and DARPA (the successor to ARPA, which focuses on defense projects) gave $24 million to six computer science departments to figure out the best way to index and search the internet. Two grad students at Stanford University named Sergey Brin and Larry Page substantially benefited from this program, which “supported much of Brin and Page’s work,” O’Mara writes. That work soon became Google. If DARPA were a venture capital fund, it would be one of the most successful in history.
This important context is either downplayed or avoided entirely when the tech industry talks about its roots. Steve Jobs, one of the greatest storytellers in modern times, excluded the government’s role in seeding many tech companies of note when evangelizing for his—and other—companies during a publicity wave of cover stories in the 1980s. Jobs, by the way, was hardly immune to the lure of government largesse. He once spent two weeks walking the halls of Congress lobbying the federal government for tax breaks for computers donated to schools; he failed in Washington but succeeded in California, putting his products in front of thousands of California children for pennies on the dollar.
As O’Mara pointed out, ignoring your own history or writing an altogether new one can be a great business strategy. “We see a lot of this in mid-20th-century America,” she said, where companies embraced narratives of “we’re marching toward the future.” Business leaders realized it’s a great public relations gambit with investors, politicians, and the general population to spin a yarn about progress and possibilities, “making the world more open and connected,” and brushing aside inconsistent facts. History was just another marketing tool, sometimes literally. An Apple ad campaign from the 1980s featured actors dressed up as Benjamin Franklin, Thomas Jefferson, Thomas Edison, and the Wright brothers holding Apple IIs. One of the taglines read: “Don’t let history pass you by.”
It’s traditional for cultures of innovation to regard history as more or less worthless. Considering Levandowski’s interests, it’s ironic that the Valley’s predecessor here is none other than the automobile industry. To take just one prominent example, for the 1939 World’s Fair, General Motors commissioned an exhibition called “Futurama” looking 20 years into the future, featuring vast, automated, congestion-free freeways. When the World’s Fair returned to New York in 1964, GM did it again with similar vast, automated freeways.
It was a good story, and good for business. In 1953, President Eisenhower appointed GM’s president and CEO Charles Wilson as secretary of defense to oversee, among other things, the planning of a federal highway system, a 100 percent government-funded program to the tune of some $100 billion that helped cement the automobile as a necessity for nearly all American families.
But this wasn’t merely about business. Charles Kettering, a GM engineer and perhaps America’s greatest inventor since Thomas Edison, was prone to decidedly Levandowskiesque pronouncements about history’s irrelevance. “You never get anywhere looking in your rearview mirror,” he once said. The future, Kettering added, was all that matters, because “we will have to spend the rest of our lives there.”
Kettering’s attitude was not only representative of the automotive industry around that time, said the Virginia Tech history professor Lee Vinsel, but of American business more broadly, which believed unflinchingly in American dominance and progress. Vinsel pointed out that one of the most infamous quotes about history comes from an American automotive titan, Henry Ford. An ardent isolationist, Ford said “history is more or less bunk” in a contentious 1916 interview with the war-hungry Chicago Tribune about whether the U.S. should get involved in World War I. The remark went virtually unnoticed for three years. (This and other details come from a 1965 Proceedings of the Massachusetts Historical Society paper by Roger Butterfield that investigated the history of this quote.)
Later that year, Ford sued the Tribune for libel, demanding $1 million after the paper called him an “anarchist” and an “ignorant idealist.” The case went to trial in 1919 and the judge made clear the issue at hand was not whether Ford was an anarchist, but whether he was ignorant. Ford spent eight days on the witness stand as Tribune lawyers pelted him with questions in an attempt to prove Ford was an ignorant man, and the press wrote up every juicy exchange. One such exchange regarded just how much contempt Ford had for history.
Ford won the case, but only just. The jury awarded him six cents in damages. Shortly thereafter, he wrote to his secretary Ernest Liebold that he was going to start a museum “and give people a true picture of the development of the country.” He vowed to collect and preserve artifacts in service of this mission because the only history worth observing is “that you can preserve in itself.”
“We’re going to build a museum that’s going to show industrial history,” Ford wrote to Liebold. “And it won’t be bunk.”
And it wasn’t. The Henry Ford Museum and Greenfield Village complex in Dearborn, Michigan, is one of the largest collections of American historical artifacts. The guy responsible for one of the most famous anti-history quotes in our language came to care a tremendous deal about history.
This tends to happen. We get older and realize we may live the rest of our lives in the future, as Kettering said, but much of our time is spent in the past, too. As we age, the ratio flips. Great chunks of us become history. And one day, we will be too. The past no longer seems to be an abstract, irrelevant tale but something that happened to us, to people we know. It’s something we made, some- thing we did.
This is partly why O’Mara thinks we’re at the beginning of a shift in which Silicon Valley will start to care about history. She’s been invited to talk about her book up and down the Valley, in front of audiences of all ages. The industry is now mature enough that parts of it are history itself.
But it’s not mere nostalgia—or, less charitably, a dif- ferent form of hubris—that makes history important. Even historians disagree on why history matters. Some stress that its cyclical nature—“history doesn’t repeat itself but it rhymes”—is the business case for learning history, so one does not repeat the mistakes of the past.
There’s something to this, but history’s relevance runs deeper. Learning it can be almost spiritual, a kind of therapy. It’s oddly comforting to learn about times when people thought they were experiencing unprecedented circumstances, when they were scared out of their minds about what had become of their society, when they were afraid they had lost all con- trol over events. Things may be different today, but not that different.
History does a lot of telling us what we don’t want to hear. It disposes of the progress myth we are taught in schools—which is also also a foundational principle of Levandowski’s AI church—that things just keep getting better, even as it feels like they are only getting worse.
The three historians I talked to for this article stressed that history disabuses us of these easy “progress narratives.” Instead, it presents a much more challenging yet honest view of humanity.
Patrick McCray, a historian of technology and science at the University of California, Santa Barbara, told me that the story of humanity is not one of linear progress, but of spurs and splits, fits and starts, progress and backpedaling. For his scientific history course, one of his main goals is to show students this. But it’s no easy feat, especially for students in science and technology, fields entirely based on progress narratives and finding clean solutions to difficult problems. “It’s really hard to get them out of that mindset, because they really have this view that science is this ever-improving thing and we’re just simply knowing more and more and more,” McCray said.
This is hard stuff, and acknowledging it comes with a corollary: We, as a society, are not particularly special. Vinsel, the historian at Virginia Tech, cautioned against “digital exceptionalism,” or the idea that everything is different now that the silicon chip has been harnessed for the controlled movement of electrons.
It’s a difficult thing for people to accept, especially those who have spent their lives building those chips or the software they run. “Just on a psychological level,” Vinsel said, “people want to live in an exciting moment. Students want to believe they’re part of a generation that’s going to change the world through digital technology or whatever.”
Perhaps no single human embodies the concept of digital exceptionalism more than Levandowski. In an anecdote from a 2013 New Yorker profile, he showed the writer Burkhard Bilger his collection of “vintage illustrations and newsreels on his laptop” of the failed attempts to have cars drive themselves in the past. Levandowski may not be a student of history, but he’s hardly ignorant. For all his bluster, Levandowski may be more like Henry Ford than he lets on.
When Vinsel tells his students about the importance of history, he references the philosopher Harry Frankfurt’s classic work On Bullshit, which experienced a brief resurgence in 2016. Frankfurt argued that bullshit is not about lying so much as simply not giving a shit about truth. Bullshit is saying whatever you need to get elected or to build hype around your product or get that next round of venture capital funding or win that government contract.
“I think history leads you to be a bullshit detector,” Vinsel said. He supposes this may be the fundamental incompatibility between tech companies, which disseminate an awful lot of bullshit, and their disdain for an honest reading of history. Perhaps, he thought, they might see a little too much of it in themselves. After all, Vinsel added, “there’s not a lot of innovation in bullshit.”
“We didn’t come up with this idea,” Levandowski once said of cars driving themselves. “We just got lucky that the computers and sensors were ready for us.” He believes this time is different, just like everyone before him believed their time was different. It’s a gigantic downer to be told otherwise. In many ways, that’s what history is.
Editor's note: After this article was finalized for print publication, Levandowski declared Chapter 11 bankruptcy following a court order to repay Google $179 million.