Why 2002 Was Possibly the Most Important Year in Video Gaming History

Imagery from ‘Grand Theft Auto: Vice City’

Few realized it at the time, but 2002, even by the end of its first month, had already become one of the most significant years in gaming history.

Until then, technical limitations had forced game-makers to create, relatively small-scale war sims—since the PlayStation’s memory size made daylight difficult to render, the original Medal of Honor of 1999 focused not on soldiers and the front lines, but spies, secret agents, and members of the Resistance, the kind of people who worked only at night. 2002’s Allied Assault changed all that. Developed for the more powerful PC, it opened with a spectacular appropriation of Steven Spielberg’s D-Day. Critics loved it. Players loved it. As a result, Electronic Arts insisted the next Medal of Honor also open on a spectacular battle.

Videos by VICE

And when the same year’s Frontline became the best-selling game of the entire MoH franchise, a transformation, monumental in hindsight but perhaps imperceptible at the time, had been completed: war games, going forward, would either have to go big or go home. It’s of course worth noting that the lead designers on Allied Assault, Vince Zampella and Jason West, were the future inventors of Call of Duty.

I was 11 years old in 2002, and I’d been playing games for a number of years. But, largely because of age, I hadn’t yet “arrived” in gaming culture. I didn’t buy magazines, I didn’t follow new releases, I didn’t even know what I liked. Grand Theft Auto III had changed that slightly—it was the first time, for me, that a video game felt like a truly significant cultural event—but my personal taste was still gestating.

So when The Operative: No One Lives Forever released to PS2 in April 2002, I didn’t pay it attention. It was, and remains, one of the few video games to genuinely respects its female lead, and had I known it existed (and had I been old enough to even spell “feminism”) I would have wanted it. But such was the naiveté of youth—I was busy playing State of Emergency.

It’d be nice to imagine that doing side missions, smashing up scenery, and killing enemies, en masse, inside a shopping mall are why games like those in the Saints Row and Dead Rising series don’t excite me today—thanks to State of Emergency, I’d like to think I had my fill of that stuff more than a decade ago. Similarly, I presume Capcom, in 2002, was finally bored. Its beloved Resident Evil: Remake, also launched in April, and Resident Evil Zero marked two of the final hurrahs for, what by then had become, survival horror traditions.

The flagship zombie series, from here on, would ditch its static camera, limited game saves, and preference for dangerous rather than plentiful enemies—”action horror,” heralded by Resident Evil 4, would soon become vogue and foster various dumb, not frightening video games, until finally falling from grace thanks to dismal sequels like Resident Evil 6 and Dead Space 3. That year, 2002, was the curtain call for a certain kind of horror game, one prematurely abandoned—like the slower, subtler war shooter—in favor of bigger spills. Big genres started to surge in new, questionable directions.

These kinds of divergences, from one video game trend to the next—if you look closely—occur every year. But it was 2002, appropriately the year when I became genuinely interested in games, that the foundations were laid for what I today consider my greatest loves and most enduring bugbears.

Hitman 2: Silent Assassin, from October, was intelligent, in a way video games had rarely been intelligent before. Thanks to its visual design work, a few sparse cutscenes, and the ways it encouraged players to regard levels coolly and methodically, that game could lay claim to something exceptional: a leading character both morally questionable and consistently defined. I long for games to get their hands dirty, to put real villains and real bastards on the screen without levity or irony. Hitman 2 shared that ambition—it was years ahead of its contemporaries.

Imagery from ‘Metal Gear Solid 2: Sons of Liberty’

As was Metal Gear Solid 2: Sons of Liberty, which launched in Europe on March 8. Critical discussions about players’ agency versus game-makers’ intentions wouldn’t calcify, at least not in the popular press, until 2007 and the release of BioShockthere was a game that nakedly addressed the disparity between how players play and what writers write. But MGS 2 considered a similar dilemma: If new protagonist Raiden officiously follows orders from an increasingly preposterous central command, to what extent is he, and by extension the player, really in control? The final few hours of MGS 2, whereby formerly trustworthy characters, intelligible plot threads, and plausible physical locations all descend into absurdity, represented an indictment of gaming’s paradoxical, top-down relationship to its supposedly impassive audience—with things turned so utterly on their heads, would we continue to buy everything MGS 2 was selling?

SOCOM: U.S. Navy SEALs, which came out in the US in the summer of 2002 before coming to Europe the next spring, was another forward push and remains, to my mind, the most competent online shooter. It was one of the PlayStation 2’s earliest network-enabled games, a notable technical achievement. It was also tense in a way that console shooters today, generally, are not. Two teams of eight squared off in objective based games—there were no death matches—and once you were killed, that was it, you had to wait for the next round. It was a simple conceit, lifted from Counter-Strike and present today in Rainbow Six Siege, but it made the gunfights, the kills, and the deaths in SOCOM matter, which is more than I can say for the most popular game modes in Call of Duty and Battlefield. Kill, die, respawn, repeat—multiplayer shooters have become so blithe.

Multiplayer traded teamwork for selfishness and isolationism. By extension, 2002 somehow got away from the gaming industry—when Grand Theft Auto: Vice City launched in October, it brushed aside smarter titles like The Getaway and Mafia, and cemented a model for big and technically impressive, but ultimately shallow sandbox games.

It was primitive, but The Getaway attempted to painstakingly recreate the real world and a real location in a game. Mafia, though occasionally poorly scripted, was brave enough to sideline conventional “fun” for better drama. Shootouts were sporadic, missions were tough, and if you drove over 40 miles an hour, you’d get a speeding ticket. Mafia was a crueler open-world world. When player-character Tommy was shot dead at the end, in a scene presumably overlooked by anyone who’d describe Red Dead Redemption as “brave,” it felt right. Vice City was stylish, broad, and very enjoyable, and to its teenage players—myself included—its rehashing of Brian De Palma’s Scarface might have seemed like an exciting story. But The Getaway was experimental, and Mafia was smarter.

Some of the lessons of 2002 have been remembered. Others have been forgotten entirely. That year, war, horror, and open-world games found new life, which continues today in their respective veins. But their success is mitigated, and the representations of violence and characters, and discussions of form, class, and authorial intent remain largely stagnant. The year 2002 hardened my personal interest in video gaming. But it also fostered aspects of the cultures that I find disagreeable and ugly. As much as for the industry, for my relationship to video games, it was a defining year.

Follow Ed Smith on Twitter.