Postscript is Cameron Kunzelman's weekly column about endings, apocalypses, deaths, bosses, and all sorts of other finalities.
The games industry and its analysts are always looking for a silver bullet to describe the wax and wane of economics. It’s undeniable that costs are going up within traditional game development, and it’s harder than ever to ensure that you have a hit on your hands. Short of the massive blockbuster franchises, your Assassin’s Creeds and your Call of Duty games, there seems to be a general anxiety that there are no longer as many safe bets in games. Indeed, as Mike Rose has suggested, it’s almost a sure bet that any given game isn’t a success.
Raph Koster, the former lead designer for Ultima Online and a key creative force behind Star Wars Galaxies and Everquest II among a whole host of other things, wrote up a talk he gave a couple weeks ago that interrogates these particular problems within the game industry even more explicitly.
In the piece, titled “The Cost of Games,” Koster looks at a data set of 250 games going back into the early 1980s in order to make some arguments about how the economics of the game industry works. While he expected to find correlations in game cost around hardware generations or increased efficiency in development procedure, the key data point that he draws out of his analysis is that megabytes of data cost roughly the same amount to make across different domains, hardware, and genres.
It’s a bizarre finding, so let me make this as clear as possible: Koster is arguing that if I’m making a mobile game and you’re making the next Destiny, we will spend roughly the same dollar amount on individual megabytes of data. There is a flat cost to the production of data itself, agnostic of the specifics of what we’re actually making. And it’s strange.
Koster draws a few implications from this dollars and data equivalency. The first is that the focus on Unity and Unreal as core game engines for the game industry might have locked in a cost-per-megabyte baseline standard, and that they might be preventing developers from creating game data more cheaply. There’s no searching for dollars-to-data efficiency or more elegant solutions because these engines have created a certain baseline for production.
The second is that while cost-per-megabyte has changed over the years, players are getting those megabytes much more cheaply than they have been before. We know that crunch is the method through which the video game industry subsidizes our games (by socially pressuring and managerially expecting people to work longer compensated and uncompensated hours at development), and Koster gives us a way of figuring that as a data-to-hours equivalency. The cost of data might be locked due to technical constraints, but whatever they cost the developer is not what they cost the consumer.
In some ways, Koster is just giving us a new way of talking about old problems. Costs of game development are going up, players are not paying any more for those costs, and they are being covered via subsidization in a dozen different realms, be that the extraction of more labor power from devs (via overtime and crunch) or lootboxes that give each game’s monetization period a much longer, fuller tail.
Koster has some doomsaying projections in the end (“We’re talking one terabyte games that cost $250m to develop, by the early 2020s.”), but his fundamental claim is that the only games that escape this equivalency between data and dollars are games that hone in on community, user-created content, and player skill. These are games like DOTA, PlayerUnknown’s Battlegrounds, or Minecraft, the kind of experiences that you can sink a huge amount of time and money into, because they will fundamentally generate a longer tail of money for the companies that create them. It’s a bonus if you can create items in a marketplace that they can take some percentage of.
For my own take, this reads a lot like the claim that single player games are dying, which is the broadest kind of analytical point that can be pulled apart in a dozen ways, and I don’t think that the widening world of game production, with its indie point and click games to its first-person explorers to its AAA shooter experiences, is wholly depicted within the data that Koster is presenting in his charts.
More fundamentally, I think that if Koster is correct then this commercial industry trend signals that it might be worth spending time playing games that purposefully don’t interact with those large, bombastic, spendy parts of the industry. Koster’s data suggests that the games industry might end up looking a lot like the world of fiction writing or music creation in the near future, which means that there are lots of people making interesting things at an economic tier below full employment. This is a mode of creation where production costs can be spread out over a longer amount of time and at a more predictable level of dollars-to-megabytes.
What do you want your dollars to support, and how much does your $60 (or $40, or $20) do for a smaller developer versus Activision Blizzard?
Spending money on a game from a part-time creator, or a full-time independent game dev who is getting by via the skin of their teeth, has a much greater effect on those devs than it does at the massive scaled game development companies that Koster is mostly focused on there. Within this paradigm, purchasing choices cross over with ethical ones. What do you want your dollars to support, and how much does your $60 (or $40, or $20) do for a smaller developer versus Activision Blizzard?
What’s undeniable is that the shape of the games industry is changing, and Koster’s data analysis gives us an angle for thinking about how consumer purchasing power operates there. My suggestion is to spend more time on itch.io.
You can follow Cameron on Twitter.