If you talk to any passionate PC gamer—particularly one who identifies with the so-called “PC Master Race”—they’ll tell you that consoles and mobile gaming are the twin evils killing the gaming industry.
It might sound like a snotty and elitist thing to say, but that doesn’t mean it’s not, at least, partially true. The gaming industry is littered with the corpses (and soon-to-be-corpses) of traditional, PC-first game developers who, after making the switch to console- or mobile-first development, completely lost sight of their own culture and, in turn, surrendered their relevance to “core” gamers—that is, the gamers who made them successful in the first place.
The men and women who oversee these companies will tell you that the shift was made because the “pay $60 for a disc-based game” business model is all but dead. The funny thing is that not too many of the gamers I know feel this way; they just want better value for their money.
But let’s get more specific: the real problem here is that we’re expected to pay the same price—upwards of $60—for both Skyrim (a game that you can spend dozens and dozens of hours playing) and for the latest annual installment of Call of Duty, the campaigns of which usually top out at around 6 hours or so.
You’re probably telling yourself right now that the longevity of a predominantly multiplayer-focused franchise like CoD justifies its high price tag, but I’d point out that this mode of thinking effectively cuts off your game from a substantial portion of its prospective market: namely, those gamers who prefer to boot up a game and have a solitary adventure. In fact, so common is this tendency to lean on multiplayer to justify the price tag of a AAA shooter that those few examples of the genre that do go above and beyond to provide lasting value absent multiplayer features (your BioShocks, your Half-Lifes, and your Deus Exes) are exceptionally few and far between.
The solution nobody really asked for, naturally, has been either free-to-play or “freemium” (I hate that word with a passion) games. It’s been a transition largely isolated to the handheld market so far, but that may soon change; developers everywhere are clamoring to bring free-to-play games to our living room consoles, as well.
But, then, that’s a bit disingenuous; consoles have had free-to-play games for a while now, including hits like World of Tanks and H1Z1, on Xbox One, and PlayStation 4, respectively.
It’s been an interesting ride; in the early days of smartphones, most mobile-first games borrowed from the console model: pay a not-inconsiderable price just once, and then play for as long as you like. Micro-transactions were optional, rather than mandatory; you didn’t need to pay again to get the “full experience.”
But then something changed, and I’m not sure what was behind it: flagging attention spans or just an increasing hesitation to open our wallets for a game we might not like, or might not play on a regular basis. We started getting games that were free to download and play, but had important features or even entire levels blocked off until you paid a modest fee to unlock it.
If you ask me, I don’t think the freemium business model has any business in the console or PC world. Let’s take a look at a couple examples to figure out why.
BioWare’s Star Wars: The Old Republic began as a strange beast. Players had to buy the game itself, and then also had to contend with the $15/month price tag just for the privilege of signing in. I tried the game for a while, just as I tried Lord of the Rings Online and Star Trek Online, both of which also leaned on passionate subscribers to pay for the development of content expansions and other additions to the game. For a full year of play time, you’d pay upwards of $180, which meant you had to play as fast and as much as you could to make sure you got your money’s worth.
But through all of this was Guild Wars, which has never had a subscription fee, and asks gamers only for that all-important one-time cost of $60. And it still managed to be one of the most popular and enduring online games in years.
But other MMOs, like the still-struggling Elder Scrolls Online, want to have their cake and eat it too; players have to plunk down the one-time $60 price for the game itself, and then are also gently prodded to spend additional real-world currency for in-game content.
I’d say that Guild Wars definitely has the right of it. Gamers everywhere want quality games, and they only want to pay once. Developer ArenaNet understands this, and has tailored their business model accordingly.
Unfortunately, the demographics most heavily targeted by these developers are growing accustomed to getting their kicks for free. People are buying precious little music these days, for example, and even dropping a buck or two on a smartphone app seems like a big ask. We have nobody but ourselves to blame for the rise in “freemium” content.
To some extent, Apple is to blame as well. While they might command the most robust app store on the planet right now, there’s really no great solution for letting customers take apps (and games, of course) for a trial run before plunking down their hard-earned cash. Sure, some developers also release gimped versions of their apps to provide a proof-of-concept for interested parties, but that’s not a perfect solution. I want to be able to sample all that an app has to offer before making my (informed) decision about whether to buy it or not.
For all of these reasons and more, I suppose it was only a matter of time before pay-gates and subscription fees started making their way into console games en masse. But I do think it’s a shame; free-to-play lowers the bar of quality on just about every single gaming platform. It means that different players have different experiences right out of the box, depending on their ability to keep up the payments, and that’s never a good thing.