Despite posting what they asserted to have been a “record year” in 2018, gaming giant Activision Blizzard, the joint organization responsible for some of the highest-selling and most recognizable games on the market, laid off around 800 employees last February to the shock and chagrin of a community already displeased with many of their recent ventures. Though they claim their business to be healthier than ever, trimming nearly 10% of their total staff size wasn’t exactly a good look no matter how they spun it.
Yet, if publishers like these are so well-off, why the sudden need to downsize? Industry pundits have argued these layoffs and corporate restructurings are done in the interest of conserving CEO salaries and ensuring that those at the top of the food chain stay in the black. There may be some truth to that; Activision head Bobby Kotick and Electronic Arts leader Andrew Wilson were recently ranked among the most overpaid CEOs in America. Yet, fewer employees inevitably means less output, and it seems strange that companies as money-hungry as these would ever want to settle for less.
However, amid a flurry of tweets arguing against the tenets of late capitalism and a slew of hyperbolically critical YouTube vlogs, it’s hard to know exactly what to make of this. On one hand, signs that the structure of triple-A video game development could be weakening may be construed as either a positive or negative development based on personal opinion. While many seem to be eagerly anticipating the fall of our corporate overlords, we need to be careful of that for which we are wishing.
As previously mentioned, while mass layoffs may not necessarily be indicative of market ills, it absolutely means that, in the future, we’ll be getting higher-quality games much less often, as major companies rush to push out more barren, incomplete annual releases. If Anthem felt lackluster at launch, we should shudder to think what it may have been like had support and quality assurance teams been halved during development.
The root of the problem seems to stem from the "put-it-out-now-and-patch-it-later" mentality, which larger publishers seem to be adopting. Before a 50 gig day-one patch could fix a title’s major issues before most players could even discover them, studios and publishers had to meticulously comb through their products before they shipped to ensure that they met at least some minimum standard of playability. Today, utterly broken, quasi-early access triple-A titles are almost the norm, and those without solid, consistent access to the Internet are left to deal with an inferior experience.
Often referred to as a minimum viable product, rather than invest time and resources into the creation of a quality title, publishers are now searching for the least amount of work they could possibly do before pushing out a so-called “finished” game, all the while hoping to cash in on ancillary income earned through the implementation of microtransactions.
It’s no coincidence that the most functional aspect of a game on release tends to be the microtransaction storefront. Far be it from any publisher to expect gamers to feel any sense of gratification for their $60 expenditure. Why stop there when you could charge through the nose for more digital currency or in-game extras? Furthermore, why keep 800-odd employees on hand when the same amount of money can be made without their services?
The gaming community often argues over the concept of gaming as an artform; should Mike Bithell’s Thomas Was Alone stand alongside Van Gogh’s Bedroom in Arles? Should Toby Fox’s Undertale be compared to Mondrian’s Composition II in Red, Blue, and Yellow? Perhaps, but the argument is increasingly sullied by soulless games created for no reason other than to turn a profit. Sure, while we would love to believe that video game development is always a passion-fueled endeavor, designers have to put bread on the table at the end of the day. The fact remains, however, that games are by-and-large offering less while charging exponentially more.
We’ve seen this before; it may be a tired analogy now, but the video game industry was in a very similar position nearly forty years ago. In 1983, a glut of mediocre, rushed titles left consumers feeling spurned, and, as a result, the market spent two years in major decline before renewed quality assurance and the dawn of the 8-bit era brought it back. Those who are ignorant of the past are doomed to repeat it, of course, and the modern-day deluge of inferior, underwhelming games made strictly to please investors and bolster revenue may soon cause the whole industry to once again go belly-up.
Yet, unlike those days, we now have a rich and vibrant indie and double-A underground, which can keep the flame of digital interactive media burning bright for the foreseeable future. Titles like Team Cherry’s Hollow Knight or Edmund McMillen’s The Binding of Isaac: Rebirth have proven that games can be financially successful without nickel-and-diming consumers, and that gamers will be more than willing to pay for skillfully-made titles priced with respect for consumers.
As it stands, companies like EA could re-hash the same FIFA title for years with minimal effort, do away with as much as a quarter of their staff, and rake in more money than ever through brand recognition and borderline-criminal amounts of in-game monetization schemes. In 2017, the publisher reported that approximately 39% of its income came from microtransactions, which is telling. It is in this way that gamers should be very concerned for the future. These corporate titans aren’t weakening. They simply don’t need as many employees to fill their bottom line as they once did.
When it comes down to it, the only way to fight back against further major layoffs may be to, paradoxically, stop supporting these releases through in-game purchases. If companies can no longer rely on minimum viable products as vehicles for millions upon millions of microtransaction-earned dollars, then they may have to resort to the unthinkable: once again making quality, consumer-respectful content.