August 3, 2011

Profit Mongering (The New Business as Usual)

Being 25 at the time of this writing, I can safely say I am an old fart when it comes to gaming. As such, I feel entitled to a bit of "back in my day" ranting. Allow me to wax nostalgic for a bit.






Though I grew up with an Atari 2600, handed down through the family, and an original Gameboy, I was still exposed to the finer examples of gaming thanks to friends and neighbors. Some very fond memories involve sitting next to someone in my bedroom or their living room in front of a television, mashing away on a controller. When the Nintendo 64 came out, these good times culminated in pure multiplayer bliss (and if we were lucky, co-op!).

When the Playstation 2 and Gamecube came out, living room gaming looked as if it would only get better. The XBox was released, and love it or hate it, Halo sparked a new fervor in splitscreen gaming that rivaled GoldenEye's. It was the only thing worth playing on the original XBox, but it was pretty great. Around this time, I went to college and lived on campus. Since my roommate and I collectively had every console worth having, our room was the place to be.

Back then, as now, I wasn't as much of a console gamer as a PC gamer. I played the Doom games pretty regularly until my uncle introduced me to Half-Life and Team Fortress Classic. Playing online against other people was a blast... when my dial-up service could maintain a stable connection, which wasn't often. I spent a lot of my multiplayer gaming on the PC and on consoles playing against computer opponents wherever possible. But usually that was good enough, and even had some benefits. I was able to learn the maps quickly, and had the freedom to experiment with different weapons and tactics. There was no pressure, and no insults flying my way over text chat. If I wanted a quick game of Day of Defeat, I could create a local server, blast away, and log out whenever I wanted.

Apologies for the exposition, but here's my point. Things are very different now, and they happened within a single console generation. Splitscreen options in a video game are few and far between, and bots in multiplayer are practically non-existent. Any of these features that you do find are mere afterthoughts, and never as robust as their online counterparts. The reason for this can be boiled down to a single thing:

Money.

Specifically, the amount of money console manufacturers make by removing these options. Here's an illustration: Billy and his three friends enjoy splitscreen gaming. So Billy goes and buys an XBox 360 and the latest shooter game. They all meet up at Billy's place and start up the game. But to their dismay, they discover that there are no splitscreen options available, only online play and system link. Oh dear, it looks like Billy's friends will have to go out and each buy their own XBox 360 and copy of that shooter game. Microsoft (and Sony and Nintendo) laughs their way to the bank, and many publishers have learned to cash in on this opportunity in the worst possible way.

So what, you may think. That extra profit will lead to better talent being hired, more time to spend polishing a game, and an overall better gaming experience in the final product, right? I wish this were true in every case (or even in most cases), but there are some very disturbing trends that are emerging from this new business strategy. I will try to provide examples for each of these.

Marketing

Marketing is arguably the biggest money sinkhole. Millions of dollars are being spent to promote video games, whether or not the game is worth it. There seems to be a fallacy being cultivated here, feeding off the idea that if a game is sponsored by SpikeTV, and has a huge launch party in New York City the night of release, it must be something special. This marketing is typically geared towards the sequels of established franchises, which makes it seem even more unnecessary, as the demand for a sequel to a great game is usually pretty high already.

Of course, once release day hits, the truth is out. Say what you will about video game review publications (and I have a lot of critical things to say), even they have to admit when a game has problemsis too linearfails to meet the hype or overthrow the competition. Or even when a game is broken or just plain sucks.

I should note that most of those links are for shooter games. I don't know what it is about shooter games, but a lot of them go through this sort of thing.

Talent

Times are tough, sure. Video games may be one of the few luxury items that continues to dump billions into America's recession, but even the game industry has seen its fair share of mass layoffs. Several publishers have taken it upon themselves to completely dissolve or lay off entire development companies, even when they've been successful. Ensemble Studios, known for the Age of Empires series, was shut down after completing Halo Wars for Microsoft. They were lucky in being reabsorbed into a new studio or found jobs elsewhere in the company, but one has to wonder how many of them were included in the mass layoff of over 5,000 people that also took out ACES Studio, the developers of the Flight Sim franchise.

Conglomeration

The major publishers have often been reviled by gamers, as their corporate image seemed at odds with the creativity the game industry is known for. Rare Software just hasn't been the same since Microsoft bought them. (If I get my hands on the people who thought Perfect Dark Zero was a good idea...) While EA has arguably improved a great deal from their "buy and forget" studio acquisition behavior of the late 1990's and early 2000's, it seems other publishers are working to pick up the slack.

Everyone's current favorite evil empire is Activision, with Bobby Kotick playing the role of Emperor Palpatine. It's typical for publishers to release entries in a game franchise ad nauseum, but Activision has made it an art. Since Neversoft's acquisition in 1997, they've run the Tony Hawk series into the ground with 16 releases, and their 6 Guitar Hero games leading Activision to shut down the series due to franchise fatigue. RedOctane had originally published the Guitar Hero franchise, but were also acquired by Activision and had their studios shut down with the Guitar Hero IP. And, of course, there's Infinity Ward and the Call of Duty franchise, the success of which has completely altered the modern first person shooter, the negative aspects of which I will get into later.

I'd like to pose what I think is an interesting question here. Why is it that gamers in the United States get so bent out of shape when American or European developers release what is essentially the same game over and over, but get even more pissed when Japanese developers try to change one or two things in their franchises? Left 4 Dead 2 was practically the same game as Left 4 Dead with some new weapons, and when that was released, the nerd rage reached epic proportions. But when Nintendo debuted The Legend of Zelda: Wind Waker with its new cel-shaded look, I considered building a bomb shelter just to avoid the fallout.

Release Day Disasters

Release day can be one of the most stressful times for a developer. Everything has to be perfect when the game launches. Yet more and more frequently, games are launching with bugs, glitches, and other problems, usually so bad that the game is crippled.

These days, chances are very good (and getting better all the time) that you will buy a broken video game right off the shelf. This inexcusable travesty is partly the result of laziness inspired by the "we'll just patch it later" mentality that internet-connected consoles have succumbed to. Something that used to be common in PC titles a couple of generations ago, before consoles could connect to the internet, but never so prevalent as it is today with consoles.

Shovelware

So now that there's a console in every home, how can a publisher maximize their own profits in a competitive marketplace? Easy; remove the competition by becoming the competition! The Playstation 3 and the XBox 360 are essentially computers, with the Playstation 3 running a variation of Unix (specifically, FreeBSD) and the XBox 360 using a custom OS capable of running Microsoft's DirectX graphics libraries. This has made development a lot easier, but what really opened the floodgates to cross-platform compatibility are software technologies like the Unreal 3 Engine.

Unfortunately, development has become so easy, that publishers and their developers can pump out games in a surprisingly short amount of time. Now I can't complain too much, since this also allows modders and indie developers like myself to do the same. But it's flooded the shelves with an overwhelming number of games that are so terrible, they've come to be collectively known as "shovelware." This has noticeably affected the PC market, as games have often become developed for the consoles first, and then back-ported to the PC, despite being developed on the PC in the first place, and with previous iterations in the franchise originally being exclusive to that platform.

Shovelware games are the sitcoms of the game industry. They dumb down their experiences and focus on mass-market appeal, guaranteeing a purchase by any consumer not in the know about what to look for and what to avoid when hunting for a good game. It's sickening.

So far I haven't mentioned much about the Wii. To its credit, it has been helping keep splitscreen gaming alive. But that has more to do with it being underpowered in comparison with the other consoles and its networking architecture also being inferior. Nintendo should have named their console the "Mii 2" (see what I did there?), as they dropped their Official Nintendo Seal of Quality standards assurance and are now host to the biggest steaming pile of shovelware out of any gaming platform ever made. Most of it is TV and movie based games, which have always sucked, but even games like Call of Duty 4: Modern Warfare and Dead Space have seen neutered downgrades and offshoots jammed into the little white box. It's so bad that Team Meat, the indie developers of the very successful and highly original game Super Meat Boy, weren't able to find a publisher for a Wii version of their game. Every publisher turned them down because Super Meat Boy would be lost amidst all of the shovelware, the majority of which was released by those same publishers!

Gameplay Experience

The culmination of all of these things hits hard where it matters most, in the actual gaming experience. Video games these days look beautiful. I've been playing Crysis 2 on my PS3, and it looks so good that I get horribly distracted by the environments when I should be paying attention to the enemy soldier I just bumped into. If you want eye candy, even a newly-released mediocre game on your two-year-old computer should give you ocular diabeetus. But if my old Atari taught me one thing, it's that graphics don't mean squat if the gameplay sucks.

Unfortunately, since the current generation of consoles can pump up the graphics, the focus lies there. Those discs can store a lot of information, but most of it tends to be in high resolution textures and models. With this increasing demand for what is pretty and shiny, what is engaging and interesting falls by the wayside. This is especially true in the first person shooter genre. (On a side note, Breach's developers recognized and sought to change the problems causing the shooter genre to stagnate, but ultimately missed the mark.) Publishers put the pressure on their developers to deliver, and the difference in quality is substantial. So how do they compensate?

Call of Duty: Modern Warfare 2Call of Duty: Black OpsMedal of HonorHomefrontKillzone 3, and many others all share the same complaint: linearity. With no time, room, or budget left to develop the game mechanics, it seems developers opt to abuse the use of scripted sequences to make sure that the player sees all of the pretty graphics they spent the majority of their labor on. However, that can only last for so long; movies, in comparison, are only a couple of hours in length. Similarly, the average "blockbuster" shooter game comes in between five and eight hours in length (compare to older games like Half-Life, Quake, and Red Faction, and modern games featuring more open-world style gameplay like Crysis and Stalker).

That might not be quite so terrible if cooperative modes were more available. A lack of multiplayer features can sometimes be forgiven when you can play through a story's campaign with one or more friends. Working together is a rewarding experience in a "you vs. the world" kind of game. Co-op has made it into some games with great effect, most notably Halo 3's story and multiplayer, Call of Duty: Black Ops' Zombies mode and multiplayer, and Call of Duty: Modern Warfare 2's challenge mode. Outside of rare instances, however, co-op is all but dead.

But that's okay, because you're supposed to be playing multiplayer. Besides graphics, competitive multiplayer is the other big thing now. It's even given rise to video game e-sports, where gamers battle it out in televised tournaments for large sums of money and prizes. When the paltry offerings of single-player finally give up the ghost, all that's left to enjoy in the game is its multiplayer offerings. So that's where most gamers are, especially the prepubescent middle-schoolers that haven't yet developed social skills. Anyone who has played an online game in the past five years knows what I'm talking about. These basement dwellers do nothing but play these games, dominating the virtual landscape until only their kind remains, pushing out the gamers that may be hardcore, but have obligations and responsibilities that keep them from playing more than a quick game here or there.

Which kind of leads me back to my original rant. When playing online became a severe test of one's patience, there used to be offline options available. You could play against the computer, or against your friends (or anyone you could tolerate being around). No need to server hop until you find a group of players who at least have no microphone hooked up. But now the casual multiplayer gamer is stuck in a never-ending match of frustration. And what excuse do the developers give for the lack of cooperative modes and splitscreen options? "The hardware can't handle splitscreen. It takes too much processing away from the graphics card." Which is, of course,

B U L L S H I T .

This can be refuted by one simple fact: the existence of games from the current console generation that have these options. I've already mentioned Call of Duty: Black Ops, but what I didn't mention, is how they accomplish two-player splitscreen. I noticed it when I first started playing the game with a friend. Basically, Treyarch tones down the graphics quality for multiplayer. And before you can say "I bet it's uglier than Left 4 Dead on the XBox 360", let me tell you, the game still looks great. Ambient lighting, detailed shadows and water, bullet casings and bodies everywhere. Halo 3, from my experience, doesn't even need to tone anything down. It remains beautiful with large, open maps. Finally, as I also mentioned previously, a lot of games run the Unreal 3 Engine. This does support splitscreen (though, oddly, Unreal 3 for the Playstation 3 is missing it), as my friend managed to get it working by modding the engine on his computer. Appeal from personal experience, sure, but it's easy enough for anyone else to replicate with their own copy of the Unreal 3 SDK.

Now, because I can, let me list some games from the previous console generation that had these options without any problems.

Game
Splitscreen Multiplayer
Co-op
Bots
TimeSplitters
X
X
X
TimeSplitters 2
X
X
X
TimeSplitters: Future Perfect
X
X
X
Red Faction
X
Red Faction 2
X
X
Tribes: Aerial Assault
X
X
X
GoldenEye: Rogue Agent
X
Grand Theft Auto: San Andreas
X
Half-Life (PS2)
X
Killzone
X
X
Quake III: Revolution
X
X
Unreal Tournament
X
X
Warhammer 40,000: Fire Warrior
X
XIII
X
X

Mind you, that's just from my personal collection, and only my Playstation 2 games. Those are all shooters, with the exception of Grand Theft Auto: San Andreas, which I left as an exception on purpose; despite being an open-world game with a huge overworld, it still had a limited form of co-op. It was standard practice to include at least one of these features not too long ago. Yeah, the games didn't look as great as they do today, but they still looked pretty damned good (especially Killzone). But in the end, who cared? My friends and I were just happy to be able to blast each other around the warm glow of a single television.

No comments:

Post a Comment