I’m going to start this week’s issue of Critical Thinking off on a rather contentious note – gaming as a whole involves far less interaction than it used to be. It’s turning distressingly solitary.
It’s a trend I’ve noticed in passing for some time now, although it never occurred to me to organize my thoughts and address it until just recently. See, I was a child of the nineties. Although arcades are something of a distant memory to me, I still fondly recall consoles such as the Nintendo 64. I still hold dear to my heart memories of wasting away the night playing Mario Kart, Super Smash Brothers, Mario Tennis, and Perfect Dark. Although I hate to gush, it was something of a golden age for party games: the most fun you could have on a console usually involved four friends sitting next to you on the couch.
More than anything, that was the real joy of it – kicking back and letting loose with a group of your closest friends.
Fast-forward a few decades, and with a few rare exceptions, video games – and gamers -are becoming more and more solitary. Name ten games released in the last two years with four-person local multi-player. As an added challenge, only two of those games can be on one of Nintendo’s consoles. See what I mean? Tough, isn’t it?
Don’t get me wrong, gaming is still social enough. Everybody’s willing to buzz on Facebook about their latest achievements; people co-ordinate with one another through VOIP clients in League of Legends; developers are making an active effort with next-gen consoles to promote ease of sharing and a sense of online community. At the end of the day, though? All that stuff simply isn’t the same as actual human contact, and it would be foolish to try pretending that it is.
In short, we’ve shifted our focus away from human contact towards what is honestly best termed a sad imitation.
In the case of consoles, it can be chalked up to three factors. The first is technical concerns. Like it or not, modern video game consoles simply might not have the power to support four-person local play with certain titles. That sounds absurd, I know (after all the Nintendo 64, with its 133 mHz processor was capable of doing so), but it’s still a possibility that’s worth considering. Secondly (and this could be considered something if a vicious cycle), online multiplayer is prevalent enough that most devs may not be willing to expend the time or effort to set up local play. Last, but certainly not least, the almighty dollar may well play a part. After all, if you’ve got four people playing on one console, that’s three people who haven’t bought the game.
That last one is a bit cynical, sure…but with how prevalent micro-transactions have become, you’ll have to forgive me a spot of cynicism.
The end result of all this is that many of us run the risk of becoming ever more isolated, even while under the illusion that we aren’t. Thankfully, the solution isn’t all that difficult: we just need to be more consciously sociable. In the meantime, perhaps there may come a day when social media – and online gaming – evolve to the point that it’s a suitable substitute for in-person play.
For the time being, though? “Social” definitely doesn’t mean what it used to.