The Question of Video Game Realism

I remember when video games were little more than pixelated characters bouncing about the screen. That was a simpler time: no one really talked about the power of a particular physics engine, or whether or not a game was ?realistic? enough, because it didn?t really occur to most of us to ask. It wasn?t something we considered a vital element of our gaming experience. Games were fun distractions, simple escapes from reality. They weren?t, in most cases, all that realistic, though some were quite complex and in-depth.

And that was perfectly alright.

We?ve witnessed something of a metamorphosis in the past decade. Realism?s the word, and everybody?s buzzing about it. A graphics engine that lets you see the sweat beading on your character?s forehead; a physics engine that can accurately predict how kernels of popcorn will hit the floor; artificial intelligence that actually reacts in a realistic, human fashion; all these are symptoms of both the march of technology and the acceptance of video games into mainstream culture.

It?s not difficult to understand why things have been progressing in such a fashion. After all, it?s a far simpler thing to immerse oneself in a title if the environment within feels like it?s a living, breathing world. People want realistic games, don?t they? They want an RPG Game where they can go and build a homestead near where they just killed that dragon; a shooter which makes them feel like they?re actually on the battlefield.

The quest for realism ?realistic graphics, realistic physics; realistic AI ? has become an active effort on the part of developers. But is that a good thing?

Yes and no.

On the one hand, ?realism? in the case of an open-world game means that the freedoms granted to the player could be positively staggering. Better AI means more enjoyable, more challenging gameplay. Better graphics means, well?.

Eye candy.

Basically, it means that games have the potential to get better and better.

On the other hand, we?re not going to experience clear sailing ? not by a long shot. For one, the obsession with making games more ?lifelike? could result in developers and publishers losing sight of what?s important ? entertainment and narrative. In the worst case scenario; the games industry could develop something known as The Hollywood Syndrome: pumping out a bunch of soulless, hyper-realistic games that give us little aside from impressive graphics.

There?s also the fact that the push towards graphical realism in gaming will, like it or not, be fraught with obstacles. I?m sure most of you have heard of the uncanny valley. That?s going to get worse before it gets better, folks. When it comes to pretty little pixels or blocky 3D models, it?s fairly obvious they aren?t human. Sure, they aren?t always that nice to look at, but they also don?t inspire spine-shitting terror.

The problem here is that, as game characters become more and more human, we?re forced to focus more and more on the tiny elements that make them inhuman ? something as simple as a verbal tic or the lack of proper eye movement could have you staring at a soulless, dead-faced terror instead of your character?s love interest. Erik Kain of Forbes put it best ? better graphics don?t always mean more realistic graphics.

As for AI, well?we?ve still got a long way to go, either way. Obscenely long development times don?t exactly help matters ? but that?s a topic for another day.

The push to make games more realistic is an admirable one ? but only in moderation. We shouldn?t ever forget that, as fun as it can be to completely suspend our disbelief and pretend we?re a badass military man or medieval knight, realism on its own isn?t what makes a game fun: at the end of the day, that?s what?s really important; not whether or not that bandit?s decapitated head flies off in a convincing fashion.

What do you folks think?

Leave a Comment