Back in my day (I’m currently 27, but that’s like 256 in video game years) playing a video game with a story meant a lot of reading. Like, a novel’s worth.
When I was little, I would read books, but I would also play Chrono Trigger, and those were very similar actions, as far as I was concerned. Chrono Trigger required a little more active thought, but they were of roughly equal literary value (I still feel that way, though the writing in some of those 90’s-era RPGs was a bit hokey).
But in the last decade, playing a game with a storyline has become a lot more like watching a movie. There are cutscenes that James Cameron would be proud of, skillfully written plot arcs, and voice work by actual real people. At one point, voice acting meant having something to listen to while you read the dialog box in major events. Game development companies lacked the means or the desire to fully voice a game, so you’d get a few lines with highly compressed audio wherever the developers felt like it.
Today, voice acting in video games is so prevalent, that what was once known as the “text box” is now known as “subtitles.” And their default setting when you start the game is “off.”
This a big reason why video games have flooded the mainstream media in recent years. In the same way that movies are more prevalent in the mainstream than books (though most movies are based on books), new games are more mainstream than old games (though most new games are based on old games).
I don’t particularly wish to make a value judgement on whether that’s a good or bad thing in terms of the literary value of modern video games, but it does present a new challenge for those who create narratives in the digital age:
Voice acting is a thing. In the same way bad acting can throw off the impact of a good movie, bad voice acting can ruin a well-written story. The delivery is every bit as important as the composition itself. Continue reading