What Is The Accepted Level Of Instability For New Games?
I imagine this varies from gamer to gamer but it's a more relevant question than ever:
Considering the sheer number of new games that release today with obvious problems, is there a new "accepted level" of instability that most consumers are willing to tolerate? Obviously, that level must be a lot higher than it used to be for console gamers.
One of the biggest reasons console gamers preferred their platforms over PC - back in the day - was simply because of how user-friendly and pain-free the consoles were in comparison. We understood PC was a technically superior platform but many of us simply didn't care, and we tended to prefer the games we had on consoles (JRPGs, sports, action/adventure, platformers, etc.). I mean, for the most part, if you didn't like RTS, FPS or WRPGs, there wasn't much point to having a high-end gaming rig.
But above all else, it was the "push a button and play" simplicity that we appreciated. It was the fact that the games would pretty much always work, and continue to work. In the age of patches and updates, it now seems like no big game releases that is as clean as it should be. I was worried about this the instant I understood that consoles were basically going the PC route, and I feel those fears have been realized. Now, we really do have to alter our expectations. We just have to accept that games will invariably release with problems.
So, taking that into account, what is the new accepted level of instability? At what point do we just throw up our hands and say, "this is unplayable; I'm waiting until you fix it"? As much as I loved The Witcher 3: Wild Hunt, I put many, many hours into that game and here we sit, over two and a half months after release, and it's only now approaching what I would consider to be an acceptable stability level. That's an extreme case, of course, but most games have issues. That much we know.
And it's annoying.
8/9/2015 9:51:12 PM Ben Dutka