Now that the next generation of gaming is officially underway, the inevitable discussion about graphics has come to the forefront. It seems like no matter where you go, you'll always encounter videos or articles discussing the power discrepancies between the PS4 and Xbox One. The focus is always on the resolution of certain games, but you never hear anybody discuss the most important improvement new hardware offers games: better performance.
This is something that we haven't really seen since the days of the PS2 or the original Xbox, which embraced the new hardware for performance standards. Prettier graphics and higher resolutions are always nice, and when given the choice between the better looking game and the lesser looking game, the choice is immediately obvious, but they're merely cosmetic. With the exception of non action or story driven games, when it comes to a large number of titles, the most important aspect is the gameplay, and if there is one thing that I think all console games should strive to achieve, it's a steady 60 frames per second.
This really needs to be clarified because, amongst gamers, there are a lot of misconceptions and misunderstanding when it comes to frame rates in video games and how important the frame rate is to the gaming experience. I encounter a lot of people who don't seem to understand why I like more powerful hardware despite the fact that I don't care that much about graphics, but it all comes down to the frame rate and the impact it has on gameplay.
For people who aren't really aware of what a frame rate is, essentially the frame rate is how many images are displayed within a second. A frame can be viewed as a still image due to the fact that all viewing experiences, whether it be on TV or in a video game, is just a series of images being displayed at a fast speed to simulate motion. It is commonly accepted that at least 12 frames need to be displayed per second to simulate motion, but it will still appear choppy. The more frames displayed per second, the smoother the motion is. Most people own monitors or TV's that can display up to 60 frames per second. With film, the standard frame rate is 24 frames per second, which is fine for film, but the difference between film and gaming is that gaming is interactive.
Just like an action movie, a good coat of paint is always nice, but if there is no substance, the game is lacking. We've seen this with lots of indie games, which embrace great game design over ultra realistic visuals. This I understand fully and regardless of what platform has more powerful hardware, the games come first. This is true and this will always trump graphical fidelity. I would go back and play one of my all time favourite games - Crash Bandicoot 3 - over a lot of stuff that's been released in recent years, despite the visuals of the game. However, this doesn't mean that I dislike good hardware. Good hardware is always a plus because it enables developers to flex their muscles, not only when it comes to graphics, but also AI, world design, content and more, but even more significantly, games have a higher tendency to achieve higher frame rates (the goldilocks zone being 60 frames per second), a frame rate that is substantial for gaming, but also economically viable with affordable hardware.
This really came to the forefront for me when Square Enix announced that Tomb Raider: Definitive Edition would run at close to 60 FPS on the PlayStation 4, but closer to 30 FPS on the Xbox One. This truly baffled me because we already know they could have easily lowered the resolution on the Xbox One version to hit a solid 60 frames without consistent frame rate drops.
A lot of people have misunderstood my stance on the Xbox One, especially when I state how the hardware of the Xbox One seems to be lacking compared to the PS4 and, as a result, some games have had lower frame rates than their PS4 counterparts, which unlike resolutions and graphics is actually vitally important as the frame rate is potent to the gaming experience. The problem is that this issue can be easily resolved by simply lowering the resolution which in my eyes, would make both versions identical when it comes to gameplay.
It is undeniable that a consistently higher frame rate is important in games. Higher frame rates are better for a number of reasons. Not only does action appear smoother, but it also aids in faster reaction times as the information the player needs to make quick decisions during gameplay will be displayed earlier, which is vital to particular games such as character action games and multiplayer games. In fighting games, it's the norm because they understand how important keeping a high frame rate is in a genre where fast reaction times are a necessity. Really, it is largely part of the reason why Call of Duty has remained such a popular shooter, because graphics aside, the game runs buttery smooth and keeps the action fast with a solid 60 FPS. Regardless of how you lean, a higher frame rate is always better, so long as the frame rate is consistent and doesn't constantly drop, which has been an issue with many next gen titles such as Killzone: Shadow Fall which has an uncapped frame rate, but it goes all over the place. This is partly the reason why I became a PC gamer, because I always adjust my settings to reach a solid 60 frames per second instead of ultra realistic graphical fidelity.
My advice to developers is to ensure that 60 frames per second is the goal with most games. Put resolution aside and focus on ensuring that your game runs buttery smooth and in a way that ensures reaction time is not hindered, because graphics age faster than gameplay. Games like Devil May Cry and even Ratchet and Clank strived for a solid 60 frames on reasonably dated hardware, and to be honest, the gameplay in those titles feels a lot smoother and more enjoyable than some recently released titles. Anything that can run consistently above 30 is acceptable, but once the frame rate dips below 30, that's when it becomes unacceptable.
I can completely understand some of the concerns with the power of the next gen hardware already being capped because it makes people wonder what the point is of upgrading to a supposed "next gen" console. It's ironic then that the supposedly anemic Wii U has more 60 frames per second games than the PS4 or Xbox One. Despite this, I think everybody should still really focus on the games when they're making the decision about which next gen console to buy, because regardless of graphics, both versions would feel and play the same, and people play games, not watch them. In my eyes, they would be at their most identical in terms of the gameplay experience, which is the most important thing at the end of the day.