Until a year or so ago, all the graphics card reviewers were going by the conclusion: more FPS ⇒ smoother gameplay. Sure, there isn’t anything wrong with that conclusion, if you keep rest of the parameters the same. But that actually is not the case.
People who have bought AMD’s HD7900 series high-end cards have been really pissed to find out that their gameplay is not actually as smooth as those reviewers suggest. 60FPS means pretty smooth gameplay. You might not see any difference between 60FPS and anything higher. Funny thing is that, people are actually getting the FPS that the reviewers claim – there is no mistake there. But for some reason, gameplay isn’t smooth. You would see pauses every few seconds. If you run a frame rate counter, such as FRAPS or MSI Afterburner, you would not notice any glitches. But you are NOT feeling that the gameplay corresponds to which the FPS counter is showing.
Guess what? That’s because of “micro-stuttering”. The phenomenon called micro-stuttering happens when a certain frame takes abnormally longer to render than the previous subsequent frames did, and his being repeated over and over the entire time. (That last part is important to create the illusion of stutter.) If you plot the frame times in a graph, you would see spikes appearing roughly in similar gaps. Funny thing is, if you average out the FPS, you would not see a drop. That’s why you cannot go with just the FPS.
Look at the following example.
Say, you are getting 50fps in your game. It could either be that all of your frames took 20ms each to render (1000ms per second/20ms per frame = 50fps), or it could be that the first 49frames took 19 seconds each (931ms total), and the last one took 69ms to render. 50 frames, 1000ms ⇒ 50fps It would still give the 50FPS, but you would be seeing terrible micro-stuttering in the latter case.
So how do you measure this frame latency? You can actually take a look at the previous post HERE.
Micro-stuttering is usually non-existent or manageable in a single GPU environment, but it is much noticeable in a multi-GPU environment – especially, dual GPU environment. However, 3 or 4 GPU’s actually make things a bit better – for some unknown reason.
Computer Base, a German company, just published an article comparing the frame rate latencies between AMD HD7970 and NVidia GTX680. The results are not good for the red team (i.e. AMD). AMD card is showing terrible micro-stuttering in many of the popular games such as FarCry3, Hitman: Absolution, The Witcher 2 and Alan wake. However, GTX680 slows slight micro-stuttering in Battlefield 3, but it is not really something you would call micro-stuttering grade frame rate fluctuation.
What is causing this? Looks like drivers to be blamed here. AMD have pledged to fix these issues in a future driver version, so hopefully, there isn’t anything wrong with the GCN architecture. That would hurt AMD really bad. Micro-stuttering doesn’t happen in every game either, thus the game engine has a role to play too. Perhaps, there isn’t anything the developers can do. It should be related to how the actual rendering works. Some kind of caching issue perhaps? Some kind of threading issue in the drivers? Who knows. I’m no graphics driver developer. :)
But AMD should up their game. More and more reviewers are doing these frame rate latency checks and their verdict would be that you stick away from AMD’s high-end cards until the fix the drivers. I’m not a fan boy of either company, nor I’m using an AMD card. But I don’t want to see what happened between AMD and Intel CPU wars to happen in graphics card wars. We need competition if we need innovation.
No comments:
Post a Comment