As we have discussed, the micro stutter problem can affect gaming performance. This isn't true of all circumstances though because we all have varying levels of perception.
Our perception of frames per second varies from one person to the next. On a personal level, I would struggle to visually notice a difference at anything above 50 FPS. You may have a visual threshold of higher than this or indeed lower. Each of our perceptions are purely personal and cannot be attributed to a 'one size fits all' measurement. Ask on any enthusiast forum and people will claim they can tell which card is faster simply by playing the game it is being tested on, even if the rendering speed is so huge the human eye cannot keep up with the frames being produced. So having a GPU that is capable of rendering frames at twice this perceived speed may seem like overkill however this is not the case. The more frames that are rendered consistently the more fluid the game will feel. The key word being consistently. So having a high frame rate still holds it's worth and we are not for one minute saying you should simply enable vsync and be done with it, not only because vsync has its own problems (an article for another day) but because as we will see, vsync is not the solution.
So how do we measure micro stutter? There are numerous ways to measure this, the first is by the 'feel' of the game. Clearly though this cannot be conveyed to you, the reader of a review. What we need is some sort of graphical representation.
This leaves us with two methods. Perhaps the most accurate method would be to have a dedicated video capture card in a totally separate unit with multiple, high capacity SSD's to capture the gaming sequence with minimal interruption either by software or hardware with which to capture and analyse each frame produced. This software can then be analysed and used to present the information via a set of graphs. Or we could use FRAPS in combination with FRAFS, a simple software tool to take the frametimes recorded by FRAPS and transfer it into a simple line graph.
For the purposes of simplicity we will be using the FRAPS method. We readily concede that this method is not the definitive method with which to analyse frame times, the capture card method being preferred. It is however a method which anybody can use to replicate our results with no dedicated hardware required. With FRAPS, the frametimes are recorded at the software level rather than the hardware level so while there may be some issues amplified by the hardware level or some issues that may not be picked up by FRAPS, we believe FRAPS is accurate enough to analyse if there are any major issues, certainly ones which will be noticed by the end user of the GPU which is what should concern you, the potential buyer the most.
Frame Time Analysis
Using FRAPS to measure frame times is easy enough. Simply set FRAPS to measure 'Frametimes' (tick box) and drag and drop the resulting excel file into FRAFS Benchmark Viewer. In Blue Peter fashion; here are some we made earlier:
The above graph was taken during a short stint playing Crysis 3 at 1920x1080 with 4xMSAA on 'high' settings. In an ideal world what we want to see is a nice slim 'fuzzy' line stretching from one end on the chart to the other. What we do not want to see are 'spikes'. This spikes are delays in frame time rendering and are where stuttering will occur if they go above a given threshold. This brings us back to what you can and cannot perceive. Anything below 30ms I would doubt anyone could detect, go above this time and gaming experience may be affected, depending on your own perception. On a personal level, I cannot notice anything below 50ms so when interpreting results, I can only go by this figure as what is and isn't noticeable so it should be considered that my own perceptions may not match that of others. Only you, yourself can decide what is and isn't perceptible.
Random, thin spikes are not too much of an issue however should there be a block of spikes in close proximity to one another, especially if these spike jump quite high will likely be noticeable and interpreted as 'stutter'.
The 99th Percentile
To best explain what the 99th Percentile is imagine you have taken a Math test. You score 80 out of 100 (80 percent). You are however in the 99th percentile as only 1% of other people who took this test scored above your mark. The percentile does not look at the result itself but compares your result with others from the group who have taken the same test.
From the graph above we can see that the 99th Percentile is 31ms. This means the lowest frame time you can expect for 99% of the time or the measurement below which 99% of frames are rendered. This measurement omits the runts / oddities, giving us a more accurate picture of what is going on without the random spikes in frame latency. So in the case above 99% of the time, you can expect the graphics card to render frames faster than 31ms.