To Sync or to Sheer, That is the Question.
When I said Sync, I of course meant VSync but that would have sounded rubbish. Ever seen a live interview where the host has finished a question and the guest nods there head for embarrassing seconds while the audio bounces off a satellite and into the space between their ears? No doubt you’ve witnessed the wonders of cutting edge communication. Those hyper-intelligent handsets that that allow us to hold conversions without ever completing a sentence?
Vsync in its traditional form instructed the GPU to never dispatch images at a frequency that exceeded the monitor’s refresh rate and moreover, to forward them in strict accordance with the latter’s “vertical blank” signal. Applying our analogy, this was the display’s way of saying to its willing supplier, “I’ve finished my sentence”, “over to you”. As the frame rate was now wholly dictated by the monitor’s requests, it was no longer possible for our player to suffer the visual penalties imposed by partially drawn images.
In short, no tears no teers (sic!) correct? Not quite, the advantage came at a price. Firstly, if the game was relatively lenient on resources, performance could be heavily impeded by the display. A monitor with a refresh rate of 60hz, was restricted to scanning 60 frames per second, irrespective of the GPU’s abilities. In scenarios where the video card would normally be processing frames at a far higher rate, this meant that its frame buffers were filled to capacity in a very short period and the GPU was forced to wait for pre-rendered data to be cleared before it duties could continue. To those with screens equipped to refresh at 120 and 144hz this was less of a hindrance since these bestowed an elevated workload on the GPU.
However, this in turn gave rise to a second and more troubling anomaly. If the game was exceptionally onerous and the GPU was unable to keep pace with the monitor’s demands, it was forced to regulate its frame rate in fixed steps derived from factors of the refresh rate. Hence, 60fps could not be subtly reduced to 50 then boosted to 56, it had to be rounded down in whole factors, to 30, 20, 15 and so on, then back up via the same sequence. The net effect was two aggravating side-effects. Stuttering and input lag. Often severe enough to mitigate the advantages of sheer and tear free motion.
Adapt and Survive?
Adaptive sync was Nvidia’s first official workaround and emerged within drivers that accompanied the launch of it’s GTX 680 in 2012. As the name suggests it worked with a modicum of intelligence, automatically activating VSync whenever the frame rate matched the monitor’s extremities and switching it it off it at all other times. This removed a sizeable percentage of tearing but at less of a cost to performance due to the GPU being allowed manage its frame rate in a “step-less” fashion. It was a fair compromise but far from a definitive and given the consequences already covered, the reasons are self-evident.
For one, even though the GPU was ordered never breach the monitor’s limits, it had not been told to use the v-blank signal as the only basis for data requests. Thus there remained the chance, albeit a slimmer one, that a frame would be sent to the screen whilst a refresh cycle was in progress. Moreover, in circumstances where the host computer harboured exceptional pixels powers, such as when multiple GPUs were installed, input lag could still be an issue as soon as the card’s memory reserves were expended.
So Now What?
For many seasoned veterans, the choice was obvious and one which they shall steadfastly favour until an unconditional remedy materialises. I’d hoped to coin a phrase for it. Grindhouse Gaming. The term “Grindhouse” is most often associated with 1970s cinema and is american slang for a theatre that was used to screen content of an explicit and exploitative nature, produced on comparatively meagre budgets and due to poor quality prints, regularly exhibiting the visual equivalent of pops and clicks on vinyl record.
The genre was paid tribute by Quentien Tarantino in his 2007 film “Deathproof” and from a technical perspective, everyone of its defining details was clinically observed. Colour bleed, dust and scratches on the film, cigarette burns as the reels were changed. How is this relevant? Simple, just as there were audiophiles who were tied to their turntables and film buffs who would settle for nothing less than a human projectionist, there are gamers who would gleefully embrace a warts and all experience in order to extract their video card’s full potential. As we know, with vsync out of the picture, the GPU was not compelled to align its flips with the display’s vertical blanks and could liberally unleash every image it had rendered , thereby eliminating buffer overflow, maximising speed and minimizing latency.