Three Way SLI and Quadfire Benchmarks (2010).

«»

1 2 3 4 5 6 7ALL

Welcome to a detailed comparison of two enthusiast grade video card configurations.  The first features three Nvidia GTX 580s running in tri-SLI, while the second includes a pair of ATI Radeon HD 5970s setup to operate in crossfire.  This extensive analysis is intended to show how both platforms perform in a series of individual tests made up of synthetic and game specific benchmarks.

At the time of testing, the most popular resolution for high quality gaming was 1920×1200 hence, that was the sole resolution applied throughout this comparison.  It is however worth noting that a significant portion, indeed, perhaps the majority of those with high-end graphics subsystems such as these will now either be using the highest 16:10 resolution commonly available on a single screen (2560X1600) or will have invested in additional monitors in order to take advantage of the multi screen gaming technology presently offered by Nvidia’s “3d Vision Surround” or ATI’s “Eyeinfinity”.  Therefore, it is also important to briefly raise the much debated issue  of “bottlenecking”, a process which occurs when the performance of a faster component (in this case the video cards) is directly or indirectly compromised by a slower one (in this case the CPU).

Even at the relatively high resolution used during these tests, symptoms of bottlenecking  were vividly evident, several results obtained from both setups showed little or no frame rate discrepancy, an obvious and well documented indication of the CPUs inability to keep up, this was especially noticeable when the default settings were engaged in either the application or video card’s control panel.  To lessen the effect and force the graphics cards to do as much work as possible at the chosen resolution, all of the tests were rerun using alternate settings for visual quality, or “eye candy” as it’s known in the trade!  When seeking to increase visual quality, two options are called upon above all others, Anti aliasing, to blend jagged edges and Anisotropic Filtering, to enhance the appearance of textures    Where made permissible by the application, the following profiles were used:

Low Quality – 1920×1200 – 0x aa and 0xaf– This is the default profile with both anti-aliasing (aa) and anisotropic filtering (af) disabled and thus, the lowest quality.
Medium Quality – 1920×1200 – 4x aa and 16xaf – This profile has anti-aliasing set to 4x (4 samples) and anisotropic filtering set to 16x, the maximum currently possible.
High Quality- 1920×1200 – 8xaa and 16xaf – The third profile has anti-aliasing set to 8x while anisotropic filtering remains at 16x.

If one or more of the above was not enforceable in a particular benchmark, initially within the software’s configuration options or failing that, via the video card’s control panel, the test was skipped and/or a “best match” alternative was run it its place.  Two such examples were 3dMark Vantage, where only the “extreme” test was run, which enables 4xaa and 16xaf by default and Aliens vs Predator where it was not possible to disable Anisotropic filtering, hence the low quality profile was amended to 0xaa and 16xaf.

«»

1 2 3 4 5 6 7ALL

Pages: 1 2 3 4 5 6 7