CrossFire Gaming Performance

The RD580 chipset brings ATI Dual X16 CrossFire video to the marketplace, so both the major players in the video market now have flagship dual X16 solutions. SLI and CrossFire are about gaming, so CrossFire tests were confined to gaming benchmarks, and the test suite is heavily slanted to recent and popular titles where SLI and CrossFire make the biggest difference.

However, the practical reality today is that NVIDIA SLI only works on NVIDIA boards, and ATI CrossFire works on ATI and Intel boards. This limits our ATI Dual X16 testing to CrossFire. All CrossFire testing was at 1600x1200, 4X AA, and 8X AF. Tests were also run with a single X1900XT at this same resolution. The single video results are included for reference in the results for each game. Single video results on the DFI CFX3200 are in Orange and the DFI CrossFire results are in Red.

Gaming Performance - Crossfire


Gaming Performance - Crossfire


Gaming Performance - Crossfire


Gaming Performance - Crossfire


Gaming Performance - Crossfire


Gaming Performance - Crossfire


With 4X AA and 8X AF turned on at 1600x1200 resolution, CoD2, SS2, HL2LC and SCCT all show huge performance gains with CrossFire. There are smaller, but still significant gains with F.E.A.R. in CrossFire. However, look closely and you will see that Far Cry shows no improvement at all with a CrossFire platform compared to a single X1900XT, showing that this older title is CPU limited with the latest graphics cards. The DFI CFX3200 was generally a very good performer in CrossFire mode, topping the comparisons in most benchmarks. After all the early issues with the RDX200 and CrossFire this should be welcomed news for potential buyers.

We complained about ATI's CrossFire interface in our last CrossFire review, and it still remains clumsy and anything but intuitive. To use CrossFire you MUST install Catalyst Control Center. If CCC is not installed CrossFire will not work. Since many users don't install CCC (due to the large memory footprint and an interface that quite a few people dislike), this is not an ideal solution for CrossFire. Once CCC drivers are installed, you must also get the latest DirectX update from Microsoft - at least the February 2006 update is required, or you won't get the CrossFire tab in CCC. Then you must go into CCC, select the CrossFire tab and enable the feature.

This is a MANUAL procedure - there is no warning at all that you have a CrossFire capable board or that you have to go into CCC to enable CrossFire. NVIDIA notifies you that an SLI-capable system is installed and prompts you to enable SLI. There is no alert at all with ATI. The only clue you will have that CrossFire is NOT turned on is the poor performance results. THEN you start looking for what's wrong. ATI really needs to fix this issue; CrossFire performance is very good, but it would be much nicer if ATI made it easier to turn on CrossFire, as well as providing a clear way to determine that CF is functioning properly (a la NVIDIA's "Show GPU load balancing" option).

Standard Gaming Performance Overclocking
Comments Locked

25 Comments

View All Comments

  • rqle - Monday, May 8, 2006 - link

    "...breathlessly waiting for DFI's AM2 and Conroe motherboards."
    Great board, but not sure where this new mainboard will fit in since AM2 is coming, many can opt for the nforce expert if they need a board before AM2.

    hoping AM2 version is in the works and will be release soon as well.
  • electronox - Monday, May 8, 2006 - link

    *sigh*

    as far as gaming benchmarks go, what we really need to learn to do is to focus on the lowest framerates rather than the highest framerates (or even the average framerate). fink, anand, and co., you guys offer a progressive tech-journalism and no doubt have thought about what FPS performance really means.

    in its most important application, FPS performance means the ability to convey a smooth, fluid visual experience without noticeable dips or jerks in motion. sadly, with the way things are marketed now, the overall fluidity of gaming is sacrificed to reach those peak framerates we all obsess about in our benchmarking suites.

    as a long time gamer and enthusiast-sector consumer, i wish such high profile websites as yours would pay more attention to the worst parts of FPS gaming - the parts of the game where the intensity of in-game content is notched up, but often our video settings must be turned down in order to prevent epileptic siezures. such media attention might, in turn, lead industry developers to optimize their drivers for this exceedingly common problem which, in my opinon, is just as easily quantifiable and ever bit as important as average FPS performance.

    my thoughts, electronox.
  • Dfere - Monday, May 8, 2006 - link

    I have to agree. I make good money, but I no longer have the time to play with bleeding edge components and do modding. I know this is an enthusiast site, but at least for me , and I think a large amount of readers, an analysis of the max you might get out of a bleeding edge system is not all the value your site brings. A lot of posts by the readers show they have mid range systems. Thus I can only agree that an analysis of the FPS "issues" described above with a mid range system would help readers identify what would best go with their current system, not just a top of the line upgrade. I know your testing tries to determine , for example, CPU limits or GPU limits...... but it really only does so on bleeding edge systems..... and these comments were already mirrored in the latest AGP vid card releases......(why compare a new AGP card with new processor when most AGP owners have 754 systems.... etc)
  • JarredWalton - Monday, May 8, 2006 - link

    I think it all depends on what game you're talking about, and how the impact is felt in the fluidity of the FPS score. These days, the vast majority of first-person shooters have a pretty consistent FPS, at least in normal gaming. In benchmarks, you're often stressing the games in a somewhat unrealistic sense -- playing back a demo at three or four times the speed at which it was recorded. Why does that matter? Well, depending on the game engine, loading of data can occur in the background without actually slowing performance down much, if at all. In a time demo, you don't generally get that capability, since everything moves much faster.

    There are several other difficulties with providing minimum frame rates. Many games don't report instantaneous frames per second and only provide you with the average score. (Doom 3, Quake 4, Call of Duty 2, Half-Life 2, Day of Defeat: Source all generate scores automatically, but don't provide minimum and maximum frame rates.) If we notice inconsistent frame rates, we do generally comment on the fact. About the only game where I still notice inconsistent frame rates is Battlefield 2 with only 1GB of RAM -- at least on a system of this performance level. (I suppose I should throw in Oblivion as well.)

    Sure, we could use tools like FRAPS together more detailed information, but given that there's a limited amount of time to get reviews done, would you rather have fewer games with more detailed stats, or more games with average frame rates? Realistically, we can't do both on every single article. Our motherboard reviews try to stay consistent within motherboards, our processor reviews do the same within CPU articles, and the same goes with graphics cards and other areas. If we have an article where we look at results from one specific game, we will often use that to establish a baseline metric for performance, and readers that are interested in knowing more about the benchmark can refer back to that game article.

    Average frame rates are not the be-all, end-all of performance. However, neither are they useless or meaningless. we run into similar problems if we report minimum frame rates -- did the minimum frame rate occur once, twice, frequently? As long as people understand that average frame rates are an abstraction representing several layers of performance, than they can glean meaning from the results. You almost never get higher average frame rates with lower minimum frame rates, or conversely lower average frame rates with higher minimum frame rates -- not in a single game. In the vast majority of benchmarks, an increase in average frame rate of 10 FPS usually means that minimum frame rates have gone up as well -- maybe not 10 FPS, but probably 7 or 8 FPS at least.

    In the end, without turning every article into a treatise on statistics, not to mention drastically increasing the complexity of our graphs, it's generally better to stick with average frame rates. Individual articles may look at minimum and maximum frame rates as well, but doing that for every single article that uses a benchmark rapidly consumes all of our time. Are we being lazy, or merely efficient? I'd like to think it's the latter. :-)

    Regards,
    Jarred Walton
    Hardware Editor
    AnandTech.com
  • OvErHeAtInG - Monday, May 8, 2006 - link

    Good answer :) Also I think that minimum framerates (while very important in gameplay) are much more impacted by the videocard used. With a motherboard review, we're much more concerned with overall performance, which is exactly what you gave us with the avg. framerate numbers...

Log in

Don't have an account? Sign up now