Graphics Performance Comparison

With the background and context of the benchmark covered, we now dig into the data and see what we have to look forward to with DirectX 12 game performance. This benchmark has preconfigured batch files that will launch the utility at either 3840x2160 (4K) with settings at ultra, 1920x1080 (1080p) also on ultra, or 1280x720 (720p) with low settings more suited for integrated graphics environments.

Fable Legends Beta: 3840x2160 Ultra, Core i7

Fable Legends Beta: 3840x2160 Ultra, Core i5

Fable Legends Beta: 3840x2160 Ultra, Core i3

When dealing with 3840x2160 resolution, the GTX 980 Ti has a single digit percentage lead over the AMD Fury X, but both are above the bare minimum of 30 FPS no matter what the CPU.

Fable Legends Beta: 1920x1080 Ultra, Core i7

Fable Legends Beta: 1920x1080 Ultra, Core i5

Fable Legends Beta: 1920x1080 Ultra, Core i3

When dealing with the i5 and i7 at 1920x1080 ultra settings, the GTX 980 Ti still has that single digit percentage lead, but at Core i3 levels of CPU power the difference is next to zero, suggesting we are CPU limited even though the frame difference from i3 to i5 is minimal. If we look at the range of cards under the Core i7 at this point, the interesting thing here is that the GTX 970 just about hits that 60 FPS mark, while some of the older generation cards (7970/GTX 680) would require compromises in the settings to push it over the 60 FPS barrier at this resolution. The GTX 750 Ti doesn’t come anywhere close, suggesting that this game (under these settings) is targeting upper mainstream to lower high end cards. It would be interesting to see if there is an overriding game setting that ends up crippling this level of GPU.

Fable Legends Beta: 1280x720 Low, Core i7

Fable Legends Beta: 1280x720 Low, Core i5

Fable Legends Beta: 1280x720 Low, Core i3

At the 720p low settings, the Core i7 pushes everything above 60 FPS, but you need at least an AMD 7970/GTX 960 to start going for 120 FPS if only for high refresh rate panels. We are likely being held back by CPU performance as illustrated by the GTX 970 and GTX 980 Ti being practically tied and the R9 290X stepping ahead of the pack. This makes it interesting when we consider integrated graphics, which we might test for a later article.  It is worth noting that at the low resolution, the R9 290X and Fury X pull out a minor lead over the NVIDIA cards. The Fury X expands this lead with the i5 and i3 configurations, just rolling over to the double digit percentage gains.

Fable Legends Early Preview: DirectX 12 Benchmark Analysis CPU Scaling
Comments Locked


View All Comments

  • Gotpaidmuch - Thursday, September 24, 2015 - link

    Sad day for all of us when even the small wins, that AMD gets, are omitted from the benchmarks.
  • Oxford Guy - Thursday, September 24, 2015 - link

    "we are waiting for a better time to test the Ashes of the Singularity benchmark"
  • ZipSpeed - Thursday, September 24, 2015 - link

    The 7970 sure has legs. Turn down the quality down one notch from ultra to high, and the card is still viable at 1080p gaming.
  • looncraz - Thursday, September 24, 2015 - link

    As a long-time multi-CPU/threaded software developer AMD's results show one thing quite clearly: they have some unwanted lock contention in their current driver.

    As soon as that is resolved, we should see a decent improvement for AMD.

    On another note, am I the only one that noticed how much the 290X jumped compared to the rest of the lineup?!

    Does that put the 390X on par with the 980 for Direct X 12? That would be an interesting development.
  • mr_tawan - Thursday, September 24, 2015 - link

    Well even if UE4 uses DX12, it would probably just a straight port from DX11 (rather than from XBONE or other console). The approach it uses maynot flavour AMD as much as Nvidia, who know ?

    Also I think the Nvidia people would have involved with the engine development more than AMD (due to its developer relationships team size I guess). The Oxide games also mentioned that they got this kind of involvement as well (even if the game is AMD title).
  • tipoo - Thursday, September 24, 2015 - link

    Nice article. Looks like i3s are going to only get *more* feasible for gaming rigs under DX12. There's still the odd title that suffers without quads though, but most console ports at least should do fine.
  • ThomasS31 - Thursday, September 24, 2015 - link

    Still not a Game performance test... nor a CPU.

    There is no AI... and I guess a lot more is missing that would make a difference in CPU as well.

    Though yeah... kind a funny that an i3 is "faster" than an i5/7 here. :)
  • Traciatim - Thursday, September 24, 2015 - link

    This is what I was thinking too. I thought that DX12 might shake up the old rule of thumb saying 'i5 for gaming and i7 for working' but it seems to be this still holds true. In some cases it might even make more sense budget wise to go for a high end i3 and sink as much in to your video card as possible rather than go for an i5 depending on where your budget and current expected configuration are.

    More CPU benchmarking and DX12 benchmarks are needed of course, but it still looks like the design of machines isn't going to change all that much.
  • Margalus - Friday, September 25, 2015 - link

    this test shows absolutely nothing about "gaming". It is strictly rendering. When it comes to "gaming" I believe your i3 is going to drop like a rock once it has to start dealing with AI and other "gaming" features. Try playing something like StarCraft or Civilization on your i3. I don't think it's going to cut the muster in the real world.
  • joex4444 - Thursday, September 24, 2015 - link

    As far as using X79 as the test platform here goes, I'm mildly curious what sort of effect the quad channel RAM had. Particularly with Core i3, most people pair that with 2x4GB of cheap DDR3 and won't be getting even half the memory bandwidth your test platform had available.

    Also fun would be to switch to X99 and test the Core i7-5960X, though dropping an E5-2687W in the X79 platform (hey, it *is* supported after all).

Log in

Don't have an account? Sign up now