AMD's Richland vs. Intel's Haswell GPU on the Desktop: Radeon HD 8670D vs. Intel HD 4600
by Anand Lal Shimpi on June 6, 2013 12:00 PM ESTMetro: Last Light
Metro: Last Light is the latest entry in the Metro series of post-apocalyptic shooters by developer 4A Games. Like its processor, Last Light is a game that sets a high bar for visual quality, and at its highest settings an equally high bar for system requirements thanks to its advanced lighting system. This doesn’t preclude it from running on iGPUs thanks to the fact that it scales down rather well, but it does mean that we have to run at fairly low resolutions to get a playable framerate.
Looking at desktop parts alone, Intel really suffers from not having a socketed GT3 SKU. Although HD 4600 is appreciably faster than HD 4000 (+30%), both Trinity and Richland are around 17% faster than it. As you'll see, Metro ends up being one of the smaller gaps between the two in our suite.
As memory bandwidth becomes the ultimate bounding condition, the gap between Richland and Haswell shrinks considerably. Note that on the HD 4600 side, the difference between DDR3-1333 and DDR3-2400 is only 10% here. Given the limited performance of the 20 EU Haswell GPU configuration, it doesn't seem like Intel is all that bandwidth limited here.
BioShock: Infinite
Bioshock Infinite is Irrational Games’ latest entry in the Bioshock franchise. Though it’s based on Unreal Engine 3 – making it our obligatory UE3 game – Irrational had added a number of effects that make the game rather GPU-intensive on its highest settings. As an added bonus it includes a built-in benchmark composed of several scenes, a rarity for UE3 engine games, so we can easily get a good representation of what Bioshock’s performance is like.
If Metro was an example of the worst case scenario for Richland, BioShock: Infinite is the best case scenario. Here the Radeon HD 8670D holds a 50% performance advantage over Intel's HD 4600 graphics.
The gap narrows a bit at higher resolution/quality settings, but it's still 39%.
Sleeping Dogs
A Square Enix game, Sleeping Dogs is one of the few open world games to be released with any kind of benchmark, giving us a unique opportunity to benchmark an open world game. Like most console ports, Sleeping Dogs’ base assets are not extremely demanding, but it makes up for it with its interesting anti-aliasing implementation, a mix of FXAA and SSAA that at its highest settings does an impeccable job of removing jaggies. However by effectively rendering the game world multiple times over, it can also require a very powerful video card to drive these high AA modes.
Richland is approaching 60 fps in our Sleeping Dogs benchmark at medium quality, definitely not bad at all. The advantage over Intel's HD 4600 is 34%.
The performance advantage grows a bit at the higher quality/resolution settings, however we drop below the line of playability. With most of these games, you can trade off image quality for resolution however.
Tomb Raider (2013)
The simply titled Tomb Raider is the latest entry in the Tomb Raider franchise, making a clean break from past titles in plot, gameplay, and technology. Tomb Raider games have traditionally been technical marvels and the 2013 iteration is no different. iGPUs aren’t going to have quite enough power to use its marquee feature – DirectCompute accelerated hair physics (TressFX) – however even without it the game still looks quite good at its lower settings, while providing a challenge for our iGPUs.
Tomb Raider is another title that doesn't put Richland in the best light, but it still ends up around 23% faster than Haswell GT2.
Battlefield 3
Our multiplayer action game benchmark of choice is Battlefield 3, DICE’s 2011 multiplayer military shooter. Its ability to pose a significant challenge to GPUs has been dulled some by time and drivers at the high-end, but it’s still a challenge for more entry-level GPUs such as the iGPUs found on Intel and AMD's latest parts. Our goal here is to crack 60fps in our benchmark, as our rule of thumb based on experience is that multiplayer framerates in intense firefights will bottom out at roughly half our benchmark average, so hitting medium-high framerates here is not necessarily high enough.
Richland's performance in Battlefield 3 climbs around 30% over the HD 4600 regardless of quality/resolution.
Crysis 3
With Crysis 3, Crytek has gone back to trying to kill computers, taking back the “most punishing game” title in our benchmark suite. Only in a handful of setups can we even run Crysis 3 at its highest (Very High) settings, and the situation isn't too much better for entry-level GPUs at its lowest quality setting. In any case Crysis 1 was an excellent template for the kind of performance required to drive games for the next few years, and Crysis 3 looks to be much the same for 2013.
Crysis is another benchmark where we see an increase in performance in the low 30% range.
102 Comments
View All Comments
Will Robinson - Sunday, June 9, 2013 - link
LOL...NeelyCam must be crying his eyes out over those results.Good work AMD!
Wurmer - Sunday, June 9, 2013 - link
I still find this article interesting even if IGP are certainly not the main focus of gamers. I don't consider myself a hardcore gamer but I don't game on IGP. I am currently using a 560 GTX which provides me with decent performances in pretty much any situation. On the other hand, it gives an idea of the progress made by IGP. I certainly would enjoy more performance from the one I am using at work which is a GMA 4500 paired with a E8400. There are markets for good IGP but gaming is not of them. As I see it, IGP are more suited to be paired with low to mid CPUs which would make very decent all around machine.lordmetroid - Monday, June 10, 2013 - link
Using high end games that will never be played on the internal graphic processor is totally pointless, why not use something like ETQW?skgiven - Monday, June 10, 2013 - link
Looks like you used a 65W GT640, released just over a year ago.You could have used the slightly newer and faster 49W or 50W models or a 65W GTX640 (37% faster than the 65W GT640).
Better still a GeForce GT 630 Rev. 2 (25W) with the same performance as a 65W GT640!
(I'm sure you don't have every GPU thats ever been released lying around, so just saying whats there).
An i7-4770K, or one of its many siblings, costs ~$350.
For most light gaming and GPU apps, the Celeron G1610T (35W) along with a 49W GT640 would outperform the i7-4770K.
The combined Wattage is exactly the same - 84W but the relative price is $140!
Obviously the 25W GK208 GeForce GT 630 Rev. 2 would save you another $20 and give you a combined TDP of 60W, which is 40% better than the i7-4770K.
It’s likely that there will be a few more GT600 Rev.2 models and the GK700 range has to fill out. Existing mid-range GPU’s offer >5times the performance of the i7-4770K.
The reasons for buying an i7 still have little or nothing to do with its GPU!
skgiven - Monday, June 10, 2013 - link
- meant GTX645 (not GTX640)NoKidding - Monday, June 24, 2013 - link
i shudder to think what an a10 kaveri can bring to the table considering it'll be equipped with amd's gcn architecture and additional ipc improvements. low price + 4 cores + (possibly) hybrid xfire with a 7xxx series radeon? a great starting point for a decent gaming rig. not to mention that the minimum baseline for pc gaming will rise from decent to respectable.Silma - Friday, June 28, 2013 - link
Sometimes I really don't understand your comparisons and even less the conclusions.Why compare a Richland to a Haswell when obviously they will get used for totally different purposes? Who will purchase a desktop Haswell without graphic card for gaming? Why use super expensive 2133 memory with a super bad processor?
There are really 3 conclusions to be had:
- CPU-Wise Richland sucks aplenty
- GPU-Wise there is next to no progress as compared to Trinity, the difference being fully explained by a small frenquency increase.
- If you want cheap desktop gaming you will be much better server by a Pentium G2020 + Radeon HD6670 or HD 7750 for the same price as a crappy A6800 or A6700.
XmenMR - Monday, September 2, 2013 - link
You make me laugh. I normally do not post comments on these things based on the fact that I read them just to get a laugh, but I do have to point out how wrong you are. I have a G1620, G2020, i3-3240, A8, A10 and a more and have ran benchmarks with a 6450, 6570, 6670, 7730, 7750 and 7770 for budget build gaming computers for customers.Your build of a G2020 with a 6670 in my test was beaten, hands down by the A10-6800k hxf with 7750 (yes I said it, hybrid crossfire with 7750, it can be done although not popular supported by AMD). G2020 with 6670 will run you about $130, and an A10 with 7750 is about $230. To match the A10 hxf 7750 ($230 value) performance with Intel I did have to use 7750/7770 or higher with the Pentiums and I3+7750 ($210 value) did quite well but still was beaten in quite a few things graphics related.
Point being a discrete GPU changes the whole aspect of the concept. I3+7750 are very close to A10+hxf7750 in more ways than just performance, but that’s not the point of this Topic. It was AMD 8670D vs Intel HD 4600. I know lots of people that buy Intel i5 and i7 and live off the iGPU thinking one day they would have the money to get a nice GPU and call it good, %60 of the time this does not happen, new tech comes out that’s better and they just change their minds and try to get a whole new system. The APU on the other hand would have been cheaper and performed better for what they needed, had they just gone that road, and I am not the only one that came to that conclusion. AMD has done a great job with the APU and after testing many myself, I have become a believer. Stock i5 computer for $700 got smashed by stock A10 $400 in CS6 sitting side by side, I could not believe it. I do not have to argue how good the APU is doing because Microsoft and Sony have already done it. So I leave with a question. If the APU was not a fantastic alternative that delivers a higher standard of graphics performance, then why are they going to be used in the Xbox1 and PS4?
ezjohny - Tuesday, September 10, 2013 - link
when are we going to get a APU where you could go in game an adjust the graphic setting to very high with out a bottle neck!nanomech - Sunday, December 8, 2013 - link
This is a slanted review. The i7 with the separate Nvidia card skews the results, perhaps erroneously, toward Intel. How about the A10 with the same separate Nvidia card and/or the comparable separate AMD video card? The performance difference can be quite drastic.IMHO, one should compare apples to apples as much as possible. Doing so yields a much more complete comparison. I realize that these APUs tout their built-in graphic abilities, but Intel is trying to do so as well. It's the only way to give the CPU part of the APU a fair shake. That or leave the i7-Nvidia results out completely.