Intel’s Integrated Graphics Mini-Review: Is Rocket Lake Core 11th Gen Competitive?
by Dr. Ian Cutress on May 7, 2021 10:20 AM ESTIntegrated Graphics
For our main tests, there are areas to consider and determine if these processors are at the very least, usable. This means:
- Competition against Intel’s previous IGPs: Comet Lake (i9-10900K) or Broadwell (i7-5775C)
- Competition against other integrated graphics: AMD Ryzen R4000
- Competition against entry level graphics: GT 1030
When comparing these against entry level graphics, the usual consideration is for combined price comparison – what would be the equivalent CPU+GPU combo in price against the integrated graphics solution.
For example, the Core i5-11600K retails for $255. This is very expensive for a processor with integrated graphics – in the past integrated graphics solutions have targeted price points around $100-$160. That being said, AMD’s latest R4000 APUs cost $355, $262 and $170.
Our main data point is a Ryzen 5 2600 paired with a GT1030 2 GB. In normal times, this is a $200 (SEP) processor paired with an $85 graphics card – it has been as low as $120+$70, but currently they sit at $210+$114, which is quite a considerable jump. We also have a Ryzen 9 5950X paired with a GTX 950 2GB (75W) in the results as well.
Please note that some of this data includes notebook data, which was obtained during the loan period for that device. As a result, not all systems were tested on all benchmarks, depending on when the benchmarks were added to our testing (or were available at the time).
How Low is Too Low?
As part of the research for this article, we sent a poll out on social media regarding low framerate experiences have changes the way that users enjoy the games they play. The expectation was that a number of us have at some point enjoyed gaming on a complete potato of a system, reducing resolution or quality to match the budget hardware at our hands. Personally I remember playing Counter Strike on dual-core AMD 2011 netbook processors, with the low resolution texture pack, at 15 FPS at a LAN party. It was glorious. The results of the poll were as follows:
What's the lowest average FPS you've had where you've still been happy to play? Also, which game? [POLL]
— 𝐷𝑟. 𝐼𝑎𝑛 𝐶𝑢𝑡𝑟𝑒𝑠𝑠 (@IanCutress) May 4, 2021
As was perhaps expected, users gravitate towards the highest frame rates possible. Modern hardware is that good with common eSports titles that a minimum 30 FPS becomes a standard for a lot of people, especially those who have never known the hardship of old processor gaming. However, the comments were quite telling. Here are a selected few:
- Old laptop played Minecraft at 15-20 fps with low resolution. Those were the days
- A decade ago I was happy about anything above 10 FPS. Nowadays, bare minimum of 15, but usually 20-25 feels playable
- Played Witcher 3 on medium 1080p with a 960M, running 27-32 FPS. Manageable
- 15 FPS in Starcraft 2, I was still happy to play on a laptop with integrated graphics
- I can play Genshin Impact or FFXIV at fixed 30 FPS as long as it's low latency
- Assassin's Creed unity, 25 FPS felt ok
- Tomb Raider 2013 at 20-25 FPS at 768p on a 940M, I really liked those graphics
- 18 FPS in ETS2 was fine for two years, then I got a desktop
- Some games are locked to 25 FPS anyway, like San Andreas on PC
- 20-ish FPS on any Earth Defence Force Game
- Civilization 6 at about 30 FPS
- Ark Survival at 20-23 FPS
- Pretty much any console game over the last 10 years is 25-30 FPS anyway
- I used to play Black Ops 2 at 1024x768 at 14 FPS
- 21 FPS with Left 4 Dead 2, on a laptop 10 years ago at a LAN party!
- Diablo 2 is 25 FPS, so somewhere around that
- Crysis on an 8600GT, 20 FPS and no regrets
There are of course a similar amount of comments decrying anything below 60 frames per second on anything, although there do seem to be a contingent happy to play the right game at a less-than-ideal frame rate, and it really depends on the game. This is going to be important for the following graphs. We test best-case frame rates and modern resolution settings.
Gaming Tests: Civilization 6
Originally penned by Sid Meier and his team, the Civilization series of turn-based strategy games are a cult classic, and many an excuse for an all-nighter trying to get Gandhi to declare war on you due to an integer underflow.
Civilization 6 is one of the few titles where the enhanced core count and frequency of the Comet Lake processors puts it above Tiger Lake (1185G7), despite the Tiger Lake processor having substantially more execution units, higher graphics frequency, and better memory bandwidth. That being said, it sits behind most AMD modern APUs, mobile and desktop.
Gaming Tests: Deus Ex Mankind Divided
Deus Ex:MD combines first-person, stealth, and role-playing elements, with the game set in Prague, dealing with themes of transhumanism, conspiracy theories, and a cyberpunk future. The game allows the player to select their own path (stealth, gun-toting maniac) and offers multiple solutions to its puzzles.
At the lower settings, Rocket Lake sits above Comet Lake and Broadwell by a good margin, and is very much playable with 5th Percentiles above 30 frames per second. At the higher settings though, there aren't many options for playability.
Gaming Tests: Final Fantasy XIV
In 2019, FFXIV launched its Shadowbringers expansion, and an official standalone benchmark was released at the same time for users to understand what level of performance they could expect. Much like the FF15 benchmark we’ve been using for a while, this test is a long 7-minute scene of simulated gameplay within the title. There are a number of interesting graphical features, and it certainly looks more like a 2019 title than a 2010 release, which is when FF14 first came out.
Final Fantasy loves Intel processors here, and the average frame rates at the lower settings are easily playable. However at 1080p Maximum, the AMD APUs pull ahead while Rocket Lake struggles to be playable.
Gaming Tests: Final Fantasy XV
The game uses the internal Luminous Engine, and as with other Final Fantasy games, pushes the imagination of what we can do with the hardware underneath us. To that end, FFXV was one of the first games to promote the use of ‘video game landscape photography’, due in part to the extensive detail even at long range but also with the integration of NVIDIA’s Ansel software, that allowed for super-resolution imagery and post-processing effects to be applied.
All of our setups were below 5 FPS for our high resolution test, so we're saying with 720p here. At around 15-17 FPS, the Rocket Lake graphics are just about playable, although it will feel like an old time system. Anything from AMD at this point is more playable, and the Tiger Lake option showcases how much better Xe-LP can be with enough units and frequency.
Gaming Tests: World of Tanks
World of Tanks is set in the mid-20th century and allows players to take control of a range of military based armored vehicles. The game offers multiple entry points including a free-to-play element as well as allowing players to pay a fee to open up more features.
For World of Tanks, everything is very playable at these frame rates, however Rocket Lake is behind Broadwell, Intel's 5th Gen Core processor with eDRAM.
Gaming Tests: Borderlands 3
The fourth title of the franchise, Borderlands 3 expands the universe beyond Pandora and its orbit, with the set of heroes (plus those from previous games) now cruising the galaxy looking for vaults and the treasures within. Popular Characters like Tiny Tina, Claptrap, Lilith, Dr. Zed, Zer0, Tannis, and others all make appearances as the game continues its cel-shaded design but with the graphical fidelity turned up.
Borderlands 3 is our biggest min/max difference when comparing the 11900K to the 10900K. At the lower settings the Rocket Lake 11900K is actually behind the Comet Lake 10900K by a couple of percent. At the higher resolution and quality settings Rocket Lake is ahead by almost double, however it is in no-way actually playable. Users looking to crank up the quality are going to be looking for discrete graphics for sure.
Gaming Tests: F1 2019
The 2019 edition of the game features all 21 circuits on the calendar for that year, and includes a range of retro models and DLC focusing on the careers of Alain Prost and Ayrton Senna. This edition revamps up the Career mode, with features such as in-season driver swaps coming into the mix, and the quality of the graphics this time around is also superb, even at 4K low or 1080p Ultra.
F1 2019 is certainly a step up grom generation to generation on the desktop, and at the low settings is able to pip Broadwell into something playable at 60 FPS. The higher resolution testing is less playable, on par with a laptop with a basic MX150 graphics card, but very much behind any of AMD's desktop offerings and mobile Tiger Lake.
Gaming Tests: Far Cry 5
The fifth title in Ubisoft's Far Cry series lands us right into the unwelcoming arms of an armed militant cult in Montana, one of the many middles-of-nowhere in the United States. With a charismatic and enigmatic adversary, gorgeous landscapes of the northwestern American flavor, and lots of violence, it is classic Far Cry fare. Graphically intensive in an open-world environment, the game mixes in action and exploration with a lot of configurability.
At the lower settings, The 11th Gen series shows a good generation-on-generation jump, but still sits behind Intel's 5th Gen Broadwell. As the settings are ramped up however, Broadwell drops well behind, but the low frame rate from Intel still isn't enough to make it playable.
Gaming Tests: Strange Brigade
Strange Brigade is based in 1903’s Egypt, and follows a story which is very similar to that of the Mummy film franchise. This particular third-person shooter is developed by Rebellion Developments which is more widely known for games such as the Sniper Elite and Alien vs Predator series. The game follows the hunt for Seteki the Witch Queen, who has arose once again and the only ‘troop’ who can ultimately stop her.
Gears Tactics
Remembering the original Gears of War brings back a number of memories – some good, and some involving online gameplay. The latest iteration of the franchise was launched as I was putting this benchmark suite together, and Gears Tactics is a high-fidelity turn-based strategy game with an extensive single player mode. As with a lot of turn-based games, there is ample opportunity to crank up the visual effects.
While there are good gen-on-gen increases, Rocket Lake sits best as a low resolution option for Gears.
Red Dead Redemption 2
Building on the success of the original RDR, the second incarnation came to Steam in December 2019 having been released on consoles first. The PC version takes the open-world cowboy genre into the start of the modern age, with a wide array of impressive graphics and features that are eerily close to reality.
We didn't put the 1080p result here, because they were all very bad, but if you're happy to squint at the screen at what might be a cowboy and a horse, Rocket Lake is certainly playable, enough to tweak a few options higher.
Grand Theft Auto V
The highly anticipated iteration of the Grand Theft Auto franchise hit the shelves on April 14th 2015, with both AMD and NVIDIA to help optimize the title. At this point GTA V is super old, but still super useful as a benchmark – it is a complicated test with many features that modern titles today still struggle with. With rumors of a GTA 6 on the horizon, I hope Rockstar make that benchmark as easy to use as this one is.
GTA to the max is a slideshow, but very playable at 720p Low. The 11900K is happy to sit above Tiger Lake by the slimmest of margins for a rare spot of glory.
165 Comments
View All Comments
mode_13h - Friday, May 7, 2021 - link
> I didn't see that the title question was answered in the articleI think they presume that piece of meat behind your eyes is doing more than keeping your head from floating away. Look at the graphs, and see the answer for yourself.
However, the article does in fact sort of answer it, in the title of the final page:
"Conclusions: The Bare Minimum"
mode_13h - Friday, May 7, 2021 - link
> unless Dr. Ian Cutress is asking whether Intel's current IGPs are "competitive"> with older Intel IGPs...which would seem to be the case.
As is often the case, they're comparing it with previous generations that readers might be familiar with, in order to get a sense of whether/how much better it is.
And it's not as if that's *all* they compared it against!
dwillmore - Friday, May 7, 2021 - link
So your choices are postage stamp or slide show? No thank you.Oxford Guy - Friday, May 7, 2021 - link
My favorite part of the Intel CPU + Intel GPU history is Atom, where serious hype was created over how fabulously efficient the chip was, while it was sold with a GPU+chipset that used — what was it? — three times the power — negating the ostensible benefit from paying for the pain of an in-order CPU (a time-inefficient design sensibly abandoned after the Pentium 1). The amazing ideological purity of the engineering team’s design goal (maximizing the power efficiency of the CPU) was touted heavily. Netbooks were touted heavily. I said they’re a mistake, even before I learned (which wasn’t so easy) that the chipset+GPU solution Intel chose to pair with Atom (purely to save the company money) made the whole thing seem like a massive bait and switch.mode_13h - Friday, May 7, 2021 - link
> fabulously efficient the chip was, while it was sold with a GPU+chipset that used> — what was it? — three times the power
Well, if they want to preserve battery life, maybe users could simply avoid running graphically-intensive apps on it? I think that's a better approach than constraining its graphics even further, which would just extend the pain.
I'm also confused which Atoms you mean. I'm not sure, but I think they didn't have iGPUs until Silvermont, which was already an out-of-order core. And those SoC's only had 4 EUs, which I doubt consumed 3x the power of the CPU cores & certainly not 3x the power of the rest of the chip.
What I liked best about Intel's use of their iGPUs in their low-power SoCs is that the drivers just work. Even in Linux, these chips were well-supported, pretty much right out of the gate.
TheinsanegamerN - Friday, May 7, 2021 - link
Graphically intensive apps, you mean like windows explorer and a web browser? Because that was enought o obliterate battery life.The original atom platform was awful. Plain and simple.
29a - Friday, May 7, 2021 - link
This^ Atoms were awful turning the computer on would be considered graphically intensive.mode_13h - Friday, May 7, 2021 - link
I still don't follow the logic of the Oxford dude. Would it really have been a good solution to put in even worse graphics, further impinging on the user experience, just to eke out a little more battery life? I'm not defending the overall result, but that strikes me as an odd angle on the issue.Indeed, if explorer and web browser were as much as their GPU could handle, then it seems the GPU was well-matched to the task.
Oxford Guy - Sunday, May 9, 2021 - link
You should learn about the Atom nonsense before posting opinions about it.The power consumption chipset + GPU completely negated the entire point of the Atom CPU, from its design philosophy to the huge hype placed behind it by Intel, tech media, and companies peddling netbooks.
It is illustrative of large-scale bait and switch in the tech world. It happened purely because Intel wanted to save a few pennies, not because of technological restriction. The chipset + GPU could have been much more power-efficient.
Spunjji - Monday, May 10, 2021 - link
You don't follow because you're trying to assess what he said by your own (apparently incomplete) knowledge, whereas what would make sense here would be to pay more attention to what he said - because, in this case, it's entirely accurate.Intel paired the first 45nm Atom chips with one of two chipsets - either the recycled 180nm 945 chipset, designed for Pentium 4 and Core 2 processors, or the 130nm Poulsbo chipset. The latter had an Imagination Technologies mobile-class GPU attached, but Intel never got around to sorting out working Windows drivers for it. In either case, it meant that they'd built an extremely efficient CPU on a cutting-edge manufacturing process and then paired it with a hot, thirsty chipset. It was not a good look; this was back when they were absolutely clobbering TSMC on manufacturing, too, so it was a supreme own-goal.