The NVIDIA GeForce GTX 780 Ti Reviewby Ryan Smith on November 7, 2013 9:01 AM EST
Meet The GeForce GTX 780 Ti
When it comes to the physical design and functionality of the GTX 780 Ti, to no surprise NVIDIA is sticking with what works. The design of the GTX Titan and its associated cooler have proven themselves twice over now between the GTX Titan and the GTX 780, so with only the slightest of changes this is what NVIDIA is going with for GTX 780 Ti, too. Consequently there’s very little new material to cover here, but we’ll quickly hit the high points before recapping the general design of what has now become the GTX 780 series.
The biggest change here is that GTX 780 Ti is the first NVIDIA launch product to feature the new B1 revision of their GK110 GPU. B1 has already been shipping for a couple of months now, so GTX 780 Ti isn’t the first card to get this new GPU. However while GTX Titan and GTX 780 products currently contain a mix of the old and new revisions as NVIDIA completes the change-over, GTX 780 Ti will be B1 (and only B1) right out the door.
As for what’s new for B1, NVIDIA is telling us that it’s a fairly tame revision of GK110. NVIDIA hasn’t made any significant changes to the GPU, rather they’ve merely gone in and fixed some errata that were in the earlier revision of GK110, and in the meantime tightened up the design and leakage just a bit to nudge power usage down, the latter of which is helpful for countering the greater power draw from lighting up the 15th and final SMX. Otherwise B1 doesn’t have any feature changes nor significant changes in its power characteristics relative to the previous revision, so it should be a fairly small footnote compared to GTX 780.
The other notable change coming with GTX 780 Ti is that NVIDIA has slightly adjusted the default temperature throttle point, increasing it from 80C to 83C. The difference in cooling efficiency itself will be trivial, but since NVIDIA is using the exact same fan curve on the GTX 780 Ti as they did the GTX 780, the higher temperature throttle effectively increases the card’s equilibrium point, and therefore the average fan speed under load. Or put another way, but letting it get a bit warmer the GTX 780 Ti will ramp up its fan a bit more and throttle a bit less, which should help offset the card’s increased power consumption while also keeping thermal throttling minimized.
|GeForce GTX 780 Series Temperature Targets|
|GTX 780 Ti Temp Target||GTX 780 Temp Target||GTX Titan Temp Target|
Moving on, since the design of the GTX 780 Ti is a near carbon copy of GTX 780, we’re essentially looking at GTX 780 with better specs and new trimmings. NVIDIA’s very effective (and still quite unique) metallic GTX Titan cooler is back, this time featuring black lettering and a black tinted window. As such GTX 780 Ti remains a 10.5” long card composed of a cast aluminum housing, a nickel-tipped heatsink, an aluminum baseplate, and a vapor chamber providing heat transfer between the GPU and the heatsink. The end result is the GTX 780 Ti is a quiet card despite the fact that it’s a 250W blower design, while still maintaining the solid feel and eye-catching design that NVIDIA has opted for with this generation of cards.
Drilling down, the PCB is also a re-use from GTX 780. It’s the same GK110 GPU mounted on the same PCB with the same 6+2 phase power design. This being despite the fact that GTX 780 Ti features faster 7GHz memory, indicating that NVIDIA was able to hit their higher memory speed targets without making any obvious changes to the PCB or memory trace layouts. Meanwhile the reuse of the power delivery subsystem is a reflection of the fact that GTX 780 Ti has the same 250W TDP limit as GTX 780 and GTX Titan, though unlike those two cards GTX 780 Ti will have the least headroom to spare and will come the closest to hitting it, due to the general uptick in power requirements from having 15 active SMXes. Finally, using the same PCB also means that GTX 780 has the same 6pin + 8pin power requirement and the same display I/O configuration of 2x DL-DVI, 1x HDMI, 1x DisplayPort 1.2.
On a final note, for custom cards NVIDIA won’t be allowing custom cards right off the bat – everything today will be a reference card – but with NVIDIA’s partners having already put together their custom GK110 designs for GTX 780, custom designs for GTX 780 Ti will come very quickly. Consequently, expect most (if not all of them) to be variants of their existing custom GTX 780 designs.
Post Your CommentPlease log in or sign up to comment.
View All Comments
yuko - Monday, November 11, 2013 - linkfor me neither of them is gamechanger ... gsync, shield ... nice stuff i don't need
mantle: another nice approach to create an semi-closed-standard .. it's not that directX or opengl is allready existing and working quite good, no , we need another low level standard where amd creates the api (and to be honest, they would be quite stupid not optimizing it for their hardware).
I cannot believe and hope that mantle will flop, it does no favor to customers and the industry. It's just good for the marketing but has no real world use.
Kamus - Thursday, November 7, 2013 - linkNope, it's confirmed for every frostbite 3 game coming out, that's at least a dozen so far, not to mention it's also officially coming to starcitizen, which runs on cryengine 3 I believe.
But yes, even with those titles it's still a huge difference, obviously.
That said, you can expect that any engine optimized for GCN on consoles could wind up with mantle support, since the hard work is already done. And in the case of star citizen... Well, that's a PC exclusive, and it's still getting mantle.
StevoLincolnite - Thursday, November 7, 2013 - linkMantle is confirmed for all Frostbite powered games.
That is, Battlefield 4, Dragon Age 3, Mirrors Edge 2, Need for Speed, Mass Effect, StarWars Battlefront, Plant's vs Zombies: Garden Warfare and probably others that haven't been announced yet by EA.
Star Citizen and Thief will also support Mantle.
So that's EA, Cloud Imperium Games, Square Enix that will support the API and it hasn't even released yet.
ahlan - Thursday, November 7, 2013 - linkAnd for Gsync you will need a new monitor with Gsync support. I won't buy a new monitor only for that.
jnad32 - Thursday, November 7, 2013 - linkhttp://ir.amd.com/phoenix.zhtml?c=74093&p=irol...
Creig - Friday, November 8, 2013 - linkGsync will only work on Kepler and above video cards.
So if you have an older card, not only do you have to buy an expensive gsync capable monitor, you also need a new Kepler based video card as well. Even if you already own a Kepler video card, you still have to purchase a new gsync monitor which will cost you $100 more than an identical non-gsync monitor.
Whereas Mantle is a free performance boost for all GCN video cards.
Gsync cost - Purchase new computer monitor +$100 for gsync module.
Mantle cost - Free performance increase for all GCN equipped video cards.
Pretty easy to see which one offers the better value.
neils58 - Sunday, November 10, 2013 - linkAs you say Mantle is very exciting, but we don't know how much performance we are talking about yet. My thinking on saying that crossfire was AMD's only answer is that in order to avoid the stuttering effect of dropping below the Vsync rate, you have to ensure that the minimum framerate is much higher, which means adding more cards or turning down quality settings. If Mantle turns out to be a huge performance increase things might work out, but we just don't know.
Sure, TN isn't ideal, but people with gaming priorities will already be looking for monitors with low input lag, fast refresh rates and features like backlight strobing for motion blur reduction, G-Sync will basically become a standard feature on a brands lineup of gaming oriented monitors. I think it'll come down in price a fair bit too once there are a few competing brands.
It's all made things tricky for me, I'm currently on a 1920x1200 'VA monitor on a 5850 and was considering going up to a 1440p 27" screen (which would have required a new GPU purchase anyway) G-Sync adds enough value to Gaming TN's to push me over to them.
jcollett - Monday, November 11, 2013 - linkI've got a large 27" IPS panel so I understand the concern. However, a good high refresh panel need not cost very much and still look great. Check out the ASUS VG248QE; been hearing good things about the panel and it is relatively cheap at about $270. I assume it would work with the G-Sync but I haven't confirmed that myself. I'll be looking for reviews of Battlefield 4 using Mantle this December as that could makeup a big part of the decision on my next card coming from Team Green or Red.
misfit410 - Thursday, November 7, 2013 - linkI don't buy that it's a game changer, I have no intention of replacing my three Dell Ultrasharp monitors anytime soon, and even if I did I have no intention of dealing with buggy displayport as my only option to hook up a synced monitor.
Mr Majestyk - Thursday, November 7, 2013 - link+1
I've got two high end Dell 27" monitors and it's a joke to think I'd swap them out for garbage TN monitors just to get G Sync.
I don't see the 780 Ti as being any skin off AMD's nose. It's much dearer for very small gains and we haven't seen the custom AMD boards yet. For now I'd probably get the R9 290, assuming custom boards can greatly improve on cooling and heat.