Nvidia GeForce GTX 1080 review: A big leap, but not quite a 4K slayer

Nvidia’s GeForce GTX 1080 is no longer top dog in its GPU family – that honour now goes the GTX 1080Ti and, of course, the frankly ridiculous Titan Xp. It has, however, come down quite dramatically in price since I first looked at it, arguably making it a better buy than ever before if you’re after a 4K-capable graphics card. The economically monikered MSI Gaming X 8GB Twin Frozr VI pictured above, for instance, cost a wallet-breaking £695 a year ago. Now you can pick up one like Gigabyte’s equally succinct GeForce GTX 1080 Turbo OC for as little as £489 from Scan. Or, if you head over to our Black Friday 2017 hub, you can find one for just £439 over on Ebuyer. The 1080Ti, on the other hand, has remained at a steady £700 since launch.

A no-brainer, right? Not quite, as there’s also the GTX 1070Ti to think about as well, which costs even less at around £420 and promises near 1080 performance. We’ll be taking a look at the 1070Ti shortly, so we’ll update this page with our findings soon to let you know how we got on. For now, though, I’ll turn my attention back to the regular GTX 1080.

What I’m mainly interested here is whether the GTX 1080 feels like the double-generational jump I’ve mentioned previously. Is it truly that good? Of course, many flavours of Nvidia’s new GeForce GTX 1080 wonder chipset are available. So best to begin by clarifying the specifics of the card in question, the MSI Gaming X 8G. There are some hard points defined by Nvidia’s GP104 chip that lies at the heart of the GTX 1080 card / chipset / whatever.

An MSI Gaming X 8G Twin Frozr VI GeForce GTX 1080, yesterday

So, that’s things like 2,560 eye candy-creating shader cores, 160 texture units for, you know, texturing stuff, 64 render output units for spitting out finished pixels, a 256-bit memory bus and a healthy 8GB dollop of memory. All of that applies to any 1080.

Beyond that, retail cards differ from Nvidia’s so-called reference chipset. The standard core clockspeed and boost clock (the latter roughly equivalent to turbo mode in a CPU) are 1,607MHz and 1,733MHz. This MSI board tops out at 1,708MHz and 1,847MHz respectively.

The MSI also cranks the memory speeds up, but by such a tiny amount that I won’t bother with the details. Truth be told, the tweaks to the operating frequencies don’t amount to anything you’re ever going to feel in games. What you might notice is the cooling solution.

A standard Nvidia board has an enclosed or ducted impeller-type fan for pumping hot, GPU’ed air straight out of the chassis. Sounds like a good idea? Yup, but as it happens sound is the problem. That kind of cooling is relatively noisy. So MSI, like a lot of non-reference designs, has ditched all that in favour of larger and more conventional fans.

Custom cooling makes for silent running

In fact, MSI has rigged this board to power down the fans under low load, making it totally silent. Long story short, this kind of cooling setup typically makes for less din.

If that’s how MSI Gaming X 8G stacks up against the Nvidia reference board, my yardstick will actually be a Sapphire AMD Radeon 290 board. Obviously the 290 is a fairly old card now. So this isn’t about direct comparisons for purchase. Instead I’m interested in both how the 1080 feels in isolation and also the question of whether it really does dramatically improve on the subjective experience of a high-end card from a couple of generations back.

For logistical reasons too abstruse to divulge, my virtual playgrounds in this case extend to Total War: Attila, Witcher 3, that bloody Mordor game and GTA V. As for graphics settings, the general rule is maxxed out, but I’ve flipped a few knobs to ‘off’ that either can’t be used across both Nvidia and AMD cards (like Nvidia Hairworks) or I summarily and unilaterally judge to be waste of GPU cycles. Neither the game title choices nor the settings are scientific. It’s not the point. Complaints on a postcard, which will be filed in the circular receptacle beneath my desk with an autocratic flourish.

Two cards, one review…

Oh, and resolution-wise I sniffed around each game and with each card at 1,920×1,080, 2,560×1,440 and ye olde 4K, otherwise known as 3,840×2,160 pixels. Anyway, that’s how I’m rolling and the scene is set. But what did I learn?

First, the 1080 is not a universal 4K panacea. It batters Shadow of Mordor and its orcish malevolence into submission at 4K, no question. Ditto GTA V. Both feel super slick and super smooth on the 1080 at 4K. That’s not something you can say about the Radeon 290. Think just about playable but slightly juddery and you’ll have the right idea.

Then there’s Total War: Attila. At first, I thought the GTX 1080 had that nailed, too. Then I zoomed right in among the troops and surprisingly but undeniably noted that the buttery smoothness gave way to the unmistakable staccato that accompanies fewer frames being rendered.

Time to dust off that old Radeon 290…

The same goes for Witcher 3. In fact, the 1080 struggles just a little with Witcher, generally. It’s playable, but not truly effortless. I don’t want to get bogged down with talk of frame rates, but if I had to guess, I’d put the GTX 1080 at the low 30s in Witcher with my 4K settings.

The 290, of course, is a total mess in Attila at 4K, thoroughly unpleasant and unplayable. It copes a bit better with Witcher, but 4K is frankly beyond it. So the new card is a big step forward. And yet not actually the 4K killer I’d hoped for.

The step down to 2,560×1,440 is where I was confident coming in that Nvidia’s new chip would render all comprehensively asunder. And yet, the harsh truth is that it doesn’t. Not quite. Again, zoom into among the troops in Attila and a very slight drop off can be felt. And before you blame that on CPU limitations, that doesn’t happen at 1,920×1,080.

This is where even the mighty new GTX 1080 comes unstuck

As for Witcher 3, at first I thought the GTX 1080 had its measure at 2,560×1,440. But knock things down to ‘1080’ and there’s a tangible uptick in smoothness and response. It’s subtle, but it’s definitely there. Speaking of response, that remains a relevant issue for the GTX 1080. There’s definitely noticeably more input lag running Witcher 3 at 4K than lower resolutions. Of course, some games, like Shadow of Mordor, simply have laggy interfaces at any resolution. But the new 1080 doesn’t banish input lag to history.

All of which makes the conclusion regarding Nvidia’s GTX 1080 simple enough. It’s a clear and substantial step forward, there’s absolutely no question about that. But it’s not the multi-generational / er-me-gerd leap I was hoping for, nor the final answer to the will-it-play-4K question.

The price certainly makes it more tempting than it was a year ago, but until we’ve taken a look at the 1070Ti, it’s hard to say for sure whether you should take the plunge right now. There’s also AMD’s new Vega 64 card to factor in as well, which again, we’ll be looking at very soon. My advice would be to hold fire for now, and check back in just a few days time.

 

0 Comments

Leave a Comment

Login

Welcome! Login in to your account

Remember me Lost your password?

Lost Password