ATI HD 5970 Review

Author: Kevin Spiess
Editor: Howard Ha
Publish Date: Tuesday, November 17th, 2009
Originally Published on Neoseeker (http://www.neoseeker.com)
Article Link: http://www.neoseeker.com/Articles/Hardware/s/hd5970launchreview/
Copyright Neo Era Media, Inc. - please do not redistribute or use for commercial purposes.

Sooner or later you'll need to upgrade your GPU, if you are into gaming on your PC. It is one of those fundamental laws of existence. Of which there is no escape! But the big question when you go for that requisite upgrade is whether you go good enough, or go big. If you are the type that likes to 'go big' then you'll be interested in today's review of the new dual GPU titan from ATI, the HD 5970. Likeiwse, if you are interested in something that can handle HD videos and Civilization 4, then click your browser's back button before you see what you might be missing out on.

Most everyone and their pets saw today's review coming. Dual GPU video cards from ATI is nothing new and the HD 3870 X2 and the HD 4870 X2 are still fresh in many minds, and even leaked roadmaps declared a double-headed monster was on the way in the HD 5000 series. But what few people did guess was that today's card would be named the HD 5970, and not the HD 5870 X2.

Over a chat with ATI, they told us that they shifted to the HD 5970 name to help clear up some nomenclature for the average Radeon buyer. Last generation saw the HD 4890 competiting with the HD 4870 X2 -- and many folks assumed (apparently) that the HD 4890 was the faster of the two, not the more expensive dual-GPU HD 4870 X2. So, the new label, the HD 5970, better clarifies the gaming performance hierachry of the HD 5000 series. Makes sense to us. Not everyone is as informed as you probably are (as a hardware review reader), so the new name works, even if we were getting somewhat used to that "X2" label.  

Irregardless of whatever you want to call today's card, we can assure you even before you check out our first benchmark that this $600 USD creature is going to be fast. Fast as in, can't-get-any-faster-fast. The recently released HD 5870 doesn't have much trouble going through any new game, running them like they are out of 2004, not 2009; so certainly, a dual-GPU version is going to really zip. 

ATI's internal semi-codename for the HD 5970 is the Hemlock. We are not sure what trees have to do with this video card. Trees are much too slow.

Let's take a look-see at the HD 5970, shall we?

Impressions

You can tell in one glance that the HD 5970 is related to the rest of the HD 5000 series. The HD 5770, HD 5850, HD 5870 and HD 5970 all have a similar look, and the same bold black and red colors (and Batmobile-sytle exhaust vents) -- the only quick way to differentiate between them is their varying lengths. As you might have guessed, this means that the HD 5970 is the longest of the long, coming in at a just about a foot long.

The HD 5970 only has about an inch on the HD 5870, so that doesn't leave a lot of extra room free on the PCB. Because of this premium on the real estate, and the higher operating temperatures often found on dual GPU parts, ATI has switched the cooler up, from prior HD 5000 parts. The HD 5970 has a vapor chamber design cooler -- one which uses water vapors to help move heat off of the GPUs. This is similar to what was needed to keep the nova-hot R600 cool. 

The vapor chambers are assisted by cooling fins and the same fan found in the other members of the HD 5000 family. In normal operation the fan makes a reasonable, average amount of noise. If you ever do some serious gaming on a hot day however, and you start to really ratchet up those operating temperatures, the HD 5970 kicks the fan into high gear and it starts to get far more noticeable. Though it is very unlikely the fan will ever hit 100% power in gaming (at least for the next few years), if it does go to full power, then bust out a pair of ear plugs. 

On the bracket end of the HD 5970, you have the standard -- no wait, scratch that. What is this? This is the first time we've seen this at Neoseeker: we have two DVI ports, and one Mini DisplayPort port. Probably many of you haven't gotten used to DisplayPorts yet, so will be surprised to see this new smaller version of it. Mini DisplayPorts are functionally the same as DisplayPorts (HDCP DRM, and 2560x1600 top resolutions) -- but are much more commonly seen connecting Apple products together. If you have a 'regular' DisplayPort connector, there will no doubt be adapters to faciliate the conversion to the Mini DisplayPort standard.

You might have been hoping that a HD 5970 would be able to drive six displays, through the use of Eyefinity, but this is not the case. This is certainly one area where having two HD 5870s would be better than one HD 5970, but nonetheless, this will only affect a small percentage of users out there. Eyefinity, if you missed hearing about it earlier, is ATI's newest technology that allows for the creation of multi-monitor metadisplays with HD 5000 video cards. Each HD 5000 series card can handle up to three displays, and this number can climb up to 24 with multiple cards in a CrossFireX setup. Wait a second -- you might be scratching your head here -- Kevin's math is awful! The max number is 24, not 12, because ATI promised some special editions of upcoming HD 5000 cards will be able to handle 6 displays each (due in Q1 we are told).

Another important mentionable on the topic of Eyefinity -- with the launch of the HD 5970 the new Catalyst drivers will support multi-GPU rendering over Eyefinity. Before, this wasn't the case. But now if you have three displays going on one HD 5890 for example, the two GPUs will share the load.

One big improvement also championed with the HD 5970 release is in power efficiency. Though when fully engaged the HD 5970 will no doubt suck power like a zombie does brains, the HD 5970 is promised to have an extremely low idle power drain -- the astounding figure of 42W has been thrown about. We'll see how much truth this claim holds in our power testing on the last page of this review.

Picture 1: Notice any resemblance?  Picture 2: Big guns: GTX 295, HD 5970, HD 4870 X2

Specifications

In the last few years, GPU capabilities and horsepower have continued to go through the stratosphere. With two Cypress XT GPUs, the HD 5970 has a massive 3200 stream processors at its calling. In terms of raw number munching, this adds up to over 5 TFLOPS of computational excess.

As you can see, the CPUs used here aren't different than what you would find solo in a HD 5870 -- however ATI has said that they cherry-picked them, to find the chips that had better upper-end performance. Keeping the two GPUs talking is a second generation PLX bridge chip. This kind of power we are first getting into the territory where x8 PCIE 2.1 lanes might be saturating bandwidth limits compared to x16 lanes.

As you can see here, the HD 5970 is clocked at HD 5850 speeds from the get-go, for both the core and memory clock. This is a change up from prior, recent dual-GPU configuration cards and warrants some discussion. ATI has seemingly clocked the HD 5970 lower than it can handle for two reasons, one more marketing-inclined than the other: the first is that this lower power consumption -- which is good to see, as you don't need anywhere near the full force of this card to handle your daily desktop activities; second, ATI is really pushing the overclocking potential of this card. So, by the lowering the default clocks, you make the upper-echelon overclocks seem more impressive.

Without even seeing the game benchmark results yet, we can suggest that as of right now, running the HD 5890 at default speeds should be fine for just about anyone, running things at 1920x1080 or lower. But if you want the performance, or have multiple big displays, manual overclocking is going to be required here -- at least until the presumed factory OC'ed models come out. Look for further discussion on the overclocking situation on the next page .

 

GTX260

GTX285

GTX 295

HD 4850

HD 4870 HD 4870 X2 HD 4890 HD 5870 HD 5970

Processing Cores

216

240

480*

800

800 1600* 800 1600 3200*

Core Clock

576

648

576

625

750 750 850
850 725

Shader Clock

1350

1476

1242

625

750 750 850
850 725

Memory Clock (effective)

1998

2484

1998

1986

3600 3600 3900
4800 4000

Memory Interface

448 bit

512 bit

896 bit*

256 bit

256 bit 512 bit* 256 bit 256 bit 512 bit*

Memory Type

896MB GDDR3

1024MB GDDR3

1792MB GDDR3*

512MB GDDR3

512MB GDDR5 2048MB GDDR5* 1024MB GDDR5 1024MB GDDR5
2048MB GDDR5*

Fabrication Process

55nm

55nm 55nm

55nm

55nm 55nm 55nm 40nm 40nm

  * denotes cumulative effective efforts coming from 2 GPU's (i.e GTX 295: two GPUs with 240 cores equal 480)

Let's see what we can do on the overclocking side of things.

Overclocking


So here is the situation: as we said, if you want to get the full potential of performance from the HD 5970, you will have to overclock it. This is more true in this case for this video card than perhaps any other reference design video card we've come across in a long time. With a card -- such as , for example -- the HD 4830, it is a good idea to overclock it, getting that extra step of performance. But with the HD 5970 you have to overclock it just to have it run at 100%, as the GPU's can certainly handle more than a 725 MHz clock without much trouble.

Also required for the overclocker: ATI's Voltage utility. Especially made for the HD 5970, the over-volt utility, supplied by AMD to us (and we presume, will come with available for all coming models of this video card) is almost required. Expect to flounder around the 790 MHz mark for the core clock without using the tool. 

The over-volt tool could not be simplier. You load it up, and you have two options: regular voltage ( 1.05V for the core, 1.10V for the memory) or over-voltage (1.1625V / 1.15V). One thing to keep in mind: if you are running your HD 5970 overclocked, and you ever have a overclocking-related display driver failure, you'll have to re-open the voltage tool, set the speeds to default, apply, then set them back to over-volt mode. Your clock overclock will stay constant in a crash, but the voltage settings gets reset. We figured this out after some big-time head scratching when our once-stable OC's mysteriously started to fail. 

Okay down to the numbers now then. So the very low default speed for the HD 5890 is 725 MHz for the core, and 1000 MHz for the memory. After a long-time testing, and running the fan at full speed, we found 935 MHz / 1235 MHz to be fully stable (we achieved higher speeds but nothing lasted more than 30 minutes in our stress tests). If you consider these overclocks from the distance from the default clocks, then they are quite exceptional. However if you figure the HD 5890 can easily handle HD 5870 speeds, then this overclock is 10% for the core, which is fairy average, and hardly over normal operating speeds for the memory (clocked as a HD 5870). So in conclusion, the overclocking here really isn't that exciting.

On the other hand it is possible we just didn't get a good flavor of chips passed our way. Pre-release rumors suggested that speeds up to 1000 MHz for the core clock might even be possible. In our case though, the HD 5970 performed as you'd expect considering what a HD 5870 can do.

Hardware

Video cards used in the benchmarks include a Sapphire HD 5870 Vapour-X, a Gigabyte GTS 250 OC, a VisionTek HD 4870, a Sapphire HD 5770, a BFG GTX 295, and a LeadTek GTX 260 Extreme+ .

Software

For the drivers, all the ATI cards used the Catalyst 9.10 drivers,  and all the Nvidia cards used Forceware 190.17 drivers.

We have updated our benchmarks. Here are the new ones:

Batman: Arkham Asylum: Gotham's Greatest Detective makes for a good benchmark. We used the in-game bench, running at 2560x1600, with the highest quality settings possible. We chose to only test without AA as there has been some controversy that AA is unnecessarily handicapped in this game for some video cards.

FTL_Blunderbuss: This is a demoscene demo by the group Fairlight, which came in second in competition in October 2009. It makes very heavy use of particles, and is a good GPU workout.  We used FRAPS to measure the average framerates of a run through the program, running at 1680x1050, with 4xAA, and 'high' detail.

Resident Evil V: Capcom's latest zombie smasher has a great 'Fixed' in-game benchmark. We ran it at top quality at 2560x1600 in DX10 mode, with and without AA . 

And here is our older benchmarks still used:

Bioshock: For this benchmark, all of the Detail settings were set to 'High'. All of the graphic option switches were set to 'On', with the exception of the following three settings: Vsync, Windowed mode, and Force Global Lighting. We used FRAPS to measure frame rate performance. The FRAPS run was 138 seconds, triggered from pulling the switch in the sub at game's beginning. The sub's dive involves many big models moving around, which should strain the GPUs and be a good measure of the game's engine.

Crysis: Warhead: Games don't get much more demanding than Crysis. We used the 'Gamer' pre-set level of details, which is the middle level setting out of 5 options. We ran the benchmark on the 'avalanche' map, using the FrameBuffer Crysis benchmarking tool, version 0.29, in DX10 mode.

Enemy Territory: Quake Wars: We use this id FPS benchmark to test out higher resolutions. We used the highest possible detail settings. We tested the resolutions at 4x AA as well as at 8x AA. 16x AF was also used.  

Far Cry 2: This open-world FPS is great looking game that really puts the strain on a gaming rig. We used the built-in benchmarking tool, and the overall 'Very High' quality setting was used.

Furmark: This intensive, synthetic benchmark models a ring of fur. We benched at 1680x1050.

Street Fighter IV: You have probably heard of this famous fighting game. It has 3D graphics, but generally does not require much GPU horsepower to run well. We used Capcom's stand-alone PC benchmarking tool for our tests, and ran everything at its highest possible settings, using 4xAA, and the 'Watercolor' setting.

Unreal Tournament 3: We tested the game using a fly-through of the vehicle capture-the-flag map 'Suspense.'ShangriLa (map) running for 90 seconds. Details were set to 'High', and a AF setting of 16x was used.

World In Conflict: We used the built-in benchmark of the demo version of this game. We ran the benchmark in DX9 rendering mode, with a 'High' level of quality. For the AA testing, we used a setting of 4x, and a setting of 16x for AF.

If you would like any further information about our benchmark settings, feel free to ask us in the forums.

Futuremark seems to believe that the reign of the GTX 295 has now come to an end. This sounds reasonble to us.

..or is the end so soon? Here in this benchmark, where Nvidia cards usually perform better than their ATI counter-parts, the GTX 295 puts in good numbers, actually beating the HD 5970 in the lower resolutions. Surprsingly the HD 5970 only gets about 25% more frames than the HD 5870 in the Bio-bench here. We are going to presume that the Catalyst drivers need a bit more tweaking here. 

This as an interesting result spread here. This demoscene demo is heavy on the particles, and demanding for modern video cards at even the lower resolutions.

Two things: first off this test doesn't use more than one GPU. If you are new to dual GPU cards, we hate to break it to you, but this is part of the whole double-GPU card thing: not every program you are going to run is going to have CrossFireX support; and even some big titles might take a while to get working smoothly in CrossFire. The second thing to mention here is just how much a difference the extra horsepower makes: the increased clock speeds of the HD 5870 Sapphire Vapor-X over the single-GPU HD 5970  (running at HD 5850 speeds) makes a huge difference in this test.

Again another test that doesn't utilize both GPU's. In this synthetic modelling of a ring of fur, the HD 5970 scores about where you'd expect it compared to the HD 5870. Nvidia cards never had as much luck with this bench as AMD cards.

Here we go -- this is a bit more like it. Living up to it's top-dog billing the HD 5970 does quite well in Crysis: Warhead, managing to break the 40 fps in 2560x1600 mode, which is impressive for this game.

The HD 5970 has the horsepower to blow through Far Cry 2 like it is Unreal Tournament 3. There is no question this card packs a big punch.

The BFG GTX 295 puts in really good numbers here considering it is a generation behind the HD 5970. So for all you GTX 295 owners out there: don't toss your video card out of bedroom window yet. It still has legs.

The Sapphire HD 5870 Vapor-X also puts in good numbers -- it doesn't come too far behind the two dual-GPU cards.

The BFG GTX 295 manages to speed by the HD 5970 here with no AA on. Again it looks like some driver tweaking may be necessary -- the HD 5970 doesn't score too much higher than the HD 5870.

While the GTX 295 comes out ahead without AA, the HD 5970 comes in tops with AA on. But really, after 150 fps, does it matter all that much?

Of course it does!

At top resolution, the HD 5970 puts in a good show, and stays well ahead of the GTX 295, getting roughly double the frames as the LeadTek GTX 260.

We have to say the GTX 295 is holding up a bit better than we expected. It manages to maintain respectable second place showings to the HD 5970.

If you want to play World In Conflict at 2560x1600 with AA you are going to need some serious hardware. The HD 5970 is a serious piece of hardware.

Nvidia might be a bit miffed that even after helping Rocksteady Studios with Batman's latest escapade, through the Way It Is Meant To Be Played program, the game runs better currently on ATI hardware (without considering PhysX anyways). But maybe Nvidia's new gen will change things up significantly...

Operating Temperatures

We used the program OCCT to measure 'load' temperatures once 200 seconds elapsed.

Vapor chamber or no, the HD 5970 puts out a lot of heat when things are really pushed to the edge. Don't expect it to reach such toasty in normal operation though -- for the time being, not many games strain the video card too much.

Nonetheless, 'non-reference cooler' comes to mind when looking at these results.

Power Usage

We used a Kill-o-Watt measuring device in conjunction with the program OCCT to measure 'load' temperatures once 200 seconds elapsed.

The load power usage seems reasonably (high) for the performance of the video card. Even at 40nm a great deal of power is needed to keep those billions of transistors firing on all cylinders.

In our chats with AMD, they made much out of the extremely low idle power state. In our testing, we just didn't see it. Don't get us wrong -- the idle is great, but it is not ground-breakingly low; just compare it to the GTX 295 above.

With the HD 5970, you can either connect two 6 pin PCIE power connectors, or one 6 pin and one 8 pin PCIE connector. If you want to overclock, you'll need the 8 pin PCIE connected.

Once again, with all powerful gaming cards such as this, you'll want the right PSU for the job. A well made 650W PSU or above is recommended here. If you run it on much less you might be pushing things.
 

Conclusion

Like the HD 5870, the HD 5970 is in the enviable position of not having much competition. That being said though, we thought the GTX 295 held up pretty well against it; certainly the HD 5970 is the new king of the block, but the GTX 295 is able to keep up -- this is could arguably be the better state of the SLI drivers as  opposed to the Catalyst CrossFire drivers, which typically take longer to mature.

For the lucky ones who may have a monster video card lying under their Christmas tree this year, if you want top-of-the-line, top-tier performance than the strongest option is the HD 5890. While the GTX 295 still holds up in the benchmarks, as we recommended in our recent HD 5870 reviews, we feel that it is best to stick with ATI for the moment because of DirectX 11 (which promises to really materialize into something, unlike DX10) and Eyefinity is nice feature to have as well.

Let's talk about prices for a bit. As a typical high-end monster-card introductory price, the HD 5970 sells for $600 USD. This is steep, but a fair price for the state of video card right now. Almost all HD 5000 series cards, and most Nvidia high-end cards (GTX 260 and above), are in short supply. So don't expect prices to fall any time soon. Some previous dual-GPU monster cards entered the market at a time when you could buy the equivalent pair of video cards and do your own multi-GPU setup (i.e buying a pair of HD 3870s when the HD 3870 X2 came out) but this isn't the case here. We expect HD 5000 prices to remain stable -- even possibly for a month or two after the launch of Nvidia's new generation, which is rumored not be coming in big volumes until the end of January.  $600 may seem high but this is a fair price as far as supply and demand dictates.

We felt that perhaps the HD 5970 was a touch on the underclocked side. Unfortunately our guess would be that many users out there (none of us of course) will not investigate the overclocking ability of this video card, so, for those people, they'll have a HD 5970 that is only running at %80-%85 of it's safe potential clock speeds. If you get this card, you must, must overclock it. It is underclocked out of the box (to benefit power consumption). No need to push it to the limit like we did in our O/C testing, but at least add 15% to the clocks to get yourself playing games in top-gear.  

On the other hand of the overclocking coin, the HD 5970 is such a powerful card offering such great performance that there isn't all that much that will really strain the video card, in the way of games, out just yet regardless. But rest assured 2010's crop of games will no doubt be taxing this video card well before Blizzard puts out there next game.

If you want top-end, no-joke performance right now, then there is only one option really:  the HD 5970. 

»Neoseeker.com

Copyright Neo Era Media, Inc., 1999-2014.
All Rights Reserved.

Please do not redistribute or use this article in whole, or in part, for commercial purposes.