Gigabyte Radeon HD 3870 versus the 9600GT

Author: Kevin Spiess
Editor: Howard Ha
Publish Date: Wednesday, February 27th, 2008
Originally Published on Neoseeker (http://www.neoseeker.com)
Article Link: http://www.neoseeker.com/Articles/Hardware/s/gigabyte_hd3870/
Copyright Neo Era Media, Inc. - please do not redistribute or use for commercial purposes.

It is a good time to be in the market for a video card -- especially if you want to spend around $200. Both ATI's HD 3870 and NVIDIA's 8800 GT have been around for a few months now, and prices have reached new lows.

Even just six months ago, getting the level of performance that can be had with a HD 3870 or a 8800 GT for around $200 would have been hard to believe. And just last week, things got even more interesting with the introduction of NVIDIA's first 9th series card, the 9600 GT. Currently selling for around $180, the 9600 GT proved to be speedy card, despite having only 64 stream processors.

While the previous generation of mid-range cards left a bit to be desired when it came to performance -- especially in DX10 games -- that isn't the case this time around.

Today we will be looking at Gigabyte's twist on the HD 3870. This HD 3870 has a Zalman cooler on it, and Gigabyte's 'Ultra Durable 2' technology as well, which might help out with overclocking. We are going to test this HD 3870 against the primary competition around the $200 price point, including the 8800 GT 256MB/512MB, the 9600 GT, and not to mention the HD 3850 as well, in order to find out what might be the best route to go.

How will the Gigabyte HD 3870 fare in this competive market? We intend to find out.

While Neoseeker has reviewed a number of HD 3870 cards over the last few months, the good news has been that they have all been unique in their own way. The Gigabyte HD 3870 continues this trend as well, with the most obvious feature seperating it from the HD 3870 pack is its unique cooling solution. Gigabyte chose to go with Zalman do design their cooler. You are probably familiar with Zalman Tech -- they are a Korean company that has become renown for their effective cooler designs, and has been around since the late 20th century (--well, since 1999 anyways).

This time around Zalman has come up with a half-circle heat fin array, composed of aluminum and copper. While it may look a little less serious than some coolers we have seen recently (such as the one on VisionTek's HD 3870), this Zalman 2-ball bearing cooler proved effective in our testing. The fan ran at a high RPM, as well as being fairly quiet, even when manually set to run at 100%.

Left to right: VisionTek HD 3870, Gigabyte HD 3870, Asus HD 3870 TOP (close to reference board design)

As effective as the Zalman fan is, there are two faults to this cooling situation that are apparent: first off, many people have a preference for hot air being moved out off their case by a cooler; and secondly, the fan only blows a bit air over the Gigabyte HD 3870's memory, and their is no metal heatsink at all over this important region. While it is pretty much assured that this will not be a problem for regular operation, not having a heatsink over the memory is sure to hinder overclocking. 

The Gigabyte HD 3870 also features 'Ultra Durable 2 Technology', which can be seen as marketing catchphrase basically describing 'Higher Quality Components.' But this isn't to deride Gigabyte for choosing this name: the higher quality ferrite core chokes and capacitors are something to be proud, for sure. To what extent they improve performance should be seen in the benchmarks. 

Let's take a look at some numbers, shall we? The Gigabyte HD 3870 is not a factory-overclocked model -- and actually has a slower than average memory clock -- and can be attached to the following numbers, as reported by GPU-Z:

  Gigabyte HD 3870 9600GT (reference)

8800GT 512MB (reference)

8800GT 256MB (reference) HD 3870 (reference) HD 3850 (reference)
Stream Processors 320 64 112 112 320 320
Core Clock 776 650 600 600 775 668
Shader Clock   1625 1500 1500 775 668
Memory Clock 1890 1800 1800 1800 2250 1656
Memory Interface 256 bit 256 bit 256 bit 256 bit 256 bit 256 bit
Memory Type 512 MB GDDR3 512MB GDDR3 512MB GDDR3 256MB GDDR3 512MB GDDR4 256MB GDDR3
Memory Bandwidth (GB/s) 60.5 57.6  57.6 57.6 72.0 52.9
Texture Fillrate (billion/sec) 12.4 20.8 33.6 33.6 12.4 10.6
Fabrication Process 55nm 65nm 65nm 65nm 55nm 55nm

As you might have noticed as well, this is one of the few HD 3870 cards that has GDDR3 memory instead of GDDR4 memory. 

As will all HD 3870 cards, this current generation of from ATI supports DirectX 10.1, Shader Model 4.1, PCI-Express 2.0, HDMI audio, and has HDCP decryption ability for playing HD / Blu Ray DVDs. Also with the HD 3870 you get a Tessellation unit, and ATI's Universal Video Decoder.

Power Usage

To measure power usage, we used a Kill A Watt P4400 power meter. Note that the above numbers represent the power drain for the entire benchmarking system, not just the video cards themselves. For the 'idle' readings we measured the power drain from the desktop, with no applications running; for the 'load' situation, we ran a demanding part of 3DMark06.

The 55nm fabrication process of the RV670 GPU helps keep the Gigabyte HD 3870 at a very reasonable level of power usage. A standard 400W-450W power supply would not very strained by having this card in your case -- unless, of course, you have other high-demand components, such as a multiple hard drives, or instance.  

Overclocking

To be honest, my expectations for the overclocking of Gigabyte's HD 3870 were mixed. On the positive side of things, Gigabyte's quality "Ultra Durable 2" construction -- which consists of better-than-average ferrite core chokes, solid organic polymer capacitors, and lower switching resistance MOSFETs -- could do nothing but help overclocking; on the flip side, it isn't often that a enthusiaut-class video card does not have any sort of metal heat sinks on the memory chips.

My tredipation was well placed, as the GPU overclocked quite well, but the memory didn't achieve the level of overclock I was looking for. Nonetheless, a core speed of 889 MHz and a memory clock of 1089 (2178) MHz isn't that bad at all. This is a very reasonable overclock considering the default clock speed of a HD 3870 is 775 MHz, and 1000 (2000) for the memory. For comparison, our last HD 3870 we reviewed -- from VisionTek -- reached 884 MHz (core) / 1292 (2596) MHz. However, to be fair to Gigabyte, the VisionTek required the fan to be running at 100% for this VisionTek overclock to be stable, and was pretty loud at that setting, whereas even at 100% fan power for the Gigabyte HD 3870, the Zalman wasn't that loud at all.

Some enterprising overclockers out there might want to throw an old heatsink that they may have kicking around, right onto the memory and see what kind of new speeds they can reach. The memory chips are Samsung K4J52324QE-BJ1A's, which are rated to 1000 (2000) MHz. With proper cooling, you should not have trouble getting at least 200 MHz above that rated speed, I'd presume. 

Box & Packaging

On the standard-size Gigabyte box we have a Darth Maul-like, evil mage guy, highlighting prominently the fact that Bioware's and Obsidian Entertainment's Neverwinter Nights 2 comes included with video card.

On the back of the box the benefits of CrossFireX (which allows up to 4 current generation ATI cards to be linked together for improved performance) and Gigabyte's Ultra Durable 2 technology are described in detail. In the box the card is in an anti-static bag, comfortably sandwiched in a shell of styrofoam, which is good for the safety of the card, if bad for the environement.

Bundle

As mentioned, the Gigabyte HD 3870 comes with a full version of Neverwinter Nights 2. This is game is a sequel to Bioware's original Neverwinter Nights, which is perhaps the best modern translation of Dungeons and Dragons in a PC game. Needless to say, the game is an RPG, and was generally well received by critics. For the price-point of this card (~$200), it's great to see this game in the mix -- even more so if hardcore RPGs are your thing!

Rounding out the rest of the bundle, in the way of cables we have 2 DVI to VGA adapters, a CrossFire bridge, a Molex-to-PCIe power adapter (if you happen to have a old PSU this will be handy), and a Gigabyte branded HDTV video adapter. Additionally, we have a driver CD, a full paper manual for Neverwinter Nights 2, a well-illustrated quick start guide for the card, and a manual. Though this probably will not weigh into your decision regarding possibly purchasing this card, I do want to mention that this Gigabyte manual is excellent -- far above than the vast majority of video card manuals -- and is written in two very popular languages: English and Chinese.   

This time around, we used the following system to bench our cards on: 

As for video cards, against the Gigabyte HD 3870, we have the following cards, ranked fasted to slowest: a XFX 8800 GTS 512MB Alpha Dog Edition (XXX), a NVIDIA 8800 GT 512MB, a Plait 9600GT Sonic, a XFX 8800 GT 256MB Alpha Dog Edition (XXX), and a PowerColor HD3850 Xtreme PCS 512MB. Beside the G92-based 8800 GTS, all of these cards are within $50 USD of one another, and should offer reasonably similar levels of performance.     

We use a fully-patched Microsoft Windows Vista to test out our cards. As for drivers, the NVIDIA cards all used Forceware 169.28, except for the 9600GT which used Forceware 174.12 Beta drivers, while the ATI cards were tested with Catalyst 8.2 drivers.

For the games we decided to bench, here is some information on our chosen settings: 

Bioshock: For this benchmark, all of the Detail settings were set to 'High'. All of the graphic option switches were set to 'On', with the exception of the following three settings: Vsync, Windowed mode, and Force Global Lighting. We used FRAPS to measure frame rate performance. The FRAPS run was 138 seconds, triggered from pulling the switch in the sub at game's beginning. The sub's dive involves many big models moving around, which should strain the GPU's and be a good measure of the game's engine.

Call of Juarez: We used the stand-alone Call of Juarez DX10 benchmarking program for these results. For our AA testing, we used a setting of 2x.

Crysis: These benchmarks were performed using 'fly-by' GPU test found within the single-player pre-release demo version of the game. All graphic settings were on High.' For AA, we used a setting of 4x. DX10 mode was used. The game has also been fully patched (1.1).

Enemy Territory: Quake Wars: We use this id FPS benchmark to test out higher resolutions (1280x1024,1600x1200,1920x1200). We used the highest possible detail settings. We tested the resolutions at 4x AA as well as at 8x AA. 16x AF was also used.  

Unreal Tournament 3: We tested the game using a fly-through of the vehicle capture-the-flag map 'Suspense.'ShangriLa (map) running for 90 seconds. Details were set to 'High', and a AF setting of 16x was used.

World In Conflict: We used the built-in benchmark of the demo version of this game. We ran the benchmark in DX9 rendering mode, with a 'High' level of quality. For the AA testing, we used a setting of 4x, and a setting of 16x for AF.

If you would like any further information about our benchmark settings, feel free to ask us in the forums.

Interesting results: fairly good showing in the Vertex and Fill Rate tests.

Surprisingly, it was the Gigabyte HD 3870 that performed best at 1920x1200, even beating out the XFX 8800 GTS 512MB at this resolution -- even if by only a small margin. At the lower resolutions, most of the cards in the benchmark put up fairly similar numbers.

We recently changed our testing settings for Enemy Territory: Quake Wars.

As Quake Wars is based on the aging Doom 3 engine meant that Quake Wars did not require all that much graphic-horsepower to get upwards of a 100 FPS in our benchmarks, so we decided to make this our high-resolution, high-AA test. While not that many people game at 1920x1200 resolutions with 8x AA, with this $200 Gigabyte HD 3870, you have enough power to do so -- at least in Quake Wars.

While the 8800 GT and the HD 3870 are neck-to-neck in this bench, it is the shader processor optimizations found in the 9600 GT that really shine here.

And by the way, if you think those scores look a bit strange for the XFX 8800 GTS 512 -- well, I did to. But I double checked them, and they are correct. While the 8800 GTS 512 is a fast card indeed, for whatever reason, with a 8x AA setting it does not perform very well in this bench, getting bested by the much cheaper 9600 GT.

A nice showing for the Gigabyte HD 3870, as it comes in at the top of this chart -- although, again it does not give all that much different of a performance than the 8800 GT.

At the lower resolutions, the HD 3870 trails significantly behind the 8800 GT. At 1600x1200, the performance difference gets all but erased. The 9600GT does well here, and offers great bang-for-buck in Bioshock performance.

The Gigabyte HD 3870 had a bit of a challenging time in the World of Conflict bench. Without AA, the HD 3870 is right in the thick of things, and performs pretty well - with 4x AA though, the NVIDIA cards out-maneuver the ATI cards, especially the 9600GT, which turns in some solid numbers.

The Gigabyte HD 3870 doesn't do all that great in the demanding Crysis benchmark, getting beat fairly soundly by the NVIDIA cards in the mix.

With this review, I was hoping for a clear bang-for-buck performance winner to emerge from the benchmarks, but this just wasn't the case. There really is a card for everyone between the 9600 GT, HD 3870, HD 3850, and 8800 GT.

If you were to study the benchmarks in this review quite closely, you might be inclined to argue that the Palit 9600GT Sonic was the best deal of the bunch -- and you might have a case, as it is definitely a strong contender for that title. One thing to keep in mind is that this particular 9600GT is a overclocked version, and retails for about $30 dollars more than a standard 9600G, and this overclock gives it a bit of an advantage in the benchmarks presented in this review. However even when this is taken into account, with the Palit 9600GT Sonic retailing for $209 at NewEgg currently, while the Gigabyte HD 3870 is going for $195, things look arguably better for going the 9600GT in this particular match up. The 9600GT out-performed the Gigabyte HD 3870 in most of the benchmarks.

How about the Gigabyte HD 3870 compared to the rest of the cards around the $200 mark?

Compared to our NVIDIA 8800GT, the HD 3870 faired much better. With 8800 GT's ranging from $200 to $260, the HD 3870 doesn't look too bad at all at around $195. With the exception of Crysis and World In Conflict (with AA on), the Gigabyte HD 3870 was neck-in-neck with the 8800 GT, and with the Gigabyte HD 3870 being one of the less expensive HD 3870 cards around, it isn't that bad of deal at all.

 

In comparison with the HD 3850, you may have noticed that the factory-overclocked, 512MB PowerColor HD 3850 Xtreme PCS put in very similar results as the Gigabyte HD 3870. But the two are currently about the same price, so the PowerColor HD 3850 did not really distinguish itself in this line-up. 

If we set aside the competition for moment, overall the Gigabyte HD 3870 is a good card. On the positive side of things, the Zalman cooler is quite nice (and would cost you around $25 if you bought it on its own.) Not only did it keep temperatures down for a decent GPU overclock, but it was also very quiet, it was also nice to see was the inclusion of Neverwinter Nights 2 in the Gigabyte HD 3870's bundle. With competition this close, if you are a big RPG ran, I could see this game titling this card into your good books. One thing that did not impress me very much with for this card was the memory; it is one of the very few HD 3870 cards with GDDR3 memory, and it is clocked by default to less than the reference HD 3870 clock speeds. While you could overclock it no problem to the standard speeds if you wished (and this should not decrease the lifespan of your card significantly), not everybody is an overclocker, and many consumers probably would not realize their HD 3870 is a bit slower than the others. If you are an overclocker, the lack of heatsinks on the memory will limit you from really pushing the memory very far.       

In the end, the Gigabyte HD 3870 is a good card, and it is a good deal to be had at around $200, but competition is so tight around this mark -- especially now with the release of the 9600GT -- that this card doesn't really have that certain something that would make it stand out from the rest of the pack.

»Neoseeker.com

Copyright Neo Era Media, Inc., 1999-2014.
All Rights Reserved.

Please do not redistribute or use this article in whole, or in part, for commercial purposes.