Author: Kevin Spiess
Editor: Howard Ha
Publish Date: Tuesday, February 9th, 2010
Originally Published on Neoseeker (http://www.neoseeker.com)
Article Link: http://www.neoseeker.com/Articles/Hardware/s/gigabytegt240_hd5670/
Copyright Neo Era Media, Inc. - please do not redistribute or use for commercial purposes.
The HD 5000 series is almost entirely revealed now, as we are just about to enter the middle of February, 2010. Like with pretty much all introductions of new generations of GPUs, ATI started the HD 5000 series off with entries at the high-end, then worked their way steadily down the performance ladder. Launched about a month ago, the HD 5670 is the mid-range piece of ATI's current arsenal. Together with the HD 5750, the HD 5670 is aimed at those looking for the trade-off between price and performance. While you can't expect that'll a mid-range card will be able to max out every game at a high resolution, you can expect that you'll at least be able to do some high quality gaming with this video card that sells for around the $100 USD mark.
Gigabyte has given us our second HD 5670 to take a look at. Unfortunately, are last engineering sample HD 5670 from ATI did not have CrossFire teeth on it, unlike Gigabyte's model, so we won't be able to run two of these cards at the same time, to give you some CrossFire numbers. Nonetheless we'll still have an interesting time today seeing what Gigabyte has done with this non-reference design.
Also, keep in mind for the HD 5670 two models are available: 1 GB version, and a 512MB version. Pricing for the these two models have been fairly steady around $100 and $115 USD respectively.
As we discovered last review, strong competition around the $100 price-point continues to come from higher-end cards of the last generation --- but wait, maybe we are getting ahead of ourselves. Let's take a closer look at the card before we go into the numbers.
The HD 5670 is ATI's primary representative of ATI's mid-range offerings right now. From our previous testings, we'd put the HD 5750 in the high mid-range, and the HD 5770 in the low high-range, of gaming performance. Below the HD 5670, we have the recently released low-end 5450, and a step in between the two is also expect to be released sometime soon.
The HD 5670 comes in 512MB and 1GB versions -- today's Gigabyte HD 5670 is a 1GB card.
Similar to the our recent review of the Gigabyte GT 240 (Nvidia's alternative to a video card around this $100 price point), Gigabyte has given a customized treatment to this HD 5670. We are left with something pretty much the same in function as the reference design HD 5670 we first reviewed, but arranged somewhat differently. Both the cooler and the printed circuit board have been tweaked by Gigabyte's engineers.
The PCB is blue, for starters; beyond that cosmetic change we have a slightly different arrangement of capacitors and components on the board. Gigabyte used some beefier cap's (though used fewer MOFETS) and seemed to have made good use of the real estate of the PCB. Size-wise, the Gigabyte HD 5670 is the same as the reference board design, which we have already reviewed, coming in at 6.5".
The cooler has also been changed -- using the same design as the found on the Gigabyte GT 240, but with a slightly different plastic cover. This cooler uses much more aluminum than the reference board design cooler, and additionally, has a much larger fan. This should most certainly superior cooling for the GPU -- though, however, the memory chips do not receive any heatsink coverage like they do in the reference board.
The Gigabyte HD 5670 offers good flexibility with the output display options: you get a VGA, DVI, and HDMI port. As with all products in the ATI HD 5000 series, you have Eyefinity capability as well, which allows you to connect up to 3 displays to your video card, simultaneously. However, just remember that running 3 displays for desktop stuff is fine; but if you want to game on 3 displays, you'll likely require a video card for more horsepower, in order to achieve acceptable framerates. Helping this situation out though is CrossFireX -- you can mix-and-match different HD 3000, HD 4000, and HD 5000 video cards; so, if you want more performance down the line, and your motherboard supports it, you have the option of buying a second video card for system.
The Gigabyte HD 5670 beside a reference design HD 5670, and beside a Gigabyte GT 240
That's actually one thing we apologize for being remiss about, with our last review of the HD 5670 -- only the 1GB version has CrossFireX capabilities. The 512MB version is lacking the teeth for it. This is understandable move on ATI's part, from a marketing perspective -- the extra 512MB of memory, time and time again shown in our benchmarking, usually has a negible affect on performance. So, having that CrossFire compatibility with the 1GB model makes the $20 price premium seem more reasonable.
With Nvidia's next generation of GPU delayed for a while now, ATI is the only current way to go if you need a DirectX 11 capable GPU, such as found in today's HD 5670. It remains to be seen how important, or how unimportant, DirectX 11 will be for those who are into playing PC games. It is our guess it'll be 2011 or 2012 even, before DirectX 11 really makes a significant appearance in titles, beyond window-dressing. But either way, it is good to be prepared, if the option is available.
In addition the DirectX 11 support, you get the additional selection of HD 5000 series features, including PCI Express 2.0 and Shader Model 5.0 support, the UVD video engine (which improves HD video content playback, and ATI Stream , which speeds up Flash content.
As you can see in the chart below, the HD 5670 has 400 stream processors. This is about half the amount found in the HD 5770. This card also has a 128-bit memory interface. In the last few generations, 128-bit memory interfaces generally meant pretty sad performance -- but this isn't as much a factor this time around, because of the high-speed GDDR5 memory used.
Finally, note that this is a overclocked model of the HD 5670 offered by Gigabyte here today -- but it is pretty much just a OC model in name. The core clock has boosted a paltry 10 MHz, while the memory clock has not been increased. OC models really don't come less OC'ed than this! 10 MHz on the core should only lead to a insignificant performance increase of about 2% tops.
|GTX 285||GTS 250||9800 GT||GT 240||HD 5670||HD 4850||HD 5750||HD 5770||HD 5870|
Memory Clock (effective)
|512 bit||256 bit||256 bit||128 bit||128 bit||256 bit||128 bit||128 bit||256 bit|
|512MB GDDR3||512MB GDDR5||512MB GDDR5||512MB GDDR3||512MB GDDR5||512MB GDDR4||1024MB GDDR5|
Box and bundle
The Gigabyte HD 5670 comes in a blue box with a robot on it, that very vaguely reminds me of that security bot you can recruit in Fallout 3.
As for the bundle, there really isn't much of one to speak of. You just get the basics here: a manual and a driver CD.
As mentioned last page, the factory overclock for this card is very slight. So, how about some real, manual overclocking?
We have had prior experience overclocking the HD 5670, so we knew right where to start: at the max allowable overclock allowed by the CCC. The Gigabyte HD 5670 operates at the default speeds of 785 for the core (10MHz over default clocks), and a 1000 MHz for the memory. The HD 5670 seems to have no trouble sustaining a 846 MHz core overclock, and a 1049 MHz memory overclock. This is a nice little performance boost, though it'd be interesting to see how far we could get if the Catalyst Control Center had a higher max overclocked allowed. There should be some more headroom, and it would put to better use the non-reference cooler that Gigabyte utilizes.
Video cards used in the benchmarks include a LeadTek GTX 260, a PowerColor HD 5770 PCS+, a VisionTek HD 5750, a Gigabyte GTS 250, a VisionTek HD 4850, a VisionTek HD 4870, and a Gigabyte GT 240.
For the drivers, all the ATI cards used the Catalyst 9.10 drivers except for the HD 5670, which used beta drivers, and all the Nvidia cards used Forceware 190.17 drivers.
We have updated our benchmarks. Here are the new ones:
Batman: Arkham Asylum: Gotham's Greatest Detective makes for a good benchmark. We used the in-game bench, running at 2560x1600, with the highest quality settings possible. We chose to only test without AA as there has been some controversy that AA is unnecessarily handicapped in this game for some video cards.
FTL_Blunderbuss: This is a demoscene demo by the group Fairlight, which came in second in competition in October 2009. It makes very heavy use of particles, and is a good GPU workout. We used FRAPS to measure the average framerates of a run through the program, running at 1680x1050, with 4xAA, and 'high' detail.
Resident Evil V: Capcom's latest zombie smasher has a great 'Fixed' in-game benchmark. We ran it at top quality at 2560x1600 in DX10 mode, with and without AA .
In addition, we test with:
Bioshock: For this benchmark, all of the Detail settings were set to 'High'. All of the graphic option switches were set to 'On', with the exception of the following three settings: Vsync, Windowed mode, and Force Global Lighting. We used FRAPS to measure frame rate performance. The FRAPS run was 138 seconds, triggered from pulling the switch in the sub at game's beginning. The sub's dive involves many big models moving around, which should strain the GPUs and be a good measure of the game's engine.
Crysis: Warhead: Games don't get much more demanding than Crysis. We used the 'Gamer' pre-set level of details, which is the middle level setting out of 5 options. We ran the benchmark on the 'avalanche' map, using the FrameBuffer Crysis benchmarking tool, version 0.29, in DX10 mode.
Enemy Territory: Quake Wars: We use this id FPS benchmark to test out higher resolutions. We used the highest possible detail settings. We tested the resolutions at 4x AA as well as at 8x AA. 16x AF was also used.
Far Cry 2: This open-world FPS is great looking game that really puts the strain on a gaming rig. We used the built-in benchmarking tool, and the overall 'Very High' quality setting was used.
Furmark: This intensive, synthetic benchmark models a ring of fur. We benched at 1680x1050.
Street Fighter IV: You have probably heard of this famous fighting game. It has 3D graphics, but generally does not require much GPU horsepower to run well. We used Capcom's stand-alone PC benchmarking tool for our tests, and ran everything at its highest possible settings, using 4xAA, and the 'Watercolor' setting.
Unreal Tournament 3: We tested the game using a fly-through of the vehicle capture-the-flag map 'Suspense.'ShangriLa (map) running for 90 seconds. Details were set to 'High', and a AF setting of 16x was used.
World In Conflict: We used the built-in benchmark of the demo version of this game. We ran the benchmark in DX9 rendering mode, with a 'High' level of quality. For the AA testing, we used a setting of 4x, and a setting of 16x for AF.
If you would like any further information about our benchmark settings, feel free to ask us in the forums.
According to Vantage, the HD 5670 places between the GT 240 and the GT 250, and fairly behind the HD 5750. We'll see if this holds true in the game benchmarks.
Note that the Gigabyte GTS 250 benchmarked below is marked 'OC' on the chart, but this OC was very small (just about 25MHz), so it behaves closer to a normal GTS 250 than it does a super-charged, overclocked GTS 250.
In the feature tests, the HD 5670 scores across the board: sometimes near the bottom, but in some tests, much better. Such as the perlin noise test, where the card comes in 4th, which is very good considering the high-end heavy cards we are benching the HD 5670 against today.
In this Furmark benchmark, the Gigabyte HD 5670 does well with AA on, but not as well against the competition with AA off.
In this particle-heavy test, the HD 5670 scores fairly close (but under) the GTS 250, and a notch above the GT 240. Considering the price-tags of all the cards in our benchmark line-up, there aren't really any surprises here: the HD 5670 offers good performance so far for its price. Well, actually, that's not entirely accurate: the last-generation HD 4850 currently sells for the same price as the HD 5670, around $100, and offers a big performance boost, if not as much of the feature set as the HD 5670.
The GTS 250 does quite well here in Bioshock, which leaves the HD 5670 at just about the bottom of list here.
The HD 5670 does not offer a playable experience with details cranked in Warhead, are most demanding game in our benchmarks -- however the Gigabyte card does perform better in the max resolution thanks to the 1GB of DDR5.
Far Cry 2 is a bit more forgiving than Crysis: Warhead, and here the Gigabyte HD 5670 is capable of delivering a sufficient amount of horsepower to keep the game moving at the highest levels of detail.
We test H.A.W.X at the max resolution of 2560x1600. At this resolution, it takes a lot of horsepower to keep the frames up. 1GB of GDDR5 or not, the HD 5670 can't keep things going really smoothly. We'd suspect the game would be playable though, at a more reasonable resolution, of, say, 1920x1080.
Resident Evil V is a great looking game; the HD 5670 being capable of strafing the 30fps mark at max detail settings and max resolution, is pretty good.
Here the HD 5670 stays well behind the GTS 250, but nonetheless, still offers a good playing experience in the street fighting.
Many games are powered by the well-optimized Unreal engine -- and the HD 5670 should be very capable of playing them all.
This game is getting a bit old now, but it is still a very demanding benchmark to run -- and the HD 5670 has some troubles keeping up.
The HD 5670 comes last in the list here, yet still manages to keep things on the happy side of 30 fps.
For these results, we used the program OCCT to reach 'load' temperatures, taking a measurement once 200 seconds elapsed.
The power usage numbers for the HD 5670 are very good. There are very few other cards that offer this level of performance without the need for a PCIE power connector. The box suggests a 400W PSU is enough to keep the Gigabyte HD 5670 happy, and this would seem to be the case (unless, say, you have tons of optical and/or hard drives and are using a cheap generic 400W PSU...then things might be iffy.)
For these results, we used a Kill-o-Watt measuring device in conjunction with the program OCCT to measure 'load' temperatures once 200 seconds elapsed.
Gigabyte's cooler is sufficient enough to keep the HD 5670 running cool. 70 degrees at peak loads is pretty good; you can expect your HD 5670 to last indefinitely with those sort of lower operating temperatures.
The Gigabyte HD 5670 1GB is a good card for about $110 USD. We generally prefer this video card to Gigabyte's own GT 240 which we reviewed recently, even though the GT 240 did out-perform the HD 5670 in many benchmarks. Why's that? Because of three key features present in the HD 5000 series: CrossFireX, which enables you to eventually link the HD 5670 to any newer ATI card in the future, if your motherboard supports CrossFire and has two PCIE slots; Eyefinity, which allows up to three displays to run off of the one card; and finally, DirectX 11 support -- though we think this won't be much of a factor in coming PC games, in the real-world of gaming beyond marketing, it is still nice to have. If you have specific desire for NVIDIA's CUDA however, maybe the GT 240 would be a better fit; but for most, we'd recommend the HD 5670 over the GT 240.
If you are looking and spending about hundred bucks on a new video card this first quarter of 2010, then you have one important question to ask yourself: what do I care more about -- either those three features above, or game performance. If you prefer game performance over those three key features listed, then you can't beat the value of going with a high-end gaming card of last generation, foremost of which come to mind the HD 4850, and the 9800 GT.
When the HD 5670's first hit stores, the price difference between the 512MB and 1GB version was about $20 or so. That has now fallen to about $10. We don't think that the extra 512MB is a big selling point for most people-- because it doesn't make much of a difference unless you are running stuff at 2560x1600, and if you are running stuff at 2560x1600 and have the display to do so, we'd recommend going with a card with more horsepower. However, the big selling point for the 1GB is that it is CrossFire capable (most 512MB HD 5670 cards are not), so if you have a CrossFire board, we recommend going for the 1GB version. We are confident that something like a HD 5670 paired with a next-gen HD 6670, or HD 6770, will be a good combo, just as this card would work well with a HD 4770.
Gigabyte put this HD 5670 together well. While the "OC" is virtually a non-factor, and the bundle is just the basics, neither of these phase us too much. And while you can get more gaming performance from a last-gen card for about the same price as the HD 5670, performance offered by the HD 5670 isn't all that bad, and the feature set offered with the HD 5000 series is strong, which helps to offset the card's negatives. All in all, the Gigabyte HD 5670 OC is a good all-around video card choice for about $110 or so, but nothing that knocks off socks.
Please do not redistribute or use this article in whole, or in part, for commercial purposes.