ATI Radeon HD 2900 XT Review

Author: Michael Nguyen, J. Micah Grunert
Editor: Howard Ha
Publish Date: Monday, May 14th, 2007
Originally Published on Neoseeker (http://www.neoseeker.com)
Article Link: http://www.neoseeker.com/Articles/Hardware/Reviews/ati_2900xt/
Copyright Neo Era Media, Inc. - please do not redistribute or use for commercial purposes.

It has been a long time coming and countless delays since the rumours of ATI's next generation GPU, the R600 were leaked onto the net. Here a brief timeline between the very first rumours and now. Since then ATI has merged with AMD, NVIDIA has released the 8800 GTX, GTS and Ultra, Saddam Hussein has been eliminated, Martin Scorsese has won an Oscar, etc. You get the point.  It has been a long and arduous time waiting for some new ATI cards. Expectations are very high for the R600 and kept getting fueled with every specification media leak. But it's here and we've had it in our hands for the past week and it is indeed very exciting.  Today we're going to take a hands on look at the R600 and a brief description rest of the "HD 2000" series which is just being announced now.




The R600 or Radeon HD 2900XT will launch on May 14th (today!) however the lower end 2600 and 2400 series will launch concurrently at later date in June. The latter cards are not yet available for sale and no review samples have been made available yet.  The HD 2900XT is currently alread listed in several major e-tailers.  Something really exciting that you should know is that the 2900XT has been priced very competitively against NVIDIA's GeForce 8800 GTS 640MB while having the allure of a more expensive "flagship" product. The HD 2900 XT will be replacing the longstanding X1950 XTX which stood as ATI flapship card for quite some time now. The previous generation for ATI wasn't an overall disaster but comparatively against the 79xx series, it left a lot to be desired and noticeably failed to capture a large marketshare, especially for ATI fans coming of the X800 and 9800 series.

Although this new slew of Radeon releases is being hyped by many media outlets to incite an immediate clash of titans of sort between ATI and NVIDIA, it should be kept in mind that there can't a be definite winner any time in the near future. DirectX 10 should and could play a major role specifically during this release and yet there aren't any DX10 games out as of yet, except for copies of an early test version of Call of Juarez which only certain press have.  Also ATI will be launching another card sooner than later which will indefinitely become flagship card for them, so declaring anything as the de facto standard will be unfair until both companies have put out all their cards on the table. The R600 vs. G80 debate will be a long process but at the very least, the new HD 2900XT will finally spark some flames and perhaps prove to be a major competitor.

Below are a few specification sheet which detail the new launch which has some really attractive numbers.


The new ATI R600 architecture was made in direct contrast to NVIDIA's G80 series, but the two ultimately have a lot in common too.  While ATI touts many of their features that differentiate the two different designs, there are many similarities. The HD 2900XT was changed from the previous individual shader implementation to the current unified stream processor build that NVIDIA also uses and introduced to the world with the G80. And while the 2900XT has twice the amount of stream processors as the GeForce 8800 GTX with 320 in total, they operate at a lower frequency. This means the 2900XT has a higher theoretical limit of raw or parallel computation, but there are constraints which game developers can either get around or get stuck in.



In relation to gaming, the R600 is actually an offshoot of the "Xenos" GPU used in the Xbox 360. The Xenos GPU also used a unified shader architecture. So if you want to the use the performance of the Xbox 360 as a precursor to R600 performance, it wouldn't be such a bad comparison. However ATI has made a few changes to the original GPU design, first and foremost being that they have changed to a 65nm fabrication process for 2600 and 2400 series. While there were rumors that the 2900XT may also see 65nm GPU in production for consumer purchase while the preliminary review batch would 80nm, there isn't any confirmation on that. The new GPU also packs in 700 million transistors which is coming close to what CPUs have. Combine this with the 320 stream processor design and it'll will produce the 2900XT (currently only in 80nm) which has been adjusted to run at 743MHz core and 1.65GHz (825MHz) RAM speeds.

The 2900XT has 512MB of RAM which connects GPU via what ATI calls the first 512-bit bus interface (which is really eight 64-bit memory channels working in tandem). The memory interface has increased the memory bandwidth significantly over the previous X1950 XTX and has a decent amount of RAM in comparison to a 8800 GTS 640MB. Here is a small diagram of the new bus architecture. Pay close attention to the top left where there is additional PCI-e bus stop where there wasn't in the previous Radeon series.



Below is a picture of the power connectors on the 2900XT. Dual connectors are nothing new but the the 8 pin is and will cause people to consider their power supplies much more before buying the new card. First off, at the very least two 6 pin PCI-e connectors are needed to run this card. This would require sticking a 6 pin into the 8 pin. A 6 pin and 8 pin together would be required for CrossFire and ATI Overdrive - ATI's overclocking feature. So with that in mind, the manual recommends a 750W or better PSU for CrossFire and 550W for single card.



Back to the topic of gaming, all the new cards ATI is releasing will support DirectX 10. Image quality was an area ATI spent a lot of time developing, which is evident in their new Anti-Aliasing modes. In addition to their previous AA modes (HDR+AA, Adaptive SSAA/MSAA, etc.), ATI has developed a couple of new modes. 8x Multi-sample AA and 24x Custom Filter AA or CFAA are the new modes that will supposedly work right out of box, meaning that they will work on many current DirextX 9.0c titles. And if getting the settings to work on the existing gaming platform wasn't impressive enough, some more CFAA details will. Apparently the custom filtering refers to the ability to apply software patches such as driver patches to add programmable patterns of texture detection. This detailed picture below will help explain this a little better.



What the picture effectively show is that the range of box filter can be enhanced through programmability by the developers and engineers. New filters can be applied to each individual game so that the filter can be perfected over time to increase efficiency and image quality. Of course this will take valuable man hours to develop and only time will tell if people will actually stick to it. But if it done right, some fantastic visual could be created.  Another feature of CFAA is the Adaptive Edge Detection filter.  Edge detection will focus filtering on the edges of physical objects on screen and render them in priority to flatter objects. This helps in some obvious ways, such as when there is text on flat surfaces that become blurred or overexposed with high AA or when fine details become blurry, two common problems that many gamers have experienced first hand. Edge detection will help maintain clean objects and work harder on "jaggies" images.



Another feature set that ATI is heavily promoting their cards to do efficiently is tesselation. Tesselation is one of the newest features ATI has ported over from their Xenos Xbox 360 design and can be described as a programmable unit that provides geometric data compression. This means that character figure mappings and extreme character detail work will be maintained through the ATI GPU which they claim is "orders of magnitude" faster than if the calculations were handled by the CPU. Tesselation is something new to gaming that developers will take into consideration which may or may not pan out well. In the cases that the it does, it will certainly prove to be an interesting point of discussion in ATI vs. NVIDIA DX10 utilization.

Before we get to the high definition content, I'd like to quickly comment on the physical aspects of the card.

The reference design of the ATI Radeon HD 2900XT is both conservative and not. The card has the same overall Radeon red looks going on, in fact the whole design looks to be like a throw back to the X1950 XTX, with some modifications to the fan design.  One thing to note about the heatsink is that the fan has changed from the normal windmill style to the spindle or waterwheel style that is akin to the fans on G80 cards. Size wise the card is almost exactly the same length (9inches) as the X1950XT and just a hair longer than the 8800 GTS.  On the page after this you can see an array of various cards pictured together to give you an idea of scale.

The 2900XT board also has a dual everything option on it, with dual DVI, PCI-e connector and CrossFire. None of this is new, as we've seen dual 12-bit CrossFire connectors before but these are necessary logistics that should be mentioned.
 


HD 2900 XT on top, X1950 XTX on bottom


The HD 2900 XT and GTS OC cards one beside the other

In the namesake of the new card (HD replacing the X), ATI has made significant enhancement to the video quality and high definition abilities to the new card. Unifed Video Decoder or UVD debuted some time ago on an ATI OEM video card and in short, acted as a GPU offload to take some video playback computations off of your CPU. UVD is supposedly most efficient when paired with a slower CPU, so that the pair can handle high bitrate decoding together and suffer less bottlenecks. This is very similar to NVIDIA's PureVideo HD, nearly the same process in fact. Well now ATI has implemented UVD into the silicon of all its new GPU minus the 2900XT which is inexplicable to me. The suggestion is that a 2900XT would already be paired alongside a powerful enough CPU which would be able to handle HD-DVD or BLU-RAY decoding on its own.  This is also the same thinking that leads NVIDIA to not include their newest PureVideo HD offloading support in the 8800GTX.



UVD is part of the new ATI Avivo HD technology which was developed to utilize the new GPUs for more than just gaming. In the diagram above, it shows the decoding process of the newly released 8600/8500 Geforce series in comparison to the UVD scheme. Note how the 8600/8500 GPU's will not actually do the bistream processing and entropy decoding with their BSP, while ATI's UVD will participate in that part of the chain.  VC1 is an alternate to H.264 for HD content and decoding of VC1 is mandatory for all Blu-Ray and HD-DVD devices, so ATI's solution can be said to be more complete than NVIDIA's.  ATI has also prompted UVD to reduce processing burden on both the CPU and GPU pipelines, thus reducing power draws. I personally hope this is true, but still find myself cautious to any power reduction claims. UVD certainly sounds good, just as PureVideo HD does but I'll leave out my verdict on it until there are high rates of HD-DVD/BLU-RAY adoption and until we actually try it out.

A couple of other HD additions are HDCP/HDMI adaptation and the audio controller.  While the HD 2900XT only has the standard dual DVI ports on the back of the card, a special HDMI dongle is bundled with the it.  This HDMI dongle is literally one of the coolest innovations we've seen in a while: not only will it carry the HDMI video signal, it also incorporates the audio signal too, so home theater enthusiasts will really get the most out of this card.  Also, while HDMI is pretty straight forward, the HDCP and audio controller components of this card are a little harder to explain. Basically, ATI has developed a cheaper way to implement  HDCP decryption onto their cards without CryptoROMs (ATI has mentioned HDCP will be on all their new card series). With HDMI and HDCP taken care of, ATI took the extra step of including an audio controller right on their cards. Just as this diagram below suggests, any of 2000 series cards will natively support audio through HDMI which is Windows Vista compliant.



A couple ending notes about the new R600 series is that ATI will also be releasing a new set of GPUs based on the HD 2600 XT on down. ATI mentions in their press kit that the 65nm fabrication and new Power management software PowerPlay 7.0 will work together to lower power per watt usage and increase idle efficiency. Lastly, the new architecture will also improve folding@home GPU functions. There was been a lot of buzz lately about the PS3 and its stellar Cell processor folding performance, it seems like ATI wants the same kind of splash with its new GPUs, and actually the Folding@Home project's initial work on folding with ATI GPU's is probably what inspired the PS3 folding project.  Support for Folding@Home is actually a big draw for enthusiasts and has actually helped a lot of F@H participants to retire folding networks of multiple older boxes with new boxes powered by ATI GPUs.

The bundle that ATI bundles with their card is pretty spectacular. You get two pieces in the packages that are completely new and intriguing. First is a game voucher for the Valve developed games Half-Life 2: Episode Two, Portal and Team Fortress 2, which has been been rumored to be in production as back early as 2004 when Counter Strike: Source was released. This deal between ATI and Valve seems to exclusive for the time being. So for anyone who purchases the 2900XT during its launch will be among the first people to play the new game whne it's released. This is an absolutely terrific addition to the card.

The second new piece of this package is the HDMI dongle. There are other HDMI converters out on the market right now but ATI claims that the specific one included with the 2900XT is capable of carrying video and audio feeds. So while this dongle converts one of the DVI ports on the 2900XT, it acts as a full functional HDMI port. So add that to the fact that the 2900XT also has an audio controller for native support for audio (which is depicted on the preview page), and this card has amazing potential.

The rest of the bundled hardware can be seen below.

 

 The cluster of cards we tested against the 2900XT (running with the ATI beta Catalyst 8.37.4.2 drivers):

The HD 2900XT is now ATI new flagship card, so we've added four extra benchmarks settings to our regular setup:

The benchmark system used consist of:

The benchmarks used for testing were:

3D Mark 06

3DMark 06 is usually a pretty good judge of graphics performance character. And though it may be a synthetic benchmark, it can help to indicate a card's performance focus. Case in point being that of Splinter Cell: Chaos Theory. This game is quite dependent upon shader performance above all else.

To get 3dMark06 working with the HD 2900 XT you will have to use the "nosysteminfo" command line argument as shown below:

Now to see how the ATI Radeon HD 2900 XT did:

Vertex performance is good.

I would have expected a better fill rate.

The Radeon HD 2900 XT does pretty well here, but its little cousin, the X1950 XTX doesn't seem to know how to process ShaderModel3.0 particles. Good thing the ATI 2900 is compatible up to ShaderModel4.0. 

That pixel shader score hurts.

But perlin noise rocks.

Far Cry






Splinter Cell: Chaos Theory





In Far Cry, the HD 2900 XT takes a pounding with 4xAA and 16xAF enabled. It is a little disappointing but still manages playable framerates at the highest settings (2048x1536 AA/AF on). We think this is definitely a driver related issue that will definitely be fixed, because in all other cases the 2900 XT's raw horsepower would never allow such a poor showing and you can see that the 2900 XT performed just a bit above the 8800GTS in all the upper resolutions when AA/AF were not enabled.  In Splinter Cell the 2900XT comes in at an impressive second place - clearly surpassing the 8800 GTS. As mentioned on the previous page, Splinter Cell is a shader based game and it seems that the 2900XT excelled here.

Doom 3






Quake 4





In Doom, the 2900XT was again only behind the 8800 GTX in all settings and is widely outperforming the 8800 GTS at resolutions of 1600x1200 and above.  With AA/AF enabled the 2900 XT is significantly faster than the 8800GTS across the board, with the performance gap increasing as the resolution goes up.  Actually at the crushing 2048x1536 resolution the HD 2900 XT is 21% faster than the 8800 GTS - clearly a storybook win for ATI here. Quake 4 was another story, as we determined that our benchmarking platform tops out in the mid 80s FPS. While you can ignore the non AA/AF scores, the AA/AF on scores are much more meaningful. Here the 2900XT and 8800 GTS are neck and neck with the lead slightly in the favor of the 8800 GTS.  Remember, the 8800 GTS performance is around where ATI wants its card.

F.E.A.R.





Prey





In F.E.A.R., the GTX takes the lead while the other third cards straddle each other. The 8800 GTS will take F.E.A.R. over the 2900XT but not by much. Prey has some exciting numbers as well, as the 2900XT somehow matches the GTX with AA/AF off. While it slips back in the more regular slot we have come to be accustomed to with AA/AF on, the 2900XT is impressive again. We also wonder whether the 2900 XT would do even better in the AA/AF tests with newer drivers.

X3





Company of Heroes





With these newer games, the results are more of a mixed bag than the previous benchmarks. ATI have traditionally performed really good with X3 for whatever reason and that trend seems like it's staying. The 2900XT manages to be first with AA/AF off but crashes with AA/AF on. Seeing how the X1950 XTX holds strong in there, I've got to believe there is some driver issue with the 2900XT that is holding it back.   This is becoming somewhat of a pattern as the HD 2900 XT performance with AA/AF enabled in some games appears to be disproportionately penalized relative to its performance with AA/AF disabled.

In Company of Heroes, perhaps the most texturally and graphically intensive game in our line-up, the 2900XT performs well beyond what it should and is giving the GTX a very good run for its money. I say I didn't believe what I saw at first but I don't doubt these scores now. The Radeon HD 2900XT showed that it was a proficient card through out all testing and is capable of testing through the roof.

Power Consumption




The 2900XT is an absolute beast and draws enough power for us to identify this as typical ATI card. We have been accustomed to seeing ATI's power consumption skyrocket above NVIDIA's in the past but the G80 series had us wondering whether they could knock them off again. In some way I'd like to congratulate ATI on regaining the crown of least power efficient video card available but still hope they can somehow shrink the 2900XT GPU down to 65nm to reduce power draw.  The HD 2900 XT is clearly drawing 52W more power than the X1950 XTX under load.  Even at idle it is drawing 32W more than its predecessor, though it does actually have slightly (~5%) lower energy consumption at idle than both the 8800 GTX and GTS cards.

Final Thoughts


While ATI was met with some tough criticism even before the HD 2900XT launch, its newest flagship card for the time being has overcome some of the skepticism and has left us with an overall positive impression. Of course with any ATI product, power consumption is always a focal point and some media outlets had concerns of the 2900XT drawing more power than what typical power supplies are able to output. Well if you look at the top of the pages, the power draw isn't as significant as first rumored although it is still pretty high. However CrossFire is more of a concern to me than single card usage and seeing how there aren't any CrossFire certified 2000 series power supplies yet, I wonder if picking up two 2900XT(s) right away is good idea. Plus CrossFire the previous 2 generations of flagship Radeon products doesn't have an impressive stability track record as far as early adopters go, so its likely prudent to wait for those CrossFire Certified announcements before investing in two of these cards.

One area that is yet to be explored is ATI's 512bit memory bus and superior memory bandwidth, two areas that should theoretically give it a strong competitive edge at higher resolutions and with larger textures.  Because of this driver, improvements could give a dramatic performance boost to the card that will just be added gravy considering its already not inconsiderable performance.  Also of note is the pattern we see emerging with the HD 2900 XT being penalized disproportinately when AA/AF is enabled in certain games.  Considering ATI's traditionally superb AA/AF efficiency we suspect this is a driver issue that when resolved, will reveal an even more positive picture for the R600.



Probably the most important factor going into purchasing a new video card is value and pricing. From our benchmarking results, we can conclude that the 2900XT will outperform the GeForce GTS 640MB in some cases and lose in some others. To say whether one is better than the other right now would be neglecting the fact that none of our games are DX10 compatible, in fact no games are DX10 compatible currently (although the world's first downloadable DX10 playable demo will be available this or next week, which ups the ante quite a bit).  So what we can take into account is value. From a bundling standpoint alone, the 2900XT has several package items that will be of value to most if not all people buying the card. The gaming voucher from Valve will allow you to play all three new releases from the giant game developer and apparently a fourth game, Day of Defeat: Source, will also be part of the voucher as well. All of these games are produced off the Source engine and have historic backgrounds (minus portal). Secondly the HDMI converter is a sweet addition which works in tandem with the on-chip audio controller to carry both audio and video feeds. If anything is lacking for the G80 series right now it is HDMI availability.

Pricing of the 2900XT is rather ambiguous at the moment and while the MSRP of the card is $399, availability and vendor stock has skewed the price a little higher than expected. Of course when ATI can produce more of the cards, the pricing will become more stable. Then again NVIDIA hasn't been shy in the past to drop its prices down when another competitor card is released.  As it stands now, there is more of price discrepancy than ATI would like. Comparing the lowest costing 2900XT vs. lowest costing 8800 GTS 640MB on several online vendors, there seems to be a $50-100 difference between the two in favor of the 8800 GTS. Just another factor to weight in. However with the 2900XT, just as with all other launch hardware, we must take future considerations into account. With the arrival of Windows Vista and DirectX 10, both ATI and NVIDIA been looking towards the future just as much as they look at today. And with their new architectures, much of the new functionality has to do with theoretical capability than proven quality. Much of what the 2900XT will be tested for can simply not happen now, we just have to wait.

Still the new architecture for the 2000 series has a bit of heightened excitement to it, pricing and bundle items aside. Of course ATI countered NVIDIA's PureVideo HD with their own version of HD decoding using UVD (both of which are present in the midrange products rather than the flagships). New features in image quality which include Custom Filter AA and Edge Detection have sparked interest in developmental patching to enhance texture quality as time goes on. The list goes on with Tesselation geometry implementation,  improved ring-bus, expanded 512-bit memory buses, a 2.5x increase in Folding@Home capability, etc. But everything I just mentioned is heavily dependent on something that isn't right with the new 2000 series right now, and that are the drivers. With any launch, products are being rushed out the door and some parts of development don't get the attention they deserve due to time constraints (kind of like this review). However unlike this review which won't be changed every month after its publication, ATI has a chance to improve performance and overcome some of the HD 2900 XT current faults. We had some problems with the card's current Catalyst 8.37.4.2 drivers during benchmarking and while I won't go into specifics, they were cumbersome to the whole testing process. So if ATI's engineering can constantly patch their card up, we'll see some better performance.

The new line-up cards has yet to full reveal itself and with the mid-range 2600 coming out in about a month, we'll see what R600 can really do in the more mainstream market.  Right now the HD 2900 XT left us with a positive, and curious impression.  As a new flagship it isn't going to displace the much more expensive 8800 GTX but it offers a sweet and appealing alternative to those considering the 8800 GTS 640MB cards.  It's also a fair assumption that there will be a new flagship card for ATI some time towards the latter part of this year as they address the hardcore enthusiast gap in their lineup.  As it stands right now, the 2900XT is worthy of a purchase and so is the 8800 GTS.

For Canadians who are looking to buy the HD 2900 XT, you might want to check out NCIX's HD 2900 XT page since they are one of the only retailers in Canada to have availability right at launch.  US readers can refer to the US lowest price listings and the etailer usual culprits.

»Neoseeker.com

Copyright Neo Era Media, Inc., 1999-2014.
All Rights Reserved.

Please do not redistribute or use this article in whole, or in part, for commercial purposes.