Multi-GPU Performance in DirectX10 Review - PAGE 1

- Wednesday, September 19th, 2007 Like Share






Get updates when we publish new articles

Comments

Sort by date: ascending descending
0 thumbs!
^
Tweaker Sep 20, 07
Nice article but you got the Ati driver naming scheme down wrong... Next month is Catalyst 7.10 not 8.0.

quote kspiess
-- a Catalyst 8.0 release could significantly change these results.
0 thumbs!
^
kspiess Sep 20, 07
Thanks! Good point.
0 thumbs!
^
Tweaker Sep 20, 07
Sorry didnt mean to point it out to be rude but I still meant it was a really good article because it shows just how much Ati and Nvidia really care about their multi gpu performance.
0 thumbs!
^
kspiess Sep 20, 07
Oh I didn't find your comment rude at all. Actually, I took your advice and changed the line to "a new Catalyst release could ..."

I originally mentioned 8.0 because I meant to communicate 'somewhere soon the road' a soon-forthcoming Catalyst release might fix things up. But ya, "a new Catalyst release..." sounds better.

Funny thing about W.i.C. I was surprised to see the performance so poor for the HD2900xt in my article (for w.i.c). I originally ran a bench or two of W.i.C with the 7.8 drivers, and I *think* (just going on memory) that it run fine. Hmm... oh well. ATI really needs to get those drivers going.

It just doesn't seem right to advertise all the benefits of CrossFire when in reality, CrossFire barely seems to add any bonuses at all in Vista. I feel sorry for someone (maybe who doesn't read up on articles, such as mine ) who bought a extra HD 2900 XT, plugged it into their computer, and then saw their performance DROP.

0 thumbs!
^
Tweaker Sep 21, 07
I can usually stand one $400 kick in the balls mistake but 2 at the same time is just damning. XD

I think both companies should just drop the dual gpu concept unless they had a driver team that could work separately on multi gpu drivers that actually performed better than a single card. Seems like a waste of resources to me if they still cant get the performance right on the 3rd generation of cards that support multi-gpu connectivity, if 3DFX didnt give the hint already...
0 thumbs!
^
axforts2212 Sep 21, 07
Multi GPU is a waste anyway you slice it these days. Lack of good drivers makes it utterly useless. Performance at best is a 30-50% increase, not what you should get for double the money. My philosophy is one card, and later, when its showing its age, buy the best out there and let that hold you for a while.

oh and dont forget, its more than a 400 dollar kick in the balls.

+the money for teh Xfire "certified" 800+watt psu
0 thumbs!
^
Iceguy2003 Sep 21, 07
It's like most of us neoseekers have known this for awhile, but just to see it put like this makes me even sicker. Nvidia/ATi should quit lying back with their feet on their desk counting money and maybe put that money towards their X-Fire/SLi department.

It COULD be at least an 80% increase if someone knew what they were doing. This isn't about video cards, but for example, my older Seagate Barracuda 7200.6's hit a max of 60 mb/s apiece, and when I put them in RAID 0, I hit a max of 114 mb/s. EXPLAIN THAT? People can say what they want about VIA chipsets, but my RAID 0 set up is on the VIA 8237 chipset, and personally, I don't think I lost a bit of performance.

Again, I understand that isn't video cards, BUT, it shows that 30% increases from two cards is not acceptable.
0 thumbs!
^
Scott_ill Sep 21, 07
Unless dual GPU support is across the board and yeilds at least a 50% performance gain on all games theres just no point.

Imagine you get to the stage where one card isnt enough to play a game smoothly and your relying on the performance boost of your second card, then it turns out that game doesn't support SLI / Crossfire.

Far too hit and (mainly) miss

Article was a good read though thanks.
0 thumbs!
^
URLORD Sep 21, 07
DO your numbers represent an average or a highest framerate? I checked out a review at Hardocp and it showed a much higher result for the GTX in Bioshock. And overall, doing benchmarks with demos don't represent real world results(World in Conflict)
0 thumbs!
^
Iceguy2003 Sep 21, 07
But it shows the percent increase that dual-video card set ups give you, though. And it isn't much, depending on the game.
0 thumbs!
^
kspiess Sep 21, 07
quote URLORD
DO your numbers represent an average or a highest framerate? I checked out a review at Hardocp and it showed a much higher result for the GTX in Bioshock. And overall, doing benchmarks with demos don't represent real world results(World in Conflict)
quote URLORD
DO your numbers represent an average or a highest framerate? I checked out a review at Hardocp and it showed a much higher result for the GTX in Bioshock. "And overall, doing benchmarks with demos don't represent real world results(World in Conflict)
Hi URLord,

In regards to Bioshock, they are the average framerates. Personally, I don't agree with the statement that the ocp benchmark had much higher results with then are own. Their are two important things here: 1) The OCP GTX 16x12 bench spikes to 90 some-odd frames a second, but the actual FPS average would be probably be somewhere around, I don't know (hard to judge from the graph) maybe 70 FPS? 2) They benchmarked a different area in the game, so really, comparing the average frame rates across two different benchmarks is not that beneficial. I have not seen the part of the game they benchmarked (it is a later part of the game), so it is really hard for me to say anything about the comparison, but I can tell you that for my benchmark, there are so really huge models moving around (a big whale comes to mind) and it seems like a fairly graphically intensive part of the game, which would lead to lower average framerates. Additionally, they are running @ 2.93 GHZ, which will change things a bit too.

The second part of your comments, ("And overall, doing benchmarks with demos don't represent real world results(World in Conflict)", I do not think is very fair. Our World In Conflict Demo benchmark is a real world result. It isn't a World In Conflict benchmark, it is a World In Conflict Demo benchmark.

Please note that the game was not released at the time of writing of that article.

I'd also like to say that the benchmark in the demo is a very good benchmark, and is a great test for the video cards. Furthermore, performance of the benchmark's demo will definetly relate to performance people will have with the full version of the game. I can't confirm this (I don't own the game) but the benchmark demo and the full game demo are probably the exact same benchmark.

Thanks for your comments. I'm always open to critiscm...

This message was edited by kspiess on Sep 21 2007.
0 thumbs!
^
kspiess Sep 21, 07
quote Scott_ill
Unless dual GPU support is across the board and yeilds at least a 50% performance gain on all games theres just no point.

Imagine you get to the stage where one card isnt enough to play a game smoothly and your relying on the performance boost of your second card, then it turns out that game doesn't support SLI / Crossfire.

Far too hit and (mainly) miss

Article was a good read though thanks.
Thanks.
0 thumbs!
^
Alexvolcan Oct 31, 07
Could you update this marvellous benchmark with 8800GT and/or add games like Crysis Demo ?
And last drivers ATI & NVIDIA make big effort on last drivers for SLI/Crossfire
^
Sponsored
Sort by date: ascending descending
Add your comment:
Name *:  Members, please LOGIN
Email:  We use this to display your Gravatar.

Sign in with
Comment *:
(0.5151/d/web2)