• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

ATI ahead of nvidia in new gfx so far

Sadly benchmarks mean very little compared to gaming. Hopefully though the same will apply to games. It's about time we had some competition in the market.
 
Dual GPU vs Single GPU != Fair Test.

That doesnt matter.
If the cards are priced at the same point then thats what matters.

If ATI have 100 small gpus on one card and nvidia have 1 fat one and are both priced the same is that fair?
Of course.
 
thats good ati lets partners to decide clock on there self so cards can come out with some insane clocks overclocked high with custom coolers.

me waits sapphire toxic 4870x2 :eek:
 
Everything about Nvidia's lineup now suggests it will be mostly based on paper whilst they frantically prepare a 55nm shrink.

I wonder if we're about to see a repeat of a 5800 Ultra - it wasn't that long ago so you'd think Nvidia would remember...
 
Everything about Nvidia's lineup now suggests it will be mostly based on paper whilst they frantically prepare a 55nm shrink.

I wonder if we're about to see a repeat of a 5800 Ultra - it wasn't that long ago so you'd think Nvidia would remember...

Yeah but they have to get something out to compete with the latest ATI cards.

Other than that I think they would have waited until they had ironed out their problems with the die shrink.

The early cards are just going to be short lived "hot" and poor overclocking cards IMO and I am going to steer clear of them
 
Sometimes the R700 will drop to half the performance when xfire isn't working. Theres also Microstutter with dual gpu cards.

That has been fixed along time ago & was noting todo with multi GPU & was a problem with the drivers on the 3800 series down clocking & would get stutter on a single gpu so the problem would still be there with multi gpu :rolleyes:
 
Last edited:
Id still take a single GTX 280 over it, as the R700 is only going to be faster in games that support CrossfireX, where as games that aint, the GTX 280 will murder it, as it will run its full speeds all the time being a proper single core/card.
 
Dual GPU vs Single GPU != Fair Test.

Sure it is. Particularly so in R700 vs. GTX280 as two GT200 dies could not feasibly be put on one card, firstly due to their enormous size, secondly due to the stupendous amount of heat they'll put out, whereas two RV770 supposedly can be.

Edit: On that note, don't expect the 55nm shrink of the GTX280 to allow nVidia to put two GT200b's on one solution as 65nm to 55nm is nothing compared to the 80nm to 55nm ATi did on the 2900 XT to make the 3870...
 
Last edited:
Is this the first time ATI or Nvidia have told their partners they can choose speeds, memory types, etc without their own input? Could be interesting seeing lower clocked chips being sold cheaper to those 'super overclocked' items
 
There's more to it than that. Multi GPU's suffer with the frame timing.

http://www.pcgameshardware.de/?article_id=631668

Yes but when the 3800 line came out it was severe & it was down to powerplay in the drivers & not micro stutter at the time which people mistaken it for & i noticed it from the get go on single & multi 3800line GPU like many others on COD4, now i don't notice it at all & im on 3x crossfire 3800line & i never noticed it on my 1900xtm+1900xtx with COD4 & never seen complaints about it then or the micro stutter word.
As far as im concerned when its not noticeable with no ill effects then its sorted.

And i would say this awareness is down to a combination of the powerplay issue with the 3800 line & multi GPU becoming more popular at the same & people assuming that the problem was all down to multi GPU & not the powerplay as it was their first multi.

The issue with timing has always been there with multi GPU but only now do they come up with a name & if it was not for the powerplay problem to begin with then most people would be none the wiser or even care..

As many things if it gets to a point that it needs a benchmark or probing tools to notice the imperfections with out ill effect then its doing its intended job.
 
Last edited:
Back
Top Bottom