• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

1Ghz 5870 18% faster than GTX480 in Heaven with Tessellation off!

aaaayo.jpg


bbbbbbbn.jpg


Love the scale of the Unigine results
 
Much better min FPS on Crysis Warhead though.

But neither is playable. I always laugh at silly results in very demanding games or resolutions when one card gets 5 fps but hey, this card gives you 100% gain and you get 10 fps.

Nobody is going to run at those settings on either card.

And those results with 8xaa seem a little off for the 5870 anyway.
 
That graph shows exactly why you shouldn't believe any test untill the card is released.

The green bar is double the size of the red bar but only around 50 points more, it's laughable. :rolleyes:

Yet in games the ati card seems to hold a good advantage. Like greebo says when the ati faulters the minimum and average frames mean little as at those values its not playable and maybe down to lack of memory. Looking at those results overall though the 5870 looks to be a good bit faster at normal game settings. This is just another nail in fermi's coffin if true with all these results coming out apart from unigine showing fermi not to be faster than the 5 series.
 
Having done a little research - that minimum value does appear to be correct(ish) due to the lack of VRAM I would guess. The average values don't seem to match any other figures I can find for either card tho.
 
Last edited:
Having done a little research - that minimum value does appear to be correct(ish) due to the lack of VRAM I would guess. The average values don't seem to match any other figures I can find for either card tho.

You have to admit though Rroff in most if not all cases leaked benches close to launch time are not to far of the mark and only unigine these gtx cards don't look to promising. If these cards were good we would have seen something good in games by now.
 
I don't understand this really. Have Nvidia spent all this time making a card that is only a bit faster than a GTX 285?? :confused:
 
looks like gtx480/5870 are gonna be trading wins...

personally im waiting for fermi's smaller derivatives to appear so 5770 prices might drop a bit...
 
Much better min FPS on Crysis Warhead though.

Again, people don't seem to understand minimum frame rates.

If the average is say 80FPS, max 120FPS, the minimum could be anything from 79 to less than one FPS without the card being a poor performer.

It only takes the FPS to drop once to that number for a fraction of a second and there's your minimum FPS.

You could be playing for hours at 80FPS, then go in to a new area and the FPS dips to 2FPS for a whole second, while new data is being loaded in to RAM.
 
looks like gtx480/5870 are gonna be trading wins...

personally im waiting for fermi's smaller derivatives to appear so 5770 prices might drop a bit...

That will probably not happen.

I'd suspect you'd get some GT200 based cards to fill in the lower performance areas to be honest.
 
IF NV had a stonker of a card they would have leaked samples out to reviewers, like they have done in the past to big it up for release, alas less than three weeks left and not a whistle, says it all really.
 
Again, people don't seem to understand minimum frame rates.

If the average is say 80FPS, max 120FPS, the minimum could be anything from 79 to less than one FPS without the card being a poor performer.

It only takes the FPS to drop once to that number for a fraction of a second and there's your minimum FPS.

You could be playing for hours at 80FPS, then go in to a new area and the FPS dips to 2FPS for a whole second, while new data is being loaded in to RAM.

Thats why we've said many times on these forums that min fps means nothing without percentages in each band, i.e. 80% of the time over 50fps, etc. or a graph of fps over time.
 
I don't understand this really. Have Nvidia spent all this time making a card that is only a bit faster than a GTX 285?? :confused:

According to Charlie (take it or leave it):

'The GF100 GTX480 was not meant to be a GPU, it was a GPGPU chip pulled into service for graphics when the other plans at Nvidia failed.'

This is now looking to be about right i think (if the current results prove accurate when we have plenty of analysis post 26th). Fermi is certainly going to be a total monster for CUDA/OpenCL, but it seems that a good chunk of the 3b+ transistors are not as useful for gaming as we might ideally like.

I'm still interested to see how things look when they are launched, still a lot of unanswered questions, possible performance increases with future drivers etc.. But for now it looks like it will just be a good gaming card, not an incredible one.
 
That will probably not happen.

I'd suspect you'd get some GT200 based cards to fill in the lower performance areas to be honest.

They can't really rebadge the GT200 as it would be DX10. Nvidia do have a price point that they can aim for- in between £125-175. If they can get a handle on 40nm and release a card with GTX275/4890 performance for £150 then I feel they would be on to a winner as ATi have left that wide open with the 5830 disaster.
 
According to Charlie:

'The GF100 GTX480 was not meant to be a GPU, it was a GPGPU chip pulled into service for graphics when the other plans at Nvidia failed.'

This is now looking to be about right i think (if the current results prove accurate when we have plenty of analysis post 26th). Fermi is certainly going to be a total monster for CUDA/OpenCL, but it seems that a good chunk of the 3b+ transistors are not as useful for gaming as we might ideally like.

I'm still interested to see how things look when they are launched, still a lot of unanswered questions, possible performance increases with future drivers etc.. But for now it looks like it will just be a good gaming card, not an incredible one.

That is somewhat correct - nVidia originally planned to follow up on the 200 series with a 40nm DX10.1 core. The GF100 design was never originally intended to go up against evergreen - tho I can't tell you if it was originally a GPGPU design or intended as the next generation after GT212.
 
Back
Top Bottom