• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD 6990 Slogan.

IMO, As for the doing 'right and wrong' stuff:

GTX 400 range = Fermi done terribly wrong
HD 5000 range = DX11 done brilliantly
GTX 460 / 500 range = Fermi done brilliantly
HD 6000 range = Disappointment, hardly any better than the cards it replaced.

The 500's and 6000's are just refinements of the previous generation and like you said the 400's where badly done but the 5000's where brilliant so Nvidia had more room for improvement.

If the 5000 series was complete rubbish would the 6000 series magically become better in peoples opinions?
 
The 400 done DX11 better than the 5000 cards when you look at the performance in one of DX11s main features tessellation. One could say the 500 cards are hardly any better than the cards they replaced, Nvida had to boost clocks to make the performance difference not so mediocre.

Price, power consumption, and heat output to performance was absolutely dire, and it still is on the GTX 470 + 480. Back when those cards were released, most people who had been waiting were disappointed and bought a 5850 instead.

Performance alone doesnt make a good graphics card.

The 500's and 6000's are just refinements of the previous generation and like you said the 400's where badly done but the 5000's where brilliant so Nvidia had more room for improvement.

If the 5000 series was complete rubbish would the 6000 series magically become better in peoples opinions?

Yea, I suppose thats why the GTX 500 seem a lot better, and the 6900s dont when comparing them to the previous gen.
 
The 400 done DX11 better than the 5000 cards when you look at the performance in one of DX11s main features tessellation. One could say the 500 cards are hardly any better than the cards they replaced, Nvida had to boost clocks to make the performance difference not so mediocre.

AND that feature is used the least so what does it matter when by the time its is used more in a meaningful way theses cards will just be a memory.
So what's the point is making a GPU twice the size to be better at a feature that will hardly be used in its life time.

If i was offered GPU replacements 5xxx chips that took out the tessellation but gave me more GPU performance that all games generally use than i would gladly give up tessellation.
 
Last edited:
A clocked 480 destroyed a clocked 5870, I would take the extra performance over power consumption anytime, it's not like most people game 24/7,
 
A clocked 480 destroyed a clocked 5870, I would take thee extra performance over power consumption anytime, it's not like most people game 24/7,

Good for you. The vast majority of gamers however who buy mid range cards wouldn't choose to do that. By the time a single 5850 is obsolete, so will your GTX 480 be.

I wouldnt have ever put a GTX 470 or a 480 in my PC even if they were given to me for free. I would have rather sold them and kept my GTX 460 - 560 instead.
 
Last edited:
I wouldnt have ever put a GTX 470 or a 480 in my PC even if they were given to me for free. I would have rather sold them and kept my GTX 460 - 560 instead.

I agree, noise/heat is an issue for me as well. I'd never go any close to a 470/480. If I wanted to upgrade my 460 I'd probably get a 560 or an MSI 6950 Twin Frozr II which seems to run a lot cooler (and therefore quieter) than reference 6950 cards.
 
Not meaning to start trouble here but I thought it was AMD/ATI's gameplan to not bother with huge GPUs, not that they aren't capable.

.

Raven is very aware of that, he's just back on the fanboy bandwagon flapping his gums at every opportunity.
 
What I don't understand is why people are so concerned about power usage/heat with cards of this calibre. Surely you know these cards aren't going to run eco-friendly before buying it? Anyone care to shed some light on this, or am I wrong?
 
What I don't understand is why people are so concerned about power usage/heat with cards of this calibre. Surely you know these cards aren't going to run eco-friendly before buying it? Anyone care to shed some light on this, or am I wrong?

You're right, if I gamed 24/7 then I maybe take power usage in to consideration, seeing as I game a few hours a day it makes little difference to me if I use 80 watts more power than the next card, at least it performs and that's what counts.
 
You're right, if I gamed 24/7 then I maybe take power usage in to consideration, seeing as I game a few hours a day it makes little difference to me if I use 80 watts more power than the next card, at least it performs and that's what counts.

My thoughts exactly, if you got £500+ to buy a graphics card, you got enough to pay for the electricity bill.
 
That annoys me a little, I believe it should say worlds fast DUAL graphics because it will be limited to crossfire drivers for performance, and because it is

Agreed, it's like welding two Ferrari's together front to back and claiming to have the worlds fastest limousine, sure it looks like one but at the end of the day it's still two separate cars bolted together. :o

I will continue to compare such cards to two separate cards operating in Crossfire/SLI rather than the fastest single GPU solution, they don't fool me. :p
 
There is plenty of cars out there with 2 engines and claim to be the most economical. These are still cars and thats what they are called. The car that holds the fastest 0 to 60 times has 2 engines 1 for the front wheels and 1 for the back yet its still a car. No one cares where the speed or economy comes from the same that no one cares where there huge fps comes from as long as the fps is there.
 
Last edited:
Them 2 engine cars are prone to more problems and unreliable performance from now and then compared to single engine cars, same as Xfire/SLI then. Single fast GPU is the path I will always take.
 
Them 2 engine cars are prone to more problems and unreliable performance from now and then compared to single engine cars, same as Xfire/SLI then. Single fast GPU is the path I will always take.

There are plenty with the same view as you but there are plenty that love sli/cf and the extra performance it brings. My mate has not had a single card setup for around 5 years and tbh i can count the amount of problems he has had on 1 hand. He usually has top of the range cards so if sli/cf does not work he still has a top end single card to fall back on anyway.
 
Agreed, it's like welding two Ferrari's together front to back and claiming to have the worlds fastest limousine, sure it looks like one but at the end of the day it's still two separate cars bolted together. :o

I will continue to compare such cards to two separate cards operating in Crossfire/SLI rather than the fastest single GPU solution, they don't fool me. :p



Pretty nice ferrari limo not sure on its top speed though lol.
 
What I don't understand is why people are so concerned about power usage/heat with cards of this calibre. Surely you know these cards aren't going to run eco-friendly before buying it? Anyone care to shed some light on this, or am I wrong?

No one does care about power.

If one card uses 150W and another card uses 450W, but is 3 times faster, who cares.

If one card uses 180W and costs £300, and another card uses 300W, is £450, and only offers 10% more performance, the power/heat/noise/price just become unacceptable FOR THE LEVEL OF PERFORMANCE OFFERED.

if the £400+ 480gtx on launch offered 30-50% more performance, I'd have got one, it would have offered good value and the power was being used for something I could see, feel and be worthwhile.

When it offered no real perceptible difference power/heat were just the last on a long line of reasons in the "con" column when deciding if you should buy.

Think of it more directly, if there was a 180W 5870 and a 250W 5870 that offered a 5% overclock but took a huge amount of power to do so, that heat and power translates into higher noise, harder to cool, is it worth the 5%, no.

Heat and power aren't the "only" reasons Fermi was a poor choice last gen, it was the 5th and 6th reason, not really important but still a factor.

As for power, its worth noting idle if you have two screens is significantly higher power/speed draw on the 480gtx, I'm not entirely sure how that stacks up with the 580gtx, its not a huge deal, this whole screens flickering and increased clocks needed for dual screens is some utterly retarded thing both Nvidia and AMD got wrong, or maybe its another rule/standard/spec/os/driver thing they were forced into, I really don't know.

Its retarded that a 150Mhz gpu can deal with one screen, while never having a 1% load on the gpu, but it can't deal with 2 screens also with essentially no load so has to jump up.

Dear god I hope AMD/Nvidia sort that out for the next gen as its one of the single most retarded things they got wrong, more so seeing as they both pushed this gen of cards as "surround gaming" generation so expecting more than one screen.
 
You're not honestly suggesting that NVIDIA doesn't have the capability to build a dual GPU card (last gen, or whenever) are you? Because if you are, then LOL

I didn't say they couldn't, please show me where I did, however Raven said it was sooooo easy yet we did not see a dual card from last generation........

As opposed to the " AMD can do no wrong, Nvidia can do not right camp " ;)


TBH I don't think the GF100 was ever a viable option for a dual GPU card, it needed refining which it has been with the GF110. I think we will all be pleasantly surprised at how much cooler and quieter the 590 will run compared to a 6990, Nvidia have done their homework on GFX cooling after the stick they took over GF100.

Well I trashed the 6990, I've had a go at their cooling, said the 69xx's were a little tame and been fairly unimpressed(yet exactly where I said performance would be for 6 months before release ;) ).

The thing is, why wasn't a dual gf100/gf110 viable, 2x250W cards is 2x250W cards, they could cut them down further, or release a 2xgf110 that would clearly use significantly less power, we saw leaked design after design of PCB shots of dual gf100's, dual gf110's, we saw designs and attempts by AIB's for 8 months or so and no products.

It was always viable, it was always possible, the simple problem was a dual gf110 would have been slower than a 5970, and a dual gf100 would have been heavily cut down to be thermally possible and maybe not beat a 5970. Frankly why they couldn't release a triple slot cooler version with almost silent cooling I don't know.
 
Back
Top Bottom