Snip...
No probs, like you said it was beat to death in the other thread.

Just some firendly advice though, you probably shouldn't have commented on the other thread or DM if you wanted to avoid another debate.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Snip...

Just some firendly advice though, you probably shouldn't have commented on the other thread or DM if you wanted to avoid another debate.
He must be ill or something.
Thankfully he's stayed away since he was called out in a previous thread.
But, if you really want to, you can construct your own drunkenmaster post (if you have the stamina to write a little essay composed of two-line paragraphs). After all, you know exactly what he would say...
But, on-topic, I'll be interested to see what the Fermi refresh brings to the table. I doubt it will be able to compete with AMD in the efficiency stakes, and frankly, I will be (pleasantly) surprised if we see it this year. My completely subjective feeling is that AMD has things in the bag for the rest of this generation, but will have a mountain to climb to match nvidia in the next, given that nvidia has already taken a first iteration of their "completely new" architecture in Fermi. A switch to global foundries could potentially work to AMDs advantage in the next generation though.
Nvidia admitted defeat.
I've never seen you make a post that makes sense, ever.
As for Biased, I'm not
That is where Nvidia failed, MISERABLY
how many spins did it take before Nvidia admitted defeat
it was late, used a lot of power and had lower yields than they'd have liked, and most of the series was sold for a loss, which largely all stemmed from the core being too big.
being essentially a copied design, and having little else to do
Anyway...
Still sounds like a lot of hot air from Nvidia though. Even if the parts are slower than Ati, they won't be 'that' much slower.
To be fair there is no hot air coming from Nvidia on this it’s all from crappy disreputable news sites.
All this is true, but how come so many games run better on Nvidia cards????
All this is true, but how come so many games run better on Nvidia cards????

All this is true, but how come so many games run better on Nvidia cards????
Like?
All this is true, but how come so many games run better on Nvidia cards????
How nice for you.
Sure... In just one post:
...quite apart from having no evidence it was sold at a loss
If you can't see the above as being examples of a biased and over-simplified viewpoint, then there is no hope for you. Try writing without emotive language - it will add considerable weight to the points you are trying to make.

Biased? Do you think something is biased if it's not positive? Instead of biased, maybe you mean "negative" as that'd make far more sense. If they've messed up, and people speak about it, it's not being biased.![]()
), but to dismiss it outright as a "miserable failure" is short-sighted. As I have repeatedly highlighted, GPU architectures are designed to scale over multiple generations. Dramatic reformulations are often unsuccessful in their first iteration (FX5800 / R600 for example), yet provide excellent results with minor tweaks on more refined manufacturing processes (NV40/G70, R700/R800). Only time will tell the success of the architecture, and to state its failure at the first iteration is premature and short-sighted; particularly when the reasons for the apparent shortcomings are not available to anyone outside nvidia's design team.