• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Instead of slating Nvidia...

Ati ever since the 3870s have had the more efficient cards I believe power wise correct me if I am wrong. 4870 gave the gtx 260 run for its mooney

Actually when both cards first came out eh 4870 was the quicker of the two a while latter Nvidia changed the card to 215sp instead of 192sp to make it more competitive.
 
I'll explain this once...

I set my nVidia card device ID to "MySuper 10000 GTX" - I get no AA
I set my nVidia card device ID to default - I get AA
I set my ATI card device ID to "MySuper 10000 GTX" - I get no AA
I set my ATI card device ID to default - I get no AA
I set my ATI card device ID to "nVidia GeForce GTX260" - I get AA

Therefore to conclude from this that nVidia specifically locked out AA on ATI cards is a logical fallacy.

Is there a difference between locking out AA for ALL competitors and locking out AA for just ATI? No.

AA is AA, and AA will work on NV or ATi card - AA is also in DX spec.
 
ive definately gota take my hat off and thats coming from a 'nvidia fanboy' lol, ati have done a great job, and things are hotting up now ati have dropped prices....
 
Is there a difference between locking out AA for ALL competitors and locking out AA for just ATI? No.

AA is AA, and AA will work on NV or ATi card - AA is also in DX spec.

You could argue there is no difference between only enabling it on their cards and locking out all competitors but that again implies motive - where do you draw the line between enabling vendor specific features and locking out the competitor?

AA isn't AA - theres all kinds of different implementations, problems and optimizations for anti-aliasing. A varient known as MSAA has become so common people expect it to be there, but when you throw things like HDR into the picture it becomes trickier. AA in the DX spec is pretty naff at best - almost nothing actually uses it in its intended form.
 
The Anandtech Fermi article has only just begun playing up giving that error message in FF. It was working at 9pm UK time because I was reading it and close to falling asleep.
 
It was happening earlier too - when I came home from work at around 2pm it was there, but later on around 6-8pm somewhere it was working without that message.
 
Instead of slating Nvidia, wouldn't it be just be better to congratulate ATI. I mean if ATI's 5 series cards sucked everyone would be all over fermi because at the end of the day they are very powerful graphics cards.

I dont know how ATI pulled it off with a smaller budget compared to Nvidia plus there rocky history, less development time, 6 months early etc the 5 series seem to be some cracking cards.

What do you think Nvidia fail or ATI have out done themselves this time round?

i agree, THE 5000 series is a massive achievement of ati.
 
You could argue there is no difference between only enabling it on their cards and locking out all competitors but that again implies motive - where do you draw the line between enabling vendor specific features and locking out the competitor?

AA isn't AA - theres all kinds of different implementations, problems and optimizations for anti-aliasing. A varient known as MSAA has become so common people expect it to be there, but when you throw things like HDR into the picture it becomes trickier. AA in the DX spec is pretty naff at best - almost nothing actually uses it in its intended form.

AA is not a vendor specific feature though, it is an industry standard mostly now and is a feature expected of latest titles. It is supported by both card makers, and their implementations are similar for the most part.
Making a game only work with NV AA is akin to making a game that will only run on Intel's x86 implementation as it is ever-so-slightly different from AMDs version (as shown by Intel's compiler).
 
I'll explain this once...

I set my nVidia card device ID to "MySuper 10000 GTX" - I get no AA
I set my nVidia card device ID to default - I get AA
I set my ATI card device ID to "MySuper 10000 GTX" - I get no AA
I set my ATI card device ID to default - I get no AA
I set my ATI card device ID to "nVidia GeForce GTX260" - I get AA

Therefore to conclude from this that nVidia specifically locked out AA on ATI cards is a logical fallacy.

The implementation doesn't change the desired effect.
 
AA is not a vendor specific feature though, it is an industry standard mostly now and is a feature expected of latest titles. It is supported by both card makers, and their implementations are similar for the most part.
Making a game only work with NV AA is akin to making a game that will only run on Intel's x86 implementation as it is ever-so-slightly different from AMDs version (as shown by Intel's compiler).

AA is very much a vendor specific feature, take a look at ATI's variations adaptive/coverage, etc. nVidias quincunx, SLI AA, etc. and when you come to deferred shading and anti-alias combinations there are all sorts of pitfalls - ME2 is a good example of this - it has no AA implementation for either ATI or nVidia and is left to driver hacks to make it work.

They didn't make a game that only worked with NV AA tho. They made a game that was only tested with NV AA.

Infact as ME2 is basically a branch of the same engine... imagine this scenario...

ME2 has no AA paths, so they invite nVidia and ATI to implement their own (as this is the most optimal way to get AA + HDR/deferred lighting working efficently), ATI says sure and gets there version working ready for launch, nVidia dithers and delays and by launch still hasn't done anything... so the game ships with an option for AA when an ATI card is detected but doesn't show that option when any other videocard is detected... are people right to get up in arms saying ATI disabled AA on nVidia cards?
 
Last edited:
We congratulate ATi with all of our lovely money :p methinks they'll have made a tidy few these past months, and will continue to do so for a while..

Nvidia..well we hurt them because we love them (well..maybe not kylew)..all most of us want is healthy competition..drives better products, better prices and happier people all round. NVidia hasnt lived up to that, so we bash them and buy ATi instead..
 
They didn't make a game that only worked with NV AA tho. They made a game that was only tested with NV AA.

So that makes it better? They still disable a feature that was supported for non NV setups - doesn't matter how or why it was done, it was done - in exchange for funding from NVidia.

Compare this to how other companies behave, they may fund a specific feature in a program but generally will allow the developer to include that feature even on their competitors platform.
 
re-read my post one back I added a scenario to highlight.


Just adhoc enabling the nVidia AA path on other hardware is a recipe for disaster it would have been a big no no - as you can see from the ME2 thread awhile back trying to force AA on with unsupported drivers appeared to work ok but in some areas you'd get massive slowdowns, AA not applied on certain edges or even the game crashing.
 
Last edited:
So you are saying it's ok that NV paid to "get AA, which is BROADLY standard now, to work on only NV cards", rather than "get AA to be added to the game as they felt that Batman was a big title that would drive more GPU sales"


Other companies have been before commissions about monopoly abuse over actions like that and have lost lawsuits over it, hence why the industry practice now-a-days is to allow for features to be used on all hardware.
 
I think, more to the point, AMD is the 'better' party here. They've been helping developers implement DX10.1 and DX11 effects that Nvidia cards can run without problems. Nvidia, by contrast, have been implementing effects specifically to run on their own hardware.
 
I think, more to the point, AMD is the 'better' party here. They've been helping developers implement DX10.1 and DX11 effects that Nvidia cards can run without problems. Nvidia, by contrast, have been implementing effects specifically to run on their own hardware.

To be fair, NV were so late to the 10.1 and 11 party that it didn't really matter for AMD
 
How do they have a smaller budget? Its AMD, they recently got that massive payoff from intel to drop the antitrust lawsuit, if anything its nvidia with the smaller budget.

I gather they pulled it off by testing out TSMC's 40nm process with their old 4770 cards rather than chance it with a brand new series, so they used the 4770 to get the kinks worked out so everything would be smooth for the 5xxx cypress cores.

AMD have a larger budget, but budget is largely determined by size of teams, if you won the lottery tomorrow, you'd still be living in the same house, have the same car and the same computer, over a short space of time you'd likely buy a new house, but finding the right house takes time no matter how much money you have.

CPU cycles are determined in lengths of years, or even half a decade from early design to out in retail, GPU cycles are shorter but the team that started the 5xxx series cards aren't the "uber rich we have so much money its embarassing" R&D team that Intel and Nvidia had, not by a LONG LONG shot.

Even then, just because you have large backers, these aren't the Chelsea owners, Abu Dhabi's business success is not throwing money around, its being very very good at what they do.

Theres no need to spend 1billion more just because you can, in other words, they can afford to employ everyone at AMD, Intel, Nvidia, IBM, Samsung and anyone else all together from the change in one of their pockets, the problem being those people have jobs, they don't need that many people, and finding the best people takes time. I'm fairly sure investment and R&D spending will increase, quite a lot, but also factor in very very few companies massively increase spending when the world economy is crashing badly.

Either way you have the reason for the success partially right, they learnt a LOT form the 4770, but not quite as much as you think. They learned that via's weren't being reproduced accurately so large redundancy will help MASSIVELY, and that transistor size is simply not remotely accurate.

But the fundamental reason they weren't effected is down to trial and error. They built the 2900XT on the basis of TSMC's promises, post 2900XT they decided to build a core that would still work in the case that TSMC screw up, because when you assume they'll screw up you won't be screwed yourself.

The way to protect yourself from bad and leaky processes is to make smaller more efficient cores, with lower clock speeds and higher IPC. Sound familiar? Yes every other chip maker in the world follows the idea that Ghz is an old idea. Intel's chips are smaller than AMD's, and more efficient, how does AMD compete, on price. Nvidia ignore the trend of every other chip maker on the planet, and decide not to compete on price either, they are flat out stupid.

Nvidia have been trying to make 40nm chips for well over a year now, from failed designs of the gt200b they should have known how to "fix" Fermi, they just wouldn't. The 3 respins still haven't fixed the fundamental issues.

THe problem is, its almost a certainly that Nvidia will make a pretty huge loss on every single sale of a 470/480gtx, and thats just not good business.

But as others have listed, we tend to enjoy it when arrogant, rude, cheating companies screw up, we all like a little karma, makes us feel better about life and the fairness of it all I guess, it doesn't matter the company or area, we all get tickled when a "mean" company gets screwed and we all love the underdog.

AMD's 5xxx series only looks so fantastic due to Nvidia's screwups, in reality its a pretty simple die shrink + increase in shaders compared ot the 4870 series, which was a great core but also a fairly simple idea. THe 2900xt is technologically, well, it had so many fantastic breakthroughs all in one card. Tesselation dx10(10.1, I'm not actually sure, I think it is dx10.1 but dx10.1 simply wasn't out back then), ringbus memory(total overkill but fantastic tech), it came with the new shader style which was a monumental change in shader style, pretty much an unmatched change in terms of graphics technology since I can remember being into computers(GF3 ish era).

THe 5xxx series is still at its heart a 2900xt, cut down for the better of the consumer, costs and manufacturing but still fantastic.

Also when you consider where the 5xxx stems from, the 2900xt and the massive changes in the 3870, you'll realise those were during the "lean" Ati/AMD years.
 
Nvidia have been trying to make 40nm chips for well over a year now, from failed designs of the gt200b they should have known how to "fix" Fermi, they just wouldn't. The 3 respins still haven't fixed the fundamental issues.

This really puzzles me... how on earth can nVidia have gone through all the birthing pains of the 200 series 40nm cards and not learnt anything from it and then start all over again with a design even less suited to the problematic process they had to deal with.
 
Instead of slating Nvidia, wouldn't it be just be better to congratulate ATI. I mean if ATI's 5 series cards sucked everyone would be all over fermi because at the end of the day they are very powerful graphics cards.

I dont know how ATI pulled it off with a smaller budget compared to Nvidia plus there rocky history, less development time, 6 months early etc the 5 series seem to be some cracking cards.

What do you think Nvidia fail or ATI have out done themselves this time round?

ATI wasn't 6 months early, Nvidia was 6 months late. Due to being this late on I personally would have concentrated on making a card capable of beating an ATI refresh, not the current cards on the market. Its normally 8-12 months before a gcard company introduces a refresh based on past events so nvidia may have made a mistake here unless they have something up their sleeves.

Fermi are good cards, the only reason people are put off is mainly the price. The temperature and power issues are all solved by either better cooling be it from the end users or a company customising the card itself and most end users buy a good PSU for their system anyway when they are even considering a top level graphics card.

If the 480 was £310 people would be dragging ATI's name through the mud
 
Back
Top Bottom