Thats the beauty of politics... they can say what they like publicly... and its not like their past is spotless...
Few if any has a spotless pass so no need in making a point about it.
Its how often & how you plan to proceed that counts.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Thats the beauty of politics... they can say what they like publicly... and its not like their past is spotless...
ATI is just taking the "moral high ground" because they can't compete with nVidia toe to toe with developer relationships... if they had more clout they'd be just as quick to try and put the competition at a disadvantage.
That said I'm not aware of nVidia directly paying developers to make games run crap on ATI.
I personally think all they are doing is damaging the PC games industry and playing into the hands of the console makers.
Thats the beauty of politics... they can say what they like publicly... and its not like their past is spotless...
And nVidia have never blocked AA in any title (that I'm aware of).
To demonstrate this point... we have to take 2 games in mind... Batman AA and Mass Effect 2. Both ship on variations of the same engine, both have the same issues with MSAA and deferred shader compatibility and NEITHER have a generic out the box MSAA implementation.
One game (ME2) ships with no AA at all on any vendor the other game (Batman) ships with an nVidia provided implementation for anti-aliasing thats been tested against nVidia cards and is not enabled on unsupported hardware (of any kind - even older nVidia cards that haven't been tested).
Now anyone whos experimented with getting AA to work in Mass Effect 2 (pre driver hacks) will know that getting AA to work under these situations - and theres plenty of threads to document this - has unpredictable and problematic results, what seems to work in one situation can cause extreme performance degredation, artifacts or even crashes in another. So its not suprising that in the case of Batman nVidia only enabled it on tested hardware... should nVidia be responsible for testing on ATI hardware? it would make ATI look irresponsible for suggesting that the nVidia code should just be enabled untested on ATI hardware.
Now with ME2 both, probably due to the fallout of Batman AA, ATI and nVidia forced driver level hacks, after the game had shipped, to enabled AA in this title, which is far from optimal, but it does the job. Now ATI could have approached the developer and said hey we want to help you implement MSAA + defferred shader support in this title as we know how tricky it can be... the game ships with an AA path thats enabled on ATI hardware because its tested but disabled when unsupported hardware is found... its not ATI's job to test on nVidia hardware, did ATI block nVidia from using AA in this title?
Don't ATI and Nvidia also make the GPU's/mainboards/cpu's for these consoles as well?
Only a fool ignores the past.
The beta was the testing ground & it worked & there was no complaints, Job Done
Games do not need NV or ATI certification.
When it comes to things like AA + deferred shaders it absolutely needs proper testing, just letting a few random people play through an unfinished version of parts of the game would not be sufficent testing.
The PC is a very credible tool and has the potential to become the major gaming system of choice. Really the industry needs some type of summit to set down some guidelines and have everyone pushing in the right direction. Sadly I think it would take a company or two to go under/change leadership before that could happen.
Nvidia need to start doing their bit and stop undermining anyone they consider a threat IMO.
Ofcourse the game is tested on both makes of card extensively. Your not seeing the finer technical problems this specific issue presents. You need a strong technical understanding of exactly how the hardware and software handles the rendering pipeline from start to finish under a variaty of different circumstances - well beyond my capabilities, beyond the capabilities of 99/100 of your average software developer employee's capabilities.
Large majority of software developers these days are middleware people, they know how to init an API and how to hand data to it to do what they want but they don't really know how it does it under the bonnet.
Isn't ATI owned by AMD who I think are the second largest chipset and cpu maker?
Therefore have more clout and buying power than nvidia?
Now you can stop right there because its BS time & i don't like or accept it..
There is enough people who understand exactly what NV has done & know full well that it would have no issues on ATI cards.
When it comes to things like AA + deferred shaders it absolutely needs proper testing, just letting a few random people play through an unfinished version of parts of the game would not be sufficent testing. As can be seen from my posts on forcing AA on nVidia cards in ME2 - at first it seemed to work fine, but then I found areas where it would cause huge slowdowns or even caused the GPU to lockup. Sure beta testing might have thrown up some of these issues but it would have been better to get it right from the start at a technical level than leave it to random play testing.
Before you rabble on about nowt, GenL released a mod that enables AA via Batmans in game menu on ATI's cards and it works fine. He just turned off the if ATI card exists turn off AA switch that Nvidia had hard coded in the game.
Yes it appears to work fine thats not the point.
"Secondly, Eidos asked AMD to provide "robust sample code". To this date, AMD failed to do so, arguing that nVidia's method is the same as AMD's sample code. Given the fact that you can turn in-game AA by changing the vendor lD to nVidia, there is nothing given by Eidos nor nVidia that would prove otherwise."