• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Why Is Nvidia So Nasty To ATI?

Thats the beauty of politics... they can say what they like publicly... and its not like their past is spotless...

Few if any has a spotless pass so no need in making a point about it.

Its how often & how you plan to proceed that counts.
 
ATI is just taking the "moral high ground" because they can't compete with nVidia toe to toe with developer relationships... if they had more clout they'd be just as quick to try and put the competition at a disadvantage.

That said I'm not aware of nVidia directly paying developers to make games run crap on ATI.


Isn't ATI owned by AMD who I think are the second largest chipset and cpu maker?

Therefore have more clout and buying power than nvidia?

I assume ATI consider the chipset and desktop "office integrated gpu" market as more worthwhile.

I personally think all they are doing is damaging the PC games industry and playing into the hands of the console makers.

Don't ATI and Nvidia also make the GPU's/mainboards/cpu's for these consoles as well?
 
Last edited:
Thats the beauty of politics... they can say what they like publicly... and its not like their past is spotless...

And nVidia have never blocked AA in any title (that I'm aware of).

To demonstrate this point... we have to take 2 games in mind... Batman AA and Mass Effect 2. Both ship on variations of the same engine, both have the same issues with MSAA and deferred shader compatibility and NEITHER have a generic out the box MSAA implementation.

One game (ME2) ships with no AA at all on any vendor the other game (Batman) ships with an nVidia provided implementation for anti-aliasing thats been tested against nVidia cards and is not enabled on unsupported hardware (of any kind - even older nVidia cards that haven't been tested).

Now anyone whos experimented with getting AA to work in Mass Effect 2 (pre driver hacks) will know that getting AA to work under these situations - and theres plenty of threads to document this - has unpredictable and problematic results, what seems to work in one situation can cause extreme performance degredation, artifacts or even crashes in another. So its not suprising that in the case of Batman nVidia only enabled it on tested hardware... should nVidia be responsible for testing on ATI hardware? it would make ATI look irresponsible for suggesting that the nVidia code should just be enabled untested on ATI hardware.

Now with ME2 both, probably due to the fallout of Batman AA, ATI and nVidia forced driver level hacks, after the game had shipped, to enabled AA in this title, which is far from optimal, but it does the job. Now ATI could have approached the developer and said hey we want to help you implement MSAA + defferred shader support in this title as we know how tricky it can be... the game ships with an AA path thats enabled on ATI hardware because its tested but disabled when unsupported hardware is found... its not ATI's job to test on nVidia hardware, did ATI block nVidia from using AA in this title?

The beta was the testing ground & it worked & there was no complaints, Job Done
Games do not need NV or ATI certification.
 
Don't ATI and Nvidia also make the GPU's/mainboards/cpu's for these consoles as well?

Yes a lot of company's take a slice from the cost of the various consoles. The advantage with the console is every person involved with producing a game is working to a common goal.
 
Last edited:
The beta was the testing ground & it worked & there was no complaints, Job Done
Games do not need NV or ATI certification.

When it comes to things like AA + deferred shaders it absolutely needs proper testing, just letting a few random people play through an unfinished version of parts of the game would not be sufficent testing. As can be seen from my posts on forcing AA on nVidia cards in ME2 - at first it seemed to work fine, but then I found areas where it would cause huge slowdowns or even caused the GPU to lockup. Sure beta testing might have thrown up some of these issues but it would have been better to get it right from the start at a technical level than leave it to random play testing.
 
Last edited:
When it comes to things like AA + deferred shaders it absolutely needs proper testing, just letting a few random people play through an unfinished version of parts of the game would not be sufficent testing.

Put it this way there is noway in hell that any main game is not tested on both makes of cards regardless of the main development & I'm not one for accepting lame excuses from anyone.
 
Ofcourse the game is tested on both makes of card extensively. Your not seeing the finer technical problems this specific issue presents. You need a strong technical understanding of exactly how the hardware and software handles the rendering pipeline from start to finish under a variaty of different circumstances - well beyond my capabilities, beyond the capabilities of 99/100 of your average software developer employee's capabilities.

Large majority of software developers these days are middleware people, they know how to init an API and how to hand data to it to do what they want but they don't really know how it does it under the bonnet.
 
The PC is a very credible tool and has the potential to become the major gaming system of choice. Really the industry needs some type of summit to set down some guidelines and have everyone pushing in the right direction. Sadly I think it would take a company or two to go under/change leadership before that could happen.

Nvidia need to start doing their bit and stop undermining anyone they consider a threat IMO.

I don't see that happening for a number of reasons, here's a few...

1. Consoles have significant advantages from a developers point of view. The hardware stays at a fixed level for a very long period of time. Any investment you make in technology development holds it's value until the next cycle.

2. Customer base is much larger for consoles. Of course the total number of PC's is vastly larger, but the number of PC's that are capable of playing even console ports is not.

3. Piracy. Yes i see people argue that it's rubbish and that console games are pirated too, but piracy is significantly smaller on the consoles. You need to have your console physically altered for a start.

4. Most people just want a system where they can throw in a disc, sit back on the sofa and play a game. They typically don't want to worry about drivers, updates, building computers etc..

I'm not arguing that PC's are inferior, i don't own any consoles and i haven't since the Dreamcast - but it's really obvious now that PC's as a primary development platform are not viable for most projects.

Like i said before, Id, EPIC, Crytek etc.. these are the biggest technology developers around, and they have all shifted their focus to the consoles. This speaks volumes.
 
Ofcourse the game is tested on both makes of card extensively. Your not seeing the finer technical problems this specific issue presents. You need a strong technical understanding of exactly how the hardware and software handles the rendering pipeline from start to finish under a variaty of different circumstances - well beyond my capabilities, beyond the capabilities of 99/100 of your average software developer employee's capabilities.

Large majority of software developers these days are middleware people, they know how to init an API and how to hand data to it to do what they want but they don't really know how it does it under the bonnet.

Now you can stop right there because its BS time & i don't like or accept it..
There is enough people who understand exactly what NV has done & know full well that it would have no issues on ATI cards.
 
Last edited:
P0rks0da's last sentence or two pretty much sums it up for me. What I wonder is what would / will the PC hardware manufacturers do ? Like Gigabyte, Asus etc. I appreciate that the PC gaming market is small in comparison with others but in delicate economical times like these I can't help but feel that more companies will bite the dust as everything moves towards a more centralised theme.
 
Isn't ATI owned by AMD who I think are the second largest chipset and cpu maker?

Therefore have more clout and buying power than nvidia?

That doesn't mean that AMD has opened the cheque-book and told ATI to spend, spend, spend though. AMD has it's hands full with Intel who are on a roll with Core 2 and Core i5/i7.

As Drunkenmaster said, it probably boils down to the fact that until the last couple of years, nVidia has had the biggest market share and the biggest budget, more of which they've dedicated to maintaining developer partnerships.

In fact, it's very business savvy to get developers and partners on your side and to lock (or encourage) them into your technology by providing more/better support than rivals. Then once they've developed the games using nVidia suggested implementations, they can slap it on the product which gives the consumer the impression that it will run better on nVidia hardware, encouraging us to all run off and buy some ;)

Whether it does run better or not is often a moot point, but it's all part of the psychological business strategy. When you see nVidia's TWIMTBP logo flash up, you'll probably either feel:

1) Smug because you have an nVidia card and thus reinforcing product satisfaction/brand loyalty
2) Deflation that you have an ATi card and might not be getting the best experience.

The more you see nVidia's logo splashed all over games, the more you'll regret buying ATi hardware and will swap over to nVidia, especially since ATi has very few games with their logo all over it compared to nVidia. At least that's probably theory behind it.

I think it ultimately depends on how your receive nVidia's approach to marketing all of this. Personally I find it all a bit in your-face and abrassive, but it works - while writing this post, at least I knew the name of nVidia's "The Way It's Mean't to Be Played" programme, while I can't say the same for ATi; I know they have one but I can't think of the name of it off hand.
 
Now you can stop right there because its BS time & i don't like or accept it..
There is enough people who understand exactly what NV has done & know full well that it would have no issues on ATI cards.

nVidia used a variation of a documented generic implementation for MSAA + deferred shaders... all well and good... except that its a generic implementation and needs some adapting for specific applications... what is also well documented is the problems and issues and even if it "appears" to work fine theres a very realistic chance of problems.

Why they didn't leave the ability to enable it if the end user understood it was unsupported I don't know - but I can QUITE see the issues that would arise if ATI users enabled it and then it didn't work...........................

You may not like it, you may not accept it but you will find I am not far from the truth. I'm not going to pretend I'm an expert but I have more than 10 years technical experience in this field both commercial and hobby including coding my own directx 7 based game engine from scratch and developing on (building products on top of rather than writing the engine) the latest DX9/10 based engines.
 
When it comes to things like AA + deferred shaders it absolutely needs proper testing, just letting a few random people play through an unfinished version of parts of the game would not be sufficent testing. As can be seen from my posts on forcing AA on nVidia cards in ME2 - at first it seemed to work fine, but then I found areas where it would cause huge slowdowns or even caused the GPU to lockup. Sure beta testing might have thrown up some of these issues but it would have been better to get it right from the start at a technical level than leave it to random play testing.

Before you rabble on about nowt, GenL released a mod that enables AA via Batmans in game menu on ATI's cards and it works fine. He just turned off the if ATI card exists turn off AA switch that Nvidia had hard coded in the game.
 
Before you rabble on about nowt, GenL released a mod that enables AA via Batmans in game menu on ATI's cards and it works fine. He just turned off the if ATI card exists turn off AA switch that Nvidia had hard coded in the game.

Yes it appears to work fine thats not the point.
 
Last edited:
Seems to me there's more to be gained by PC component manufacturers/developers all working together rather than trying to damage the platform...
 
Yes it appears to work fine thats not the point.

Then the developer in question did one of two things:

1. Did not test the game properly on ATi cards in order to verify it working fine.

2. Did a dirty deal with Nvidia to ensure ATi users were left in the cold.
 
As to point 1.

Testing MSAA + deferred shading properly, when you didn't write the engine from scratch and don't have advanced technical knowledge of the hardware pipeline is beyond the capabilities of your average middleware based studio. Which is why Mass Effect 2 ships with no working AA despite being developed by a fairly major studio with a lot of backing.

Sure you can just do some playtests running through the game and see if it crashes or not... but thats likely to come back and bite you in the arse.

On point 2. no one really knows what happened behind the scenes, the code was left open "apparently" for other vendors to implement and test their own resolve... but outside of a few people close to the game no one actually knows if ATI had a fair chance to step upto that or not. The indications from the email exchanges very definitely point to ATI couldn't be bothered.

"Secondly, Eidos asked AMD to provide "robust sample code". To this date, AMD failed to do so, arguing that nVidia's method is the same as AMD's sample code. Given the fact that you can turn in-game AA by changing the vendor lD to nVidia, there is nothing given by Eidos nor nVidia that would prove otherwise."

nVidia gave and tested a code sample for AA, ATI wasn't interested, end of story. If nvidia could be bothered to put the effort in - even if it was to essentially reproduce a generic implementation with some optimization - why couldn't ATI put the same effort in? and if it was so trivial a modification of some publicly available technique why would it be so difficult for ATI to knock up a sample even untested?
 
Last edited:
Back
Top Bottom