• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

BATMANGATE - Nvidia bows to public pressure over AA

good news for ati users, they can now enjoy AA in batman using nvidia's code. since you know that ati would never have provided an AA solution for batman lol.
 
It is a very unusual move... I can only imagine someone must have something to hang over nvidia as its not like them to be charitable... or they can see an opportunity to highlight the competition in a bad light.

My guess would be a court case threat, and if they caved, it says to me they didn't have a leg to stand on and knew it and while they are tightening the belt and not looking for a multimillion dollar year long lawyer fest, AMD aren't.

For all the facts you've stated in this thread, is there any proof, beyond conjecture, that Nvidia created this AA code and didn't just stipulate as part of the investment in the game AA would be disabled on ATi cards. Or is it similar to all the other threads where you take obscure quotes and gradually post by post turn them into proof over the course of a multipage thread most people get bored of?

If it was their IP, theres really nothing anyone can "hang" over them to force them to do anything. They pay dev's money for the TWIMTBP program, did Eidos threaten to, you know, stop accepting money from Nvidia.....

Any legal threats wouldn't be threatening, unless they had done something wrong and I really can't think of anything else it could be.

It's not actually buggy, it works fine when you trick the game in to thinking your card is from nVidia.

It worked fine in the demo aswell, before an extra layer of disabling was added for the final game because it was far to easy to enable it in the demo.

I keep seeing Rroff say its untested code, keep seeing utter trash posted about how AMD aren't interested and don't get involved and don't have do any testing to work out bugs. Other than Rroff saying this I've seen no proof of that at all. The fact it worked fine, as does the rest of the game on AMD cards, the fact that no company is dumb enough to not run the game on a couple of computers with AMD cards in and the obvious fact they found no problems.
 
Yeah remember it wasn't coded by ATI...

Games SHOULDN'T be coded by graphics hardware makers for the most part.

It just brings this type of nonsense in to the situation.

This is something all gamers should be against.
 
Drivers are never perfect or we wouldn't see constant driver updates - the graphic card vendors do have to get involved to a certain extent...

TBH the game developer should never be in a position where they have to even touch multisampling code ideally - they should just be able to turn it on like a switch with 1-2 API calls and be done with it.

There is very strong proof that ATI aren't willing to actually get their hands dirty...

It’s also worth noting here that AMD have made efforts both pre-release and post-release to allow Eidos to enable the in-game antialiasing code - there was no refusal on AMD’s part to enable in game AA IP in a timely manner.

If you can't work it out from that wording alone - without any other "evidence" its a pretty poor show.
 
Isn't Mirror's Edge a UE3 game (with physx) yet in-game AA options work perfectly fine with ATI cards? Played through the whole game at 8xAA with absolutely no issues.


What's so special about Batman?
 
For all the facts you've stated in this thread, is there any proof, beyond conjecture, that Nvidia created this AA code and didn't just stipulate as part of the investment in the game AA would be disabled on ATi cards. Or is it similar to all the other threads where you take obscure quotes and gradually post by post turn them into proof over the course of a multipage thread most people get bored of?

If you want proof go ahead and benchmark with forced MSAA and ingame AA and see if there is any performance difference... tho that can only conclusively prove that nvidia did write a multisampling path if there is a performance increase - a lack of performance increase doesn't conclusively disprove it.
 
Give it a rest guys. We don't know the details, no point in debating in the mist. If there was conclusive evidence, there wouldn't be an argument to be had. :rolleyes:
 
Isn't Mirror's Edge a UE3 game (with physx) yet in-game AA options work perfectly fine with ATI cards? Played through the whole game at 8xAA with absolutely no issues.


What's so special about Batman?

Mirrors Edge didn't originally have AA either (demo doesn't have it) - it was added in at some point just before the game went retail - I'm assuming its just a generic forced path tho.

EDIT: Digging a little deeper it seems Mirrors Edge the developers had already ripped out the entire lighting system in the UE3 build they started with to suit the feel of the game better and replaced it with "beast" that does include an AA path... The version of UE3 thats released to developers does not have any AA path.
 
Last edited:
We are also missing the details of what went on between Rocksteady and ATI (which is where it all actually kicked off).
 
Rroff, if this really was IP, why have nVidia bowed to the pressure?

I mean surely it would be watertight in a courtroom, right? ;)

Well it must be close to watertight because all AMD could do was cry about it in public, in the end they've bowed to public pressure from ATI owners who are not getting the after-sale support they deserve.
 
Ah found it...

Tim Sweeney: Unreal Engine 3 uses deferred shading to accelerate dynamic lighting and shadowing. Integrating this feature with multisampling requires lower-level control over FSAA than the DirectX9 API provides.

So to steal someeone elses wording... the multisampling code provided by nVidia doesn't disable AA on ATI cards... it enables it on nVidia cards.

Props to "the coca cola company" who has managed to explain this far better than I have.
 
Ah found it...



So to steal someeone elses wording... the multisampling code provided by nVidia doesn't disable AA on ATI cards... it enables it on nVidia cards.

Props to "the coca cola company" who has managed to explain this far better than I have.

which is what i said 5 days ago.
 
I thought I saw posted that all the ATI fan boys was saying that Physx was crap and no need of it?

Just goes to show a woman can never make her mind up :D
 
Back
Top Bottom