• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Why Is Nvidia So Nasty To ATI?

"Secondly, Eidos asked AMD to provide "robust sample code". To this date, AMD failed to do so, arguing that nVidia's method is the same as AMD's sample code. Given the fact that you can turn in-game AA by changing the vendor lD to nVidia, there is nothing given by Eidos nor nVidia that would prove otherwise."

And seeing as there is no point in using the same code twice its more a case of ATI didn't need to do anything because there was no point because ATI's method was the same.
Now if ATI s method was different then ATI would have needed to provide "robust sample code".
 
Last edited:
You may not like it, you may not accept it but you will find I am not far from the truth. I'm not going to pretend I'm an expert but I have more than 10 years technical experience in this field both commercial and hobby including coding my own directx 7 based game engine from scratch and developing on (building products on top of rather than writing the engine) the latest DX9/10 based engines.

And there are plenty of others who have just as much time & experience & more in the field who disagree.
 
And seeing as there is not point in using the same code twice its more a case of ATI didn't need to do anything because there was no point because ATI's method was the same.
Now if ATI s method was different then ATI would have needed to provide "robust sample code".

"I believe this technique is very closely related to a technique which we've seen NVIDIA recommend before now"

Richard Huddy's own words... I hope the implications of what I've put in bold don't need to be spelt out.

ATI couldn't be bothered to make an effort for their users, didn't care what the end results were... end of story.
 
Richard Huddy's own words... I hope the implications of what I've put in bold don't need to be spelt out.

ATI couldn't be bothered to make an effort for their users, didn't care what the end results were... end of story.

That quote says nothing of the sort.
 
Last edited:
Seems to me there's more to be gained by PC component manufacturers/developers all working together rather than trying to damage the platform...

The thing is that would absolutely cripple the PC and slow down progress in technology. The hyper-competitive market in the PC space is what brought us all this crazy fast 3D tech in the first place. You would basically end up with an expensive console with all the drawbacks that turn the average joe away from gaming on a computer still there. Not to mention that if all the various players came together and decided the direction that the industry should take, it would put the customer in an extremely weak position. We already pay so much for the hardware now, what would happen if there was essentially no choice?

It's an unfortunate truth but ruthless competition is the most effective way to get humans to innovate. You only have to look at what came out of WW2 or the Cold War.

It sucks that the PC plays second fiddle to the consoles, but you have to look on the bright side that at least we do still get ports (we could be down to just a handful of games every year). Any vendor specific fluff that gets added for PC is just that.

With everything in such a state of flux at the moment, both in terms of API's fighting for dominance and GPU's & CPU's slowly converging, it's just impossible for developers to develop a game for the consoles and PC and truly take advantage of both systems. The hardware gap is just too wide and the costs associated with writing tons of bleeding edge PC specific code doesn't pay off. That's why console dominate right now because you remove a massive amount of headaches from the equation.

Hopefully when future hardware evolves out of Fusion/Larrabee/Fermi and what we are left with is many heterogeneous cores on one chip, a lot of the questions about API's or software/hardware will become moot anyway. I don't expect that to happen for another 2 console generations or so though at a guess, and it has to happen through survival of the fittest.
 
Last edited:
That quote says nothing of the sort.

it says to me that ati didnt need to add anything as the nvidia version would have done the job correctly..

BUT nvidia put a code where if you didnt have a Nvid cad you would not be able to use the AA...

thats Unfair and should be against the law :D

Roff stop trying to back up nvidia... You can plainly see what they did but your blinded by your own ego

btw of you pick out my bad grammer and spelling its due to you being totally silly and not finding something to say to prove me wrong :P
 
Who really cares? It's business, they stab each other in the back wherever possible. Just let them get on with it, buy whichever you like and live a long and happy life. :p
 
it says to me that ati didnt need to add anything as the nvidia version would have done the job correctly..

BUT nvidia put a code where if you didnt have a Nvid cad you would not be able to use the AA...

thats Unfair and should be against the law :D

To be honest all that matters is the user experience & the politics only matters when it directly .

Talking stuff away that was seen to be working in the name of untested & no guarantee in a game will not cut it & people will get worked up.

AA in batman
PPU
NV card for PhysX used with ATI card.
Patched out 10.1 in Assassins creed.
PhysX only running on one core to often in games that also have GPU PhysX, but funny enough many games that just have CPU PhysX don't have that issue.

People were happy with what they had so excuses to take them away is irrelevant as it worked fine in there eyes, this is games not critical software.
Some how NV was doing everyone favour & was being done to prevent harm to user & hardware.
Please don't do me any favours.
 
Who really cares? It's business, they stab each other in the back wherever possible. Just let them get on with it, buy whichever you like and live a long and happy life. :p

Yep If the just kept to slagging each other off then few would care.
 
That quote says nothing of the sort.

Course it does... ATI are assuming what nVidia have done, they assume its closely related to code that they have "seen" nVidia reccomend before... they don't really know for certain anything - but they are happy enough if thats punted out untested to their users... says it all.

it says to me that ati didnt need to add anything as the nvidia version would have done the job correctly..

BUT nvidia put a code where if you didnt have a Nvid cad you would not be able to use the AA...

thats Unfair and should be against the law :D

That still leaves us with a situation where ATI were happy with untested code being punted to their users.

The next point is a common mis-understanding as people are so used to MSAA just being there and working... look at Mass Effect 2 which uses the same engine as Batman AA - do you think that they just "forgot" to add something as fundemental as well?

Roff stop trying to back up nvidia... You can plainly see what they did but your blinded by your own ego

btw of you pick out my bad grammer and spelling its due to you being totally silly and not finding something to say to prove me wrong :P

I am not blind to whats going on with this situation - I have a good grasp of the technical details and contary to what it might look like I'm not trying to backup nVidia on this... what I'm trying to get people to see is where they should be asking questions, not blindly attacking nVidia on an incorrect principle.

I don't give a **** about spelling or grammar on the internet... and I don't (intentionally) attack people personally for having a different opinion.
 
Last edited:
Course it does... ATI are assuming what nVidia have done, they assume its closely related to code that they have "seen" nVidia reccomend before... they don't really know for certain anything - but they are happy enough if thats punted out untested to their users... says it all.

Far too many assumptions & i would guess that ATi knows more than you to whether it needs testing because it could have been tested numerous times before then.
As they said they have seen the code before.
 
Last edited:
It would still need proper testing in context of the specific application even if it worked fine with other Unreal engine based products as each one uses different shader techniques, etc.

Although there is a reference in the last email (from ATI) to something, thats either been removed from the email or an attachment thats not been forwarded, which makes me wonder if code or a reference to a specific technique was included and been omitted by someone at nVidia... but forwarding code like that would be a bit unorthodox. Doesn't really change the point I've been trying to explain tho. If your implementing something like this you need hands on testing and documentation.
 
Last edited:
It would still need proper testing in context of the specific application even if it worked fine with other Unreal engine based products as each one uses different shader techniques, etc.

Although there is a reference in the last email (from ATI) to something, thats either been removed from the email or an attachment thats not been forwarded, which makes me wonder if code or a reference to a specific technique was included and been omitted by someone at nVidia... but forwarding code like that would be a bit unorthodox. Doesn't really change the point I've been trying to explain tho. If your implementing something like this you need hands on testing and documentation.

At the end of the day people only care that it did work & that's all.
 
The thing is that would absolutely cripple the PC and slow down progress in technology. The hyper-competitive market in the PC space is what brought us all this crazy fast 3D tech in the first place. You would basically end up with an expensive console with all the drawbacks that turn the average joe away from gaming on a computer still there. Not to mention that if all the various players came together and decided the direction that the industry should take, it would put the customer in an extremely weak position. We already pay so much for the hardware now, what would happen if there was essentially no choice?

It's an unfortunate truth but ruthless competition is the most effective way to get humans to innovate. You only have to look at what came out of WW2 or the Cold War.

It sucks that the PC plays second fiddle to the consoles, but you have to look on the bright side that at least we do still get ports (we could be down to just a handful of games every year). Any vendor specific fluff that gets added for PC is just that.

With everything in such a state of flux at the moment, both in terms of API's fighting for dominance and GPU's & CPU's slowly converging, it's just impossible for developers to develop a game for the consoles and PC and truly take advantage of both systems. The hardware gap is just too wide and the costs associated with writing tons of bleeding edge PC specific code doesn't pay off. That's why console dominate right now because you remove a massive amount of headaches from the equation.

Hopefully when future hardware evolves out of Fusion/Larrabee/Fermi and what we are left with is many heterogeneous cores on one chip, a lot of the questions about API's or software/hardware will become moot anyway. I don't expect that to happen for another 2 console generations or so though at a guess, and it has to happen through survival of the fittest.

In theory yes but when they are also holding back each others hardware and crippling software that's hardly progress. Look at Microsoft and Windows, it has its faults, too much bloat and tries to do too much imo but it gave a common platform for everyone to work on. The hardware manufacturers could do more I think, we already have common standards for interfaces of varying types....

The constant competition argument is valid and does produce rapid change but as we see now the software can't keep up and the cost to the consumer is rising if only to keep up with the pace of change. A little slower rate of change and fully exploiting the hardware wouldn't go amiss. It makes me think the business models modern tech companies are using just aren't sustainable in the long term. Especially when you factor in the current concerns with shortages or resources and energy. This comes from someone who is most definitely not a tree hugger! :D
 
At the end of the day people only care that it did work & that's all.


I don't have a problem with that aspect... I have a problem with people saying "nVidia intentionally disabled AA on ATI cards in Batman AA" when what they should be asking is:

Why did the developer not implement a generic MSAA resolve? did nVidia actively block the developer from accepting a solution from AMD/ATI in implementing their own MSAA resolve?
 
Last edited:
AMD have got other problems than Nvidia you know... they gotta keep up with the chipsets as well so AMD are taking it 3 ways thinking about it, they are doing the best tehy can through turned some ex Nv games around
 
I don't have a problem with that aspect... I have a problem with people saying "nVidia intentionally disabled AA on ATI cards in Batman AA" when what they should be asking is:

Why did the developer not implement a generic MSAA resolve? did nVidia actively block the developer from accepting a solution from AMD/ATI in implementing their own MSAA resolve?

That's a question that you want to ask all i care is that it works, how what & why does not matter.
The majority only care about using the hardware & software that they paid for.

The & tribulations of how it came in to being, they could not care less about & any reason given to why something that was working & then taken away will fall on deaf ears.

The only way to get away with not giving under the circumstances is to make sure that it never worked in the first place & that's where NV consistently slips up.
Its better not to give than to give & then take away.
 
That's entirely arbitrary to my point.

If the disable switch was not put in then there would not be no need for any discussion on the matter.
The fact is many things work & later NV comes along & gives lame technical reasons why it does not work now even tho people clearly could see that it was working.

Simple fact that people don't believe them so the reasons become irrelevant because it does not change the outcome of the fact of what the end product is to the consumer.
People care about the end product & not the ifs buts & maybes.
 
Back
Top Bottom