• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Ian McNaughton goes out against The Way it's Meant to be Played

Associate
Joined
23 May 2008
Posts
420
Well that's the kind of attitude that is costing ATI, if NVidia hadn't worked with the developer then they wouldn't have had support either, ATi can't expect their rival to fix things on their cards.

As SE-Lain pointed out ATI have a LONG history of not fixing bugs until after a game is released (Hotfix drivers etc), it seems they're only making an issue of these bugs because they effect benchmarks.

I've said it in an earlier post,

but what attitude??? I mean developers have a common toolset called DirectX to try and apply things as AA into their game engine.

It makes more sense to me that ATI are tweaking and fixing their "OWN" drivers rather than trying to tell game developers how they should be programming their game?? :confused:
 
Associate
Joined
11 Nov 2003
Posts
128
Location
Glasgow, Scotland
I've said it in an earlier post,

but what attitude??? I mean developers have a common toolset called DirectX to try and apply things as AA into their game engine.

It makes more sense to me that ATI are tweaking and fixing their "OWN" drivers rather than trying to tell game developers how they should be programming their game?? :confused:

DirectX just specifies that your drivers should comply with their API. How you implement the acceleration of those calls is down to the architecture of the GPU. Some of those API calls could very well be stubs - unimplemented functions put in place to satisfy the API spec.

The underlying hardware implementation, even if it complies with the DirectX spec, varies between ATI and Nvidia, hence the need for optimisations specific for each GPU.

Take this quote for example:

When Doom 3 was released, unsurprisingly Nvidia held a large performance lead. ATI analyzed the code, and found shader calculations were being performed manually through a look up table, to compensate for the relatively poor performance of the FX series in this respect. Through application recognition, on the x800 card ATI reportedly found a 30% speed gain[citation needed], just by moving shader calculations to hardware. To this extent, the question of who has the best graphics architecture, is becoming ever harder to answer, as with the increasing complexity of GPUs, performance becomes ever more optimization specific.

Doom 3 was programmed according to the OpenGL spec, yet performed differently on different GPUs. Optimisations are necessary, irregardless of how standardised the DirectX / OpenGL spec is.
 
Soldato
Joined
22 Mar 2008
Posts
11,670
Location
London
Well that's the kind of attitude that is costing ATI, if NVidia hadn't worked with the developer then they wouldn't have had support either, ATi can't expect their rival to fix things on their cards.

As SE-Lain pointed out ATI have a LONG history of not fixing bugs until after a game is released (Hotfix drivers etc), it seems they're only making an issue of these bugs because they effect benchmarks.


The main reason ATi even have to patch things is because developers go outside of DX and OpenGL specs and try to implement lower level things.

AA, however, can be handled via DX, though at times it is beneficial to handle it outside of the spec directly on cards.

Again, however, it is the developers choice to go outside of the frameworks and to develop things using lower level things or using other vendor specific frameworks (PhysX).
It is not for the hardware companies to have to then help every single developer going outside of the specs get their game to run on the hardware - that is the developer's job.

As shown the AA works fine on ATi once it no longer say it's ATi, so this means that it must have been disabled for one of two reasons:
Developer did not bother testing on ATi hardware and thus could not guarantee it would work
Developer disabled it on purpose.

Thing is though, not testing on ATi hardware is not an excuse ... as a developer you should try to test things on as many of the available platforms as possible, the only reason to not test on ATi is because you got paid by ATis competitor to not do so.
Which makes reason 1 the same as reason 2.

Doom 3 was programmed according to the OpenGL spec, yet performed differently on different GPUs. Optimisations are necessary, irregardless of how standardised the DirectX / OpenGL spec is.
Except that DX spec now has AA included, whereas before I don't think it did.
 
Associate
Joined
6 Nov 2005
Posts
157
Validating AA on hardware just means that you need to test your game on an ATi card. How hard is it to get hold of an ATi card.
 
Associate
Joined
27 Jun 2009
Posts
790
Location
Preston, UK
Validating AA on hardware just means that you need to test your game on an ATi card. How hard is it to get hold of an ATi card.
Exactly. I find it hard to believe that the developers would forget to, or wouldn't want to test a feature on cards by one of the two major graphics manufacturers, leading me to believe that it's because of money or contractual conditions with nVidia.
 
Associate
Joined
23 May 2008
Posts
420
Doom 3 was programmed according to the OpenGL spec, yet performed differently on different GPUs. Optimisations are necessary, irregardless of how standardised the DirectX / OpenGL spec is.

That's what I am saying;

in this case id adhered to the OpenGL spec.

Optimizations on the hardware drivers are necessary to make the hardware run to THAT spec. However what I am hearing in this forum is that a hardware company SHOULD directly collaborate with the developers directly, bypassing the said SPEC and have "hard-coded" parts of the game to cater to specific hardware, ditching the rest.

It's funny... Say a game uses DirectX 10.1. If a video card is DirectX 10.1 compatible, it should be able to run it. Whether it was made from ATI, nVidia, Intel, etc.
 
Soldato
Joined
7 Mar 2008
Posts
2,614
Location
Kent
I thought the only reason we have stable development is because of DirectX. The drivers are there to tell the pc what is there and how to use it.

The developers then use the functions in DirectX to call and access things on the graphics card, with DirectX the exact graphics card is irrelevant, that is the whole point of it.

One thing, Ati have had this tessellation(geometry from normal maps) thingy which has been fully functional on the cards and in the drivers(for a long time). but no game has used it because it is not implemented in DirectX until version 11.

If the developer implements their code using DirectX as others have said then it will work fine in either manufacturers hardware if they then get errors with a game after coding it in DirectX then it is the fault of the manufacturer and the drivers. Then they do need to fix it.

I think thats right, if it is, then what Nvidia has been saying doesnt make sense on any level.
 
Soldato
Joined
16 Jan 2003
Posts
10,596
Location
Nottingham
For all those people saying Nvidia are perfectly right to do this, how would you like it if AMD payed a dev to only write multi core support if an AMD CPU was detected and if an Intel CPU was detected it would only run single threaded? I'm sure you'd all be complaining then.
 
Caporegime
Joined
18 Oct 2002
Posts
33,188
Validating AA on hardware just means that you need to test your game on an ATi card. How hard is it to get hold of an ATi card.

Whats even more ludicrous is the suggestion that Nvidia spent tonnes of money testing all their cards for compatibility for the game makers to make sure it all worked.

Yet AA did work with NO ISSUES at all on the demo, with apparently, no testing whatsoever. So either Nvidia spent tonnes making it work because obviously there was a problem and ATi had no problems, or Nvidia didn't need to do tonnes of testing because it just worked........ either way, it worked, fine, for ATi testing or not, they took out a FULLY WORKING FEATURE with no bugs and problems.

Even more ridiculous, Batman wasn't a "new" game when ported to the PC, it was already working, on consoles, including a console with an ATi gpu in, and didn't have issues.


Testing, incompatibility, paying for physx, none of this is an issue or remotely relevant to the thread. Nvidia did two things, they paid a company to remove an already working feature from ATi only cards, which goes straight back and works fine the second you change the hardware ID of the card, there is simply proof of this everywhere.

It clearly works on ATi hardware, compatibility testing is a non issue. As for the tripe Physx in the game that clearly works fine on a CPU when not artificially limited to one core, you realise again that Nvidia are sabotaging their own API to run so slowly it only works, without tweaking, well on their hardware. These are average effects we've had for years that they themselves sabotage to be purposefully slow, harming ALL gamers, just so people use their cards.
 
Soldato
Joined
29 May 2007
Posts
4,898
Location
Dublin
Or a better option don't buy the game or their hardware, thats what I do lol

I'm sure they'll say: "it's just business" so there would have to be a financial incentive for them to stop this type of behaviour. I'm not going to say anything nuts like I'll never buy another nvidia card again, my 8800 GTX is great, but I honestly do not want to support a corporation that behaves like this so no more nvidia cards for me for the next few years.
 
Associate
Joined
2 Apr 2006
Posts
1,190
Location
Somewhere Fabulous...
Whats even more ludicrous is the suggestion that Nvidia spent tonnes of money testing all their cards for compatibility for the game makers to make sure it all worked.

Yet AA did work with NO ISSUES at all on the demo, with apparently, no testing whatsoever. So either Nvidia spent tonnes making it work because obviously there was a problem and ATi had no problems, or Nvidia didn't need to do tonnes of testing because it just worked........ either way, it worked, fine, for ATi testing or not, they took out a FULLY WORKING FEATURE with no bugs and problems.

Even more ridiculous, Batman wasn't a "new" game when ported to the PC, it was already working, on consoles, including a console with an ATi gpu in, and didn't have issues.


Testing, incompatibility, paying for physx, none of this is an issue or remotely relevant to the thread. Nvidia did two things, they paid a company to remove an already working feature from ATi only cards, which goes straight back and works fine the second you change the hardware ID of the card, there is simply proof of this everywhere.

It clearly works on ATi hardware, compatibility testing is a non issue. As for the tripe Physx in the game that clearly works fine on a CPU when not artificially limited to one core, you realise again that Nvidia are sabotaging their own API to run so slowly it only works, without tweaking, well on their hardware. These are average effects we've had for years that they themselves sabotage to be purposefully slow, harming ALL gamers, just so people use their cards.

How do you know FOR CERTAIN that NV paid someone to remove this? All you have is a statement from a blog of an ATI guy. It's hardly conclusive proof.

For all you know it was a decision taken by the devs with no influence from NV. Sure it worked fine for the limitted part the demo showed but you don't know about the rest of the game. Please stop spouting your opinion as fact, we all know how you feel about NV, it's all over this forum!

In NV's statement they say UE3 doesn't support AA natively, which is true. Bioshock didn't have it, UT3 didn't have it, so what makes it so hard to believe that NV worked with the devs to get in-game AA working on their hardware?

I agree with the PhysX side of things however, limitting it artificially is shooting themselves in the foot.
 
Associate
Joined
13 Feb 2006
Posts
465
Location
UK
Halflife 2 was optimised for ATI and had issues on nVidia on first release.

how people forget.

it's 6 or one, half dozen of the other.
 
Back
Top Bottom