• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD working with a major developer to create a never-before-seen DirectX® 11 technology.

I know what you're referring to, it's a poorly optimised mess of tessellation implementation.
But it's not like that tessellation disappears if you're using an Nvidia card.
It's pretty much the same situation as the GTX680 in Showdown in the end result, one vendor can't perform as well as possible due to a limitation in the card.

You can turn off the tessellation/use a less intensive setting.

I think it's sloppy and backroom aside, the end result is the same as Showdown.

Indeed. That's the way I view it as well.

@tommybhoy - By getting bogged down in endless "but this" and "yes, but they do this" you can end up getting mixed up in all the detail not all of which is really that relevant and is open to (mis)interpretation.

I'm more interested in the high level approach - effects introduced by AMD/nVidia purposefully perform better on their own hardware.

As I said, it's not exactly AMD being altruistic in implementing these effects in titles - it's to emphasise that the experience is enhanced on AMD hardware which doesn't sound too dissimilar to TWIMTBP claims, no? :p

I understand the argument that it's nVidias fault that these effects could perform as well on nVidia cards as they themselves chose to rip off all the bloat from the Kepler cards but that still doesn't detract from the fact that these effects have obviously been implemented with the Kepler specs in mind. Anyone stating otherwise is being extremely short sighted.
 
No they did not, show me where AMD were given GPU Physx and did not take it up, go on, that your challenge for the day.

In any case i never said anything in this thread about Nvidia being the evil one, What are you having a go at me for? your just picking fights with random people, gregster.

They absolutely were given the chance to implement GPU PhysX - unfortunatly finding the relevant hard quotes, etc. after all this time will be hard and I don't remember specifics - but I have posted it on here before many years ago so if anyone wants to dig up my old posts on it its there somewhere (may have gone into archive by now as I can't find them at a quick glance).

The gist of it was nVidia were "prepared" to open up GPU PhysX to *anyone* aslong as they also ported CUDA - and ATI/AMD turned them down as they wanted to push their own compute system Stream. Somewhat ironic they then turned around and in the same breath stated that physx being proprietary had no future and that they were "commited" to open standards when their real concern was if they embraced CUDA it would kill off Stream - which died a death anyway due to their lack of putting support behind it.

At the end of the day no one actually took nVidia up on their "offer" so we have no way to know how serious they were about it tho beyond AMD the only other companies that would have been in any way interested at all would be intel and possibly sony and neither are really big players in this area.
 
They absolutely were given the chance to implement GPU PhysX - unfortunatly finding the relevant hard quotes, etc. after all this time will be hard and I don't remember specifics - but I have posted it on here before many years ago so if anyone wants to dig up my old posts on it its there somewhere (may have gone into archive by now as I can't find them at a quick glance).

The gist of it was nVidia were "prepared" to open up GPU PhysX to *anyone* aslong as they also ported CUDA - and ATI/AMD turned them down as they wanted to push their own compute system Stream. Somewhat ironic they then turned around and in the same breath stated that physx being proprietary had no future and that they were "commited" to open standards when their real concern was if they embraced CUDA it would kill off Stream - which died a death anyway due to their lack of putting support behind it.

At the end of the day no one actually took nVidia up on their "offer" so we have no way to know how serious they were about it tho beyond AMD the only other companies that would have been in any way interested at all would be intel and possibly sony and neither are really big players in this area.

Then why is GPU Physx and CUDA not just on the open market?

If Nvidia are not bothered about who uses it then why don't they make it Open Source?

You said;
The gist of it was nVidia were "prepared" to open up GPU PhysX to *anyone* aslong as they also ported CUDA
Well thats great, so why did they not go ahead and do that?
 
Last edited:
I predict that they are working hard with the different vendors to ensure that all future graphics cards will have rainbow coloured coolers on them. CANNOT WAIT.
 
Then why is GPU Physx and CUDA not just on the open market?

If Nvidia are not bothered about who uses it then why don't they make it Open Source?

You said; Well thats great, so why did they not go ahead and do that?

Because they had a temper tantrum after no one took them up on the offer and everyone they approached turned them down and so they basically did the "fine we'll keep it to ourselves and you'll regret it in the long term" routine.
 
^
The end result in animosity over PhysX deployment.:(

@martini, rusty,

End result is how it plays out on each vendors hardware at the time, 100% agreement, I'm talking about the morality on how it's employed, big difference having to adjust settings in game, than having to wait for driver teams to add the override option from a driver level/get lawyers involved/patch out DX features. :)
 
Because they had a temper tantrum after no one took them up on the offer and everyone they approached turned them down and so they basically did the "fine we'll keep it to ourselves and you'll regret it in the long term" routine.

Thats a shame.
 
If everyone turned them down then it was likely the deal offered was bad, Nvidia paid to acquire physx, it doesn't make sense for them to give the tech away, its more likely they wanted to license it out to other companies. We could have been looking at all the higher spec amd cards with cuda/physx included costing something like £50 extra, with more people using it it would have meant a permanent extra cost for all none-nvidia gpus. Its better these things are developed as open standards rather than giving one company control to tax the rest.
 
I'm talking about the morality on how it's employed, big difference having to adjust settings in game, than having to wait for driver teams to add the override option from a driver level/get lawyers involved/patch out DX features. :)

Maybe so but it's still just a variant of the same thing.

Good to see you guys are up to the usual and trashing a potentially interesting topic.

Thanks guys.

And that is an extremely helpful/worthwhile comment to add into the mixer.

Thanks Scougar.

:rolleyes:
 
Maybe so but it's still just a variant of the same thing.

They aren't the same though, Sleeping Dogs & AMD are a great example, yeah Extreme AA effects Nvidia cards badly, but it's an AMD evolved game and it's as optional as it gets, it isn't forced on anyone, and isn't much better than standard. The only reason that the setting runs so bad on Nvidia is because they abandoned the tech, nobody asked them to, they had it but got rid of it, AMD are under no obligation to cater to Nvidia, but they certainly don't try and force things they can do better.

If Nvidia did that, they'd be fine, but as it goes they don't. On Crysis 2 they made tessellation mandatory (lets face it, Crysis 2's quality of tess on lowest is more than most games have on highest) and applied it on objects that didn't need it, and they even did things such as force the ocean under the entire game to be rendered that would even effect their own cards, just to further cripple AMD because of heavy water tessellation, and they knew AMD didn't have the tech to tessellate properly.

So yeah, if Nvidia just added the flashy and pretty effects as optional extras on their twimtbp titles, they wouldn't be so hated.

It's the same on the consoles, Sony vs Microsoft, Sony have an advanced tech (Blu-Ray) which is a benefit to their system. They could just as easily request multi platform developers program their games to be as uncompressed as possible in order to require multiple DVDs but they don't. A big reason for on-engine cutscenes is to save on disc space, but Sony could have requested developers instead enforce pre-rendered ones in order to be a pain for the DVD platforms.

Nvidia seek opportunities like that and they take it, and pour millions into putting the publishers in their pockets to do it as often as they can. It's bad ethics and amazing that they get away with it tbh.
 
Last edited:
They aren't the same though, Sleeping Dogs & AMD are a great example, yeah Extreme AA effects Nvidia cards badly, but it's an AMD evolved game and it's as optional as it gets, it isn't forced on anyone, and isn't much better than standard. The only reason that the setting runs so bad on Nvidia is because they abandoned the tech, nobody asked them to, they had it but got rid of it, AMD are under no obligation to cater to Nvidia, but they certainly don't try and force things they can do better.

You're still completely missing the point.

You can read back a page or two to see this point countered. Anyway, as you've now brought it back full circle and we're covering the same points I think the discussion has probably run its course and it should revert to topic as requested by Jokester :)
 
Last edited:
I look forward to seeing what it is, even though the chances are not many games will ever use it.
 
Back
Top Bottom