• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA DirectX 11 Questionnaire – A Response to AMD’s Views

It will actually be interesting to see what Nvidia will do next. Most of their cards are out of stock on OcUK and probably other shops too. Next week ATI will be releasing their mid range cards that are cheaper than the Nvidia equivalents and Nvidia has nothing to counter with until early next year.
 
Yeah its bloody brilliant isn't it, Nvidia getting owned, and fully deserved, teach the ***** a lesson for all the crap they been doing, well done ATi, stick it to the ***** :D
 
Last edited:
Yeah its bloody brilliant isn't it, Nvidia getting owned, and fully deserved, teach the ***** a lesson for all the crap they been doing, well done ATi, stick it to the ***** :D

I agree with "loadsa" in part and that ATI give nvidia a real bloody nose considering what Nvidia have done in the past to thier customers ( I won't go into them all here, they've all been well documented in posts here). I can even remember back to the days of the ATI 8500 where Nvidia released a driver set that suddenly increased performance by about a good 10%-20% performance, so beating ATI's new card with an older Nvidia one (Gefroce 2GTS I think it was). Anyway, what got me was, a load of ppl had bought their new nvidia card then found out later on that the performance was crippled somewhat by the manufacturer just so they could "get one over" the competition. Some ppl might and probably will say thats good business strategy and all that, I don't. I don't like the idea of spending over £300 on a card with castrated performance for a time just so Nvidia can indulge themselves in some dubiuos business tactics.
Anyways, the flip side to this is Nvidia need to keep going for the sake of competition and I very much doubt they'll go out of business (but its not impossible, just improbable) though they might change corporate strategy and go along a different path (consoles and what have you) which again would leave ati with a clear field to themselves.
 
Youtube vids

These CPU effects are miles more fun than the PhysX effects, at least you can interact with them

All I see about PhysX is the same as when it first come out as a PPU, loads of demoes showing us what it can do but when it comes down to it, what game actually makes use of useful physics, rather than smoke or shattering glass?
 
There is a patch for the CPU Physx & a trick to get the in game AA to work on ATI cards using 3DAnalyse.

I'm sure i posted it here already but my post seems to have been deleted & i would like to know why.
If in fact I'm in error & have not posted it i will do so later.

If its the patch I'm thinking of - some of the files aren't strictly "legal" so that might be why the post/thread was deleted.
 
I dont know how people can slate physx in Batman, as it does add a good bit to the atmosphere.

So what if its the only decent game that uses it.

Maybe when ATI can get a bit of it then most of the replies slating it will go away :P
 
Its was slated well before nvidia bought ageia so its not as if its because ati cards can't run it. It will not be used properly unless ati cards can run it anyway. Why are developers gonna put lots of effort into physx when a big slice off the market can't use it. Opencl is the way forward atm for game physics.
 
I dont get why people cant just embrace it, if it makes games look and feel good surly its a good thing along side DX11 :confused:

Hopefully ATI will get in on physx so this can push it on to newer and more titles.
 
Its was slated well before nvidia bought ageia so its not as if its because ati cards can't run it. It will not be used properly unless ati cards can run it anyway. Why are developers gonna put lots of effort into physx when a big slice off the market can't use it. Opencl is the way forward atm for game physics.

yeah. totally agreed! openCL is the future!! but have you noticed how much nvidea dont want this to be? they think cuda and physx is the future xD basically they hope for a future without ati, when they have said enough bad things about DX11, ati hardware being slower than there old low end stuff etc that people believe them. haha. well. i cant see that day! i guess nvidea think people are gullable with some of the daft statements they come out with every week or so!
 
yeah. totally agreed! openCL is the future!! but have you noticed how much nvidea dont want this to be

From the article this thread is about.

question 5) No doubt the reason we're here today is because of the recent questions put forth to us by your Asia Pacfic PR company, but we're sure that their thoughts on their mind are no doubt on your mind as well. What are your thoughts on the response to the questions with answers like "If NV really believes that DirectX11 doesn't matter, then we challenge them to say that publically, on the record.", "Proprietary standards punish gamers" and "GPU accelerated game physics will only be accepted in the marketplace when industry standards are embraced."




answer to question 5) As I've stated earlier, DX11 is a very good thing. Anything that makes the PC gaming experience better is great.

We believe that innovation is good for gamers and not innovating punishes gamers. We support open standards plus standards that allow NVIDIA to innovate in a timely fashion, the way CUDA and PhysX does. We want great features to come to games as quickly as possible. Via DirectX, OpenCL, Bullet or PhysX, it does not matter; we are still happy. We do not prefer one over the other.

The difference is that PhysX and CUDA are here TODAY. Even if no standard is available, we will continue to innovate for our gaming customers. It is a big differentiator for us. NVIDIA GPUs offer great graphics plus great features such as CUDA, PhysX, 3D Vision and SLI.

AMD has been talking about GPU physics for a year and a half, first with Havok and then with Bullet. In that time we have been working to make GPU physics a reality on PC games. For example, people with GeForce GPUs get an awesome in-game physics experience with Batman: Arkham Asylum TODAY. It is unfortunate for AMD GPU customers that AMD does not innovate at the same pace as NVIDIA and that they miss out on great features like PhysX and 3D Vision in blockbuster titles like Batman: Arkham Asylum.

When a game with Bullet Physics ships, NVIDIA customers will get the same great experience. Just as we support PhysX, we also support Bullet physics, In fact, it is being developed on NVIDIA GPUs and includes sample code we provided:

"ATI's Bullet GPU acceleration via Open CL will work with any compliant drivers, we use NVIDIA GeForce cards for our development and even use code from their OpenCL SDK, they are a great technology partner. " said Erwin."


it does make me laugh that even the gpu physics system that ati is eventually going to be using is being developed on nvidia gpus.
but yes the sooner ati get it together the better for all of us.
 
I dont get why people cant just embrace it, if it makes games look and feel good surly its a good thing along side DX11 :confused:

Hopefully ATI will get in on physx so this can push it on to newer and more titles.

All what you said has already been answered many times over & the likelihood is no.
 
I went to the 3D gaming event yesterday and there was an nVidia representative there bigging up nVidia as much as he could.

He was nice guy and all, before he started the nVidia script.

He was raging about CUDA and how great it is so I asked him, "well isn't CUDA somewhat obsolete once OpenCL and DirectCompute are out?"

His response was "actually, that's a very good point, but you still need CUDA approved hardware to run OpenCL and DirectCompte.

I asked him why he said this because OpenCL and DC are designed to be open standards that are to run on any hardware.

He had a mini rant about how ATi haven't brought out any drivers supporting them while nVidia has been supporting them since early 2009, which I said is irrelevant due to windows 7 not being official yet, there's no reason to be boasting about supporting them before Win 7 has been released.
 
I went to the 3D gaming event yesterday and there was an nVidia representative there bigging up nVidia as much as he could.

He was nice guy and all, before he started the nVidia script.

He was raging about CUDA and how great it is so I asked him, "well isn't CUDA somewhat obsolete once OpenCL and DirectCompute are out?"

His response was "actually, that's a very good point, but you still need CUDA approved hardware to run OpenCL and DirectCompte.

I asked him why he said this because OpenCL and DC are designed to be open standards that are to run on any hardware.

He had a mini rant about how ATi haven't brought out any drivers supporting them while nVidia has been supporting them since early 2009, which I said is irrelevant due to windows 7 not being official yet, there's no reason to be boasting about supporting them before Win 7 has been released.

Ouch.
 
Sounds like they let the lamb in amongst the wolves :S if they are gonna send a rep in amongst enthusiasts you'd think they'd send someone who knew what they were talking about.
 
I went to the 3D gaming event yesterday and there was an nVidia representative there bigging up nVidia as much as he could.

He was nice guy and all, before he started the nVidia script.

He was raging about CUDA and how great it is so I asked him, "well isn't CUDA somewhat obsolete once OpenCL and DirectCompute are out?"

His response was "actually, that's a very good point, but you still need CUDA approved hardware to run OpenCL and DirectCompte.

I asked him why he said this because OpenCL and DC are designed to be open standards that are to run on any hardware.

He had a mini rant about how ATi haven't brought out any drivers supporting them while nVidia has been supporting them since early 2009, which I said is irrelevant due to windows 7 not being official yet, there's no reason to be boasting about supporting them before Win 7 has been released.


Nice one.
Putting it straight with no BS or mucking about...great.
 
hehe kyle nicely done.

oh and i suppose actually cuda isnt obsolete yet, well not until windows 7 actually ships, so it has a few of weeks to live...lol :p
 
Sounds like they let the lamb in amongst the wolves :S if they are gonna send a rep in amongst enthusiasts you'd think they'd send someone who knew what they were talking about.

Yeah, that's true, he didn't seem to have much of a clue.

He was going on about fermi for a while too. I said to him that I don't believe they're gonna be out when he was claiming if they don't have any real samples to show off.

He was genuinely oblivious to the fact that the fermi board was a botched mock up. He laughed when I told him about components not matching solder points.

Then started making excuses to how it's al right, everyone uses mockups for demos.

I think it caught him off guard and he didn't really know how to respond.
 
Back
Top Bottom