• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD working with a major developer to create a never-before-seen DirectX® 11 technology.

Well again, it is vice versa. Omaeka is the one who wrongly accused Nvidia of gimping Dead Rising 2: off the record.

Time and time again, I have been told that PhysX just isn't worth it blah blah but as soon as it gets mentioned in a differnt thread, Nvidia are the bad guys for not allowing AMD to use PhysX, which I pointed out before that AMD had the opportunity to have this technology but for whatever reason, didn't take it up.

Stand by what you guys say at least.

IIRC, it wasn't quite so clear cut as that to that offer entailed, you provided a first piece, which actually went against the thesis that it was a genuine offer, and then a second piece, which again wasn't crystal clear.

I keep hearing about this PhysX 2.0, which is meant to be a lot better on the CPU however.

EDIT : In your next piece, AMD aren't saying what you've implied, and they're dead right, pushing proprietary technology isn't the way forward. Hardware accelerated physics are needed, but not in the form of what we have now.
That said, the open source "bullet" project didn't seem to go anywhere.

Ideally a collaboration from Nvidia and AMD pooling their resources and technology together to create an open standard would be much better.
 
Last edited:
Those things are powered by OpenCL, it just so happens that OpenCL on Kepler is pathetic, those things will run much better on Fermi.

Its not AMD's fault Nvidia decided to strip out OpenCL on Kepler :)

Introducing effects which they know don't run well on the comparative nVidia cards. It's the same thing just not as sneaky/obvious. It's not like they've introduced them as a show of altruism - it's to make their cards look like they're performing amazingly. Fair play - I don't have a problem with it... I'm just pointing out that it's hardly a 1 sided phenomena.
 
Introducing effects which they know don't run well on the comparative nVidia cards. It's the same thing just not as sneaky/obvious. It's not like they've introduced them as a show of altruism - it's to make their cards look like they're performing amazingly. Fair play - I don't have a problem with it... I'm just pointing out that it's hardly a 1 sided phenomena.

Because they offered extreme anti aliasing? Regular looks fine, and iirc the extreme anti aliasing was just uber amounts SMAA, something Nvidia cards did first.

Oh btw, it's optional, thought I'd point that out, a lot of Nazidia's crippling effects were enforced.
 
Last edited:
Introducing effects which they know don't run well on the comparative nVidia cards. It's the same thing just not as sneaky/obvious. It's not like they've introduced them as a show of altruism - it's to make their cards look like they're performing amazingly. Fair play - I don't have a problem with it... I'm just pointing out that it's hardly a 1 sided phenomena.

I said that exact thing before.
It's too much coincidence they've been used at a time Nvidia are weakest in that area, especially when AMD was always weak in that area compared to Nvidia.

Before anyone mentions my prior point about AMD and tessellation, Heaven benchmark was out long before Fermi cards were and AMD chugged to hell and back on extreme tessellation on that.
 
AMD beat Nvidia in every single price range and offer much more power for the money, the only way Nvidia can compete with that is by doing everything they can to cripple AMD, heaven forbid should they start offering actual hardware that has good value for money. No instead they try their best to force the consumer into buying their overpriced garbage.

That my friend, is the truth.

You're forgetting when Kepler cards were faster and cheaper than the Tahiti equivalent ones on release.... D'oh.
 
IIRC, it wasn't quite so clear cut as that to that offer entailed, you provided a first piece, which actually went against the thesis that it was a genuine offer, and then a second piece, which again wasn't crystal clear.

I keep hearing about this PhysX 2.0, which is meant to be a lot better on the CPU however.

After Nvidia’s CEO, Jen-Hsun Huang, said that Nvidia planned to provide PhysX support in CUDA, many people (including us) thought this meant that Nvidia planned to keep PhysX all to itself. However, the company has confirmed that it’s going to stick by its guns, by making PhysX a free API that’s available to anyone.

Nvidia’s director of product PR for EMEA and India, Luciano Alibrandi, told Custom PC that ‘We are committed to an open PhysX platform that encourages innovation and participation,’ and added that Nvidia would be ‘open to talking with any GPU vendor about support for their architecture.’

As well as this, Alibrandi also promised that the free PhysX SDK would continue to be available to game developers. ‘We plan to continue supporting all key gaming platforms, including the PC and all next-gen consoles, with free PhysX binaries,’ said Alibrandi. He also added that Nvidia planned to make this a ‘continually improving set of tools in an open development platform that encourages leading-edge partners to extend the PhysX eco-system.’

Nvidia is currently working on implementing PhysX into its CUDA language, which is supported by all GeForce 8-series GPUs. When this is ready to go, Alibrandi said that owners of these GPUs will ‘simply need to download the CUDA PhysX drivers from Nvidia,’ and that ‘hardware acceleration will then be transparently supported for applications making use of the PhysX SDK.’

Nvidia plans to support PhysX in a number of ways, and Alibrandi says that these ‘could include both single and SLI based options.’ He also confirmed that Nvidia’s relationship with Havok is now over, saying that ‘we are 100 per cent focused on enabling CUDA-based GPUs to accelerate PhysX processing.’

If you’re one of the rare owners of a PhysX card, then you’ll be pleased to know that Alibrandi also confirmed that Nvidia would ‘continue to support the PhysX processor as demand dictates,’ although he said that CUDA-enabled GPUs would ‘outperform the PPU,’ Interestingly, when we asked if Nvidia would finally reveal the details of the inside of the PhysX chip, he replied: ‘Maybe.’ Ageia was very secretive about the inner workings of the PhysX chip, and we’d love to know what was inside it.

Either way, it looks as though there’s hope for AMD / ATI getting a bite of the GPU PhysX pie after all; the guys at AMD just need to decide whether they want it.

http://www.bit-tech.net/custompc/news/602205/nvidia-offers-physx-support-to-amd--ati.html

PhysX 3.0?

http://techreport.com/news/21088/physx-3-0-adds-support-for-multi-core-cpus

This will be quite big in Arma III unless I have read it wrong and will be available on AMD and Nvidia.
 
Well again, it is vice versa. Omaeka is the one who wrongly accused Nvidia of gimping Dead Rising 2: off the record.

Time and time again, I have been told that PhysX just isn't worth it blah blah but as soon as it gets mentioned in a differnt thread, Nvidia are the bad guys for not allowing AMD to use PhysX, which I pointed out before that AMD had the opportunity to have this technology but for whatever reason, didn't take it up.

Stand by what you guys say at least.

No they did not, show me where AMD were given GPU Physx and did not take it up, go on, that your challenge for the day.

In any case i never said anything in this thread about Nvidia being the evil one, What are you having a go at me for? your just picking fights with random people, gregster.
 
Last edited:
No they did not, show me where AMD were given this technology and did not take it up, go on, that your challenge for the day.

In any case i never said anything in this thread about Nvidia being the evil one, What are you having a go at me for? your just picking fights with random people, gregster.

Post number 87 and I was using your post as reference and wasn't having a dig at you :confused:
 
An article almost 5 years old doesn't really do much for me, as it isn't anywhere near as simple as that, which is what your original quote last time stated.

Bearing in mind Nvidia went against what they've stated in that post about the PhysX cards.

Either way, I'm an advocate of hardware accelerated physics, I just don't think PhysX is the way, unless it changes radically so it's able to run on AMD GPU's as they are.
 
Last edited:
Yep it wasn't that long ago the 670 was the bang for buck recommendation, when a £300 card was near matching a £400 card :)

I'd sooner have a well coded well supported great game then something half arsed with some bonus features for a specific card.
 
Cheapest 680GTX = £313

Cheapest 7950 = £220

Go figure.

680 may bench slightly higher on most TWIMTBP games, but the 7950 will leave the 680 behind at higher resolutions.

Come at me bro.

At launch the 7950 was upwards to 400 quid, the Sapphire model.
GTX680 was besting 7970, even in the AMD titles ; Shogun 2, Dirt 3.
 
An article almost 5 years old doesn't really do much for me, as it isn't anywhere near as simple as that.
Bearing in mind Nvidia went against what they've stated in that post about the PhysX cards.

I couldn't care less either way to be perfectly honest and if ATI were given the opportunity and turned it down or Nvidia openly lied, only a few people will know.

My head goes in my hands when Nvidia get accused of gimping a game from someone who is clueless. I know Nvidia are far from squeaky clean but DR2 isn't something they did.

As for the Hitler picture. A little out of order :(
 
Launch doesn't mean anything, Nvidia aint bothered dropping their prices where as AMD will. Nvidia isn't where the value for money is at.




As for the Hitler picture. A little out of order :(

I know mate, no idea how Nvidia could endorse such a man. :/
 
Cheapest 680GTX = £313

Cheapest 7950 = £220

Go figure.

680 may bench slightly higher on most TWIMTBP games, but the 7950 will leave the 680 behind at higher resolutions.

Come at me bro.

Now go look at the release day pricing(s). AMD obviously wasn't confident in their ability to outsell nVidia at the same price point hence the price drops.

Price drops are never a bad thing of course...

The 680 is just outright faster than a 7950 actually not just in TWIMTBP games. It's only when both are overclocked and at triple screen resolution that it becomes noticeably faster.

I'm not sure what place "come at me bro" has in this debate - perhaps you aren't really suitable for this kind of discussion.

Launch doesn't mean anything, Nvidia aint bothered dropping their prices where as AMD will. Nvidia isn't where the value for money is at.

Well it does mean something unless you're arguing fallaciously?? :confused:

See above regarding the dropping price thing...
 
Last edited:
Back
Top Bottom