• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA DirectX 11 Questionnaire – A Response to AMD’s Views

The GPU utilisation is possibly bugged... but with most of the time spent on the CPU processing physics the GPU isn't getting much time to do work which is why you see very low to no utilisation...

The CPU is not doing any of the rendering in Batman at all ever period... that is all on GPU.

The CPU tests in 3D Marks 06 I think are all GPU rendered as well - just extra (unusual) workload put on the CPU to process ai, etc.

GPU utilization monitoring is not bugged, works perfectly fine in all games, for the GPU to not be utilized at all with full physx on is BS, this has been implemented deliberately by NV to screw over ATI performance. OK don't let the GPU to the physix but let it render the rest of the game, but no we get 0% GPU usage.
 
Well if my GPU is doing 0%, I guess the whole game is running on the CPU, and from the looks of my CPU usage that is very unlikely, Batman has screwed ATI users over big time.

It's beyond ludicrous to me that you're moaning about an Nvidia technology running poorly on your ATi card. Not only is this not a genuine grievance, it's not even a grievance; it's akin to a child eating paste and then complaining that it tastes funny.

You bought the card knowing all of this. Even more damningly, anyone with the retail version would've had to voluntarily patch the game to enable the effects! Which even then can be disabled at any time! I mean, you really went the extra mile here, this situation isn't something that was thrust upon you outta nowhere.

Frankly, your dedication to meritless whining is some of the best I've ever seen.
 
^^ Another troll enters the forum, this has got nothing to to with what vendor you go with, its about NV screwing developers to cripple ATI performace to the max, only an NV fanboy would disagree otherwise, so walk along.
 
GPU utilization monitoring is not bugged, works perfectly fine in all games, for the GPU to not be utilized at all with full physx on is BS, this has been implemented deliberately by NV to screw over ATI performance. OK don't let the GPU to the physix but let it render the rest of the game, but no we get 0% GPU usage.

If its doing 10-15fps but not registering any GPU useage... either something is broken or 10-15fps is so low it regsters as less than 1% useage... which I don't believe with scenes as complex as batman... at a rough guess thats about 8+% useage.

There is no way its rendering on the CPU... absoultely no way - you'd be looking at around 1frame every 4-5 seconds... that and the fact that the engine has no working software renderer - so you'd have had to have installed your own software 3D renderer replacement for D3D9.
 
Last edited:
If its doing 10-15fps but not registering any GPU useage... either something is broken or 10-15fps is so low it regsters as less than 1% useage... which I don't believe with scenes as complex as batman... at a rough guess thats about 8+% useage.

There is no way its rendering on the CPU... absoultely no way - you'd be looking at around 1frame every 4-5 seconds... that and the fact that the engine has no working software renderer - so you'd have had to have installed your own software 3D renderer replacement for D3D9.

Hell, HEL hell! :)
 
Throwing money at a developer to make the game run great on your hardware is fine. If nVidia want to do that, more power to them. The thing is, there is no way in hell they would have let AMD anywhere near Batman to get it optimised for their hardware, even if AMD had wanted to.

TWIMTBP is a club. A club where you get piles of money so long as you don't liaise with the enemy.
 
Obviously its broken... card is getting warmer and ~12-15fps in batman is deff. more than 1% GPU usage... its obviously not rendering on the CPU - that level of texture filtering would kill it into minutes per fps.
 
I must say, in Nvidia's defence, I am getting very tired on waiting for AMD to hurry up and get OpenCL GPU support. I mean Nvidia already has DirectCompute 4 support for all of their hardware, and OpenCL support. Meanwhile you're pretty much stuck with Brook+ and CAL if you're on ATi which are more or less dead in the water, unless you have a £200+ 5800 series card in which case you also get full DX compute support. I just hope AMD isn't going to forget about those of us with 'lesser' cards, especially after hearing they won't be implementing OpenCL on the 3800 series...
 
Its getting warmer because the clocks are up and so are the volts, on desktop with powerplay its 33c idle.

Just tried it with physx off and was 40% GPU usage, with 4xAA 70%, so monitoring is fine.
 
Last edited:
Its getting warmer because the clocks are up and so are the volts, on desktop with powerplay its 33c idle.

Just tried it with physx off and was 40% GPU usage, with 4xAA 70%, so monitoring is fine.

So in theory physx on is hogging the CPU so much that there is not enough left to feed the GPU, so your GPU usage is down but your CPU usage should be way up.
 
Its getting warmer because the clocks are up and so are the volts, on desktop with powerplay its 33c idle.

Just tried it with physx off and was 40% GPU usage, with 4xAA 70%, so monitoring is fine.

There is a patch for the CPU Physx & a trick to get the in game AA to work on ATI cards using 3DAnalyse.

I'm sure i posted it here already but my post seems to have been deleted & i would like to know why.
If in fact I'm in error & have not posted it i will do so later.
 
why they didn't atleast include a stock DX AA path I dunno kinda lazy... and whether ATI could have really got their own path included if they had been movtivated to do it is another matter which I can't answer.

ATI claims they provided an AA solution, but the dev didn't include it. I wonder why.. nothing to do with it being a nvidia sponsored game.
 
Quote from AMD regarding performance issues on Need for Speed: Shift

Need for Speed: Shift

In another TWIMTBP title, we submitted a list of issues that we discovered during the games’ development. These issues include inefficiencies in how the game engine worked with our hardware in addition to real bugs, etc.. We have sent this list to the developer for review.

Unfortunately you will be unable to get a fair assessment of our hardware’s performance on this software until the developer releases a patch to address and fix our reported issues.

Wouldn't it be great to know if these "inefficiencies" are deliberate or just a side effect of coding it to run well on nVidia's hardware.

Guess we will never know for sure.
 
That response reeked of politics and a stale company. "We focus on graphics plus cool effects like physics and 3D stereo, whereas AMD just makes incremental changes to their graphics cards."
and disabling AA on a game for Radeon then using that game to promote your card.....is that legal? ;)
It's all really pathetic by Nvidia :o
 
Back
Top Bottom