Frame rate issues with witcher 2

Soldato
Joined
25 Feb 2010
Posts
3,219
Location
England
Surely, my PC can run this game on max...

Its stable just running around town, but it dont like trees, or foliage.

What settings will I have to turn down for me to play this game well?

A new Graphics Card in order it seems.......sigh...

Suggestions ? whats better than my 5870 but wont cost me a fortune?
 
If you are using Ultra+UberSampling in graphic option, use Ultra instead!

Enabling UberSampling would cripple even a GTX580 down to unplayable frame rate (frame rate drop by 50-60% going from Ultra to Ultra+UberSampling):
http://www.benchmarkextreme.com/Articles/The Witcher 2 Performance Review/P3.html

The thing is while with UberSampling enabled it looks "different" to not enabled, but it does not actually look "better":
http://www.benchmarkextreme.com/Articles/The Witcher 2 Performance Review/P2.html
...so I don't think any sane people should trade off 55-60% of their frame rate for that...it's just a pointless "demanding for demanding's sake feature". If you compare the sceenshots of No UberSampling and With UberSampling, don't think you can actually come to a conclusion of which actually looks better.
 
Last edited:
It's nothing to do with ATI. I have a 6970 and it also cripples it.

I think sometimes with these games, it's not a case of. " Wow, the game is so good and it looks absolutely amazing, my PC can't handle such amazing graphics."

It's more a case of. The game is **** and poorly coded to run properly on lower end hardware. It's the same with GTA IV. It looks nothing special compared with a lot of other games, yet requires so much juice to run. I personally think the Witcher 2 looks crap compared to a lot of games.

Bad company 2 is visually stunning and it can be maxed out on any decent system. I personally don't think the witcher 2 looks any better with it's detailed backdrops etc.

All a load of rubbish, marketing tosh... milking more money out of hardcore gamers to spend hundreds of pounds on graphics cards when I'm quite sure if it was coded properly, it could be run on a 5850 and still look amazing.

I could have course be completely wrong, however... I doubt it!
 
I think sometimes with these games, it's not a case of. " Wow, the game is so good and it looks absolutely amazing, my PC can't handle such amazing graphics."

It's more a case of. The game is **** and poorly coded to run properly on lower end hardware. It's the same with GTA IV. It looks nothing special compared with a lot of other games, yet requires so much juice to run. I personally think the Witcher 2 looks crap compared to a lot of games.
Yea...I get what you are saying. Poor coding aside, we also have these games scaing thing going on, with games delivering "bonus" frame rate for one side comparing to the other thing going as well.

From what I recall, Witcher 2 runs better on Nvidia cards, whereas a in Shogun 2, the game is so heavily optimised toward AMD that an overclocked 6950 is as fast as a GTX580 (normally an overclocked 6950 is matching GTX480 speed).
 
in my opinion, I think the game looks amazing, but i play the likes of CS:S so I guess.

Well, I might just Xfire my card and see if that makes a difference.
 
in my opinion, I think the game looks amazing, but i play the likes of CS:S so I guess.

Well, I might just Xfire my card and see if that makes a difference.
To be honest, if most of your game you play are not that demanding, you'd probably better off keep trying on tweak your graphic settings for Witcher 2..try to reduce the setting a bit and see if you can get a good balance between framer rate and graphic details. I just don't think it's worth getting another 5870 to Crossfire, if Witcher 2 is the only game you are have issue with- better to leave the upgrade on the next gen 28nm cards.
 
But that's the thing, if i had the Nvidia equivalent this game would play fine. which is why maybe swapping is a good idea?
 
Seen people getting 50-60 fps with that card @ 1080p max settings w/o ubersampling so i'm not entirely sure it's AMDs fault for not paying the developer to be bias towards their cards.
 
Seen people getting 50-60 fps with that card @ 1080p max settings w/o ubersampling so i'm not entirely sure it's AMDs fault for not paying the developer to be bias towards their cards.

Im not talking about being biased. im talking about driver issues, my card is only getting 30% activity while playing, it isnt even using all of my card.

You would have thought after all thee years they could get it right...

<angry>
 
Im not talking about being biased. im talking about driver issues, my card is only getting 30% activity while playing, it isnt even using all of my card.
Unfortunately it's not perfect on the other side neither- Nvidia cards suffer driver issues in Dragon Age II, so people also having the similiar problem of the GPU usage being low...

So it's just the way it is...with games...
 
Back
Top Bottom