• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The Witcher 3 Benchmarks

No I think you'll find individual games is where it matters.

Your results don't show anything different to what I've already shown.

The 290's similar performance to original Titan.

It's no surprise that people are looking for a conspiracy though, must be a boring time as an AMD user.

No an average over a selection of games shows a lot more than individual games. We all know some games run faster on the other vendor and some are on par so an average over a decent selection should give you a better picture than just one or 2 games. I could find 2 games that run better on NV hardware and the results would look totally different.

Why is it boring to use a perfectly fine 290. It suits my needs and i get to game at decent settings while i also prefer game play over visuals.
 
Last edited:
The 3 games I selected aren't games where AMD perform better, the BF4 results were done on DirectX.

There's been little to no change outside of driver tweaks between the 290 launch review and the results in the TX review.
 
The 3 games I selected aren't games where AMD perform better, the BF4 results were done on DirectX.

There's been little to no change outside of driver tweaks between the 290 launch review and the results in the TX review.

I see the whole review as the result not games you chose to suit your argument. Lets get this straight though i don't know if it's gimping but it does appear that Kepler performance has went backwards.
 
I didn't choose them to suit my argument, you argued the 290 never competed with the Titan, I showed it did...

It wasn't just those games either I could have showed more results where they were similar.
 
I didn't choose them to suit my argument, you argued the 290 never competed with the Titan, I showed it did...

It wasn't just those games either I could have showed more results where they were similar.

Over a selection of games the gtx780 was more on par with a 290. Now the 290p has the upper hand over the faster Titan. The results of these reviews show that.
 
Could be anything, anyone that writes one of these benchmark reviews interjects their opinion into it and every opinion has some sort of agenda behind it.

Also they're not writing these pieces for free, I'm sure that they are influenced by wherever their pay cheques are coming from.

My thoughts exactly, the only performance reviews and opinions i trust are my own and that of a few others on this forum and elsewhere.

At the end of the day they are paid to do what they do by the vendors they are reviewing.
 
I'd hazard a guess that if a really obscure game became a go-to for benchmarking AMD vs Nvidia, we'd see a lot of driver improvements for it. :D
 
Sure..

http://cl.ly/1b1i2E3R3F2c

Follow the readme file for the instructions. Its pretty easy.

Go into the SweetFX_settings file and change the Lumasharpen to 0 instead of 1.
Turn off the sharpening feature in the game too as this just adds false sharpening and makes it look awful.

Turn off Chromatic Abberation in the options menu too in game.

Would like to try your settinga but i think the link has broken?

Ty :)
 
I really hope they improve the performance of this title for Kepler owners. I bought a titan a couple months ago cause I have always wanted one and seeing how it ran GTA V really well whilst the game looked amazing made me super happy.

However witcher 3 really struggles to get higher than 60fps at 1440p whilst the game doesn't look nearly as good. Changing settings barely changes how it performs nor looks. Which in turn severely affects my enjoyment of the game as it feels sluggish and unresponsive.
 
I'll have to have a look through some of the older threads where Matt + Co would be banging how they're the greatest because they bought 290 cards which are beating £800 Titans.

Was probably dreaming though.

There are 2 x 290 cards with one being an X model and the other being a P. The 290x was as fast as the titan and the 290p being slower for obvious reasons.
 
That reshade SFX is super grainy. I customised it a little, and modified some some of the settings that NVIDIA mentioned in their guide. You should check it out if you have't already.

Increasing grass draw range and density, along with shadowmap resolution to 4084 (try 8k if you want but it will kill it), and cascade shadow range increased.

KXaDt9T.jpg

But a configuration that works for three TITAN X may not give acceptable results for some people ;)
 
Quick question guys my rig is running the game maxed at high 30s fps average with lows of mid 30s and highs of low 40s. Everything seems nice and smooth and I refuse to turn down settings, however will capping the fps down at 30 fps make things smoother at all or do all the fps count?
 
Quick question guys my rig is running the game maxed at high 30s fps average with lows of mid 30s and highs of low 40s. Everything seems nice and smooth and I refuse to turn down settings, however will capping the fps down at 30 fps make things smoother at all or do all the fps count?

If your not on a g-sync monitor i would recommend locking it at 30 fps ingame and then turn on vsync.. You should only get minimal input lag(should be undetectable due to locking the fps as well) and you get rid of the tearing.
 
Last edited:
There are 2 x 290 cards with one being an X model and the other being a P. The 290x was as fast as the titan and the 290p being slower for obvious reasons.

Nope, as you seen in the benchmarks I linked you the 290x beat the Titan in pretty much everything and the custom 290p that most people bought because of how bad the reference cooler was beat it in some and lost in others.
 
Question for people running a 4k monitor setup!

I'm not getting a steady 60fps and even with gsync I can notice lag on SLI 980's. For some reason at 4k I also get this weird strobe-like effect. I noticed if I turned AA off (not sure why I had AA on at 4k) it fixes it somewhat, but is still visible.

However I can max the game out at 1440p on ULTRA and with Nvidia Hairworks and AA and maintain a constant 60fps. The resolution difference is noticeable, but I think I prefer the more consistent smoothness of the gameplay!

So my question is what would you guys do? 1440p max or 4k with some concessions, but still not be able to maintain 60fps? Also, why is gsync not really working for me in this game... I notice a MAJOR difference from 60-40fps! I thought you shouldn't be able to tell the difference unless the fps went below 40?
 
Can't help you on the strobe effect, can't say I've seen that. But I'd go 4K with a few settings turned down all day long if you must keep a solid 60fps, personally I think the game plays smooth with less than that but each to their own, 4K with a couple of settings on high will still look better than 1440p Ultra.
 
Back
Top Bottom