Soldato
- Joined
- 1 May 2013
- Posts
- 10,011
- Location
- M28
doesn't say what was deleted

Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
You're coming across as very misinformed.
doesn't say what was deletedother than personal attacks which is quite understandable.
You are calling me misinformed, ok then, answer this question.
If you are changing settings for a given set of hardware in order to obtain a certain amount of performance, what do you call this.
If you test something and put everything at max and ignore any apparent architecture advantage or weakness, what does this entail more of.
These 2 scenarios are very different and give very different results, what guru3D did was tweak settings as they felt the numbers seemed odd, in order to achieve this, they TWEAKED the settings away from Max or everything on.
If you are truly wanting a true performance benchmark, then you test everything on, ideally you also use the newest thing, you ignore what the hardware is good or bad at, if you change things to get the best performance out of a given hardware, that is literally called tweaking - Which is what Guru3D did.
Hopefully a more credible site will benchmark the game and show the true performance of each card. Guru3D is now a proven shill so pointless to rely on it.
wow.. he sure has. When I read it earlier he told a guy to F... OFF when challenged. What a joker.
Here's the link to the discussion anyway.
http://forums.guru3d.com/showthread.php?t=412306&page=3
You asked me this plenty so now its my turn, how do YOU feel about giving up your 290X for the 1080?![]()
Has anyone here got an RX480 and a 1070 to do some tests? Someone reliable and not Nvidia (gregster) or AMD (humbug) fanboys?
Ok Ill try explain this better and please try to have an open mind about it because this is to help.
I'll use Vulkan Doom as an example. Yes you have a setting in the game which you can toggle to enable Vulkan or open GL but this setting is nothing to do which the actual visual settings. Behind closed doors, Vulkan is completely changing how the game is rendered compared to open GL. It merely takes advantage of features more prominent in AMD cards. Its not a trick or cheat or lowered graphics setting. In other words it is unlocking the full potential of the gpu just as dx11/open GL uses nvidia cards to max potential (open to debate). This is comparable to how different drivers give different results only at a much lower level. It differs in that this is a totally different code the game is written in not how the card processes the code like a driver.
Understand that nowadays there is no 1 definitive way to run games anymore. With these cards becoming so complex and different to each other it becomes sensible to give users more choice in how they wish the game to be rendered based on user hardware. Where you seem confused seems to be that we have to choose which platform to run through the settings but rest assure no graphics changes are being made, no tweaks, no nothing except how your gpu can render... with any settings, be it max/lowest/whatever.
As for this shader cache, to my knowledge, it simply allows cards to utilize more memory by storing unused textures for later use or something to this effect. If you don't have the memory it will clog up your vram and bog down performance. And that is with both AMD and Nvidia cards.
Ok Ill try explain this better and please try to have an open mind about it because this is to help.
I'll use Vulkan Doom as an example. Yes you have a setting in the game which you can toggle to enable Vulkan or open GL but this setting is nothing to do which the actual visual settings. Behind closed doors, Vulkan is completely changing how the game is rendered compared to open GL. It merely takes advantage of features more prominent in AMD cards. Its not a trick or cheat or lowered graphics setting. In other words it is unlocking the full potential of the gpu just as dx11/open GL uses nvidia cards to max potential (open to debate). This is comparable to how different drivers give different results only at a much lower level. It differs in that this is a totally different code the game is written in not how the card processes the code like a driver.
Understand that nowadays there is no 1 definitive way to run games anymore. With these cards becoming so complex and different to each other it becomes sensible to give users more choice in how they wish the game to be rendered based on user hardware. Where you seem confused seems to be that we have to choose which platform to run through the settings but rest assure no graphics changes are being made, no tweaks, no nothing except how your gpu can render... with any settings, be it max/lowest/whatever.
As for this shader cache, to my knowledge, it simply allows cards to utilize more memory by storing unused textures for later use or something to this effect. If you don't have the memory it will clog up your vram and bog down performance. And that is with both AMD and Nvidia cards.
Not disputing that the benchmarks are a bit sus but, That statement has blown my mind tbh
When they test cards should they remove Mem/cores/clocks/whatever to bring them in line to make it fair ?
I know I've refereed to hardware but surely the difference seen on AMD is due to hardware advantage or lack of on NV cards being such a big difference ?
The cache also gives no enhanced image (that I am aware of) and if it is detremental to some cards, it makes sense to turn it off no?
Original GURU3D test (Shadow Cache = ON)...........................................Updated GURU3D test - 1080P (Shadow Cache = OFF)
![]()
![]()
The shader cache option has no impact on visual quality and looking at the on vs off comparison, the majority of both brands of cards gain FPS, only a few cards lose a few FPS:
What I find even more interesting is how the 970 and 980 are excluded from the benchmarks with shader cache on...
See, you cant have a reasoned debate/discusson if you are going to delete any comments that do not agree with your take on things. However, there's no need to call people nasty names or swear at them etc...But this seems to be ridiculously one sided on Guru 3D and in my opinion reduces his credability somewhat.
Another thing that sprang to my mind......
Within the last few months we have seen some review sites putting the DX11 results for Nvidia Cards up against the DX12 results of AMD cards....simply because that was the fastest FPS that either card got out of a game. Okay I am good with that.
However with this approach in mind why isn't this being done now with shader cache on or off to show the fastest FPS for each card with Res 7 ?
We must keep a level playing field and not move the goalposts when it suits us.....and indeed this is what it looks like is happening here. It seems it is okay to mix n match DX versions to get the fastest results per card but not with Shader/Shadow Cache......when it doesn't suit the green team.
Also if this is indeed a bug in Nvidia drivers which is not using shader caching correctly, when they do fix it and Nvidia cards do get a boost from it....that is when it will be okay to use it, yes?. Mark my words...same goes for Async Compute when Volta finally has it.![]()
Graphical/game performance is all that matters, run each card with whatever tuning you like if it has zero effect on visuals or any other aspect other than FPS. These caches are irrelevant to these qualities from what I understand, so use whatever.
When I do my own testing for my youtube channel, I test with everything maxed out (everything) but would that be fair to do the same for a 970 or a Fury X for example. I would be ripped a new one if I did that.
So would it have been fair to do that with the above cards? The cache also gives no enhanced image (that I am aware of) and if it is detremental to some cards, it makes sense to turn it off no?
Guru was spot on with turning the Cache off in my opinion. It is fair to all cards when it is.