• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Resident Evil 7 Benchmarks

This confuses me. Most benchmarks I've seen at 1440p show the Fury ahead of the 390X, but is there a trend of newer games using more VRAM and favouring the 390X? They tend to be the same price so it's a tough decision between the two.

It's best to expect ported console games to start using more ram than they used too due to the games developers relying on ram hogging high quality textures to improve there games visuals. It's because the consoles have an abundance of ram available and the textures are the easiest option for improving the look of a game for only a small performance penalty. I read that they have around 5 gb's of memory that's available for the game dev's to use.
 
I think you may be mixing up 'shader cache' available in AMD drivers with the RE7 game option called 'shadow cache'. It may be doing the same thing but it can be disabled in the game menu.
Afaik shadow cache is not an exclusive feature for AMD only but probably gives a huge boost to AMD due to some architectural reason..

Right, i have an nVidia card and know it as Shadow Cache, i haven't had an AMD card for a while.
 
Right, i have an nVidia card and know it as Shadow Cache, i haven't had an AMD card for a while.

So maybe it is the same thing but sad that some reviewers disabled it to show Nvidia in a better light even though it is a feature that works on both vendors.
 
I think you may be mixing up 'shader cache' available in AMD drivers with the RE7 game option called 'shadow cache'. It may be doing the same thing but it's a RE7 developer implementation which can be disabled in the game menu.
Afaik shadow cache is not an exclusive feature for AMD only but probably gives a huge boost to AMD due to some architectural reason..

If this is the case shouldn't the shader cache be left on for AMD 6+GB cards? The question remains. What does the 480 get with shader cache on vs the 1070 with shader chache off?
 
in that case the RX 480 is nearly as fast as the 1070, it makes the 1070 look bad. ^^^^ with Shader Cache on @ 1440P the 390X is slightly faster than the GTX 1070.

So maybe it is the same thing but sad that some reviewers disabled it to show Nvidia in a better light even though it is a feature that works on both vendors.

For reviewers to physically go into AMD driver to do something that lowers their performance and then publish that as a fair review is plain and simply nVidia shilling.

Its just proof they behave in this way, fake reviews.
 
Last edited:
in that case the RX 480 is early as fast as the 1070, it makes the 1070 look bad. ^^^^



For reviewers to physically go into AMD driver to do something that lowers their performance and then publish that as a fair review is plain and simply nVidia shilling.

Its just proof they behave in this way, fake reviews.

I don't think they actually disabled shader cache in the Crimson driver. They just disabled the shadow cache setting in the game options which is still a bad thing to do since it gimps the performance.
 
in that case the RX 480 is nearly as fast as the 1070, it makes the 1070 look bad. ^^^^ with Shader Cache on @ 1440P the 390X is slightly faster than the GTX 1070.



For reviewers to physically go into AMD driver to do something that lowers their performance and then publish that as a fair review is plain and simply nVidia shilling.

Its just proof they behave in this way, fake reviews.

For people to think that reviewers would do that is plainly silly and you really don't know what you are talking about. But crack on, as I won't bother replying and just play and enjoy (from what I have read). A shame that you get so wrapped up in a GPU manufacturer, you lose all sense and post complete drivel. You used to be ok but lost it lately :(
 
I don't think they actually disabled shader cache in the Crimson driver. They just disabled the shadow cache setting in the game options which is still a bad thing to do since it gimps the performance.

To me thats just semantics :) , be it in the drivers or the game settings they turned Shader Cache off and took down the review with Shader Cache on which had the 390X faster than the GTX 1070.

No one with an RX 480 or R9 390X is going to run the game like that, especially knowing it reduces the performance a lot, its just plain fake.
 
Did you read what the Shader Cache was doing to cards with less than 4GB of VRAM or did you ignore that for the sake of your own self justification?

I think people have lost sight at what a benchmark is, if the settings impacts lower Vram cards then so be it, a benchmark isn't to find the ideal setting people to play at with their setting, its to test the limits of the card beyond there comfort setting to see whats the best of the best.

The only time this becomes a grey area is when using proprietary technology, the setting being discussed here is a neutral setting which is neither AMD or Nvidia supplied. It is simply a setting which the developers deployed that benefits the look and style of the game, Gameworks and TressFX don't do any of this other then to put one over the competition.

Imo the benchmark is only valid with shadow cache switched on as its not a grey area setting, with it off only benefits the mindshare as opposed to a proper benchmark which should be neutral.
 
Last edited:
In the Guru tests it shows that Nvidia lose some performance as well (with shader cache off) but it's not enough to effect the results much on NV cards. For AMD cards with 8gb and RX470 with 4gb Vram it knocks a big chunk of performance off and changes the chart completely into making Nvidia look stronger than with the option off.

You have to ask yourself why would they want Nvidia to lose a few fps and AMD to lose a much bigger chunk. Surely if all cards are pretty much showing higher gains bar the odd few (the Fury series show the biggest gains) with Shadow Cache on it should be kept on. It just stinks of pandering to Nvidia as one graph shows there cards in a much better light than the others.

I will say again it's only the Fury series that are effected here (@1080p) and it would have been easier just to right this at the bottom or a paragraph on why the Fury series were effected at 1080p.
 
Last edited:
I think people have lost sight at what a benchmark is, if the settings impacts lower Vram cards then so be it, a benchmark isn't to find the ideal setting people to play at with their setting, its to test the limits of the card beyond there comfort setting to see whats the best of the best.

The only time this becomes a grey area is when using proprietary technology, the setting being discussed here is a neutral setting which is neither AMD or Nvidia supplied. It is simply a setting which the developers deployed that benefits the look and style of the game, Gameworks and TressFX don't do any of this other then to put one over the competition.

Imo the benchmark is only valid with shadow cache switched on as its not a grey area setting, with it off only benefits the mindshare as opposed to a proper benchmark which should be neutral.

Agreed but I think they choose to lose sight to suit agendas. The video I posted shows what is what and why it is best off for those with 4GB or less.
 
Agreed but I think they choose to lose sight to suit agendas. The video I posted shows what is what and why it is best off for those with 4GB or less.

The rx470 4gb lost a huge chunk of performance with it off. It's not so black and white if you compare the chart as even 2gb cards were getting a slight benefit. Fury cards were really the only cards to gain performance with it switched off. I haven't looked at the higher resolution but i think more cards were effected than at 1080p.
 
The rx470 4gb lost a huge chunk of performance with it off. It's not so black and white if you compare the chart as even 2gb cards were getting a slight benefit. Fury cards were really the only cards to gain performance with it switched off. I haven't looked at the higher resolution but i think more cards were effected than at 1080p.

You are not daft and do you honestly believe that the original bench test was correct? The guy shows what happens when using shader cache off and on with his Fury X (4GB)
 
The Fury has a completely different memory design and architecture, the only thing its relevant to is its self, its different to every-other consumer card in existence.
 
Last edited:
You are not daft and do you honestly believe that the original bench test was correct? The guy shows what happens when using shader cache off and on with his Fury X (4GB)

His video just backed up what i said :D:D:D. The Fury cards were the only ones who gained from having Shadow Cache off. Why that is i don't know (most likely needs the memory tweaking) but his video backed up the original Guru3d results compared to there updated in the fact that he saw gains. No other card in the Guru results lost performance. They all pretty much gained with it on. Look at the original and forget FPS. Look at AMD's much cheaper cards above some really expensive Nvidia models. Now look at the results with Shadow Cache off and things look way better for Nvidia cards as loads of AMD cards suddenly lost performance and slotted into where we would probably expect them.

It is what it is though and i am not getting in to conspiracy's and all that but AMD went from looking great to good but the big winner was in the way Nvidia cards will be perceived.
 
Last edited:
Back
Top Bottom