• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RT Related Games, Benchmarks, Software, Etc Thread.

Status
Not open for further replies.
The 3070 should've launched with 12GB, Same with the 3080, Nvidia keep doing this every gen with too little memory unless you buy their silly priced high end models, 16GB though for the 3070 is just a marketing checkbox.

Exactly. No doubt a 16GB 3070ti will be better than a 3070 8GB because 8GB is not enough for "max" settings @ 4k or even 1440p going forward, that and the ti is "slightly" more powerful anyway.



As per the question at hand, give me a 10GB 3080 over a 16GB 3070ti any day of the week, a 3070ti getting an extra 8GB of vram is not magically going to give it the grunt/perf. of a 3080.

https://gpu.userbenchmark.com/Compare/Nvidia-RTX-3080-vs-Nvidia-RTX-3070-Ti/4080vs4116


 
Last edited:
How do you think your old 2080ti would have performed on 8gb?

It doesnt need to use the full 16gb to better its 8gb limitation, frame times can be all over the place when the 8gb gpus go north of allocated 7gb.

After using a 3070 for more than a year, it can be a horrible gaming experience@1440p never mind 4k.
Not had any issues yet myself. Playing Days Gone at the moment, maxed out (minus **** effects like Motion blur etc) and runs smooth and not seen any drops. Great looking game too.
 
The 3070 should've launched with 12GB, Same with the 3080, Nvidia keep doing this every gen with too little memory unless you buy their silly priced high end models, 16GB though for the 3070 is just a marketing checkbox.

Most people cannot think that far ahead, They just see a bigger amount of memory and think "mOrE nUmbEr eQuAl bEtTeR !!!"

12gb should have been the magic no.

Not had any issues yet myself. Playing Days Gone at the moment, maxed out (minus **** effects like Motion blur etc) and runs smooth and not seen any drops. Great looking game too.
Playing it myself, agree that its good looking game, it doesn't go much more than 6gb, so you are never going to see a specific vram performance drop off in that title.

What I'm talking about is in regards to when you are trying for the higest playable settings(not maxing out games) and the 8gb limitation is impacting performance before the core runs out of grunt.

I'm guessing the the memory management increases the closer it gets to the limit creating a higher driver overhead which wouldn't be an issue if there was more vram available.
 
Playing it myself, agree that its good looking game, it doesn't go much more than 6gb, so you are never going to see a specific vram performance drop off in that title.

What I'm talking about is in regards to when you are trying for the higest playable settings(not maxing out games) and the 8gb limitation is impacting performance before the core runs out of grunt.

I'm guessing the the memory management increases the closer it gets to the limit creating a higher driver overhead which wouldn't be an issue if there was more vram available.
Which games are you having an issue with? Was it a case that dropping a setting or two did not solve the problem?


It should've but Nvidia like to milk until the cow is dust.
Indeed. That’s the way they role. Always been this way.
 
Which games are you having an issue with? Was it a case that dropping a setting or two did not solve the problem?

Off the top of my head, Cod CW, the older WWII is the worst offender because of the cut scenes, og metro 3, wdl, rd2, yes you could turn on dlss with some but it was a reduction in iq even at 1440p.

You cant just fire these games up, run a bit through them and say 'its fine'/'here is a(hardly any) action vid I made to prove you are talking nonsense' and come back in with ~problem your end its buttery smooth etc, its proper playing sessions, different maps/heavy action scenes etc.
 
Last edited:
Off the top of my head, Cod CW, the older WWII is the worst offender because of the cut scenes, og metro 3, wdl, rd2, yes you could turn on dlss with some but it was a reduction in iq even at 1440p.

You cant just fire these games up, run a bit through them its fine/'here is a(hardly any) action vid I made to prove you are talking nonsense' and come back in with its fine/problem your end, its proper playing sessions, different maps/heavy action scenes etc.

Can't say for certain but doesn't COD games have a shader cache/texture pool setting you can adjust?

Metro 3? Is that exodus one? IIRC, from my play through of the enhanced version, vram usage was hardly anything, like 6-7GB all maxed at 3440x1440 with dlss quality (better than native+TAA so no reason not to use it)

WDL - when I tried the free trial, I had to reduce settings at both 3440x1440 and 4k, not because of vram but because the card lacked the grunt for acceptable frame rates, WDL ray tracing is extremely intensive, possibly even more than cyberpunk RT, whilst it definitely needs more than 8GB VRAM, settings would have to reduced even further because of the lack of grunt for the 3070 which in return should reduce vram usage anyway

RDR 2 - I have seen my game hit 9.5GB vram usage at both 4k and 3440x1440 but no fps/latency issues so not surprised you're having issues with 8GB there, although I would defo be using DLSS in that as the IQ is noticeably superior compared to native+TAA especially in motion
 
Off the top of my head, Cod CW, the older WWII is the worst offender because of the cut scenes, og metro 3, wdl, rd2, yes you could turn on dlss with some but it was a reduction in iq even at 1440p.

You cant just fire these games up, run a bit through them and say 'its fine'/'here is a(hardly any) action vid I made to prove you are talking nonsense' and come back in with ~problem your end its buttery smooth etc, its proper playing sessions, different maps/heavy action scenes etc.
I am not into COD games so no idea about that. I played the enhanced metro 3 had no issues whatsoever completing the game with my 3070. WDL not played and probably won’t as I am not a fan of Ubisoft and apart from visually it did not do enough for me to want to give money to a company I am not a fan off. RDR2 I have and played fine ages ago on my 3080 and played fine on my 3070 but that was indeed with DLSS with the 3070. Did not play much though to be fair as I get bored and move on to another game. Just something about that game that does not grab me. Waste of £30.

Not saying there are not any games that suffer from 8gb not being enough, but thus far I have yet to have any issues so been very happy with the card. 16gb is where we should be at now and I am hoping next gen cards of this class will come with that as standard. I mean by the time next gen cards are out we won’t be far away from 2023, if the 4070 does not come with GDDR6 16GB as standard then they will definitely be taking the **** to a whole new level :p
 
@Nexus18 you've not got 8gb so you are never going to know past your opinion that it's enough, both gpus are being used so Iv'e got a fair grasp about what they can and can't do.

@TNA probably knows as he's used both like myself:p, the 80 isn't that much faster than a 70, therefore an 80 can't magically up loads of settings over the 70, its marginal at best but its extra 2Gb is clearly beneficial.

Dlss while being a great feature to have, wasn't the magic bullet then, it's better now, but if you had used it from the start you would know it got switched off and dialled back settings, therefore it couldn't help the 8gb limitation as you wouldn't be using it under 4K anyway.
 
@Nexus18 you've not got 8gb so you are never going to know past your opinion that it's enough, both gpus are being used so Iv'e got a fair grasp about what they can and can't do.

@TNA probably knows as he's used both like myself:p, the 80 isn't that much faster than a 70, therefore an 80 can't magically up loads of settings over the 70, its marginal at best but its extra 2Gb is clearly beneficial.

Dlss while being a great feature to have, wasn't the magic bullet then, it's better now, but if you had used it from the start you would know it got switched off and dialled back settings, therefore it couldn't help the 8gb limitation as you wouldn't be using it under 4K anyway.

True, I can't speak from first hand experience anymore since upgrading from my vega 56 to the 3080 but I know the games well and what certain cards are capable of and what settings to look out for :p

If you're having issues because of vram, it's rather simple to fix, you just reduce settings, mostly any kind of texture or/and shader setting to remove the vram bottleneck, chances are, you probably won't even notice the difference dropping from "ultra" to very high/high texture etc. setting(s) anyway. When I played cp2077 on my vega 56, I encountered vram issues i.e. frame latency and micro stuttering was all over the place but the fps I was getting was crap too due to the lack of grunt (even without ray tracing since no support for the vega 56), around 40 fps thus I had to reduce settings to get the fps range I was happy with i.e. at least 60 fps, which in return solved the vram usage this removed the issues with the inconsistent frame latency too.

Essentially, Vram and raw grunt go hand in hand, sure, it would have been nice for both the 70 and 80 to have more vram but ultimately, the 3070 having an extra 2GB would not have made that "much" of a difference to its perf., likewise with the 3080 having 2gb more vram, as evidenced on this page with the comparisons of the 12gb model to the standard 10gb model (tldr: performance difference between them is all from the MSI AIB 12gb model having a much higher boost clock and better specs across the board)....

Just watch the videos above showing 3080 vs 3070ti and 3070 perf. too, a 3080 is a rather substantial improvement over a 3070ti 8gb regardless of vram being used..... A 3070/ti for an acceptable experience (imo) will have to reduce settings regardless of its vram usage, again, subjective this though as everyone is happy with different fps levels, most noticeable and important figures is the lower end i.e. 40-50 fps is a noticeable jump, more so than going from 110-120 fps.

Don't disagree, dlss when it first launched was a pos but what does that matter now? Not to mention when it first launched, 8gb was perfectly fine back in those days so there wasn't as much of a need for it.... It's superb now.

Direct storage should in theory reduce vram usage too, problem is, we probably won't see any games anytime soon and likely will have the 50xx series of cards then.....
 
Last edited:
I found dying light 1 to be very boring

hopefully 2 isn't

as for RTX IO, won't see anything about that until Microsoft announces that Direct Storage is now available in windows
 
Not pc footage but ps 5 explaining the different modes:


So 4k and the quality/ray tracing mode look to be 30 fps with performance mode being the only one to provide 60 fps? Probably some adaptive resolution scaling going on with target res. mostly being around 1200, if that....

Looks great all the modes though.
 
Not pc footage but ps 5 explaining the different modes:


So 4k and the quality/ray tracing mode look to be 30 fps with performance mode being the only one to provide 60 fps? Probably some adaptive resolution scaling going on with target res. mostly being around 1200, if that....

Looks great all the modes though.
I am sure with a few tweaks to settings and running DLSS on Performance mode I will be fine running the game on my 3070 personally. Would be surprised if not.
 
I am sure with a few tweaks to settings and running DLSS on Performance mode I will be fine running the game on my 3070 personally. Would be surprised if not.

But but but..... 8gb vram.... ;) :cry: :p

I'm expecting even the best GPUs to have to sacrifice settings for 60+ fps at 4k even with fsr/dlss tbh.
 
Status
Not open for further replies.
Back
Top Bottom