• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Ok so when we hit a point where the "majority" of games are "literally" unplayable even with "appropriate" settings (as in not having to drop 90% of the settings below "high"), what do you think is going to be the cause for that? Lack of vram or lack of grunt?

PS. we all know that 4 cores isn't enough, do you think 6 cores is enough? Bearing in mind a 6 core 5600x performs better in games than 8 core 3700x....

In 1-2 years time, we'll be on the next gen of gpus and not because of more vram but because of far better ray tracing perf. and performance all round....

Or.... are you telling me that a 3090 or a 6900xt are going to perform better than nvidias 4080 and amds 7800xt (or whatever they will be called)? If so, why isn't a rtx turing titan performing better than a 3080???



bp6iGkG.gif

You mean the specs where they completely lied? :cry: According to them, I should have had no issues with my vega 56 and 2600 but boy I did...

"The Majority" why the majority? is this more "That'll do" philosophy? i have an RTX 2070 Super, it has 8GB VRam, i'm already finding that is not enough, no not in everything, or even the majority.

Want some cake?
 
Or.... are you telling me that a 3090 or a 6900xt are going to perform better than nvidias 4080 and amds 7800xt (or whatever they will be called)? If so, why isn't a rtx turing titan performing better than a 3080???
.

The 2080Ti is a shade better than the 3070 no? A gen doesnt generally make a huge difference its more nvidia crippling older tech through drivers to make you upgrade if anything.
 
If its VRAM is failing at 4K, there's no reason it shouldn't at 1440p or 1080p The graphs don't lie, there's not a huge reduction of VRAM between 4k and 1440p. He somehow accepts that the 8 gigs failing at 4k but unable to accept that same fate can happen at 1440p/1080p with background apps.

Please don't let him derail the discussion. I guess I need to get a 5600x (which I don't need to, because I get smooth and nice frames across a variety of games with GPU bound settings) and prove the same VRAM bound frame drops, and then he will say 8 GB sucks. The discussion shall go on forever.
As I said, he's firmly entrenched. But yes, a 5600x would be a better fit for you. However it won't suddenly give you enough video memory to mitigate your issue.
 
As I said, he's firmly entrenched. But yes, a 5600x would be a better fit for you. However it won't suddenly give you enough video memory to mitigate your issue.
I'm looking for options these days, I will most likely sell it at some point and try to find a 6700xt instead. It should provide me more VRAM to push more res scaling and less CPU overhead. If I knew the CPU overhead beforehand though, I wouldn't even considered the 3070 to begin with.

Its the hassle that stops me from doing so for now... But CPU overhead problem entices me to make the change. Damn Nvidia, really :) I'm still happy with what I got in the end, I consider myself lucky.
 
I'm looking for options these days, I will most likely sell it at some point and try to find a 6700xt instead. It should provide me more VRAM to push more res scaling and less CPU overhead. If I knew the CPU overhead beforehand though, I wouldn't even considered the 3070 to begin with.

Its the hassle that stops me from doing so for now... But CPU overhead problem entices me to make the change. Damn Nvidia, really :) I'm still happy with what I got in the end, I consider myself lucky.
Agreed. 6700 XT is slightly slower at stock than a 3070, but it overclocks like crazy if you have a decent cooler so it should be able to match it with a little tweaking. Something i intend to prove when mine arrives.

Yeah the CPU overhead is definitely a factor with your current GPU, but i think you would have the same problem with a 5700 XT for example as even the best CPU won't help a lack of video memory.
 
I'm looking for options these days, I will most likely sell it at some point and try to find a 6700xt instead. It should provide me more VRAM to push more res scaling and less CPU overhead. If I knew the CPU overhead beforehand though, I wouldn't even considered the 3070 to begin with.

Its the hassle that stops me from doing so for now... But CPU overhead problem entices me to make the change. Damn Nvidia, really :) I'm still happy with what I got in the end, I consider myself lucky.

Right. If you were going Nvidia you would defiantly need some very fast CPU cores and more than 4 of them, Nvidia do thread scheduling in software, IE on the CPU, that's the driver using unnecessary resources of the CPU, if you don't have more cores than the game needs Nvidia will steal your CPU cycles and with it frames.

However even if you're going AMD a fast 6 core these days is a good idea, more games use 6+ now.
 
If you were going Nvidia would would defiantly need some very fast CPU cores and more than 4 of them, Nvidia do thread scheduling in software, IE on the CPU, that's the driver using unnecessary resources of the CPU, if you don't have more cores than the game needs Nvidia will steal your CPU cycles and with it frames.

However even if you're going AMD a fast 6 core these days is a good idea, more games use 6+ now.

I made some researches, and found out that even if you have lots of cores, scheduler will also steal per-core performance, no matter how many cores you throw at it;

https://forums.anandtech.com/thread...overhead-problem.2591624/page-4#post-40472727

Dreaded overhead is there, even if you have a 3900x or a "5600x"

Kwe8A7b.png

n3QALnh.png

Overhead is a bit overrated, especially at GPU bound settings though;

https://www.youtube.com/watch?v=8J7HubpZJL4&t=270s

https://youtu.be/4sXpZ5rTrGI?t=29

Again though, surely 6700xt will be a better fit for my system, that I can't deny :)
 
Perhaps people should create a thread if they want to discuss about the 3070 or/and 8gb vram as this thread is for the 3080 and 10gb vram....

For what it's worth, 8gb vram and 4k should not be in the same sentence (the same way people shouldn't have said the fury x was for 4k gaming with its 4gb vram... :cry:)

If people thought 8gb was going to be enough for 4k and even 1440p going forward then they only fooled themselves tbph.

"The Majority" why the majority? is this more "That'll do" philosophy? i have an RTX 2070 Super, it has 8GB VRam, i'm already finding that is not enough, no not in everything, or even the majority.

Want some cake?

As above, 8gb isn't for games like cyberpunk @4k and in some cases, even 1440p and other games going forward but then your 2070 also doesn't have enough grunt to get the best from having more vram in the first place.... Iirc, you have had your 2070 for quite a while? So surely you'll be upgrading come next gen gpus? If so, why are you so concerned with this gen of gpus???? Especially 8gb GPUs

Majority...... because most people don't upgrade just for one or 2 games(unless they are massive fans of said outlier games and want the best experience possible), they upgrade when games across the board are giving them issues due to lack of grunt or/and lack of vram.

Would you like some cake?

The 2080Ti is a shade better than the 3070 no? A gen doesnt generally make a huge difference its more nvidia crippling older tech through drivers to make you upgrade if anything.

I don't know tbh as got/had no interest in the 3070 because it was crippled with slow 8gb vram, wasn't it basically like for like though according to jenson but the main point being the considerably cheaper price tag?

Sadly yup nvidia will cripple cards when the next gen or 2 come out :( Either that or they just don't give older cards any extra attention/optimisation but sadly, more likely the former!

As I said, he's firmly entrenched. But yes, a 5600x would be a better fit for you. However it won't suddenly give you enough video memory to mitigate your issue.

Since oguouso is being petty by replying to you about me and not to me directly :D Riddle me this, is it not confirmed that new/powerful nvidia gpus don't play nicely with older/weaker cpus especially when enabling ray tracing????

 
Your overhead argument is pointless when I drive GPU bound settings. You know this. Please, stop.

EuuktSj.png

You're still not entitled out of this issue. So stop prentending as if it doesn't happen on your CPU.

You barely get 60-70 frames in a crowded place in Cyberpunk. Once a tougher game on CPU comes out, you will be driven to below 60 FPS while an equiavelent AMD GPU+5600x will be driving 60+ FPS while you will be spending %20-25 CPU bound frames at Nvidia's greedy scheduler.
 
I made some researches, and found out that even if you have lots of cores, scheduler will also steal per-core performance, no matter how many cores you throw at it;

https://forums.anandtech.com/thread...overhead-problem.2591624/page-4#post-40472727

Dreaded overhead is there, even if you have a 3900x or a "5600x"

Kwe8A7b.png

n3QALnh.png

Yeah, tho Zen 3 is MUCH faster per core than Zen+, a 5600X blows that 2700X out of the water, and in some cases just about everything else.

I have a 5800X running at 5Ghz, marvellous...

FzWb7Id.png
 
Yeah, tho Zen 3 is MUCH faster per core than Zen 2, a 5600X blows that 2700X out of the water, and in some cases just about everything else.

I have a 5800X running at 5Ghz, marvellous...

FzWb7Id.png
Indeed, they're marvelous. But I don't need that much frames for the games I like. As long as my CPU can push 45+ frames in AAA games, I'm satisfactory enough. I'm a bit of resolution sucker (playing RDR 2 at 2x res scale at 60 FPS, while I could get 100+ frames at native res, but I prefer image sharpness over more frames, but that's me) :) Only thing that can drive me to upgrade my CPU is getting horrendous framerates below 40s but that is yet to happen (aside from running adamantly on jig-jig street in Cyberpunk with high crowd density+RT, but that's acceptable, for now. the above user however just upgraded their cpu for this alone and tries to justify their purchase when I don't even care because there's not a single FPS-reliant activity that happens in jig-jig street. on %80 of the game, i had smooth 60+ Frames with high %1 lows my 2700x with rt enabled, including combats. maybe because i also drive a highly tweaked 3466 cl14 kits, but that's another topic)

so I still don't need to shell out 300 dollars for a 6 core cpu, whatever the fps boost it may bring. the pc is a scalable platform, just like how ps4/xboxone focused on graphics rather than high framerates due to their anemic CPU. once it drops to 150-200 dollar range, i will consider it. then again, i would like to get the 64 mb cache beast 5900x instead of 5600x, so i will wait for a pride reduction instead across all CPUs (similar to how i got my 2700x for 125 dollars when it was released for 329 dollars) [he also doesn't understand that i dont experience as much as frame frops as him, he had a 2600x and i have a 2700x, and i still have technically more headroom=)
 
Last edited:
Had a quick look on youtube, 3070 seems to perform ok for a card of its specs with the "appropriate" settings when paired with a newer/better cpu....


User or/and system error it seems on the other persons end.

But as said, please create another thread for said persons issues with a 3070/8gb gpu :)
 
Had a quick look on youtube, 3070 seems to perform ok for a card of its specs with the "appropriate" settings when paired with a newer/better cpu....


User or/and system error it seems on the other persons end.

But as said, please create another thread for said persons issues with a 3070/8gb gpu :)

You know what's really odd about that? the first two lower settings the GPU is loaded at 95%+, the third section with higher setting the GPU is only loaded at 80%+ and i can see microstutter in it that was not there with the lower settings.

Its not the CPU, the CPU can drive the game at 140Hz with low settings with the GPU fully loaded up, yet with higher settings the GPU is not able to run at peak load, there is a bottleneck there, what is it?
 
You know what's really odd about that? the first two lower settings the GPU is loaded at 95%+, the third section with higher setting the GPU is only loaded at 80%+ and i can see microstutter in it that was not there with the lower settings.

Its not the CPU, the CPU can drive the game at 140Hz with low settings with the GPU fully loaded up, yet with higher settings the GPU is not able to run at peak load, there is a bottleneck there, what is it?
Ha, I'm glad someone else said it. I didn't want to.

I'm not sure that video proves what Nexus thinks it does. :confused:
 
Had a quick look on youtube, 3070 seems to perform ok for a card of its specs with the "appropriate" settings when paired with a newer/better cpu....


User or/and system error it seems on the other persons end.

But as said, please create another thread for said persons issues with a 3070/8gb gpu :)
does he have chrome, steam and discord in the background?
 
No... it doesn't.



Something like that would cause a problem when the CPU is at a higher load, IE low graphics settings high FPS.
No, i mean from a VRAM standpoint. The video owner probably only runs the game, and shuts down everything in the background, hence VRAM is not stressed by extra applications. This is what I experience... And this video is not representative of what I experience. I don't want to shut down background apps just because of 8 GB VRAM.

At 11:41, game uses 7.7 GB of VRAM. When the other apps use 700-800 MB vram, game instead will use 6.7 GB vram (as i've evidenced by my own video) and cause stutters due to shared memory usage
 
You know what's really odd about that? the first two lower settings the GPU is loaded at 95%+, the third section with higher setting the GPU is only loaded at 80%+ and i can see microstutter in it that was not there with the lower settings.

Its not the CPU, the CPU can drive the game at 140Hz with low settings with the GPU fully loaded up, yet with higher settings the GPU is not able to run at peak load, there is a bottleneck there, what is it?

Ha, I'm glad someone else said it. I didn't want to.

I'm not sure that video proves what Nexus thinks it does. :confused:

That a 3070 can play cyberpunk ok when paired with a better/newer CPU AND the appropriate settings are used because of someones complaints where their fps is dropping to 30s and saying it is "entirely" down to the vram maybe.... You know this post here, now go and compare the frametime graph [although bit of a pointless comparison given not like for like but still, clearly a better cpu would help out the 3070 here]:



Any further replies on this or/and 3070/8gb cards, please use this thread instead too:

https://www.overclockers.co.uk/forums/threads/8gb-vram-enough-for-the-3070-discuss.18929652/
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom