• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
That has nothing to do with my post above whatsoever, but they look pretty even to me?
Guess you needs to crank up RT/DLSS for the 3080 to be faster.

Speaking of which, fancy comparing 6900 XT Vs 3080 performance in the Timespy, Firestrike, Forza, Shadow of the Tomb Raider, and Valhalla OcuK bench threads?

Would love to see how fast your 3080 is in comparison at the same settings - for a bit of fun. ;)

That's why I bought a 3080, next gen tech :D
 
Only person that seems to care about that game is LtMatt. Every time the 10gb is enough up until today topic comes up… :p
Facts speak. 3080 runs out of grunt before it does vram up until this day.


I had to scroll back a few pages to remember why Godfall was mentioned. Then I saw your post And remembered the 3080 having it’s trousers pulled down on minimum FPS with RT on while having enough GPU grunt to run the game. :D
 
I had to scroll back a few pages to remember why Godfall was mentioned. Then I saw your post And remembered the 3080 having it’s trousers pulled down on minimum FPS with RT on while having enough GPU grunt to run the game. :D
Yea but as we all came to the conclusion. Who cares about that game ;)
 
Yup. Especially after seeing how higher res solves the grain issue of SSR, RT reflections lost their biggest selling point for me (being able to get rid of reflection noise). Apparently pushing 1.5x res scale at 1080p or 1.25x at 1440p is enough to clear SSR grain mostly... And in that aspect, raster monster GPUs (RDNA2) should have an easy time pushing high FPS.

The raster performance of 3070 in Cyberpunk is really startling, I feel like Series X can push more frames/res than the 3070 actually. It feels like Ampere is anemic when it comes to raster performance most of the time

It comes alive with RT Psycho. Turn off screen space reflections, so that RT reflections are used. I admit I do remember rasterisation ;)

Speaking of next gen tech ....

Software raytracing.
Hardware raytracing.
 
Glad you’ve admitted it, nothing further to add your honour. :)
Admitted what? I don’t care one way or the other. I am no nvidia fanboy and don’t even have a 3080 anymore. I got what I wanted out of the card and as predicted never had an issue with it being 10gb during the time I did :D
 
Admitted what? I don’t care one way or the other. I am no nvidia fanboy and don’t even have a 3080 anymore. I got what I wanted out of the card and as predicted never had an issue with it being 10gb during the time I did :D

The issue with the AMD 6800xt/6900xt series cards is their roughly 500 to 550GB/s Bandwidth.

The RTX 3080/3090 Series cards have 680 to 980GB/s Bandwidth.

Above 10GB RAM does not currently show any performance difference. That issue makes the 3 series cards perform better in almost all games apart from the odd AMD optimised game. Even miners say they get better performance from 3 series cards and that's why. You can't even buy them in the shops because of they are the better cards for gaming and so on.
 
Last edited:
RE Village, again :) (from guru3d.com)

bTPIkNr.png


Fvnelyx.png

2080ti and 78 FPS at 4K RT. Clearly, the "grunt" is there.

"We’re constantly analyzing memory requirements of the latest games and regularly review with game developers to understand their memory needs for current and upcoming games.....A few examples - if you look at Shadow of the Tomb Raider, Assassin’s Creed Odyssey, Metro Exodus, Wolfenstein Youngblood, Gears of War 5, Borderlands 3 and Red Dead Redemption 2 running on a 3080 at 4k with Max settings (including any applicable high res texture packs) and RTX On, when the game supports it, you get in the range of 60-100fps and use anywhere from 4GB to 6GB of memory. Extra memory is always nice to have but it would increase the price of the graphics card, so we need to find the right balance.

"

Nvidia's dreamland VRAM consumption idea = 4-6 GB memory at 4K
Reality after 8 months = 8.5-9 GB used memory at 1440p/1080p and pushing the boundaries of 10 GB at 4K
 
Last edited:
RE Village, again :) (from guru3d.com)

bTPIkNr.png


Fvnelyx.png

2080ti and 78 FPS at 4K RT. Clearly, the "grunt" is there.

"We’re constantly analyzing memory requirements of the latest games and regularly review with game developers to understand their memory needs for current and upcoming games.....A few examples - if you look at Shadow of the Tomb Raider, Assassin’s Creed Odyssey, Metro Exodus, Wolfenstein Youngblood, Gears of War 5, Borderlands 3 and Red Dead Redemption 2 running on a 3080 at 4k with Max settings (including any applicable high res texture packs) and RTX On, when the game supports it, you get in the range of 60-100fps and use anywhere from 4GB to 6GB of memory. Extra memory is always nice to have but it would increase the price of the graphics card, so we need to find the right balance.

"

Nvidia's dreamland VRAM consumption idea = 4-6 GB memory at 4K
Reality after 8 months = 8.5-9 GB used memory at 1440p/1080p and pushing the boundaries of 10 GB at 4K

Hmm, that's 1 game out of 10 thousand games. And the 10GB card still ran the game at 102fps at 4K and we'll over 100 fps at lower resolutions.
 
So... folks'll maybe need to significantly cut their reasonably expected settings/res with a 3080 sooner than maybe they'd like?
Not tomorrow or next month obviously, but starting within 6 months or so, increasingly thereafter, and sooner than with a 6800XT or 6900XT.
I mean, lowering expectations comes to all GPU's eventually, it's just... my 1070 ran 2560x1080 very nicely for 2.5 years or so until I lowered to 1080p in the last 2.5 years for the same perf, where it mostly still sits well now; high/ultra 60 fps or so in all but few very recent games. That 8Gb is being stretched to the limit now though.
But we're already talking about a 3080 not getting it done at 4K and seeing signs that's not so far off?
Well, cool for the folks that buy anew every gen or even multiple times within a gen. Not so great for the folks that want some kind of longevity from an MSRP no longer exists, 200+% mark up card. If you can get one even.
 
RE Village, again :) (from guru3d.com)

bTPIkNr.png


Fvnelyx.png

2080ti and 78 FPS at 4K RT. Clearly, the "grunt" is there.

"We’re constantly analyzing memory requirements of the latest games and regularly review with game developers to understand their memory needs for current and upcoming games.....A few examples - if you look at Shadow of the Tomb Raider, Assassin’s Creed Odyssey, Metro Exodus, Wolfenstein Youngblood, Gears of War 5, Borderlands 3 and Red Dead Redemption 2 running on a 3080 at 4k with Max settings (including any applicable high res texture packs) and RTX On, when the game supports it, you get in the range of 60-100fps and use anywhere from 4GB to 6GB of memory. Extra memory is always nice to have but it would increase the price of the graphics card, so we need to find the right balance.

"

Nvidia's dreamland VRAM consumption idea = 4-6 GB memory at 4K
Reality after 8 months = 8.5-9 GB used memory at 1440p/1080p and pushing the boundaries of 10 GB at 4K
Good point, I forgot about Residen Evil, that’s another along with Godfall. :D

I’ve heard Watchdogs can be similar too at 4K with max settings also but have not really looked into it. Hopefully we see more games pushing the video memory limits going forward.
 
Last edited:
So... folks'll maybe need to significantly cut their reasonably expected settings/res with a 3080 sooner than maybe they'd like?
Not tomorrow or next month obviously, but starting within 6 months or so, increasingly thereafter, and sooner than with a 6800XT or 6900XT.
I mean, lowering expectations comes to all GPU's eventually, it's just... my 1070 ran 2560x1080 very nicely for 2.5 years or so until I lowered to 1080p in the last 2.5 years for the same perf, where it mostly still sits well now; high/ultra 60 fps or so in all but few very recent games. That 8Gb is being stretched to the limit now though.
But we're already talking about a 3080 not getting it done at 4K and seeing signs that's not so far off?
Well, cool for the folks that buy anew every gen or even multiple times within a gen. Not so great for the folks that want some kind of longevity from an MSRP no longer exists, 200+% mark up card. If you can get one even.

I'm not sure what you mean by this.
The 3080 is the 2nd best performing card at 4k. The 1st best card is the 3090. 3rd place is 6900xt and 4th is 6800xt. So if the 3080 is not getting it done, that means the 6900xt and 6800xt is doing what exactly?
 
Good point, I forgot about Residen Evil, that’s another along with Godfall. :D

I’ve heard Watchdogs can be similar too at 4K with max settings also but have not really looked into it. Hopefully we see more games pushing the video memory limits going forward.

I read Godfall is a ps5 game port that's poorly optimised for pc. Watchdogs also has lazy optimisation. Even if they chuck 24gb on the card, it won't make a sizable difference visually or performance wise for those games.
 
Status
Not open for further replies.
Back
Top Bottom