• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Associate
Joined
25 Sep 2020
Posts
128
damn amd really did something huh...

I might still buy the 3080 10gb for the dlss... not sure though, hope 10gb is enough for 1440p lol.

cos i really feel like dlss is going to be a very important tech going forward, amd might take a while to create their own version of dlss.
 
Soldato
Joined
19 May 2012
Posts
3,633
I might buy 3080 because i have an LG OLED so Gsync on NVIDIA cards seems to play very nicely with it.
Also I see GPU as disposable toy and will just upgrade if VRAM becomes a big issue.


But if i was to use my brain to buy a GPU and didn't have an LG OLED, i'd definitely do AMD.

3070 is a joke of a GPU atm.
 
Soldato
Joined
18 Feb 2015
Posts
6,485
3080ti is almost certainly going to be the 12GB TSMC 7nm card that's been rumoured, and will be faster than the 3090. IMO anyway - we'll see how much faster it needs to be if AMD show performance of their top card today.

It can't be faster, it's cut down compared to the 3090.
 
Soldato
Joined
29 May 2007
Posts
4,898
Location
Dublin
If AMD can make a proft with 16Gb for on a card for $579, it does make you wonder. Why are people so desperate for OVERPRICED nvidia? Is it the brandname they like?

Brand loyalty is realy and very powerful, some marketing really does work and some people fall in love with symbols and . . . corporations. I'm not making this up, look at Apple. Maybe they think the company loves them back or something? Or when it does well, they are somehow also doing well?
 
Joined
27 Jul 2005
Posts
13,051
Location
The Orion Spur
Brand loyalty is realy and very powerful, some marketing really does work and some people fall in love with symbols and . . . corporations. I'm not making this up, look at Apple. Maybe they think the company loves them back or something? Or when it does well, they are somehow also doing well?

It's not just brand loyalty, I have a g-sync enabled monitor and compatible TV, what choice do I have?

It's not as cut and dry as you think, there are 'matured' feature sets that the Nvidia offer on their GPU's that also come into play, DLSS is an important one for me hitting as I want to hit 4K with more performance than native.
 
Soldato
Joined
20 Aug 2019
Posts
3,031
Location
SW Florida
I'm in the EVGA queue for a 3080. If I can buy a 6800XT before my spot in the queue comes up, I'm getting the 6800XT. (Assuming independent benchmarks line up with what I saw from AMD today)

The cache/bandwidth voodoo seems to work. The raw performance is there, and the price is right.

Each brand's offering has different pro's and con's, but I will be happy with either. Both brands seem to want my business this generation. Now it's just a race to see who can deliver an actual card to me first.
 
Last edited:
Associate
Joined
9 May 2007
Posts
1,284
They did not show performance in games like Control. Control is the big Ray tracing game. Also Metro Exodus. My guess the 6900XT cant match the 3080 with RT & DLSS 2.1. Soon most newer games will have features for RT & DLSS. People are guessing that RT performance for the 6800 and 6900 is better than turning but worse that ampere. I would wait for reviews once the cards get into the wild. As it stands in Control, a RTX 2060 can run it at 4k with DLSS set to 720p.
 
Associate
Joined
25 Sep 2020
Posts
128
They did not show performance in games like Control. Control is the big Ray tracing game. Also Metro Exodus. My guess the 6900XT cant match the 3080 with RT & DLSS 2.1. Soon most newer games will have features for RT & DLSS. People are guessing that RT performance for the 6800 and 6900 is better than turning but worse that ampere. I would wait for reviews once the cards get into the wild. As it stands in Control, a RTX 2060 can run it at 4k with DLSS set to 720p.
That's what I have been talking about to my friends, they didn't show any kind of ray tracing numbers, and I doubt they will have an answer to dlss until rdna3 and rdna2 might not support the technology who knows...

I am playing at 1440p so its easier for me to swallow the 10gb vram pill, for someone going 4k they should definitely not buy the 3080 10gb either wait for the ti models or just go with 6800xt.
 
Associate
Joined
9 May 2007
Posts
1,284
That's what I have been talking about to my friends, they didn't show any kind of ray tracing numbers, and I doubt they will have an answer to dlss until rdna3 and rdna2 might not support the technology who knows...

I am playing at 1440p so its easier for me to swallow the 10gb vram pill, for someone going 4k they should definitely not buy the 3080 10gb either wait for the ti models or just go with 6800xt.

Not even Port Royal. The leaked benchmarks showed Port Royal. The 3080 is fine at 4k. DLSS has that covered.

VRAM Usage
You can see in my testing that having DLSS 2.0 enabled cuts down on the amount of VRAM used at 8K but a big chunk. Looking at the GeForce RTX 2080 Ti which uses over 10GB of VRAM at 8K without DLSS 2.0 enabled, but under 8GB of VRAM when DLSS 2.0 is enabled to the Quality preset. https://www.tweaktown.com/articles/...hmarked-at-8k-dlss-gpu-cheat-codes/index.html

VRAM Usage
Something that has come up with other games tested in 8K is VRAM usage. For some, astronomical numbers that consume the 24 GB memory of an TITAN RTX are not unheard of. Once again, DLSS 2.0 triumphs on the RTX 2080 Ti. Using it in quality mode lowered VRAM consumption by nearly 2 GB. After enabling it, usage went from 10.3 GB down to 8.4 GB, well within the limits of the card’s 11 GB VRAM. By not hammering the card’s VRAM limits, DLSS 2.0 has allowed it to perform to its fullest potential. In the end, Mr. Garreffa has likened DLSS 2.0 to GPU cheat codes, and we can now easily see why. https://www.thefpsreview.com/2020/0...8k-with-dlss-2-0-using-a-geforce-rtx-2080-ti/

The issue with 8k, is games like Control are reported to use 20GB of vRAM. This means that the 6900XT with 16GB of vRAM may have issues keeping up. https://www.eurogamer.net/articles/digitalfoundry-2020-dlss-ultra-performance-analysis

with a small overclock, RTX 2060 can render Death Stranding pretty much locked at 4K60.

More Death Stranding

Graphics memory (VRAM) usage

How much graphics memory does the game utilize versus your monitor resolution with different graphics cards and respective VRAM sizes? Well, let's have a look at the chart below compared to the three main tested resolutions. The listed MBs used in the chart are the measured utilized graphics memory during our testing. Keep in mind; these are never absolute values. Graphics memory usage can fluctuate per game scene and activity in games. This game will consume graphics memory once you start to move around in-game, memory utilization is dynamic and can change at any time. Often the denser and more complex a scene is (entering a scene with lots of buildings or vegetation, for example) results in higher utilization. With your close to the max "High" quality settings this game tries to stay at a five towards 5~6 GB threshold. We noticed that 4GB cards can have difficulties running the game at our settings. Especially the Radeons.

index.php


Control if you turn on everything and 8k you get 19GB vRAM usage.

8K gaming isn't ready for prime time but it's good to see just how far we'll need to be along the road for it now, with this test showing that even 8GB, 12GB, and hell even 16GB of VRAM might not be enough for the future of gaming.

Read more: https://www.tweaktown.com/articles/...gb-vram/index.html#RTX-Features-(Theres-LOTS)

9131_50_control-tested-8k-nvidia-titan-rtx-vs-amd-radeon-vii-showdown.png

https://www.tweaktown.com/articles/9131/control-tested-8k-nvidia-titan-rtx-uses-18gb-vram/index.html

List of games at different resolutions including 4k. None going near 10GB.
https://www.tweaktown.com/tweakipedia/90/much-vram-need-1080p-1440p-4k-aa-enabled/index.html
 
Last edited:
Soldato
Joined
18 May 2010
Posts
22,382
Location
London
I did a quick google yesterday on GDDR6x and Micros marketing material made it look like it is 30% faster than GDDR6.

Lets say that's true for one min. 10GB of GDDR6x is the equivalent of having 13GB GDDR6 in terms of bandwidth.

The issue is some games have that little bar at the top that let you set IQ settings which they calculate wont exceed your vram limitations.

To these programs 10GB of GDDR6 vs 10GB of GDDR6X makes no difference. They will still count it as 10GB and thus set the limit to 10GB.

I'm not really arguing against te 3080 here as I have one on order just discussing potential issues.
 
Status
Not open for further replies.
Back
Top Bottom