• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Any chance you still have Surge 2 installed? :p

Lol, I know I proper failed at testing them games.

I will play it today for ya, I wanna give the game a go anyway and haven't been back to it.

So 4k, ultra vsync off yeah. Any particular AA you want on or any game breaking settings you want off?
 
You nutter.

I never once said the LG CX OLED didn't have freesync.

Yes you did:

"A lot of people want the Nvidia for gsync to go with their LG OLED tvs." If you don't see how this statement has already suggested to thousands who've read it, that only Nvidia cards support Gsync type technology on the LG OLED range, you need to study the written word.

Depends, Nvidia still has many plus points. A lot of people want the Nvidia for gsync to go with their LG OLED tvs.
 
Yes you did:

"A lot of people want the Nvidia for gsync to go with their LG OLED tvs." If you don't see how this statement has already suggested to thousands who've read it, that only Nvidia cards support Gsync type technology on the LG OLED range, you need to study the written word.
Where did he say LG CX didn’t have Freesync?

You seem to have a problem of projecting a lot. You read one thing and understand something else.
 
10GB is not enough, Heck the 1080ti has 11.. Ultrawide and 4k becoming the norm,

Its borderline enough for 1440p, for 4k its like kind of the minimum you want entering next gen, so 12gb would be perfect.

https://videocardz.com/newz/nvidia-preparing-geforce-rtx-3080-ti-with-9984-cuda-cores

If this comes out for $749-$799 I'm buying this with 12gb vram :)

Although I will be playing at 1440p, its never bad to get the higher vram model+ it should be 5-10% faster than a 3080 so not too bad of a value either if this ends up being a real thing.
 
Doesn't look like 10GB will be a problem for Watch Dogs seeing as it barely runs at 4K on a 3090.

4k No dlss No rtx = 60fps
4k dlss quality No rtx = 90fps
4k No dlss rtx On = 30fps
4k dlss quality rtx On = 50fps

 
Fresh news today of the upcoming 3080ti having 12GB of VRAM. This will be priced much more competitively than a 20GB 3080ti would have been, this is mosty likely the 1080ti type card of this generation, especially if it's on TSMC 7nm.
Priced competitively? Yeah right. There's a huge gap in price between the 3080 and 3090, this is not going to be priced competitively. It's not 7nm either and the 1080ti was way faster than the 1080. Any card squeezed in the small performance gap between the 3080 and 3090 is a cash grab for the people who thought the 3080 was too cheap.
 
Priced competitively? Yeah right. There's a huge gap in price between the 3080 and 3090, this is not going to be priced competitively. It's not 7nm either and the 1080ti was way faster than the 1080. Any card squeezed in the small performance gap between the 3080 and 3090 is a cash grab for the people who thought the 3080 was too cheap.

If it's a response to AMD, the price will be tied to AMD's offering. Nvidia doesn't automatically make more money with higher prices. When the prices get too high, they sell less cards. If there's real competition from AMD when Nvidia set prices too high, they lose a LOT of sales.
 
Priced competitively? Yeah right. There's a huge gap in price between the 3080 and 3090, this is not going to be priced competitively. It's not 7nm either and the 1080ti was way faster than the 1080. Any card squeezed in the small performance gap between the 3080 and 3090 is a cash grab for the people who thought the 3080 was too cheap.
So you don't see a 3080ti being priced at something like $799?

Maybe nvidia will do something of that sorts if 6800xt beats the 3080 and is priced low.
 
Please I would like to know. Can you help us with a model for ram sizing?

No, because I'm not a games developer or qualified to do so. The only takeaway I can give you is that monitoring vbuffer usage via tools such as Afterburner that communicate with NVAPI doesn't tell you anything at all worth knowing. DirectX12 gives developers far more control over middleware, meaning even more so than ever before, seeing high allocation cannot always be attributed to what is needed for optimum performance.
 
All I see is 85 pages of people spouting conjecture and incomplete information with no real world experience (or understanding) of how DX12 middleware and NVAPI works.
Lol - but you dont need a technical understanding in modern day internet forums. All that is required is one or two outlier cases to prove your point and gut feeling.
 
So we are basically worried a bit about 4k, but do we all agree 10gb is more than enough for 1440p?

From what I see, games are using like 4gb-6gb at 1440p for 99% of the games at the highest settings. And around 6gb-8gb at 4k at the highest settings.

With the true next gen games maybe this 1440p usage will move to 6gb-8gb, and 4k might move to something like 8gb-10gb or 8gb-12gb.

This is all purely speculative but I'm hoping these cards last a while... As far as I know even the 780ti lasted like 3 years before it could not run most games at ultra settings anymore.

We are not being gimped as bad as we were by the 780ti it had 35% of vram of ps4's total memory, and the 3080 has 65% of ps5's total memory pool.

The next gen consoles have a total of 16GB of RAM. Take away from that the OS, housekeeping features and game code/data and you will be lucky to be left with 10GB free for graphics. The up coming XBOX even has a split in it's RAM 6/10GB.

Then consider that Microsoft, Sony, AMD, Nvidia and presumably Intel see streaming directly to the GPU as the way forward.

Ampere looks like a great GPU, but turn all the settings up and it still struggles with 4k. You need to turn the settings down and then rely on DLSS.

Then there is the horrible power usage and heat output from these cards.

Everything suggests you will want if not need a new GPU before you need more than 10GB, therefore we are not worried about 10GB for 4k. Only the muppets that don't know anything about such things other than seeing a number worry.
 
Status
Not open for further replies.
Back
Top Bottom