• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
I think you misunderstood what he was saying. It's very possible that the publisher and developer could have lowered the specs to get more people to buy the game.

Well it seems we just ain't gonna agree on this subject are we.

It does kinda feel like a conversation from 2016 or earlier.

Obviously moving forward and over the next 3-4 years I expect more vram to be more and more important. But for the next two years I think it will be fine. I'm also at 1440p so shouldn't be a big issue.
 
Well it seems we just ain't gonna agree on this subject are we.

It does kinda feel like a conversation from 2016 or earlier.

Obviously moving forward and over the next 3-4 years I expect more vram to be more and more important. But for the next two years I think it will be fine. I'm also at 1440p so shouldn't be a big issue.
So that's the problem. You don't see anything beyond 2016. Okay then here's the deal developers are transitioning to the new consoles which are a lot more bandwidth sensitive than they ever been. During that transition you see games like flight simulator 2020, Horizon Dawn zero, etc etc bandwidth sensitive. Which is not a coincidence on the precipice of a new console released.

We are given a glimpse of that example with the Unreal Engine 5 demo. Which is streamed assets, etc at an incredible rate. Unheard-of on the PC. We have also seen demos of console games such as Ratchet & Clank Etc. Showing Ray tracing being used and streamed assets in an open world environment. None of this was available in 2016.

So it's not a matter that we agree or disagree it's a matter of being caught up on the times and using critical thinking. Making sound judgments by predicting market trends based on observations noted in the above examples. Which way gaming will go in the very near future. And using that knowledge to make a sound purchasing decisions.

It's not up for debate and you don't need anybody to agree or disagree with you about it.

Nvidia plainly stated that these new skus are a budget option. They charge you a premium and call it a flagship. And because they didn't add another 200 for the Nvidia tax you think it's a bargain. Not only is it less performance but they used less vram. And by some early reports does not seem to overclock very well. And, are having yield issues. Which typically has an effect on rate of return once the product becomes available for sale.

Because we know nvdia is rushing these products to market to beat console and rdna2 releases. Based on these observations it is very hard to take you seriously when you defend Nvidia that contradicts these basic trends. LOL
 
Last edited:
You do have a valid point but trouble is the way some people talk on here no one is going to buy a 3080 with 10gb VRAM because its going to cause "problems". However thats not going to be the case. Are these people fools and not going to be able to play any games ? Course not.

I`m still expecting them to be rather toasty which may put people off rather than lack of VRAM. Just based on the new FE Cooler design, IMHO.
 
So that's the problem. You don't see anything beyond 2016. Okay then here's the deal developers are transitioning to the new consoles which are a lot more bandwidth sensitive than they ever been. During that transition you see games like flight simulator 2020, Horizon Dawn zero, etc etc bandwidth sensitive. Which is not a coincidence on the precipice of a new console released.

We are given a glimpse of that example with the Unreal Engine 5 demo. Which is streamed assets, etc at an incredible rate. Unheard-of on the PC. We have also seen demos of console games such as Ratchet & Clank Etc. Showing Ray tracing being used and streamed assets in an open world environment. None of this was available in 2016.

So it's not a matter that we agree or disagree it's a matter of being caught up on the times and using critical thinking. Making sound judgments by predicting market trends based on observations noted in the above examples.

Which way gaming will go in the very near future. And using that knowledge to make a sound purchasing decisions. It's not up for debate and you don't need anybody to agree or disagree with you about it.

Nvidia plainly stated that these new skus are a budget option. They charge you a premium and call it a flagship. And because they didn't add another 200 for the Nvidia tax you think it's a bargain. Not only is it less performance but they used less vram. And by some early reports does not seem to overclock very well. Based on these observations it is very hard to take you seriously when you defend Nvidia that contradicts these basic trends. LOL

RTX IO and whatever AMD will call it is bringing this tech to the PC.

It's a platform. When the 3070 and the 3080 penetrate the market developers of pc games will know that's what they are going to be targetting.

Game devs will not bring their games to pc which use up 12GB of vram because they know the vast majority of gamers out there on the pc will not be able to play them.

Anyway, take that as an example. A game uses 12GB of vram on a console. That leaves just 4GB for the consoles os AND the game needs ram as well. The consoles are cutting it thin too with their combined 16GB.

It's more likely that 10GB of ram will be used as vram and the remaining 6GB for the OS and ram for the game. (even isn't a lot either)

I do believe that what you are arguing but not saying very well is that on the PC there will come a game or two which at 4k with a high res texture pack and all the other visuals will need more than 10GB of vram. That I can understand and agree with. But we are talking about a PC to PC comparison here not PC to console, because the console a) probably wont be able to reach the same visual fidelity as the PC and b) wouldn't have enough vram either to run the high res texture pack with all the bells and whistle on either.

Also regarding the consoles firstly it's not apples to apples to compare console to pc as they have different architectures. And secondly the full 16GB will not be available to the games as vram.

You got the OS which needs ram, the game needs ram and then what's left can be used as vram. Plus as has been pointed out not all the ram in the consoles if of the same speed. I am presuming the faster ram will be used for vram allocation and the slower ram could be used for the os and for the game ram it's self.

So I think the thing here is to clarify are we talking PC vs console or PC vs PC because they are both different conversations.

I am not denying more ram is always better. But that is a PC vs PC conversation. Not a PC vs console because I believe that the unified 16GB of ram on the console is also a limiting factor when you consider it as I have laid it out above.

A PC with 16GB of ram and a 10GB 3080 actually has MORE total ram available to it.

It's already £650 for a 10GB card. A card with more vram will be more expensive anyway.

It is what it is tho isn't it. Currently if you want the fastest card Nvidia has to offer (excluding spending silly money) you get a 3080.

I'm on a 1080 and have been waiting impatiently for these cards to drop. I don't think it's right to wait more for cards we don't even know if they exist or when they might arrive.

I can't see new cards arriving before Xmas form Nvidia.
 
Game devs will not bring their games to pc which use up 12GB of vram because they know the vast majority of gamers out there on the pc will not be able to play them.

It's more likely that 10GB of ram will be used as vram and the remaining 6GB for the OS and ram for the game. (even isn't a lot either)

I am not denying more ram is always better. But that is a PC vs PC conversation. Not a PC vs console because I believe that the unified 16GB of ram on the console is also a limiting factor when you consider it as I have laid it out above.

A PC with 16GB of ram and a 10GB 3080 actually has MORE total ram available to it

I'm on a 1080 and have been waiting impatiently for these cards to drop. I don't think it's right to wait more for cards...
To your first point, you are completely wrong. This gen of consoles only had eight gigabytes. The pc needed 16gb to compensate. Now they doubled it to 16GB. Believe it or not,and I know you don't, consoles are the trendsetters. Nvidia has absolutely no sway in the console gaming market. Except for a title here and there.

To the next point you made it is not likely to be limited to 10. I provided a slide early in this thread which clearly states 10 + 3.5 for games directly from Microsoft. I believe you already know this but are still trying to promote it to justify the vram on the 3080 inwhich Nvidia clearly stated was a budget option. They are simply charging you more for that budget option because they know you, you will buy it.

So your admission clearly states that you are aware that in this instance more RAM is better. However your omission that Windows 10 operating system is a far cry from of what they use on the consoles which is more streamlined and efficient. They do not require as much RAM as they do on the PC. This is basic, rudimentary knowledge. And, it's also rudimentary to know that as Microsoft and Sony provide updates it would reduce the memory footprint. In which case developers would have more memory to work with.

And to your next point that is a fundamental flaw in your understanding of ddr4 vs gddr6. You are creating a false equivalence between the two.

And to your final point clearly it demonstrates a lack of restraint and your compulsion to buy something. Which is driving you to do what you do.

Ladies and gentlemen who read this post his last statement clearly demonstrates why he's defending Nvidia. He can't wait to buy from them and will only buy from them. This is called advertising. He simply promoting Nvidia products. Lol

The cat is out the bag now. :D

Again, nvidia plainly stated that these new skus are a budget option. They charge you a premium and call it a flagship. And because they didn't add another 200 for the Nvidia tax you think it's a bargain. Not only is it less performance but they used less vram. And by some early reports does not seem to overclock very well. And, are having yield issues. Which typically has an effect on rate of return once the product becomes available for sale.

Because we know nvdia is rushing these products to market to beat console and rdna2 releases. Based on these observations it is very hard to take you seriously when you defend Nvidia that contradicts these basic trends.

And, to why I didn't quote everything in your post? I cut out the bloat to get to the point. Just like what next-gen consoles do to Windows 10, which is called a service...consoles a gaming system. . :p
 
Last edited:
To your first point, you are completely wrong. This gen of consoles only had eight gigabytes. The pc needed 16gb to compensate. Now they doubled it to 16GB. Believe it or not,and I know you don't, consoles are the trendsetters. Nvidia has absolutely no sway in the console gaming market. Except for a title here and there.

To the next point you made it is not likely to be limited to 10. I provided a slide early in this thread which clearly states 10 + 3.5 for games directly from Microsoft. I believe you already know this but are still trying to promote it to justify the vram on the 3080 inwhich Nvidia clearly stated was a budget option. They are simply charging you more for that budget option because they know you, you will buy it.

So your admission clearly states that you are aware that in this instance more RAM is better. However your omission that Windows 10 operating system is a far cry from of what they use on the consoles which is more streamlined and efficient. They do not require as much RAM as they do on the PC. This is basic, rudimentary knowledge. And, it's also rudimentary to know that as Microsoft and Sony provide updates it would reduce the memory footprint. In which case developers would have more memory to work with.

And to your next point that is a fundamental flaw in your understanding of ddr4 vs gddr6. You are creating a false equivalence between the two.

And to your final point clearly it demonstrates a lack of restraint and your compulsion to buy something. Which is driving you to do what you do.

Ladies and gentlemen who read this post his last statement clearly demonstrates why he's defending Nvidia. He can't wait to buy from them and will only buy from them. This is called advertising. He simply promoting Nvidia products. Lol

The cat is out the bag now. :D

Again, nvidia plainly stated that these new skus are a budget option. They charge you a premium and call it a flagship. And because they didn't add another 200 for the Nvidia tax you think it's a bargain. Not only is it less performance but they used less vram. And by some early reports does not seem to overclock very well. And, are having yield issues. Which typically has an effect on rate of return once the product becomes available for sale.

Because we know nvdia is rushing these products to market to beat console and rdna2 releases. Based on these observations it is very hard to take you seriously when you defend Nvidia that contradicts these basic trends.:p

Nope. I have a Gsync monitor. So can only buy Nvidia to make the most out of my monitor. So my options are thusly 3070 or 3080 or wait more for a card we don't know if coming and when.
 
Nope. I have a Gsync monitor. So can only buy Nvidia to make the most out of my monitor. So my options are thusly 3070 or 3080 or wait more for a card we don't know if coming and when.
As the old saying goes "you knew what this was" . You knew she stink gsync was a closed standard. And which tied you into Nvidia no matter what. Therefore that is not an excuse. Just a bad purchased on your part. Freesync has always been and will always be open to the market. And now that Nvidia supports freesync it you have no excuse.

You can do like everybody else has done put it up for sale and buy a better freesync monitor. That way you're not locked in to buying just Nvidia.
 
As the old saying goes "you knew what this was" . You knew she stink was a closed standard. And which tied you into Nvidia no matter what. Therefore that is not an excuse. Just a bad purchased on your part. Precinct has always been and will always be open to the market. And now that Nvidia support it you have no excuse.

You can do like everybody else has done put it up for sale and buy a better freesync monitor. That way you're not locked in to buying just Nvidia.

No not a bad purchase at all. I believe Gsync is superior to Freesync but that's a another conversation entirely.
 
UPSCALED 8k. Not true 8k. Big difference.
Yup. Even makes 8K attainable on a budget.
9532_105_death-stranding-benchmarked-how-does-hideo-kojimas-game-run-at-8k_full.png
 
RTX IO and whatever AMD will call it is bringing this tech to the PC.

It's a platform. When the 3070 and the 3080 penetrate the market developers of pc games will know that's what they are going to be targetting.

Game devs will not bring their games to pc which use up 12GB of vram because they know the vast majority of gamers out there on the pc will not be able to play them.

Anyway, take that as an example. A game uses 12GB of vram on a console. That leaves just 4GB for the consoles os AND the game needs ram as well. The consoles are cutting it thin too with their combined 16GB.

It's more likely that 10GB of ram will be used as vram and the remaining 6GB for the OS and ram for the game. (even isn't a lot either)

I do believe that what you are arguing but not saying very well is that on the PC there will come a game or two which at 4k with a high res texture pack and all the other visuals will need more than 10GB of vram. That I can understand and agree with. But we are talking about a PC to PC comparison here not PC to console, because the console a) probably wont be able to reach the same visual fidelity as the PC and b) wouldn't have enough vram either to run the high res texture pack with all the bells and whistle on either.

Also regarding the consoles firstly it's not apples to apples to compare console to pc as they have different architectures. And secondly the full 16GB will not be available to the games as vram.

You got the OS which needs ram, the game needs ram and then what's left can be used as vram. Plus as has been pointed out not all the ram in the consoles if of the same speed. I am presuming the faster ram will be used for vram allocation and the slower ram could be used for the os and for the game ram it's self.

So I think the thing here is to clarify are we talking PC vs console or PC vs PC because they are both different conversations.

I am not denying more ram is always better. But that is a PC vs PC conversation. Not a PC vs console because I believe that the unified 16GB of ram on the console is also a limiting factor when you consider it as I have laid it out above.

A PC with 16GB of ram and a 10GB 3080 actually has MORE total ram available to it.

It's already £650 for a 10GB card. A card with more vram will be more expensive anyway.

It is what it is tho isn't it. Currently if you want the fastest card Nvidia has to offer (excluding spending silly money) you get a 3080.

I'm on a 1080 and have been waiting impatiently for these cards to drop. I don't think it's right to wait more for cards we don't even know if they exist or when they might arrive.

I can't see new cards arriving before Xmas form Nvidia.

Not sure if you're new to PC gaming or not? Generally, games have graphical presets for different system configurations. If you don't have enough VRAM, you'll be recommended the lower graphical presets, such as High instead of Ultra. This is how it's always worked.

There are millions of GPU's already out there with more than 10GB VRAM, and about to be several hundred million (over a few years) next gen consoles released, with 16GB of total memory. Game developers will of course add in higher resolution textures, effects etc that utilise > 10GB memory.

How do we know this? They already do. At 4K (which is mainsteam upon next gen console release) there are plenty of games that need 10GB or more. If you're still at 1080P, or 1440P, of course you can get by with much less VRAM, as this is a low resolution and much easier to draw render. I'd argue the 3080 is complete overkill for 1080P, and a bit wasteful for 1440P even, unless you have a 240Hz 1440P monitor.

There's a reason all Nvidia's generational % improvements all measured in 4K... ;)
 
Not sure if you're new to PC gaming or not? Generally, games have graphical presets for different system configurations. If you don't have enough VRAM, you'll be recommended the lower graphical presets, such as High instead of Ultra. This is how it's always worked.

There are millions of GPU's already out there with more than 10GB VRAM, and about to be several hundred million (over a few years) next gen consoles released, with 16GB of total memory. Game developers will of course add in higher resolution textures, effects etc that utilise > 10GB memory.

How do we know this? They already do. At 4K (which is mainsteam upon next gen console release) there are plenty of games that need 10GB or more. If you're still at 1080P, or 1440P, of course you can get by with much less VRAM, as this is a low resolution and much easier to draw render. I'd argue the 3080 is complete overkill for 1080P, and a bit wasteful for 1440P even, unless you have a 240Hz 1440P monitor.

There's a reason all Nvidia's generational % improvements all measured in 4K... ;)

And on that quick point, whats the current consensus of 8GB for 1440p? Ie.. the 3070?
 
The 3gb 780 Ti was enough in 2013 when the consoles launched with 8gb. That's 3Gb Vram Vs 8Gb Vram. It wasn't a problem.

Now in 2020, 10gb Vram is suddenly a problem Vs the consoles 16Gb? It's a smaller gap than in 2013.

Doesn't anybody remember this??? It wasn't a problem. Stop panicking.

780Ti performance ended up falling off a cliff compared to the 290x's 4gig.

There's no such thing as AMD "Fine wine", it's simply the framebuffer making itself heard as time and gaming technology moves ever onward.
 
And on that quick point, whats the current consensus of 8GB for 1440p? Ie.. the 3070?

I think 8GB is still okay for 1440P. I've personally not seen more than 8GB used at 1440P on my Radeon 7, I think the most I've seen is around 7GB, though more commonly see 5-6GB.

Hard to see how PS5/Xbox X game ports will fare at 1440P a few months/year from now though, we'll have to see.
 
Status
Not open for further replies.
Back
Top Bottom