• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Not even so much that. I have a 1080Ti currently and am looking to upgrade. It just throws me that they went from 11GB on the 1080Ti and the 2080Ti to 10GB on the 3080, especially when there are already outliers that are approaching that limit.

I am still unsure why they knocked it down if they deemed both previous flagships of warranting more. I still would be unhappy with a card hitting VRAM bottlenecks a year after spending £650+ on it though no matter what it had.

I think they are concerned about AMD and the reputation that Turing was a rip off. That, plus Ampere not being much of an improvement.

My theory is as follows:

To stay ahead of AMD. They have rushed to market with only 1gb ram modules available.

To be competitive, they have cut cost on the 3080 by producing with 10gb. Then they can release at £650, with 30% over the 2080ti on current games and get the great publicity. However, be aware that the 3080 isn't really 30% quicker. Nvidia have had to max these out to gain the 30%. Oc against oc 2080ti cuts this lead down to ~20%.

As the arch hasn't improved much, they are then in the position of not having much more to give from the full fat chip and not making the cash they want. They then make a new tier, with large profit margin and new marketing (muddying gaming and production) to try rake some of it in, the 3090 with ample ram for the upcoming years.

When 2gb modules are available. They will release a 20gb 3080 for £950 as a super (the card they really wanted to release), and a full fat Titan with 48gb and a few percent faster than the 3090.
 
Last edited:
I think they are concerned about AMD and the reputation that Turing was a rip off. That, plus Ampere not being much of an improvement.

My theory is as follows:

To stay ahead of AMD. They have rushed to market with only 1gb ram modules available.

To be competitive, they have cut cost on the 3080 by producing with 10gb. Then they can release at £650, with 30% over the 2080ti on current games and get the great publicity. However, be aware that the 3080 isn't really 30% quicker. Nvidia have had to max these out to gain the 30%. Oc against oc 2080ti cuts this lead down to ~20%.

As the arch hasn't improved much, they are then in the position of not having much more to give from the full fat chip and not making the cash they want. They then make a new tier, with large profit margin and new marketing to try rake some of it in, the 3090.

When 2gb modules are available. They will release a 20gb 3080 for £950 as a super (the card they really wanted to release), and a full fat Titan with 48gb and a few percent faster than the 3090.

Sorry, but this reads like an AMD fanboy's reasoning.

1. 3080 is a huge leap over the 2080, the largest increase in recent memory.
2. The fact that it's over 30% faster than the 2080ti is the icing on the cake
3. The VAST majority do not overclock their GPU's.
4. 10GB is absolutely fine for 1080P, 1440P, which is a large % of the 3080 buyers from what I see. It's questionable for 4K, as we have games already like Doom Eternal, where it actually needs 9GB to run on 4k ultra. This doesn't leave much breathing room, but will still likely be fine for at least a year or two
5. Many don't understand how expensive GDDR6X is. Doubling the memory will make a 3080 closer to £900 MSRP (so over £1100 at current UK prices...), putting it out of the range of many, and providing little advantage to most.
 
I think they are concerned about AMD and the reputation that Turing was a rip off. That, plus Ampere not being much of an improvement.

My theory is as follows:

To stay ahead of AMD. They have rushed to market with only 1gb ram modules available.

To be competitive, they have cut cost on the 3080 by producing with 10gb. Then they can release at £650, with 30% over the 2080ti on current games and get the great publicity. However, be aware that the 3080 isn't really 30% quicker. Nvidia have had to max these out to gain the 30%. Oc against oc 2080ti cuts this lead down to ~20%.

As the arch hasn't improved much, they are then in the position of not having much more to give from the full fat chip and not making the cash they want. They then make a new tier, with large profit margin and new marketing (muddying gaming and production) to try rake some of it in, the 3090 with ample ram for the upcoming years.

When 2gb modules are available. They will release a 20gb 3080 for £950 as a super (the card they really wanted to release), and a full fat Titan with 48gb and a few percent faster than the 3090.

That all makes sense, it's just odd to me is all. Not sure they will be able to add £300 onto the price for another 10GB VRAM though and it looks like the chips can't be pushed far, so unless they have a proper refresh coming I can't see that price happening.

I would consider a 20GB one for £750-£800 if AMD don't bring something else worthwhile though
 
Sorry, but this reads like an AMD fanboy's reasoning.

1. 3080 is a huge leap over the 2080, the largest increase in recent memory.
2. The fact that it's over 30% faster than the 2080ti is the icing on the cake
3. The VAST majority do not overclock their GPU's.
4. 10GB is absolutely fine for 1080P, 1440P, which is a large % of the 3080 buyers from what I see. It's questionable for 4K, as we have games already like Doom Eternal, where it actually needs 9GB to run on 4k ultra. This doesn't leave much breathing room, but will still likely be fine for at least a year or two
5. Many don't understand how expensive GDDR6X is. Doubling the memory will make a 3080 closer to £900, putting it out of the range of many, and providing little advantage to most.

Haha maybe it does read like Amd. If you see my sig, you will see I am a 2080ti owner. I don't take sides, just like the high end which means my last amd was the 5870.

2. The leap isn't as huge for me from the ti perspective. My 2080ti is under water and benches about 15-20% under the 3080 oc scores. So not very exciting. Rumours of the 3090 being 10% quicker are even more dissapointing.
3. I agree, but my theory is more an overall picture pov of the arch and how that has molded the pricing strategy.
4+5 Many high end card buyers will also game at 4K. Many are concerned about 10gb, including me. I would gladly pay 950 for a 20gb varient today.
 
I think they are concerned about AMD and the reputation that Turing was a rip off. That, plus Ampere not being much of an improvement.

My theory is as follows:

To stay ahead of AMD. They have rushed to market with only 1gb ram modules available.

To be competitive, they have cut cost on the 3080 by producing with 10gb. Then they can release at £650, with 30% over the 2080ti on current games and get the great publicity. However, be aware that the 3080 isn't really 30% quicker. Nvidia have had to max these out to gain the 30%. Oc against oc 2080ti cuts this lead down to ~20%.

As the arch hasn't improved much, they are then in the position of not having much more to give from the full fat chip and not making the cash they want. They then make a new tier, with large profit margin and new marketing (muddying gaming and production) to try rake some of it in, the 3090 with ample ram for the upcoming years.

When 2gb modules are available. They will release a 20gb 3080 for £950 as a super (the card they really wanted to release), and a full fat Titan with 48gb and a few percent faster than the 3090.

It wouldn't surprise me if this is the reason for the 10GB release. And I do agree with you on some points.

But as has been brought up most of us do game at 1440p and 1080p. So 10GB is fine for most gamers.

And I believe 30% is in 1080p only. And the increase overall for 4k is about 70% and 50% in 1440p.
 
It wouldn't surprise me if this is the reason for the 10GB release. And I do agree with you on some points.

But as has been brought up most of us do game at 1440p and 1080p. So 10GB is fine for most gamers.

And I believe 30% is in 1080p only. And the increase overall for 4k is about 70% and 50% in 1440p.

The 30 percent 2080ti to 3080 is at 4k. I think you are using the 2080 to 3080 comparison.
 
It wouldn't surprise me if this is the reason for the 10GB release. And I do agree with you on some points.

But as has been brought up most of us do game at 1440p and 1080p. So 10GB is fine for most gamers.

And I believe 30% is in 1080p only. And the increase overall for 4k is about 70% and 50% in 1440p.

Most people also don't spend £650 on a card, so it's a bit of a moot point to say most game at 1440p or 1080p
 
Sorry, but this reads like an AMD fanboy's reasoning.

1. 3080 is a huge leap over the 2080, the largest increase in recent memory.
2. The fact that it's over 30% faster than the 2080ti is the icing on the cake
3. The VAST majority do not overclock their GPU's.
4. 10GB is absolutely fine for 1080P, 1440P, which is a large % of the 3080 buyers from what I see. It's questionable for 4K, as we have games already like Doom Eternal, where it actually needs 9GB to run on 4k ultra. This doesn't leave much breathing room, but will still likely be fine for at least a year or two
5. Many don't understand how expensive GDDR6X is. Doubling the memory will make a 3080 closer to £900 MSRP (so over £1100 at current UK prices...), putting it out of the range of many, and providing little advantage to most.
1. Just saying but the 2080 was TU104 and the 3080 is GA102. The 3080 is a step up in GPU Variant.
3. I agree in general. But i think we would find that people spending £600+ on a GPU are more likely to overclock it, than someone who spends less than £600
4. I find it hard to believe that most people who will buy a 3080 will be gaming at 1080p. The number of people gaming at 1440p vs 4k with these cards is probably quite close.
 
Not even so much that. I have a 1080Ti currently and am looking to upgrade. It just throws me that they went from 11GB on the 1080Ti and the 2080Ti to 10GB on the 3080, especially when there are already outliers that are approaching that limit.

I am still unsure why they knocked it down if they deemed both previous flagships of warranting more. I still would be unhappy with a card hitting VRAM bottlenecks a year after spending £650+ on it though no matter what it had.
I am sure AMD will have something better to cater for your needs ;)


who is on 1080p, Yeah I was sorta impressed over a decade ago. It's pretty much potato resolution now.
Agreed, anything under 4K is potato imo :p
 
We will only see an extra VRAM variant if AMD pulls something out the bag IMO. If AMD does pull something out the bag it will most likely be close in price to the 3080. I can't see Nvidia releasing a high VRAM version of the 3080 with an uncompetitve price.
 
If nvidia had released the 3080 with 12Gb we would not be having this conversation. 10Gb will likely be fine for most but it will likely require dumming down on some settings in a couple of high end titles in 4k in the next couple of years.
 
who is on 1080p, Yeah I was sorta impressed over a decade ago. It's pretty much potato resolution now.

People that suffer from eyestrain and laptop gamers.

I might be in the minority, i don't know, but i think the lack of a big 1080p performance boost (watt-for-watt) is a bigger problem than some make out.

If the 3080 delivered what it does at stock with a 1080 level power draw, I think it would be lauded as one of the best cards ever made. As it stands, placing a 1080GTX into a laptop was a problem at 185 watts, placing a 3080 into a laptop at 338 watts is going to nerf. its performance significantly.

Actually, I am kinda interested to see what happens when they run the chip at the 185watt-200watt laptop range, i don't think the results are going to be good, Big Navi might make a very convincing argument at this sort of watt level, especially if they pair it with HBM, as seen in the latest Macbook Pro.
 
Short of your MS Flight Sims etc, are there even any mainstream games that would come close to using 10GB VRAM? I've got a Gigabyte RTX 2080 Gaming OC which has 8GB, and for your CoD/Witcher 3/Rainbow6/RPGs even at max settings 3440x1440 it hasn't seem to be even close to a limiting factor.

But if you use a lot of mods, you can go over 8Gb, and over 10Gb apparently.
 
My Skyrim SE modded, my most played game of all time, uses 9GB VRAM!

And I don’t have that many mods installed compared to many people. This is at 3440x1440 by the way, not even 4K.

People can play mental gymnastics and new toy self delusion all they want, but 10Gb for a new flagship card in 2020 just smells a bit ‘off’
 
Status
Not open for further replies.
Back
Top Bottom