• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Soldato
Joined
12 May 2014
Posts
5,235
I'd have thought that consoles would need to be over-specced in the first instance as they would be designed to last for a lot longer than a GPU would be as they need to be able to offer the promise of the later games for a lot longer. This would be a contributing factor to them being loss-leaders, so whether they felt it was too much RAM at the moment is not the point, I'd guess. They need to look longer term.
For the record, I don't think the consoles have too much VRAM. I was merely pointing out that if they thought they had too much they had options. At this point your fight is with princessfrosty. Tag you in. Have fun:D.

A GPU is not designed to be cutting edge for the same time period. 20GB of RAM is overkill which would be more expensive than is currently needed. If it's a choice between 10GB or 20GB, from a purely bang for buck view, 10GB is about right.
I don't think a GPU should hit a VRAM bottleneck when gaming (within reason).

The reason the 3080 has 10GB is because they didn't want to give consumers too good of a deal and they got caught with their pants down.

Realistically Nvidia probably wanted to aim for at least 12GB (most likely 16GB before they heard about AMDs comeback) but couldn't cut the die in such a way as to not block off their options if AMD released something faster than them a few months later. Hence we have this situation. Nvidia did the maths and they balanced cost with the need to screw over gamers:D.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,508
Location
Greater London
Cool. Just seems a bit odd saying that the cost increase of adding a 1GB module (If that does work in the architecture, I have no idea on this) would be minimal, when cards cost an insane amount at the moment. There are obviously a lot of factors which are driving that cost, but saying more should be added seems really strange.

Also pricing wise £649 for a high end GPU with 10GB (at MSRP - Not realistic atm, but it was the design goal of the card) seems reasonable progress when compared to a 2080ti with 11GB of slower memory which cost a damn sight more at launch (Was it £1,100?)

Price wise, expensive but progress. Would 1GB extra make that much of a difference? Yet to see our 3080 struggle at 4K with anything mainstream except Cyberpunk that I'm aware of.

I'd have thought that consoles would need to be over-specced in the first instance as they would be designed to last for a lot longer than a GPU would be as they need to be able to offer the promise of the later games for a lot longer. This would be a contributing factor to them being loss-leaders, so whether they felt it was too much RAM at the moment is not the point, I'd guess. They need to look longer term.

A GPU is not designed to be cutting edge for the same time period. 20GB of RAM is overkill which would be more expensive than is currently needed. If it's a choice between 10GB or 20GB, from a purely bang for buck view, 10GB is about right.
Well said. You seem to have more common sense then some of our regular posters here. Nice to have you on onboard Mr Turnip :D
 
Associate
Joined
1 Oct 2020
Posts
1,145
For the record, I don't think the consoles have too much VRAM. I was merely pointing out that if they thought they had too much they had options. At this point your fight is with princessfrosty. Tag you in. Have fun:D.

No need to fight - I think the gist of his argument was that it was a considered engineering/accounting decision (The best they can do vs. what they can afford to lose) and I completely agree. Same with the Nvidia GPU, it's not yet showing a vRAM related issue.

3080 will be EOL soon and people are still debating the memory setup.

The original models are, aren't they? :p
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,508
Location
Greater London
3080 will be EOL soon and people are still debating the memory setup.
Exactly that. 10GB is not a big issue. People next year will just buy the 4070 which will have more power and more VRAM instead anyway. In the meantime the 3080 will have done it’s job during it's life cycle. But as I say, won’t stop people dying to say I told you so, but from what I can see that battle is already lost anyway as can’t see anything worthy coming out in the next year that will push it anyways.
 
Caporegime
Joined
4 Jun 2009
Posts
31,016
3080 is serving me far better than I expected tbh @ 3440 @144HZ and 4k@60 so unless ray tracing performance jumps significantly, I'll probably skip RDNA 3 and nvidias 40xx cards, certainly won't be upgrading for more vram anyway ;)
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,508
Location
Greater London
3080 is serving me far better than I expected tbh @ 3440 @144HZ and 4k@60 so unless ray tracing performance jumps significantly, I'll probably skip RDNA 3 and nvidias 40xx cards, certainly won't be upgrading for more vram anyway ;)
What?? Ya mad bro? 10GB ain’t enough today (need 24gb at least) so how will you make do with it for another 3-4 years???

Some will have you believe that every new game out in the near future will be unplayable due to only having 10GB :p
 
Soldato
OP
Joined
26 Aug 2004
Posts
5,032
Location
South Wales
3080 is serving me far better than I expected tbh @ 3440 @144HZ and 4k@60 so unless ray tracing performance jumps significantly, I'll probably skip RDNA 3 and nvidias 40xx cards, certainly won't be upgrading for more vram anyway ;)
Since getting my new 4k screen I've noticed 60fps looks a little choppy, i don't know if it's the screen or the fact I've grown so used to high refresh rates that dropping to around 70fps or lower doesn't look all that smooth now. But then this is my 3rd high refresh screen since 2013, probably just got an eye for it now.
My 3080 handles what i play fine though at this res and with high frame rates mostly, but i hope the next cards are a big leap to drive 144+hz screens.

The 4080 will surely have 16gb vram I'd have thought?
 
Soldato
Joined
24 Sep 2013
Posts
2,890
Location
Exmouth, Devon
Since getting my new 4k screen I've noticed 60fps looks a little choppy, i don't know if it's the screen or the fact I've grown so used to high refresh rates that dropping to around 70fps or lower doesn't look all that smooth now. But then this is my 3rd high refresh screen since 2013, probably just got an eye for it now.
My 3080 handles what i play fine though at this res and with high frame rates mostly, but i hope the next cards are a big leap to drive 144+hz screens.

The 4080 will surely have 16gb vram I'd have thought?


Once you've 'seen'/exerienced high refresh coupled with high FPS AND now with VRR. 70fps with VRR on will make the panel refresh at 70hz, so sticks out a mile. One of the biggest visual upgrades is a high refresh panel.

If you have VRR and your fps is 70 then your refresh rate will be 70 - so will stick out a mile. I take it your previous 2 panels were lower resolution?

With your games and a 4k panel, do you go into gfx options and whack everything up to max or select the ultra option for all?
 
Caporegime
Joined
4 Jun 2009
Posts
31,016
I don't see the point in keeping a high end GPU for 4 years.

Depends on how much games advance tbh.

As we have seen already:

- dlss/fsr (and soon intels version) is giving gpus a lot longer life span now and no doubt this kind of tech. will improve further over the years
- as evidenced with many titles, ray tracing will be held back for the majority of games until the next gen consoles are out

30xx and 6800xt/6900xt are still quite a bit more powerful than the ps 5/xbx so I reckon this gen of gpus will last a decent while yet.
 
Soldato
Joined
22 Nov 2018
Posts
2,715
Depends on how much games advance tbh.

As we have seen already:

- dlss/fsr (and soon intels version) is giving gpus a lot longer life span now and no doubt this kind of tech. will improve further over the years
- as evidenced with many titles, ray tracing will be held back for the majority of games until the next gen consoles are out

30xx and 6800xt/6900xt are still quite a bit more powerful than the ps 5/xbx so I reckon this gen of gpus will last a decent while yet.

Yeah but if somebody is happy with 4 year old performance, why not buy a 4 year old GPU?

Some people are still using really old Titans etc. If you're happy to use a GPU of that age, then why not buy a GPU of that age to save a lot of money?
 
Associate
Joined
4 Jun 2021
Posts
459
Location
Yorkshire
Yeah but if somebody is happy with 4 year old performance, why not buy a 4 year old GPU?

Some people are still using really old Titans etc. If you're happy to use a GPU of that age, then why not buy a GPU of that age to save a lot of money?

I think I'm missing your point.

As an owner of a GPU that is nearly 4 years old, buying a 4 year old GPU would just leave me exactly where I am now? Not buying a GPU at all seems the better course.

On the other hand, I could buy a current top end GPU and keep that for another 4-5 years.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,508
Location
Greater London
It is better to sell and buy when cards come out new in most cases in my opinion. You have something new to play with, have current hardware and usually there is not a whole lot difference between the xx70 and last generations top offering, apart from price that is, which is usually half or one third the price :p

Take a look on members market, plenty people willing to pay top dolla for old used hardware, so upgrade need not cost much.

@Th0nt You should sell you 3090 in about 1 years time for £1199 on members market. Then either get a 4080 which will no doubt be better and pocket the rest, or add a few more quid and get a 4090 if you cannot stomach less than 24GB VRAM ;)
 
Associate
Joined
1 Oct 2020
Posts
1,145
Yeah but if somebody is happy with 4 year old performance, why not buy a 4 year old GPU?

Some people are still using really old Titans etc. If you're happy to use a GPU of that age, then why not buy a GPU of that age to save a lot of money?

Agreed, but these people probably already own a 4 year old GPU. So they wouldn't look to buy one they would be satisfied with what they have.

You would buy a new GPU with the intention of it lasting a good number of years - DLSS and FSR are tools which may help lengthen the life of these GPUs, but we'll need to see how widely implemented they become.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom