• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
To quote HUB

"I did warn sticking 8 gigs of VRAM on a 800USD card is a bad idea"

Finally someone addressing the VRAM to price/power mismatch from Nvidia.

Timestamped link.

 
To quote HUB

"I did warn sticking 8 gigs of VRAM on a 800USD card is a bad idea"

Finally someone addressing the VRAM to price/power mismatch from Nvidia.

Timestamped link.

800USD???? 8GB, we referring to the 3070 I presume? Wasn't MSRP $499? If someone chose to pay the scalping price of $800+, that's their fault, not nvidia/etailers/aib..... And those who did pay the scalping prices for rdna 2 and ampere is the reason we're in the mess we are now :o
 
I felt that 10GB was going to be an issue for my 3080 at 4K, if I had planned to hold on another year. There was a few games that it caused issues with, FPS was 50 - 60+ then tanked due to hitting VRAM limits (DCS and FC6) for me. Yet it was still argued that the 3080 would run out of rendering power, before the smallish VRAM became a problem at 4K.

So despite these issues many of us 3080 owners running 4K were told our GPU was fine, that no matter what game we had VRAM issues with, it was the game that was the problem. Or even more laughably we were told “that game is crap anyway”, or “nobody plays that so it doesn’t count”. Yet we were trolled over and over with shill posts about Cyberpunk with RT and DLSS being all that mattered.
Well lazy dev's is definitely an issue, but of course we all know we not going to get the industry to change, for a long time now, the fact is optimising a game comes at the end of game development as its considered low priority, and by the time they get to that point they seem to just make sure it at least runs ok on the top end hardware available at the time.

As I often get quoted from developer friends "upgrading the hardware is the fix".
For those of us who play RPG's, who like textures that dont look like they from the PS2 era, we knew the VRAM was an issue. I remember thinking it over when I got my 3080, but I went ahead, as the pricepoint at the time for something better was ridiculous, and I would have only had 1 extra gig keeping my 1080ti. I dont regret it, but I have always felt 10 gigs of VRAM with the power of a 3080 was a massive mismatch.

Even on the 4000 series, the 4070ti really should be a 16 gigs card. Nvidia are still been tight, just not as extreme.

Meanwhile Intel managed to supply 8 gigs on a £250 card. So VRAM is defenitly not that expensive as Nvidia appear to give the impression from. I think whats hindering Nvidia is dedicated hardware for RT, thats clearly addiing to the cost of the cards, and they are mitigating that by slicing away VRAM. Thats why I want them to release non RT variants of their cards with higher VRAM at same price point. But they are all in now on their current idea.
 
800USD???? 8GB, we referring to the 3070 I presume? Wasn't MSRP $499? If someone chose to pay the scalping price of $800+, that's their fault, not nvidia/etailers/aib..... And those who did pay the scalping prices for rdna 2 and ampere is the reason we're in the mess we are now :o
No idea which SKU he was referencing but he said it.

I remember in the days Nvidia used to send a rep to youtubers, the guy was with the guy on pcper, and said Nvidias biggest problem was getting customers using their previos gen to upgrade, I wonder if they think they cracked it via planned obsolescence on VRAM.
 
Last edited:
Well lazy dev's is definitely an issue, but of course we all know we not going to get the industry to change, for a long time now, the fact is optimising a game comes at the end of game development as its considered low priority, and by the time they get to that point they seem to just make sure it at least runs ok on the top end hardware available at the time.

As I often get quoted from developer friends "upgrading the hardware is the fix".
For those of us who play RPG's, who like textures that dont look like they from the PS2 era, we knew the VRAM was an issue. I remember thinking it over when I got my 3080, but I went ahead, as the pricepoint at the time for something better was ridiculous, and I would have only had 1 extra gig keeping my 1080ti. I dont regret it, but I have always felt 10 gigs of VRAM with the power of a 3080 was a massive mismatch.

Even on the 4000 series, the 4070ti really should be a 16 gigs card. Nvidia are still been tight, just not as extreme.

Meanwhile Intel managed to supply 8 gigs on a £250 card. So VRAM is defenitly not that expensive as Nvidia appear to give the impression from. I think whats hindering Nvidia is dedicated hardware for RT, thats clearly addiing to the cost of the cards, and they are mitigating that by slicing away VRAM. Thats why I want them to release non RT variants of their cards with higher VRAM at same price point. But they are all in now on their current idea.

Typical developers....... :D

5WzGFaN.png

:cry:

But they are all in now on their current idea.

Which if looking at this from nvidias POV, is the right move, it's obvious what way the gaming development world (well not even just gaming industry but I digress....) is heading now and long term, this is going to pay of for nvidia, amd are a gen behind in both hardware RT and software/drivers for RT and general AI advancement, if amd aren't careful, they could find themselves back in a similar situation where they had no chance back with their 380/480/5700xt gpus.....

I'm actually most excited for intel gpus in the future tbh as they are making big strides in the AI and RT department for their first dGPU.

No idea which SKU he was referencing but he said it.

I remember in the days Nvidia used to send a rep to youtubers, the guy was with the guy on pcper, and said Nvidias biggest problem was getting customers using their previos gen to upgrade, I wonder if they think they cracked it via planned obsolescence on VRAM.

Well whoever said it, it's still a silly statement tbh, no one forced customers to overpay for any hardware.

I never get this "planned obsolescence" argument either, technically every single product has "planned obsolescence", maybe the equivalent for amds planned obsolescence is their lack of RT hardware/grunt?
 
AMD have shown you dont need dedicated RT hardware to use RT, if it was required I would agree, so the dedicated hardware is just a modest framerate boost in RT games, its not a requirement to play RT games.

Which card aged better, the 16 gigs AMD 6800XT without dedicated RT cores, or the 10 gigs 3080 with dedicated RT cores?
 
Well lazy dev's is definitely an issue, but of course we all know we not going to get the industry to change, for a long time now, the fact is optimising a game comes at the end of game development as its considered low priority, and by the time they get to that point they seem to just make sure it at least runs ok on the top end hardware available at the time.

As I often get quoted from developer friends "upgrading the hardware is the fix".
For those of us who play RPG's, who like textures that dont look like they from the PS2 era, we knew the VRAM was an issue. I remember thinking it over when I got my 3080, but I went ahead, as the pricepoint at the time for something better was ridiculous, and I would have only had 1 extra gig keeping my 1080ti. I dont regret it, but I have always felt 10 gigs of VRAM with the power of a 3080 was a massive mismatch.

Even on the 4000 series, the 4070ti really should be a 16 gigs card. Nvidia are still been tight, just not as extreme.

Meanwhile Intel managed to supply 8 gigs on a £250 card. So VRAM is defenitly not that expensive as Nvidia appear to give the impression from. I think whats hindering Nvidia is dedicated hardware for RT, thats clearly addiing to the cost of the cards, and they are mitigating that by slicing away VRAM. Thats why I want them to release non RT variants of their cards with higher VRAM at same price point. But they are all in now on their current idea.

Either 10GB is enough at 4K or it isn't and as you allude to, blaming lazy devs is a cop out and always was. I was always of the opinion that while 10GB was just enough at 4K a few years ago, it would become a problem for those on a 3 or 4 year upgrade cycle.

I also strongly disagreed with the nonsense that the 3080 would run out of GPU grunt, long before VRAM at 4K became an issue.
 
Last edited:
I never get this "planned obsolescence" argument either, technically every single product has "planned obsolescence", maybe the equivalent for amds planned obsolescence is their lack of RT hardware/grunt?

Until AMD release a card with dedicated RT, thats a clear no. I think they wont release a card with dedicated RT, as its extra cost for something with only moderate benefit. Same reason they proved you dont need expensive GSYNC modules to use VRR. Nvidia always seem to go what costs the most route.
 
Until AMD release a card with dedicated RT, thats a clear no. I think they wont release a card with dedicated RT, as its extra cost for something with only moderate benefit. Same reason they proved you dont need expensive GSYNC modules to use VRR. Nvidia always seem to go what costs the most route.

I don't think it's planned obsolescence with the VRAM, I think it's just Nvidia giving the minimum to maximise profits. They are very adept at selling locked "software features" as reasons to upgrade and that is where the planned obsolescence comes in.
 
AMD have shown you dont need dedicated RT hardware to use RT, if it was required I would agree, so the dedicated hardware is just a modest framerate boost in RT games, its not a requirement to play RT games.

Which card aged better, the 16 gigs AMD 6800XT without dedicated RT cores, or the 10 gigs 3080 with dedicated RT cores?

Except for the fact that their latest and greatest is only just matching 2 year old gpus in RT perf and at times not even matching the flagship 2 year old tech so if you care for RT, you're going to have to upgrade an amd gpu sooner than later due to the lack of rt performance compared to the competition and RT is only gaining more momentum as time goes on now, so far nearly every major title released this year and due this year is using some form of RT.

Well based on the games that I have played over the past 2 years, the 3080 has aged a million times better for my "needs":

- having dlss since the start, amd not having fsr 2+ for so long meant even light RT was a complete no go for rdna 2 and even now, FSR is still very hit and miss
- having the rt grunt there has meant I haven't had to make the same sacrifices to graphical settings as what 6800xt has

Daniel Owen did a good video on this recently, although sadly was the 12gb 3080 model, however, the games he showcased wouldn't have benefited much from the extra 2gb anyway (as shown by other tech reviewers):


Of course if you're someone who mods games with high res. texture packs and never uses RT then a 6800xt will have aged better, each to their own and all that.

Either 10GB is enough at 4K or it isn't and as you allude to, blaming lazy devs is a cop out and always was. I was always of the opinion that while 10GB was just enough at 4K a few years ago, it would become a problem for those on a 3 or 4 year upgrade cycle.

I also strongly disagreed with the nonsense that the 3080 would run out of GPU grunt, long before VRAM at 4K became an issue.

I have been using dlss where possible as the grunt isn't there for gaming at 3440x1440 175hz, funnily though, it's been less of an issue for my 4k display as it's only 60hz so I lock fps to 60 there, had I had a 4k 144 hz screen then I would definetly be buying a 4090 for the grunt it has at this res. for the newer games. I've had to use dlss performance mode a bit more at 4k due to lack of grunt and even with a 3090, I would also have had to use performance mode since it didn't have the grunt for certain games at 4k native nor dlss quality either e.g. dead space:

32DggTZ.png

Until AMD release a card with dedicated RT, thats a clear no. I think they wont release a card with dedicated RT, as its extra cost for something with only moderate benefit. Same reason they proved you dont need expensive GSYNC modules to use VRR. Nvidia always seem to go what costs the most route.

Intel have had no issues here with decent pricing for a dGPU with dedicated RT hardware......

And we have been through this a million times with the gsync module, there were many reasons nvidia had to go with a module and it proved and arguably still does offer some big advantages over freesync, don't take my word for it though:

 
Nexus do you think Nvidia can do no wrong? :)

I didnt say GSYNC had no benefit, I said AMD proved it wasnt required, because before AMD released VRR, Nvidia only allowed VRR with a GSYNC module. They only widened their support when forced to by their main competitor.

Will we see Nvidia release a 16 gig card for under £800?
 
Either 10GB is enough at 4K or it isn't and as you allude to, blaming lazy devs is a cop out and always was. I was always of the opinion that while 10GB was just enough at 4K a few years ago, it would become a problem for those on a 3 or 4 year upgrade cycle.

I also strongly disagreed with the nonsense that the 3080 would run out of GPU grunt, long before VRAM at 4K became an issue.

Same. It agitates because some don't have an open mind and conveniently falls into the games fault (cant be the hardware). Now we are forward two years which has given the games that were being made to take advantage of the gen, we are seeing the limitations which was being presented in the earlier days. Sure after a year passed, nvidia brought out the 12GB version which was the sweet spot it should have released with.
 
giphy-downsized-large.gif
 
Last edited:
Nexus do you think Nvidia can do no wrong? :)

I didnt say GSYNC had no benefit, I said AMD proved it wasnt required, because before AMD released VRR, Nvidia only allowed VRR with a GSYNC module. They only widened their support when forced to by their main competitor.

Will we see Nvidia release a 16 gig card for under £800?

They do plenty wrong but at the same time they aren't as "evil" as what some like to make out :) I'll be honest, I don't like amds persona they have about them these days where it always appears that they are making out they are good guys for the gaming community with their "closed source is bad!!! open source is for the good of everyone" spiel or rather their amd loyalists who know no better pushing this.... yes generally open source is a preferred approach but at the same time, there are plenty of reasons why closed source is also good and can have many benefits over an open source approach. I also don't like how people think nvidia getting involved and providing support to game developers to implement their tech is necessarily a bad thing either if it does benefit a game either performance or/and visuals and thus boost the customer/gamers experience, this is better than having amds approach of over the fence/it's in the wild, figure it out yourself and do as you please thus you get a mess such as we have seen a lot of the time with FSR implementations or/and lack of uptake of their tech (I deal with similar situations and 3rd party vendor solutions like this on a daily basis so know the pain first hand when you're dealing with companies that take this approach) and best part of it all is when push came to shove where they had the chance to contribute to an open source solution which would have benefited everyone, they denied to get involved.... That and how they 99% of the time always follow what nvidia do rather than being at the forefront but apparently people who pay the "premium" to get a better experience in a certain area or/and pay to enjoy something a year or 2 before amd get to it are in the wrong and supporting an "evil/bad company"....

Just my 2cents. But as you know, I work in the development industry so look at this from more than just a hobby/passion/end user POV.

I could be wrong now on gsync but iirc, at the time, none of nvidia dgpus had the required vesa input/hardware spec on their gpus thus they couldn't use adaptive sync feature/hardware, which was the primary reason they made the gsync module, however, going by the first year of freesync monitors (lots of issues with flickers, black screens, low/poor VRR ranges), turned out to be a good thing for nvidia in the end.
 
Last edited:
Daniel Owen did a good video on this recently, although sadly was the 12gb 3080 model, however, the games he showcased wouldn't have benefited much from the extra 2gb anyway (as shown by other tech reviewers):

Of course if you're someone who mods games with high res. texture packs and never uses RT then a 6800xt will have aged better, each to their own and all that.
Looking at the time stamps, 3 of the 8 games he tested were released in 2022. The rest are games that released around the time these cards launched or in the case of RDR 2 before these cards were launched. Unless you expect old games to be updated to use more VRAM or something, I would expect them to work with the "flagship" card of the time.
 
Looking at the time stamps, 3 of the 8 games he tested were released in 2022. The rest are games that released around the time these cards launched or in the case of RDR 2 before these cards were launched. Unless you expect old games to be updated to use more VRAM or something, I would expect them to work with the "flagship" card of the time.

Watch the video :)
 
Looking at the time stamps, 3 of the 8 games he tested were released in 2022. The rest are games that released around the time these cards launched or in the case of RDR 2 before these cards were launched. Unless you expect old games to be updated to use more VRAM or something, I would expect them to work with the "flagship" card of the time.

Few understand this.
 
Status
Not open for further replies.
Back
Top Bottom