• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
A grand total of one owner having issues proves nothing and is just as likely to be problems elsewhere in his setup, besides if it was that bad he would have upgraded to one of the 16gb AMD cards now that the price has crashed and they can't even give them away.
In his defence, its a grand total of at least two owners having issues, as I found the exact same issues he was having in far cry 6.

To be clear, 10GB is enough for me. I've found one situation where I find it to be insufficient, but I've had to go out of my way to do so. It won't be enough one day, but right now it's fine.
 
Any link showing that a 2gb IC only costs $2 more than a 1gb?

To be fair this is exactly why I am annoyed with AMD. They should have spent just a bit more and released a card with better RT that can match or beat Nvidia. They are clearly doing this to get us all to upgrade all the time… :p
 
What annoys me about it is its such a simple thing.

Quite so. It obviously touches on a nerve hence pages of nonsense refusing to accept it. Again, it is a tiny minority like less than ten people in the band of merry men, the majority of folk seem to acknowledge it but it doesn't bother them and everyone is fine with that. :)
 
In his defence, its a grand total of at least two owners having issues, as I found the exact same issues he was having in far cry 6.

To be clear, 10GB is enough for me. I've found one situation where I find it to be insufficient, but I've had to go out of my way to do so. It won't be enough one day, but right now it's fine.

To be clear Bill, there were more than two but thanks to guys like yourself we knew about it ages ago and had the evidence. Now the influencers are covering it late in the Lifecyle not feared to be punished from their contacts or viewer base it is quite clear most enthusiasts from a hardware stand point expect more than was being offered on some models of cards. Especially when these cards were commanding a high price.

To simply refuse to accept it is the jovial aspect.
 
Quite so. It obviously touches on a nerve hence pages of nonsense refusing to accept it. Again, it is a tiny minority like less than ten people in the band of merry men, the majority of folk seem to acknowledge it but it doesn't bother them and everyone is fine with that. :)

I'm good with that too.

@Armageus We aren't bickering, its just a conversation.
 
In his defence, its a grand total of at least two owners having issues, as I found the exact same issues he was having in far cry 6.

To be clear, 10GB is enough for me. I've found one situation where I find it to be insufficient, but I've had to go out of my way to do so. It won't be enough one day, but right now it's fine.

However, according to the usual suspects, there is loads of people having issues yet failing to list anything concrete let alone posting anything at all to show all these people and the issues :p

FC 6 is potentially problematic but problem is I never experienced any issues once the patch to fix the texture rendering issues arrived except when I enabled rebar and played at max settings @4k with no FSR, then I encountered the fps crashing and going by other peoples posts and footage, a 3090 also has fps dropping after a period of time, hence why I am still not convinced it is a "100% vram" related issue.

I can't recall but didn't you state some stutters but you didn't encounter the complete fps plummet to 1-3fps like Tommy did/does? Which again, that kind of behaviour has never manifested itself in any other scenario where there is a genuine vram shortage.....

Any link showing that a 2gb IC only costs $2 more than a 1gb?
I wasn't even going to respond because you didn't back up your claim.

Funnily we had a similar discussion on that aspect and someone else noted that humbug wasn't correct on that bit about the 1 vs 2 thing:

Oh... almost forgot, the 3090 also has 2GB IC's, but that's a £1,500 GPU, spend that to get plenty :D

3090 actually has 1GB IC's 24 of them, the 3090ti has 2GB IC's and 12 of them, this is why I stated there should have been 24/48GB versions of both these cards for the pro users too as really that's who they were aimed at and why I called a 3080 a disgrace of a card with 10GB VRAM, but some keep defending this fact and reality is it should have come with a minimum of 12GB from the start and 16GB for the 3080ti.

To be fair this is exactly why I am annoyed with AMD. They should have spent just a bit more and released a card with better RT that can match or beat Nvidia. They are clearly doing this to get us all to upgrade all the time… :p

Exactly but when it comes to amd it's not "planned obsolescence" :cry: You literally can't make this **** up :D

To be clear Bill, there were more than two but thanks to guys like yourself we knew about it ages ago and had the evidence. Now the influencers are covering it late in the Lifecyle not feared to be punished from their contacts or viewer base it is quite clear most enthusiasts from a hardware stand point expect more than was being offered on some models of cards. Especially when these cards were commanding a high price.

To simply refuse to accept it is the jovial aspect.

No one ever disputed their issues, the fingers in ears happens when you point out that said issues were also affecting other platforms/hardware..... that's what the problem is, is when said people don't want to be open and accept that there might just be something else causing the issues.....

Still waiting on all these other issues and evidence to be listed and again:

PS. Where abouts did HU mention specifically about 10GB and there being issues? They seemed to only refer to 8gb, I think we all agreed on 8gb having issues at some point when trying to play at 4k max settings with no FSR/DLSS, although @TNA seems to have not encountered any serious issues?

insert johntravolta.gif

But keep on ignoring that.....
 
The RTX 3060 is a 12GB card, that's good, ironically i bet its smoother than the RTX 3070.

Which should have been a 16GB card, my hope was the 3070Ti would be a 16GB card, but no...
The 3080 a 20GB card, then there would have been no need for the 3080 12GB and little value in the RTX 3090, certainly at $800 more it would have looked completely idiotic.
 
The RTX 3060 is a 12GB card, that's good, ironically i bet its smoother than the RTX 3070.

Which should have been a 16GB card, my hope was the 3070Ti would be a 16GB card, but no...
The 3080 a 20GB card, then there would have been no need for the 3080 12GB and little value in the RTX 3090, certainly at $800 more it would have looked completely idiotic.

In what games and with what settings @ what res. would a 3060 be smoother than a 3070? Maybe in the 0.5/1% of games out there it will be smoother i.e. fc 6 max settings @ 4k (although neither are 4k max settings kind of gpus hence settings will be dropped anyway....)

And how do you impose that nvidia could have achieved that at the time of release whilst still hitting the £650 mark?
 
I can't recall but didn't you state some stutters but you didn't encounter the complete fps plummet to 1-3fps like Tommy did/does? Which again, that kind of behaviour has never manifested itself in any other scenario where there is a genuine vram shortage.....
It wasn't stuttering, sometimes if you went in and out of the menu a game that was running at around 70fps would suddenly drop and only perform at around 20fps and stay there. No VRAM warning came up, and this was at 3840 x 1600, not full 4k. I put this down to engine, not vram related, but unsure.

EDIT - It would also do this on occasion without having to go to the menu, but the menu made it happen more often than not.

The benchmark would perform, at 4k with all bells, whistles and no FSR between 2 and 4 fps on average, with warnings saying vram 2as being exceeded. On ocassion, it would start at around 50fps (from memory) then as the camera got to the island, it would give a vram warning, plummet to 2 to 4 fps and stay there
 
Last edited:
In what games and with what settings @ what res. would a 3060 be smoother than a 3070? Maybe in the 0.5/1% of games out there it will be smoother i.e. fc 6 max settings @ 4k (although neither are 4k max settings kind of gpus hence settings will be dropped anyway....)

And how do you impose that nvidia could have achieved that at the time of release whilst still hitting the £650 mark?

In the same way they hit the $329 mark on the RTX 3060 despite it having 50% more VRam than the RTX 3070.

In the same way that AMD use 2GB IC's on all their GPU's.
 
They could have put 16GB GDDR6 on the 3070 for a reasonable cost but then it would look odd having only 10GB GDDR6X on the 3080. And there are 100% VRAM issues in Farcry 6 at 4K with the HD texture pack on, also 4K max settings in Cyberpunk. They don't bother me too much right now as the games would have poor frame rates regardless but I look forward to the 10GB 4070 :p
 
In his defence, its a grand total of at least two owners having issues, as I found the exact same issues he was having in far cry 6.
Seems like it's three users in this thread Bill. (four if you count Stooeh above I guess :p)

There's more on YT and Reddit too, but I think we've discovered that it doesn't matter how many individual users come forward.

If you run out of video memory then it falls into one of these categories.
  1. A local system issue as it works fine on my system (even though I won't upload a long gameplay video proving this) with everything stripped out of the driver and no background apps running.
  2. Using the wrong settings turn them down/disable the HD Texture pack (which goes against the theory of testing max settings to saturate memory).
  3. Rebar is enabled and this causes FPS drops to single digits.
  4. It's a game issue. The developers have acknowledged it a fix is coming any day now, even a year on from release. Plus there's a guy on the Ubi forums with a 3090/6900 xt who has stuttering too so its all related.
  5. Stuttering is caused by GPU power limits, not memory related.
  6. The game doesn't count because it was developed purely to harm GPUs with less than 12GB video memory (even though the texture pack is an optional download not included with the base game by default)
Did I miss any lads? :cry:
 
The RTX 3060 is a 12GB card, that's good, ironically i bet its smoother than the RTX 3070.

Which should have been a 16GB card, my hope was the 3070Ti would be a 16GB card, but no...
The 3080 a 20GB card, then there would have been no need for the 3080 12GB and little value in the RTX 3090, certainly at $800 more it would have looked completely idiotic.
Had nvidia gave the 3070 16gb then it would have ended up priced around £600 with the 3080 bumped to around £800, nvidia isn't going to give you the VRAM for free.

Nvidia didn't really have many options for the 3080,

option 1 give the 3080 16gb of GDDR6 on a 256 bus in which case it would have performed worse than its main competitor the 6800XT, using GDDR6X would have meant needing ICs on both sides of the pcb which would have meant a large additional cost.

Option 2 give the 3080 20gb VRAM, this would drive up the cost as again ICs on both sides of the PCB while also making a 3090 pointless.

Option 3 and probably the only reasonable option would be increasing the bus from 320 to 384 and going with 12gb but then the 3080ti would be pointless and the 3090 would have looked even more silly than it did with the small performance gap while people would have still complained that 12gb isn't enough.
 
Had nvidia gave the 3070 16gb then it would have ended up priced around £600 with the 3080 bumped to around £800, nvidia isn't going to give you the VRAM for free.

Nvidia didn't really have many options for the 3080,

option 1 give the 3080 16gb of GDDR6 on a 256 bus in which case it would have performed worse than its main competitor the 6800XT, using GDDR6X would have meant needing ICs on both sides of the pcb which would have meant a large additional cost.

Option 2 give the 3080 20gb VRAM, this would drive up the cost as again ICs on both sides of the PCB while also making a 3090 pointless.

Option 3 and probably the only reasonable option would be increasing the bus from 320 to 384 and going with 12gb but then the 3080ti would be pointless and the 3090 would have looked even more silly than it did with the small performance gap while people would have still complained that 12gb isn't enough.
No.
The 3070 has the same type of memory as the RTX 3060, and all of AMD's GPU's, sticking with the "it would have pushed up the cost" line to me seems counter intuitive at this point given AMD do it, even Nvidia do it on cheaper cards.
-------------------

My RTX 2070S, Maximum Settings at 1440P

HD texture packs, one cannot say performance here isn't good enough, but it is so starved for memory it actually locks up.

The PS5 has more memory than my GPU, and the RTX 3070/Ti and the RX 6700XT and the RTX 3080 and the RTX 3060 and the RTX 3080 12GB and the RTX 3080Ti, perhaps this is why FC6 has a separate installable HD texture pack? PC Master Race????

PS: the PS5, with its 16GB of GDDR6 VRam costs..... $500, the whole thing costs the same as the RTX 3070 despite having twice as much memory. The same memory.

 
Last edited:
Seems like it's three users in this thread Bill. (four if you count Stooeh above I guess :p)

There's more on YT and Reddit too, but I think we've discovered that it doesn't matter how many individual users come forward.

Precisely (Tom, Steve, Tim) although maths isn't a strong point as we have found out or even language. Goalposts though, they are good at moving those! :cry:
 
It wasn't stuttering, sometimes if you went in and out of the menu a game that was running at around 70fps would suddenly drop and only perform at around 20fps and stay there. No VRAM warning came up, and this was at 3840 x 1600, not full 4k. I put this down to engine, not vram related, but unsure.

EDIT - It would also do this on occasion without having to go to the menu, but the menu made it happen more often than not.

The benchmark would perform, at 4k with all bells, whistles and no FSR between 2 and 4 fps on average, with warnings saying vram 2as being exceeded. On ocassion, it would start at around 50fps (from memory) then as the camera got to the island, it would give a vram warning, plummet to 2 to 4 fps and stay there

Ah yes, that was an acknowledged bug by ubisoft too (the entering any kind of menu screen) but of course, people here who know better than the developers put it down to the vram :cry:

IIRC, I had the same issue with the benchmark but not in game, quite a few reviewers noted issues with the benchmark hence why they used their own scenarios, at least that was the case on release, not sure if that has been fixed now or not.

They could have put 16GB GDDR6 on the 3070 for a reasonable cost but then it would look odd having only 10GB GDDR6X on the 3080. And there are 100% VRAM issues in Farcry 6 at 4K with the HD texture pack on, also 4K max settings in Cyberpunk. They don't bother me too much right now as the games would have poor frame rates regardless but I look forward to the 10GB 4070 :p

This using mods? If so, can agree there.

If with no mods, don't recall of any issues, can you link some footage to show this? Although you'll be having to make sacrifices somewhere i.e. settings or/and a higher preset of dlss given not even a 3090 is capable of achieving good fps:


No.
The 3070 has the same type of memory as the RTX 3060, and all of AMD's GPU's, sticking with the "it would have pushed up the cost" line to me seems counter intuitive at this point given AMD do it, even Nvidia do it on cheaper cards.
-------------------

My RTX 2070S, Maximum Settings at 1440P

HD texture packs, one cannot say performance here isn't good enough, but it is so starved for memory it actually locks up.

The PS5 has more memory than my GPU, and the RTX 3070/Ti and the RX 6700XT and the RTX 3080 and the RTX 3060 and the RTX 3080 12GB and the RTX 3080Ti, perhaps this is why FC6 has a separate installable HD texture pack? PC Master Race????

PS: the PS5, with its 16GB of GDDR6 VRam costs..... $500, the whole thing costs the same as the RTX 3070 despite having twice as much memory. The same memory.


Really.... The PS 5 does not have a total of 16GB VRAM...... it is shared memory i.e. acts as RAM too so you are not getting that 16GB being fully dedicated as vram.

You still ignoring the fact that ps5/xbx also reduce several settings and/or disable RT entirely as well as running at resolutions even below 1440P most of the time in order to hold 60 fps, come on hum, you're more knowledgeable than that......
 
Status
Not open for further replies.
Back
Top Bottom