• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
What if I told you.. some couldn't read the instructions and complained it was the games fault! Would you believe me?

Ok sure, yes i would.

Ah yes, that was an acknowledged bug by ubisoft too (the entering any kind of menu screen) but of course, people here who know better than the developers put it down to the vram :cry:

IIRC, I had the same issue with the benchmark but not in game, quite a few reviewers noted issues with the benchmark hence why they used their own scenarios, at least that was the case on release, not sure if that has been fixed now or not.



This using mods? If so, can agree there.

If with no mods, don't recall of any issues, can you link some footage to show this? Although you'll be having to make sacrifices somewhere i.e. settings or/and a higher preset of dlss given not even a 3090 is capable of achieving good fps:




Really.... The PS 5 does not have a total of 16GB VRAM...... it is shared memory i.e. acts as RAM too so you are not getting that 16GB being fully dedicated as vram.

You still ignoring the fact that ps5/xbx also reduce several settings and/or disable RT entirely as well as running at resolutions even below 1440P most of the time in order to hold 60 fps, come on hum, you're more knowledgeable than that......

You're missing the point, i'm talking about the cost argument.
 
No.
The 3070 has the same type of memory as the RTX 3060, and all of AMD's GPU's, sticking with the "it would have pushed up the cost" line to me seems counter intuitive at this point given AMD do it, even Nvidia do it on cheaper cards.
-------------------

My RTX 2070S, Maximum Settings at 1440P

HD texture packs, one cannot say performance here isn't good enough, but it is so starved for memory it actually locks up.

The PS5 has more memory than my GPU, and the RTX 3070/Ti and the RX 6700XT and the RTX 3080 and the RTX 3060 and the RTX 3080 12GB and the RTX 3080Ti, perhaps this is why FC6 has a separate installable HD texture pack? PC Master Race????

PS: the PS5, with its 16GB of GDDR6 VRam costs..... $500, the whole thing costs the same as the RTX 3070 despite having twice as much memory. The same memory.

I do agree that the 3070 could have used a bit more but 16gb would have been overkill for a GPU of that level, if I was product manager at nvidia I'd have gone with 10gb on a 320 bus which would have also bumped up the performance a little and brought it closer to the 3080
 
I do agree that the 3070 could have used a bit more but 16gb would have been overkill for a GPU of that level, if I was product manager at nvidia I'd have gone with 10gb on a 320 bus which would have also bumped up the performance a little and brought it closer to the 3080

Its not even overkill for a GPU at 2070S level, 8GB was probably about right for the GTX 1070

You don't really believe that 2GB GDDR6X IC's cost $100 more than 1GB ones, let alone GDDR6.
 
Ok sure, yes i would.



You're missing the point, i'm talking about the cost argument.

No I'm not missing the point, if you want a lesser experience then go for a console but don't act like console is providing you a better experience just because it has got more vram when it's not:


Oh and just for fun sake, lets remind ourselves of the consoles being affected by the supposed vram issue of textures not rendering too :cry:

Uhf8rb8.jpg







Also, whilst we're on the topic of FC 6, lets look at our very own amd reps footage to show the ray tracing issues with amd on launch ;)

I also just remembered you benched the 3090, 6800 and 6900xt, had a quick look and looks like you were suffering the same issue too:

hcdYfZP.png


5uL7LGo.png


However, your latest run with the 6900xt:

CBvDj00.png


So seems it was a game(s) or/and amd thing.

Sorry forgot, "video encoding issues" (even though they come from the same user) :cry:
 
@Joxeon i'm not talking to you as a product manager at Nvidia, we all know Nvidia's job is to squeeze as much money out of us for as little effort as possible, and as often as possible, same for AMD.
I'm talking to you as someone at the receiving end of that, as a consumer.
 
This using mods? If so, can agree there.

If with no mods, don't recall of any issues, can you link some footage to show this? Although you'll be having to make sacrifices somewhere i.e. settings or/and a higher preset of dlss given not even a 3090 is capable of achieving good fps:

That's a pretty old video and using DLSS Quality. One of the patches increased RT iirc and I'm pretty sure max settings now with DLSS quality at 4K will overrun on VRAM in some areas. Can see it happen in the bar area of built in benchmark.
 
That's a pretty old video and using DLSS Quality. One of the patches increased RT iirc and I'm pretty sure max settings now with DLSS quality at 4K will overrun on VRAM in some areas. Can see it happen in the bar area of built in benchmark.

Hard to find a video with both side by side and without DLSS and with the benchmark mode.....

Seems a 3090ti doesn't do too well for being stutter free in those scenarios though, even with dlss:



Having a hard time finding 3080 footage in the benchmark, got any links? I can fire it up and try myself later but will need to un-mod the game first.
 
Having a hard time finding 3080 footage in the benchmark, got any links? I can fire it up and try myself later but will need to un-mod the game first.
Just gave it a try myself and tbf I didn't get any 'obvious' VRAM stutter visible with DLSS Quality though the frame rates were pretty bad (saw it briefly dip under 20) and is clearly lacking GPU power also. With DLSS off the VRAM stutter is evident and results in an average 5fps though frame rates would suck even with double the VRAM. Be interested to see how the 12GB 3080 fares with DLSS off and DLSS quality preset :p
 
Just gave it a try myself and tbf I didn't get any 'obvious' VRAM stutter visible with DLSS Quality though the frame rates were pretty bad (saw it briefly dip under 20) and is clearly lacking GPU power also. With DLSS off the VRAM stutter is evident and results in an average 5fps though frame rates would suck even with double the VRAM. Be interested to see how the 12GB 3080 fares with DLSS off and DLSS quality preset :p

Kind of figured given the 3090ti is ******** itself even with dlss :p

Had a quick look for 4k of the 3080 12gb but couldn't see any of the requirements except for @ 1440p, could try it and see if there is much difference although the 3080 12gb model has better hardware other than just 2gb more vram so I would be expecting slightly higher fps but if vram is indeed a factor, in theory, the frame latency of the 12gb model should be flat/smoother compared to the 10gb model:


This is where I am looking forward to the 4070/4080 as will hopefully have the grunt to be able to play cp 2077 again with max settings and using dlss quality @ a locked 60+ fps!
 
@Joxeon i'm not talking to you as a product manager at Nvidia, we all know Nvidia's job is to squeeze as much money out of us for as little effort as possible, and as often as possible, same for AMD.
I'm talking to you as someone at the receiving end of that, as a consumer.
It is but that's why I was pleased with the 3080 as we got basically top tier performance with the only catch being 10gb VRAM which is fine for 99% of the use cases for £650 or atleast those who were lucky enough to get one early on.

When Turing came out the 2080 was around 30% behind the 2080ti with 8gb and cost more than the 3080 did so I kept the cash in my wallet.
 
It is but that's why I was pleased with the 3080 as we got basically top tier performance with the only catch being 10gb VRAM which is fine for 99% of the use cases for £650 or atleast those who were lucky enough to get one early on.

When Turing came out the 2080 was around 30% behind the 2080ti with 8gb and cost more than the 3080 did so I kept the cash in my wallet.
Few understand.
 
  • Haha
Reactions: TNA
Few understand.

lol.gif


gIn3sMDLQLpUZrWZH_YT_ofqpIPqij1WQufZ0ihsrRk4x5NTT2AJwD_hW5XeT4PFEQ-7SPGTwPNuCGYas1P2__ydOq5hZFDm-A=s0-d
 
It is but that's why I was pleased with the 3080 as we got basically top tier performance with the only catch being 10gb VRAM which is fine for 99% of the use cases for £650 or atleast those who were lucky enough to get one early on.

When Turing came out the 2080 was around 30% behind the 2080ti with 8gb and cost more than the 3080 did so I kept the cash in my wallet.

Earlier you said more VRam would make the 3070 $100 more expensive.

Where does that come from, why would you use that as an argument against it having more VRam?
 
Because it seems to me there is some Stockholm syndrome going on here.

Please don't be critical of Nvidia, they are not really so bad, but if you upset them they might be worse, so its best just to leave it be.

You don't actually believe the 2GB IC's add $100 to the cost, do you?
 
Because it seems to me there is some Stockholm syndrome going on here.

Please don't be critical of Nvidia, they are not really so bad, but if you upset them they might be worse, so its best just to leave it be.

You don't actually believe the 2GB IC's add $100 to the cost, do you?
Don't forget back in 2020 when these cards released there was a global shortage of GDDR6 so using double would not only have cost more but likely have impacted how many cards could be produced which would have also effected prices.

Even with just 8gb being used supply couldn't meet demand so imagine a further reduction in supply.
 
Status
Not open for further replies.
Back
Top Bottom