• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
I am a 3080 FE owner and I realised that 10GB could be an issue when I bought it, particularly at 4K, but for £650 in the current crazy market it was a compromise worth making. Geforce NOW is actually quite good, but I do get some packet loss using Sky Broadband which is annoying (as do others with ISPs that use the Openreach infrastructure), I don't know whether this is due to Nvidia's servers or my ISP. If they can resolve this and bring much more of my game library to the platform it may be a viable alternative to local gaming in a few years.

This is my stance, even though I have noticed quality problems with FF7 remake, I dont regret buying the card, £650 for this vs £1500 for a 3090, its a no brainer.
 
16gb ram.... as I was saying, the patch, which fixed the texture issue increased ram usage considerably, you can see I am pushing 16gb on my 32gb setup.... So obviously whatever they did to fix issues required them to offset something to the ram otherwise the ram usage would not have increased after that patch....

Isn't that pretty much what the far cry game world is and my footage shows the kind of gameplay of rather well i.e. a big open jungle/tropical area with running and gunning across a wide enough area unlike yours where you're stuck in one area :p

What happens when you enable fsr? Bearing in mind you would "have" to use that if you want more than 50/55fps at 4k anyway....

Like I have always said, having fps drops to 2 fps and staying at that is not your usual "vram" issue (again, see my cyberpunk video for a true showcase of vram limitation), you have my footage and other people's posts in the 70+ page thread expressing that they aren't having problems. Going by your posts in the rtx thread, imo, you have some other system/driver issues if you're having all these kinds of issues.

That 3700x will also be holding back the 3080 by quite a bit too.

I think this is the problem ff7 remake has, it needs over 3 gig of VRAM to just get to the title screen, and is no way 3 gig of textures been displayed on there whilst the game barely uses any system ram, dev's seem to be moving to a trend of making non graphical data been stored in VRAM (probably due to how console development now works).
 
I think this is the problem ff7 remake has, it needs over 3 gig of VRAM to just get to the title screen, and is no way 3 gig of textures been displayed on there whilst the game barely uses any system ram, dev's seem to be moving to a trend of making non graphical data been stored in VRAM (probably due to how console development now works).

This makes great sense.
 
This is my stance, even though I have noticed quality problems with FF7 remake, I dont regret buying the card, £650 for this vs £1500 for a 3090, its a no brainer.
It's just a shame that the gamers aiming for the top end have to make these decisions due to the fact Nvidia and amd have us over a barrel
 
How do you quantify that?

It's something I have done for a long time. The busiest drive in your system is almost always your C: unless you're doing a lot of file transfer with other drives so a D: that's sitting idle would respond faster than a C: that's doing little bits here and there.

Honestly not sure how big of a difference there would be with an SSD now but I have done it since before SSDs were a thing and just kept doing it.
 
Pretty much!

FC 3 had beyond awful stuttering (well regarded amongst all pc gamers regardless of hardware too), which required a ton of faffing with config files to get it running somewhat smoothly (I actually tried it not so long back on my 3080 and stuttering was still present :o) likewise with fc 4. Primal and fc 5 + new dawn performed very well though and I don't recall of many issues. FC 6 on the whole has been very good for me tbf (ignoring the texture issue, which has been fixed now and the crashing after loading the start credits, which was fixed by a patch)
Hopefully they get a new engine as the game needs a serious change on the whole.

If interested here is the fix for Far Cry 3,Crysis 3 and some other games for stuttering.

Far Cry 3 has FPS problem on Windows 10/11 . I am confident the fix will work on a few games with bad FPS.It is so off the wall of a fix for me in a few games with bad FPS in Windows 10/11.
What I actually did on Nvidia GPU.
Put on Ultra low latency mode
Set FPS to 1000Max -(This is the only thing to be changed and any framerate to fix the stuttering)
Still used V-sync to enable G-sync in the Nvidia control panel. Worked great and now the game is playable.

Far Cry 3 No Stuttering Fix
 
If interested here is the fix for Far Cry 3,Crysis 3 and some other games for stuttering.

Far Cry 3 has FPS problem on Windows 10/11 . I am confident the fix will work on a few games with bad FPS.It is so off the wall of a fix for me in a few games with bad FPS in Windows 10/11.
What I actually did on Nvidia GPU.
Put on Ultra low latency mode
Set FPS to 1000Max -(This is the only thing to be changed and any framerate to fix the stuttering)
Still used V-sync to enable G-sync in the Nvidia control panel. Worked great and now the game is playable.

Far Cry 3 No Stuttering Fix

Oh! Nice, will give that a shot. I remember having to faff with loads of things like triple buffering, future frames rendering, a certain fps cap (iirc back at release, magic number for me was 57 fps), absolute madness the trouble we have to go to in order to get games running acceptably at times.
 
Here is a good video reviewing the Radeon VII 16GB showing the AMDs review guide of how to make the cards with 8GB or less struggle like the RTX 2080 that was used in the review with 8GB and if you watch the video AMD clearly knows what games that will use more VRAM if allowed and what games will struggle with 8GB or less video cards. Anyone remember FarCry6 and such shenanigans, this proves this has been going on a good while.

Also what this video shows that even in 2016-2019 8GB was really not enough for 4K for some games and well we know what will happen to 10GB soon and maybe even 12GB cards.

Remember the Radeon VII came out in the start of 2019.


From 8 minutes 40 secs

Quick link to the time stamp.

https://youtu.be/7j2PFKkYrVg?t=520

From AMDs Review Guide for the Radeon VII.

EtcCnOG.jpg

WglGzjs.jpg

uP5YAt1.jpg


Watch the video fully to get a good idea what is going on and why Nvidia plays the low VRAM on some cards, it is basically planned obsolescence to make you update sooner if you want high resolutions like 4K with in game higher settings, you can see even in the screenshot from the review guide there is games that used way more than 8GB even when they came out in 2016 (rise of the tomb raider as the example from the screenshot, using 9.8GB in highest settings mode, humm that's basically 3080 10GB levels in 2016, watch the rise of the tomb raider tested in the video how it makes the RTX 2080 struggle when it was doing better than the Radeon VII in most other games, also in the test is a 1080ti with 11GB and well it was almost twice as fast (playable no stutters) in the same test due to the 11GB VRAM compared to 8GB on the RTX 2080 (unplayable stuttered like crazy). RTX 2080 and 1080Ti are basically the same performance cards, but different VRAM amounts installed.

Now the question is ..is it a game issue or Nvidia driver issue too at the time of testing? I wonder if anyone with Rise Of The Tomb Raider, an RTX 2080 and a 4K screen can test it in 2022 with all the game updates and latest drivers on the same part of the game he shows with the problems. This will prove if it was the game that needed patching or the Nvidia drivers at the time causing these issues too or is it just not enough VRAM or VRAM bandwidth as the Vega 64 with 8GB HBM2 had no problems in the test.
 
Last edited:
@Purgatory Maybe some of my posts (here, here and here to name just a few) dating back years will start to make a little more sense now to some with regards to planned obsolescence, VRAM requirements, and how the latter is rarely tested properly by the tech press/end users - which I've been saying for a long time.

This is pretty standard in reviewers guides Purgatory. The aim is to show your product in the best light, in scenarios that favour it vs the competition. The results are never false, they just paint it in the best possible light in a scenario that favours them. Think of it like this, Radeon VII has double the VRAM of the 2080, so if the FPS are playable crank the image quality up to use as much VRAM as possible, which will highlight the limitations of 8GB vs a Radeon VII. As seen in the results and frame times of the two GPUs.

You can be sure the opposite will be true on the other side too. If the game supports it, crank up RT. FPS might be unplayable for all, but we'll be faster by X amount. If DLSS is available, use that and be faster by XX amount and have playable FPS. This will not be as straightforward now that FSR is available and well supported/received by game devs and gamers. So the focus would probably be on 400% zoom in still pictures to highlight subtle image quality differences, as well as FPS advantage in heavy RT scenarios with image reconstruction off.

Some tech press will take a guide and follow it to the letter, think Digital Foundry. Some will ignore it completely, think GN. Most fall somewhere in the middle.
 
Also what this video shows that even in 2016-2019 8GB was really not enough for 4K for some games and well we know what will happen to 10GB soon and maybe even 12GB cards.
...and why Nvidia plays the low VRAM on some cards, it is basically planned obsolescence to make you update sooner if you want high resolutions like 4K with in game higher settings, you can see even in the screenshot from the review guide there is games that used way more than 8GB even when they came out in 2016 (rise of the tomb raider as the example from the screenshot, using 9.8GB in highest settings mode, humm that's basically 3080 10GB levels in 2016

Just waiting to see what reaction you get. I know if when I post this inference or open ended question it is completely different. :cry:
 
Of course, it is perfectly valid to say it is planned obsolescence. Heck, we could say the same for any product on the market (look at the article talking about apple and their products, didn't they even admit it too?), after all you're talking about companies worth billions, they are all in it to make as much money as possible and none of them are your friend despite what some think :o We could say the exact same with amd and their lack of RT and dlss competitor (FSR is not even in the same league and amd themselves have even said it is not to be compared to dlss due to the differences in how they work), how many rdna 2 users will be upgrading for those 2 things alone? Assuming rdna 3 is better.... Same way turing owners upgraded to ampere just for the better RT perf.

The point I always refer back to is why do people upgrade to newer gpus? Is it purely for vram? (just look back at when the 4gb fury x came out and how many switched to it from 8gb cards ;) :p) Or/and Is it purely for rasterization or/and ray tracing perf? I think you'll find very few people upgrading just to get more vram since it is usually the card that always runs out of grunt first in the vast majority of cases.

Personally for me, my next upgrade will largely come down to the gain in RT perf. and overall performance as several of us 3080 owners have noted now, we are already having to drop settings in order to meet a certain fps figure we like.

Then you can always throw in the fact that "ultra/epic/very high" settings generally bring little to nothing for a huge perf hit. so essentially if you adjust settings properly in the first place and know what the limits of said card is, you can quite easily make a product last much longer.


As matt once said, no one is forcing anyone to use certain settings.

The main problem with nvidia is once their new line of gpus come out, they stop focussing on or rather "prioritising" previous gens so drivers aren't being optimised as well as what they could potentially be where as amd have better support here, which is generally why they end up doing better in the long run, of course, the other end of the stick is that amd take a good 6-12/24 months to get the best from their drivers....
 
Who buys a flagship product to compromise with it? If I expected a flagship card to have to turn down details in games not much more than a year after launch I know for sure I wouldn't be buying it.

Yes it's easy but it defeats the point in a flagship product, if you were expecting to lower settings anyway then buy a lower priced card.
 
Who buys a flagship product to compromise with it? If I expected a flagship card to have to turn down details in games not much more than a year after launch I know for sure I wouldn't be buying it.

Yes it's easy but it defeats the point in a flagship product, if you were expecting to lower settings anyway then buy a lower priced card.

I don't entirely disagree but even the "flagship" cards are having to reduce settings depending on the game/requirements. You buy what is best for the time and what suits your needs, as it is right now, the only cards, which are overall and universally regarded as being better is the 6900xt, 3080 ti, 3080 12GB (since it has better specs all round, essentially a "super") and a 3090 but then you have to factor in if they are worth double the price? Which I think is safe to say, everyone has come to an agreement, they aren't if it is just for gaming.

That's why it makes no sense buying a 6900xt/3090 if you're just a gamer as come next gen, you're going to have £400-600 4070/7800 beating previous flagship gen cards, which originally cost £1000-1400 (unless you paid more :o)
 
Some people just want the best though. Those people buy a 6900 XT or a 3090, so it makes sense to them. I won't deny that the increases in performance do not justify the extra outlay though in terms of bang per buck.
 
I don't entirely disagree but even the "flagship" cards are having to reduce settings depending on the game/requirements. You buy what is best for the time and what suits your needs, as it is right now, the only cards, which are overall and universally regarded as being better is the 6900xt, 3080 ti, 3080 12GB (since it has better specs all round, essentially a "super") and a 3090 but then you have to factor in if they are worth double the price? Which I think is safe to say, everyone has come to an agreement, they aren't if it is just for gaming.

That's why it makes no sense buying a 6900xt/3090 if you're just a gamer as come next gen, you're going to have £400-600 4070/7800 beating previous flagship gen cards, which originally cost £1000-1400 (unless you paid more :o)

My point isn't really about cost, it's about Nvidia themselves making sure we knew that as of October 2020 this was their flagship offering and it was the best they could do (the 3090 was not a gaming card if you recall).

Sure a while down the line as new product lines come out and support starts to end I can understand lowering settings, nothing lasts forever. It was clear the 3080 was not designed to last that long to me though.
 
My point isn't really about cost, it's about Nvidia themselves making sure we knew that as of October 2020 this was their flagship offering and it was the best they could do (the 3090 was not a gaming card if you recall).

Sure a while down the line as new product lines come out and support starts to end I can understand lowering settings, nothing lasts forever. It was clear the 3080 was not designed to last that long to me though.

It was absolutely the flagship, announced by Huang himself.

Who buys a flagship product to compromise with it? If I expected a flagship card to have to turn down details in games not much more than a year after launch I know for sure I wouldn't be buying it.

Yes it's easy but it defeats the point in a flagship product, if you were expecting to lower settings anyway then buy a lower priced card.

This is why other people have pointed out elsewhere and in the reviewer circles, the introduction of more cards which just so happen to have more memory this time round. I guess unless you have the educated background its mind blowing!

Some people just want the best though. Those people buy a 6900 XT or a 3090, so it makes sense to them. I won't deny that the increases in performance do not justify the extra outlay though in terms of bang per buck.

Apparently its only 5% (or as high as 20% when reviewers benchmark the games..). It can be the difference on compromising a setting or two to increase the fps. :cry:
 
Then you can always throw in the fact that "ultra/epic/very high" settings generally bring little to nothing for a huge perf hit. so essentially if you adjust settings properly in the first place and know what the limits of said card is, you can quite easily make a product last much longer.


This is good what HU advised and sensible tbh. The games tend to not offer great scaling options and as they have shown nightmare or whatever the highest setting is named in the menu's seems to barely differ from say ultra. Environment settings seem to be a good one for clouds, reflections, distance so its down to how real you want it to look and adjust for your own experience.

The thing is with this flexibility, you ruin the point in benchmarking - the control aspect in any test/experiment is keeping all conditions the same. If users drop settings to gain higher fps, then surely they wont run into issues others have/experience which is where the comparisons are getting woolly.
 
Some people just want the best though. Those people buy a 6900 XT or a 3090, so it makes sense to them. I won't deny that the increases in performance do not justify the extra outlay though in terms of bang per buck.
Yep. I have never been this way. I almost always look at best price for performance. That is why I almost exclusively had AMD GPU’s for nearly a decade. In many cases AMD would have 95% of the performance Nvidia had and offer it at a much better bang for buck. Was a no brainer for me. But now they have got greedy, but business is business in the end and that is why I don’t have any loyalty to anyone :)

Here is my desktop GPU’s history:

RIVA TNT
GeForce4 MX440

Radeon 9700 Pro
GeForce 7900 GT
GeForce 8800 GTS

Radeon 4870
Radeon 5770
Radeon 5850
Radeon 6970
Radeon 7870
Radeon 7950

GeForce 670
Radeon 7970
Radeon 290
Radeon 295x2

GeForce 1070
GeForce 1080

Radeon R9 Nano
Radeon RX Vega Liquid

GeForce 1050
GeForce 980Ti
Nvidia Titan XP

Radeon RX 580
GeForce 970
Radeon RX Vega 56
Radeon RX Vega 64

GeForce 3080
GeForce 3070
 
Yep. I have never been this way. I almost always look at best price for performance. That is why I almost exclusively had AMD GPU’s for nearly a decade. In many cases AMD would have 95% of the performance Nvidia had and offer it at a much better bang for buck. Was a no brainer for me. But now they have got greedy, but business is business in the end and that is why I don’t have any loyalty to anyone :)

Here is my desktop GPU’s history:

RIVA TNT
GeForce4 MX440

Radeon 9700 Pro
GeForce 7900 GT
GeForce 8800 GTS

Radeon 4870
Radeon 5770
Radeon 5850
Radeon 6970
Radeon 7870
Radeon 7950

GeForce 670
Radeon 7970
Radeon 290
Radeon 295x2

GeForce 1070
GeForce 1080

Radeon R9 Nano
Radeon RX Vega Liquid

GeForce 1050
GeForce 980Ti
Nvidia Titan XP

Radeon RX 580
GeForce 970
Radeon RX Vega 56
Radeon RX Vega 64

GeForce 3080
GeForce 3070

Don't want to send the thread off topic so replied here.
 
Status
Not open for further replies.
Back
Top Bottom