• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
consoles age much, much better than pc counterparts. why this is so hard to accept? you will be changing your gpu 2-3 times by the time ps5/xbox sx go obsolete. no point in arguing.
 
https://www.youtube.com/watch?v=sxExRN1TcY8

timagamer
3 weeks ago (edited)
I've been testing your tip about changing texture setting from High to Ultra on my GTX 780 3GB with different settings, and it seems like 3072 MB is not enough for both RDR2 with Ultra textures AND Win10 system plus some definitely required apps like MSI Afterburner and Razer Booster, as those consume up to 500+ MB of VRAM with no game launched at all. With High texture setting (quite poor and ugly compared to Ultra, as I see it) I've got like 33-52 fps at Medium-High shadow/light/lod settings, compared to 22-37 fps with just Ultra settings changed for comparison, no sudden stutters but overall decrease in framerate, to my own surpirse. Game runs from SSD, CPU is 4670K at 4.4GHz, 2x4=8GB Kingston HyperX at 2133 Mhz. This 3072 MB VRAM insufficiency is well seen in the video about RDR2 running on 1060 6GB, where 3200+MB is used at Medium graphics. I would like to investigate into this a bit. UPDATE: Tried decreasing overall VRAM load by manually shutting down desktop app and it worked, I've got the same 34-50 fps while using Ultra texture setting and 1680x1050 resolution, it seems like this game requires at least 2.8GB of VRAM exclusively for itself, 3GB more likely for resolutions like 1920x1080. UPDATE #2: The game is unstable while using Ultra textures on GTX 780 3GB, my VRAM optimisation have little to no effect on this issue, the game crashes after 5-15 mins of continuous gameplay :(


---

just 5 years after the mighty gtx 780's release, it is unable to compete with a ps4 in terms of texture fidelity. how surprising

and brutal truth;

"quite poor and ugly compared to Ultra, as I see it)"


this is just not rdr 2 of course. gtx 770 and 780 played with inferior and ugly textures since 2016. since 2016, ps4 and xbox one still boasts a rock solid ultra textures and provides superior texture fidelity

same will happen to 3070 and 3080 in relation ps5/xbox sx.

ac origins, nier automata, wolfenstein, rise of, shadow of tomb raider, countless AAA games run on low/medium textures with a 770 and 780 while ps4/xbox one provided ultra textures for all of them with their 8 gb total ram.

imagine buying a overly expensive 770 or 780 and 3 years later ps4/xbox one runs games with higher textures. yeah, same will happen to 3070 / 3080. whether you like it or not.

ps4/xbox one, devs manage to get ultra textuers fit in their memory budget somehow. i dont care how they do it. do it with a 780 770 and return to me. then we can discuss. you can't, tweak and optimize all you want, you can't. devs and nvidia will want you to move on from them. and you will. this is the actual discussion. if nvidian and devs are going to want you to move on from 8 gb 10 gb vram gpus, you will have to, eventually. medium and high textures look UGLY and WAY worse than ultra. its a huge compromise that no other graphical setting can offset.

you can run everything ultra with 3080's extra "power". it is meaningless if it has to push medium/high textures along with them.

see, gtx 780 is 4-5 times faster than ps4. it may even do 1080p medium 35-45 fps maybe (hardly). but it will do so with low med textures lmao. and game looks horrible lmao.

what you do is the same thing 770 and 780 users did in 2013-2015. they boasted with their higher settings and higher fps in games. they boasted how they experience games better than ps4. how devs pushed low settings to accomodate for games to run ps4.

3 years later_? 770 and 780 became OBSOLETE. while ps4 still kept rocking. with HIGHER fidelity settings thanks to its higher memory budget

once 2016 and nextgen games started to hit the deck, all of them scurried to newer gen cards bcoz their puny 770s and 780s were unable to cope with nextgen high quality textures with high memory budget that ps4 had no problem dealing with.
 
Last edited:
Been following this thread for what seems like months, and just want to say that PrincessFrosty has nailed this. The RAM is fine for the GPU power. What is it enough FOR is the question? Is it running RDR2 at 4k at 144hz? No. Are any of the AMD cards with 16GB doing it? No. Fair enough, is the 3090 doing it with 24GB? No. The RAM is not the limit.

Yes you can put more in, but it doesn't seem that it's actually needed yet. Will it be enough in the future? Well no, of course it won't. How long will it last? How long has a top GPU ever lasted? Who knows, but the amount of people that couldn't get a latest GPU who are saying that they are fine with their 980 etc. suggests that games still work for a long time yet.

Would it be better if they had put in 16GB, charged more, and you received no benefit? AMD have put more RAM in, but it is slower RAM. Is the speed of the RAM holding it back? Doesn't seem to. At what point do we say "Actually, that's the right amount and a reasonable balance between price and performance", assuming RRPs were reasonable?
 
So was there an answer to the original question?

228 pages? Must be a lot of people chipping in with opinions.

We're still waiting on someone to post hard evidence to show that a 3080 is struggling @4k because of its 10GB vram (outside of running at silly high vr resolutions or/and using 50+ mods)

In short, 10GB GDDRx is still coping perfectly ok right now, even at 4k, in fact, it is arguably a better card at 4k than other cards with more vram but I digress.....
 
I thought the PS5 shares system ram with vram? So if a game developer chooses to allocate 10Gb RAM for example, that leaves just 8Gb vram for the GPU.
Yes and the ps4 worked the same way. It would leave 6gb for the gpu since it has 16gb total. The huge memory difference is just not there this time.
 
So was there an answer to the original question?

228 pages? Must be a lot of people chipping in with opinions.
deathloop stutters hard and provides lesser experience with a 3080 compared to a ps5, which boasts a rock solid performance with ultra possible textures, but now it has to be fixed, it has no meanings
 
Would it be better if they had put in 16GB, charged more, and you received no benefit? AMD have put more RAM in, but it is slower RAM. Is the speed of the RAM holding it back? Doesn't seem to. At what point do we say "Actually, that's the right amount and a reasonable balance between price and performance", assuming RRPs were reasonable?
Last gen consoles had 1/2 the shared mem.

PC GPUs (that played ports of last gen's consoles) had ~8 on mainstream cards.

This gen consoles have 16GB shared mem.

PC GPUs to stick with ~8 to 10GB?

The point at which you start to wonder what's going on is probably when you see the consoles doubling their mem and nVidia doing everything in their power to avoid putting more in PC GPUs.

And it's bitten you/us in the ass before, with the 780 3GB, etc.
 
Fair point, fellow Cornwaller - But what is the correct amount? How long should this GPU last at the very top? Not trying to be awkward, these are genuine questions (My first time with a top GPU)
 
Last gen consoles had 1/2 the shared mem.

PC GPUs (that played ports of last gen's consoles) had ~8 on mainstream cards.

This gen consoles have 16GB shared mem.

PC GPUs to stick with ~8 to 10GB?

The point at which you start to wonder what's going on is probably when you see the consoles doubling their mem and nVidia doing everything in their power to avoid putting more in PC GPUs.

And it's bitten you/us in the ass before, with the 780 3GB, etc.
Flagships GPUs during ps4 launch had 3gb memory which is 62.5% less. The 3080 has 10gb memory which is only 37.5% less. Nvidia has done the math.

The ps4 had 16x more memory than the ps3 while the ps5 has only 2x more memory than the ps4. Memory requirement growth has severely slowed down.
 
Fair point, fellow Cornwaller - But what is the correct amount? How long should this GPU last at the very top? Not trying to be awkward, these are genuine questions (My first time with a top GPU)
gtx 780, which is sold for 3-4 times more the price of a ps4 shouldn't have provided a hugely inferior gaming experience just 3-4 years after. im not even asking 6-7 years, which ps4 still handle games fine. i want to understand why its tolerable and acceptable that 770 and 780 is unusable in modern games. i dont care if its %166 or %60. its less. in the end, it is what it is. it is inferior. 770 and 780 is inferior IN every aspect to ps4 outside of their initial 2 years in market.

3070 which barely manages to match a ps5, and the 3080 that barely matches the series x will be inferior 2-3 years laters. 770 and 780 was 2-5 times stronger than ps4/xbox one yet their fall was huge

this is why i sold my 3070 and bought a sx and im happy ever since. tbh it was the best decision i could ever made (thanks to the PC zealots here, i manage to see the things clearly). i now understand that people here bickering and defending their precious gpus are also the ones that will jump the ship and buy the newest shiniest gpus when they're released. they're practically fighting against themselves. if their gpus are so futureproof, why they're already making/planning their exit. as you can see, some of these defenders of low VRAM are already saying they got their money back by mining and they're ready to buy the next big thing

if youre going to shell out big bucks to new nvidia 4000 gpus ; what is the point of defending 8-10 gb vram?_ it will be obsolete 2-3 year laters in actual nextgen games and ps5/xbox sx will have superior texture fidelity

 
There is so much misinformation in this thread it's insane. I've seen console forums populated with 12 year olds spreading less misinformation.

It's almost like certain posters have an agenda.
 
With regards to ps 5's "rock solid" performance....

https://youtu.be/jWH1f9TeW0I?t=761

And with regards to native "4k" and hitting the 60 fps target:

https://youtu.be/jWH1f9TeW0I?t=413

CucyXM2.png

And with regards to ps5 having the best possible texture quality

AiqJLfe.png
yeah lower resolution having a less sharp image. how smart you are.

rtx 3080 will barely able to push a native 1080p 2-3 years later. enjoy your limited 2-3 years of premium gaming experience. it will be short. just like how gtx 770 and 780 is inferior to ps4/xbox one

your precious 3080 will be destroyed by nvidia's own 5050ti. once that happens, devs wont care how much optimization your cupcake GPu gets. they will care for consoles. to understand this, you have to have an IQ higher than 70 but since you lack it, i dont expect you to understand it
 
gtx 780, which is sold for 3-4 times more the price of a ps4 shouldn't have provided a hugely inferior gaming experience just 3-4 years after.

The PS 4 sold at about £350, didn't it? Did the GTX780 sell for £1050 to £1,400?

I only came to PC gaming after being happy with an Xbox 360. Me and my brother both played Skyrim, but he did it on a PC, and this really showed how frame rate improved the game. I've switched to PC since. The new consoles are definitely brilliant, as they should be after being only 1 year old. They even have very few next gen games at the moment. I expect my GPU to last at 4k 60fps (Basically advertised) for about 4 years. It's been a year already, and still fine.

Edit: When the 4 years is up, it doesn't mean that I will look at upgrading, it means that I will start playing with settings. I'm realistic enough to understand what I buy now is not for forever, and will comparatively deteriorate. Consoles will have dedicated development and are fine things if you can afford the games. Whilst consoles will tend to improve as developers are able to eke out every last drop of performance, some aspects definitely do not improve, and you are stuck with them long term. Installing and loading GTAV on a 360 hard drive immediately springs to mind...
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom