• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
One page later and people bringing this again? lol,

1060 can not run Borderlands anywhere near 4k like ps5 does,
Devil may cry 5 does not run in 1080p, runs at higher resolution even on the 120z mode in the ps5 making the comparison skewed for obvious reasons,.besides it has Ray tracing options and 4k option which well, 1060 cant run.
Dirt 5 has no RT options on the 1080, besides ,ironically enough the game for some odd reason has been losing performance on every console update, game is a janky mess to be honest.
Funny how he left games like COD out where it almost matches a 3070 lol.


I couldnt care less about console vs pc wars etc (i own both a ps5 and a high end gaming pc) but at least check your facts straight. cant believe people genuinely think a 1060 or a 1070 match any of the new consoles, certain youtubers should be ashamed of the BS they spout honestly making people believe that a 1060 will match a console lol.

Back to the topic, hows people experiences been on resident evil 8 with a 3080? heard its really close to using 10gbs vram, can see plenty of folks trying to upgrade for a 3080 TI very soon.

Yeah

Even a funnier point of view


qslnpNC.png



On paper, 1080ti should decimate 2060, no? Well, that used to happen. But this game seems to favor Turing, huh? Maybe it benefits from what Turing brings to the table

This alone proves that TFLOPS are mostly bad metrics to compare GPUs. Especially between
different architectures.

------

From what i've seen in videos, PS5 runs high-ultra focused settings, 1224-1440p dynamic resolution and a locked 60 FPS.

https://www.pcgamer.com/cyberpunk-2077-cinematic-rtx-mode/

Create a shortcut, add a parameter to the shortcut's target from settings, "ConsoleEarlyNextGen". This should be for the performance mode.

Then, you practically get console settings. But the actual settings are not shown. You must not touch any of the settings or it breaks. You must set your resolution before hand. For example, try setting 1440p and see if you can get 60 FPS with your GPU. If you can, you're performing equal to a SX in this game. If you can't, XSX is simply superior.

With exact same settings, but at 1080p, I used to got 43-45 FPS with a GTX 1080. That is a huge difference compared to PS5's 1224-1296p locked 60 FPS.



In short, I really think newgen consoles are really powerful. But that's my opinion. For me SX is equal to an upper midrange or entry level high end gaming machine of 2021 that has RTX 3070 but with proper VRAM and a 5800x (because i personally think that console optimization will help consoles to provide performance to the 5800x levels in PC)
 
Last edited:
I couldnt care less about console vs pc wars etc (i own both a ps5 and a high end gaming pc) but at least check your facts straight.

Well, I mean I did. I went to a pretty reputable hardware review channel, I looked at the entire video to gauge all of the benchmarks presented so as not to cherry pick anything. I summed up the findings. The reasons they used the games they did, as I understand, is that they could guarantee a settings match between the PC and PS5 for a fair comparison of the workloads. Anyway I didn't say I thought the PS5 was as fast as a 1060, I said the most appropriate comparison was a 2060. that doesn't buy you anything though, a 2060 is still a lot slower than a 3080 and pretty squarely in the mid range for videos cards on PC right now.

Yes it's digression, but it's an important one because the bugaboo here is always future games. People want to future proof as much as possible both in terms of GPU power and things like vRAM and like it or not, the evolution of PC games is affected very strongly by the console life cycle. You cannot enter into a discussion of "we'll need X in 3 years time" without also necessarily discussing the consoles since they're typically such an artificial hindrance. The more salient point here is that due to the nature of the release of the 3080 coinciding with consoles, which are inferior in speed by a non-trivial margin, it will have quite a lot of headroom for the next 2-3 years.

I've seen Resi 8 video and screenshots and while some areas look not half bad they're kinda few and far between, the vast majority of it looks truly terrible for a AAA game in 2021. Techpowerup has vRAM usage on a 3090 at 7.8Gb in 4k and that's measuring malloc. I emailed W1zzard their reviewer about this months ago and he basically kinda denied/ignored the idea of real vRAM usage and (to my knowledge) has stuck to the old methodology for measuring this. If I had to bet purely on gut judgement I'd doubt it's actually using all that even at 4k, more like 6 to 6.5Gb, or maybe a tad more if you have the RT features on.

https://www.techpowerup.com/review/resident-evil-8-village-benchmark-test-performance/6.html
 
I summed up the findings. The reasons they used the games they did, as I understand, is that they could guarantee a settings match between the PC and PS5 for a fair comparison of the workloads.

Thing is.... they didnt, DMCV doesnt even run at 1080p on the next gen consoles no matter what settings you change, its pure click baiting.
on the latest COD my GPU always allocates like 13gbs of vram but no idea how much is actually using it, 7.8 on resi 8 seems to be already getting too close for comfort, looking to get a 4k monitor very soon so i will definitely try some of heavier stuff like cyberpunk.
 
Love or hate consoles at least they go for MSRP when in stock and represent good VFM whereas GPUs are now 2~3X MSRP and have worse a price to performance ratio than cards released 4 years ago.
 
It’s obvious consumers are being taken for a ride here as there is no way the cost to manufacture and distribute has doubled in the space of 6 months.
 
Absolutely they are. But for GPU's its blame the miners (somewhat a part to play in it yes) but when consoles and other goods are really hard to secure its down to lack of stock, manufacturing capacity, slow distribution, covid shutdown periods and then the scalpers that soak up the layers making this new tech overpriced.
 
Absolutely they are. But for GPU's its blame the miners (somewhat a part to play in it yes) but when consoles and other goods are really hard to secure its down to lack of stock, manufacturing capacity, slow distribution, covid shutdown periods and then the scalpers that soak up the layers making this new tech overpriced.
I think the issue with the consoles is that There is simply to much stuff being made on tsmc 7nm to go around so everything was squeezed even cpus suffered.

Nvidia on the other hand has Samsung 8nm all to itself so shouldn’t Be effected so much with supply issues and looking at their past 2 quarters gaming card revinues are obviously producing a lot of chips but these don’t seem to be finding there way to companies like OCUK as they haven’t even been able to for fill day one orders on some cards and we are now 7 months from launch.
 
Nvidia on the other hand has Samsung 8nm all to itself so shouldn’t Be effected so much with supply issues and looking at their past 2 quarters gaming card revinues are obviously producing a lot of chips but these don’t seem to be finding there way to companies like OCUK as they haven’t even been able to for fill day one orders on some cards and we are now 7 months from launch.

Yeah that is strange. I also cant understand why their competitor gets sole dibs on selling the FE's unless that's common in other countries or its just a hassle of selling them as there's no profit margin.
 
Love or hate consoles at least they go for MSRP when in stock and represent good VFM whereas GPUs are now 2~3X MSRP and have worse a price to performance ratio than cards released 4 years ago.

Very true, and the few times consoles have spiked in price it's never been for so long nor by so much.

But still, I'd have stuck with consoles before now IF the biggest difference and draw for me of PC wasn't the (admittedly few) games I favour that aren't on console (not now and probably not ever) Pretty much all the other things a PC can do that I'd make regular use of I can do on other devices but those few games have always been the #1 dealbreaker for me re consoles. That aside, I had a lot of good times (during some bad times) with a PS4 early on, so there is some nostalgia there.
 
Pretty sure consoles have been scalped and people were paying double for them also..

he's saying that retail stores who get stock direct from Sony are selling the PS5 at RRP, thats because their contract with Sony legally binds them to sell at RRP. Nvidia and AIB's don't bind PC stores to sell at RRP, they can sell for whatevere they want.

Sony has a image to maintain, so they are very much against retail stores adding extra margin on top of the rrp. But these contracts only apply if your stock is from Sony, some stores might try to source the PS5 through a 3rd party and those stores could sell the PS5 for whatever they wanted to, but this is a small minority of all stock
 
Thing is.... they didnt, DMCV doesnt even run at 1080p on the next gen consoles no matter what settings you change, its pure click baiting.
on the latest COD my GPU always allocates like 13gbs of vram but no idea how much is actually using it, 7.8 on resi 8 seems to be already getting too close for comfort, looking to get a 4k monitor very soon so i will definitely try some of heavier stuff like cyberpunk.

It's a well respected channel so I don't think anything is being clickbaited here, I think if there's a mistake that it's a genuine error. Either way if what you're saying is true and I have no real way to confirm this myself, DMC is just one error in a bunch of games they reviewed. I think the overall point is still maintained and again sit's well within range of expectations if you're familiar with this kind of hardware and what performance to expect. it's actually fairly surprising to me that anyone on a forum like this would think the consoles have anything close to parity with a card like the 3080.

COD doesn't use 13Gb of vRAM it just allocates that, there's a massive disconnect between Malloc and Mused as has been demonstraed all throughout this thread. The latest CoD Cold War in the graphics menu quite literally has a slider that allows you to set the memory usage of your card, and the values if memory serves me correct are something like 70%,80%,90% so if you're using something like a 3090 with a crap load of memory (24GB) then you can be allocating about 21Gb of that, but the game is using some much smaller amount, like 6Gb or something like that. I did test this way way back and I can't remember the numbers, but it's in this thread somewhere.

As for comfort. Well <8Gb is certainly getting towards the high end of 10Gb limit...but that's what we want, right? We want enough memory on the card that it doesn't cause us any problems running games, but we don't want a huge excess of memory that we cannot use because that's memory we have to pay for with our own money, but provides zero benefit. For example in my line of work in IT when we provision servers for certain workloads we like to work out what memory requirements the servers need and we allocate slightly above that. If we have a small server really only needs 2-3Gb of RAM for example, we might allocate 4Gb. We're not going to allocate 128GB of RAM because all we're doing is wasting money.
 
It would be really useful to have some respectable devs speak up on this with their own thoughts. Right now, understanding allocated vs used VRAM is difficult enough but texture streaming is going to muddy that discussion soo much more when it becomes more prevalent. It would be nice to know what's really going no inside their games than assumptions, albeit it somewhat educated, based on what afterburner et al are telling us because we really aren't getting the full picture.
 
he's saying that retail stores who get stock direct from Sony are selling the PS5 at RRP, thats because their contract with Sony legally binds them to sell at RRP. Nvidia and AIB's don't bind PC stores to sell at RRP, they can sell for whatevere they want.

Sony has a image to maintain, so they are very much against retail stores adding extra margin on top of the rrp. But these contracts only apply if your stock is from Sony, some stores might try to source the PS5 through a 3rd party and those stores could sell the PS5 for whatever they wanted to, but this is a small minority of all stock
Ok this is too much common sense and logic... who are you and what have you done with grim5! :D
 
It's a well respected channel so I don't think anything is being clickbaited here, I think if there's a mistake that it's a genuine error. Either way if what you're saying is true and I have no real way to confirm this myself, DMC is just one error in a bunch of games they reviewed. I think the overall point is still maintained and again sit's well within range of expectations if you're familiar with this kind of hardware and what performance to expect. it's actually fairly surprising to me that anyone on a forum like this would think the consoles have anything close to parity with a card like the 3080.

COD doesn't use 13Gb of vRAM it just allocates that, there's a massive disconnect between Malloc and Mused as has been demonstraed all throughout this thread. The latest CoD Cold War in the graphics menu quite literally has a slider that allows you to set the memory usage of your card, and the values if memory serves me correct are something like 70%,80%,90% so if you're using something like a 3090 with a crap load of memory (24GB) then you can be allocating about 21Gb of that, but the game is using some much smaller amount, like 6Gb or something like that. I did test this way way back and I can't remember the numbers, but it's in this thread somewhere.

As for comfort. Well <8Gb is certainly getting towards the high end of 10Gb limit...but that's what we want, right? We want enough memory on the card that it doesn't cause us any problems running games, but we don't want a huge excess of memory that we cannot use because that's memory we have to pay for with our own money, but provides zero benefit. For example in my line of work in IT when we provision servers for certain workloads we like to work out what memory requirements the servers need and we allocate slightly above that. If we have a small server really only needs 2-3Gb of RAM for example, we might allocate 4Gb. We're not going to allocate 128GB of RAM because all we're doing is wasting money.


Its clickbait because he didnt even bothered to test you know.... ANY 4k game, not a single one because he 100% knows the 1060 is absolutely rubbish at it, for that kinda comparisons i would stick with digital foundry to be honest, they are not perfect but they seem the best when it comes to that kinda stuff.
As for the Vram we should have a better picture on the second half of this year when we start getting some heavier AAA games, new battlefield should be interesting.
 
Its clickbait because he didnt even bothered to test you know.... ANY 4k game, not a single one because he 100% knows the 1060 is absolutely rubbish at it, for that kinda comparisons i would stick with digital foundry to be honest, they are not perfect but they seem the best when it comes to that kinda stuff.
As for the Vram we should have a better picture on the second half of this year when we start getting some heavier AAA games, new battlefield should be interesting.
VRAM would surely be an issue too at 4K with 3GB and 6GB 1060 variants.
 
It would be really useful to have some respectable devs speak up on this with their own thoughts. Right now, understanding allocated vs used VRAM is difficult enough but texture streaming is going to muddy that discussion soo much more when it becomes more prevalent. It would be nice to know what's really going no inside their games than assumptions, albeit it somewhat educated, based on what afterburner et al are telling us because we really aren't getting the full picture.

I agree, although it's really above game developers heads I would probably argue and say unless the team are working with a propietary engine and not licencing one (most are just licenced) probably even the devs don't know. You need to be an engineer on the actual engine itself to really get to grips with this. But there is a lot of info out there on how this works and texture streaming has been popular for much longer than most people understand. There's still a very pervasive belief that vRAM is more or less a cache of textures and you request a texture last second, get a cache miss and then you get vRAM stutter hell as you fetch from disk to provide the texture in real time. And I'm not sure when that first came to and end but my general suspicion is somewhere around the original Crysis era, where open worlds were really starting to take form and streaming in assets was becoming a thing. That's a long time back now.

One argument I've made over and over is to check the size on disk of games, a lot of the sheer size on disk of modern games comes to 2 high memory assets which are audio (music, sounds etc) and textures. And games have been pushing even 100Gb for a while, but long before that have been 20-80Gb for maybe the last decade or so. And you simply cannot fit that level of data into vRAM. The way we've managed this has largely been through advanced texture streaming and pre-loading. At first it was kind of "zones" where zones were defined in game and as you passed from one to another the textures are swapped out. But more modern games are much more granular than that, they're doing all sorts of predictive math to work out where you are, which direction you're going, where you're likely to be, and so what to pre-load. If you watch the memory usage on something like FarCry5 which I did quite intently myself as testing, you'll see that as you fly around the map in a plane the textures are almost in a constant state of streaming, vRAM usage is shooting up and down rapidly as it swaps out what it needs, and it does that pretty much seamlessly.

VRAM would surely be an issue too at 4K with 3GB and 6GB 1060 variants.

Maybe, I'd argue not necessarily. I know we had it out in the past about FC5 and memory usage and I showed 4k usage and even 8k usage through 4k with 2.0 upscaling and the actual raw memory buffer itself is tiny, it's like 100mb or less for 4k. Other memory usage does also scale up such as the memory required for AA and a few other per pixel operations, but it's not really a huge increase it's tempting to think it is. What gets hit the hardest, or at least disproportionately as hard, with higher resolutions is the GPU, it just has way more raw work to do per pixel.
 
Status
Not open for further replies.
Back
Top Bottom