• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Yep. My monitor does not have HDR, so all this extra vram business does not apply to me or 99% of the market I guess. I do have a 4K 120Hz GSync OLED TV with proper HDR, but I rarely bring my pc downstairs to connect to it.
 
This is just you pivoting the argument. If I was talking about price to performance or if i questioned "why it was faster than a majority of gaming PCs" you would have a leg to stand on. But i was very clear in what i was referring to and that was performance only.

Right, and this is my point, you're deliberately referring to only one thing, which is the upside of such a business model, and you're ignoring the downside. Obviously any sane person is going to consider both things, and not only that but the weight of those things changes with time. It's a big benefit to the consoles during their early years and a downside later on.

The consoles at this moment in time are fast bits of kit. You were wrong when You said "it's no secret to anyone even remotely technical that console generations do not have fast APUs, they're sharing the same die for CPU and GPU functions and all in all they aren't that fast". If you have an actual argument for why they are not fast then lay them out otherwise stop bringing up price.

Depends what you mean by fast, that's a loosey-goosey word. The best estimates we have for relative performance puts them at about a 2060S level, this was confirmed by the direct comparison of hardware with relative settings used in WD:L, where Digital Foundry found it performed about the same, except the PC was in 4k and the consoles had dynamic scaling and were between 1080 and 1440p. So what does "fast" mean, is a 2060S fast?

You know that 1Gb modules exist right? My assumption is that they are cheaper. They could have used that to balance cost and VRAM need. As far as i am aware you can mix 1 and 2GB modules. Are there penalties to doing this on a closed eco system. But thats not important there are many options available to balance these requirements, that to assume that there is simply too much VRAM in the consoles is stupid. How do you know it's not the other way around? For all we know the guys over at sony and MS wanted more VRAM but found it was too costly or that it would give them excessive bandwidth, that could equally be true.

You cannot mix memory module size because the way that memory bandwidth is improved above the basic rate of any individual chip is by writing to the memory in parallel, kind of like how HDDs are written to parallel in RAID. With a 32bit bus width interface on each chip you can get something like 256 bit width total if you write to 8 chips in parallel. If some chips are larger they cannot usefully be used in parallel with smaller chips. I state again that in general these things are engineering trade offs, there's no 1 best way to do something there's just upsides and downsides of doing something a specific way. It's not that the consoles were given say 2-4GB of additional memory that I personally don't think will be useful, it's that the architecture limited them to either that amount, or some amount lower which was unacceptably small.

Lets be honest if there is "too much" VRAM it isn't by a significant amount. Lets take your 10GB figure it's probably 2GB tops if i was being generous. But as has been mentioned and ignored, games devs can always cram better textures and models for a cheap performance penatly relative to everything else but lets ignore that final bit.

Pretty much agree here. My estimate is probably the card will never make sensible use of 12Gb or more, so maybe 4Gb will go to waste. A bit like the 1080Ti having 11GB of vRAM but probably never making sensible use of more than 7-8GB and say 3GB going to waste.

I just didn't see the need to fully explain my line of thinking, because i prefer to keep my posts lean (you should try it). When i talk about engineering around the problem i am talking about more than just finding away around bandwidth bottlenecks. There are ways to balance engineering problem, it's not just VRAM and bandwidth that is forming this equation. There are variable on the otherside like the GPU cores themselves that they can balance, to reduce as much waste as possible and I will confidently say that, that is what both sony and MS have done, within scope of their design brief. This is custom silicon not an off the shelf part.

We're basically in agreement here, engineering is in trade offs, or what you call balance. There is more to problem as you stated, for example the amount of memory modules you can physically fit around the outside of your GPU matters, whether you can put them on both the front and back of the PCB. There's endless trade offs. Sony and MS have made the trade offs they think are worth it, as you'd expect.

There are no consoles fanboys here *******. Oh look stating your opinion as facts again,while ignoring anything that can run contrary to it (read above). Standard.

There definitely are. I wont name names, but there's obviously people here who think the consoles are running the same settings/code the PC is running, when a detailed breakdown of games and their relative settings on each platform have shown this is categorically not the case. That's not opinion in the slightest, the actual settings configs used to power games on both platforms have been published.

Reduces power and heat output by not needing highly clocked memory as well as reducing the amount of data needing to be shuffled across the memory interface. Princess Fiona over here declares it a waste of die space. If you had any integrity or an actual appreciation of engineering you wouldn't refer to it as a waste of die space. Don't worry you'll sing a different tune as soon as Nvidia announces they are doing the same thing.

I never said that it was a "waste of space", i said they waste more die space, by which I mean the valuable area on the chip itself is spent more on cache rather than on circuits for calculating things to increase performance. Every transistor the cache uses is a transistor that can't be used for doing calculations which produce the additional performance. This is likely why the Radeon series struggle (relatively) at both 4k and with RT effects. The die space spent on infinity cache eats into the budget for everything else. Again I'm not saying this is essentially bad, it's just an engineering trade off.

Point me to the exact part were i said contrary or are you just writing for the sake of it.

You said, and I quote: "Maybe you didn't notice but i was mocking the silly notion that games developers can't use the slower memory for graphics items." to which I replied "They can use it but it incurs performance penalties" which is true. The GPU can only process data from memory, if the memory becomes a bottleneck then the performance of the GPU is lowered.

Citation on bolded section. Give me actual stats from a XSX game.

I don't have a citation, largely because the consoles being closed platforms deny us the ability to collect statistics directly. However we can look at performance of these titles on the PC and infer their demand on hardware on the consoles. A game on the PC that uses say 6GB of system RAM will need 6GB of memory available on the consoles, there's not a lot of scope for consoles magically using less memory than PCs in this regard. They are some variant of PC (Linux, Windows) running libraries of DLLs that use up memory.

Was this before or after calling infinity cache a waste of space? Just for reference.

As you can see from my prior posts, I never referred to it as a "waste of space", this is you creating a straw man of my argument. I said that they waste more on the die for cache, when that space could be spend on processing. But I acknowledged this specially as a trade off. More die space spent on cache is less die space spent on processing (speed), but ultimately is a trade off vs what memory you can use and the benefits that brings.

Preaching to the choir i see. And you're not even completely right, but i'll give it to you.

Not sure what you mean by this. Almost everything in engineering is a trade off, of some kind.

I post laughing images and memes as a reminder to not take this **** seriously and because it is quick and easy. If I'm posting memes then you should revaluate what you wrote. For example using WD:L as the benchmark for what the consoles can achieve.

It's a modern AAA title that pushes the limits not just on the console but on the PC, it uses a lot of new tech including ray tracing and so is a perfect test case to compare the 2 platforms. Not just that but we have the actual config files which tell us the specific differences between each platform in the visual settings they use. You can laugh at this all you like, but what the theory here, that this specific game performs radically differently on PC vs the consoles?

The reason for your wall of text is because you use it as a way to slowly move the goal posts. It makes it hard to spot because people get lost in the waffle. If people follow the chain of posts you often end up arguing either something you didn't start with (console price) or a vague statement (like your final statement about textures). It is a great technique for getting someone to arguing something they were not orginally arguing but i simply do not take the bait which is why i will ignore vast swafts of your post. Moving a conversation along has its uses but you just take the ****. You also seem to think too highly of yourself and feel the need to "teach people", to the point of making assumptions and viewing people below you. Like trying to tell a mechanical engineer that engineering is about trade offs.

I'm simply responding to comments I've made. Discussions often wander through multiple topics and become large and so my posts become large because I have a lot to respond to.

Can you give an example of people "get lost in the waffle" in my discussion. I'd be interested in any examples you can give.

I know you ignore vast swaths of my posts because you reply to straw men of my arugments which indicates you're scanning my posts and constructing some straw man variant of them, which you think I believe, and then attacking that.

I truly wanted to thank you for wasting my time @PrincessFrosty.

No one is wasting anyone's time. We're here to discuss topics that interest us, and the discussion itself is intrinsically rewarding. We both get a kick out debating one another, otherwise we wouldn't do it.[/quote][/quote][/quote][/quote][/quote][/quote]
 
Yep. My monitor does not have HDR, so all this extra vram business does not apply to me or 99% of the market I guess. I do have a 4K 120Hz GSync OLED TV with proper HDR, but I rarely bring my pc downstairs to connect to it.

Vram doesn't affect hdr

you can just run a fibre optic hdmi 2.1 cable tonyourbtv from the pc
 
Only game I have played thus far where HDR looked noticeably better has been SOTTR, I just dont get the screen snobbery.


HDR unfortunately is not some magic trick like the marketing will have you believe- the quality of the image varies dramatically between games because of how the game was designed and how hdr was implemented.


There are some games where the developers took the time to make it look good and you do have to do that. Hdr is not a magic button that makes everything awesome. Some games which have really good hdr that I've played: last of us remake, last of us 2, uncharted 4, horizon zero Dawn, sottr yes, RE3 remake, lost odyssey, sea of thieves, Ori and will of wisps
 
HDR unfortunately is not some magic trick like the marketing will have you believe- the quality of the image varies dramatically between games because of how the game was designed and how hdr was implemented.


There are some games where the developers took the time to make it look good and you do have to do that. Hdr is not a magic button that makes everything awesome. Some games which have really good hdr that I've played: last of us remake, last of us 2, uncharted 4, horizon zero Dawn, sottr yes, RE3 remake, lost odyssey, sea of thieves, Ori and will of wisps
hello dear sir, what do you think of dolby vision?

https://twitter.com/Riddimz4/status/1443607876346556421

these comparison seem cool but i feel like its a weird color adjustment / filter , how do you feell about it? i want to get knowledgeable , to me it feels like the image was supposed to like on the left but right ones seem unnatural but idk
 
It's not HDR requiring more VRAM, it's FidelityFX LPM.

hello dear sir, what do you think of dolby vision?

https://twitter.com/Riddimz4/status/1443607876346556421

these comparison seem cool but i feel like its a weird color adjustment / filter , how do you feell about it? i want to get knowledgeable , to me it feels like the image was supposed to like on the left but right ones seem unnatural but idk

That's not really Dolby Vision, the real DV is only for videos and the one game that had some sort of native support, Andromeda, didn't really change much and was mostly a failed experiment. What's happening in those images is that TV settings are getting changed, but if you adjust those then DV will reveal no difference, a.k.a. the tone-mapping algo doesn't really change, which is the important part - that's what something like FidelityFX LPM does, but as stated above is not free (performance-wise, at least now; it used to cost too on Pascal et al) like the default HDR presentation.
 
Vram doesn't affect hdr

you can just run a fibre optic hdmi 2.1 cable tonyourbtv from the pc
I am only going by what Mr Humbugs is banging on about. I don’t use HDR, so not looked into it to be fair. Only time I used HDR in gaming was when I had my PS4 Pro last year.
 
Yeah we'll see what it is like once proper comparisons etc. comes out, as it is right now, the biggest problem nvidia will have is to do with their driver overhead issue and not vram from the look of it.

Either way, if the hd texture pack does indeed "need" 11+GB of vram, hopefully it actually has a greater graphical difference/improvement than previous HD texture packs:

 
So much for 10GB vram being an issue again eh....... ;) :p :cry: Onto the next title please! :D

Looks like my original expectations were correct, although was expecting amd to have more of a lead when ray tracing not used but alas, still time for that to change with future patches and drivers:


rkf7rNT.png

YqpROqL.png

Such a shame ray tracing is being held back so much for amd sponsored games, they should at the very least give us a choice of how intensive we want ray tracing effects to be, for obvious reasons, will never happen though. Hopefully nvidia keep sponsoring more games to push the tech. to it's max.
 
WE0DRqN.gif

Dis gona be gud.

Naysayers must be scouring the web looking for something to fit their narrative right now :D

Edit: Had a quick flick through the video. Yep, definitely another case of people getting sucked in by marketing where they said 10GB was not enough. How many times before people learn? I do not believe it until I see it. They said Watch Dogs Legion needed 11GB's also, then changed it later on :p
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom