Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
RTX 3080 10gb vs RTX 3080 10GB (at half the ram speed).
Would the halfed ram speed cause the game to suffer from running out of VRAM quicker than the full speed ram?
RTX 3080 10gb vs RTX 3080 10GB (at half the ram speed).
Would the halfed ram speed cause the game to suffer from running out of VRAM quicker than the full speed ram?
https://www.youtube.com/watch?v=_L_dAynbKbo&t=240s
Turns out tensor memory compression really works...
6800xt vs 3090, 3090 uses 10%-20% less vram(they promised 20%-40% but eh its never what they promise)
Soo, a nvidia card will work a lot better when hitting against a vram limit.
Does this mean 3080's 10gb can be said to be the same as a hypothetical 6700xt's 12gb?
https://www.youtube.com/watch?v=_L_dAynbKbo&t=240s
Turns out tensor memory compression really works...
6800xt vs 3090, 3090 uses 10%-20% less vram(they promised 20%-40% but eh its never what they promise)
Soo, a nvidia card will work a lot better when hitting against a vram limit.
Does this mean 3080's 10gb can be said to be the same as a hypothetical 6700xt's 12gb?
The architecture is normally designed to cope with multiple cards at multiple performance levels with multiple variants of the GPU and different memory capacities. It's all a big trade off, you can see that AMD targeted 16GB for example which is greater than 10GB but they've also probably over provisioned their cards, certainly the lower even models but also probably the 6900XT as well, that thing will never use 16GB and have playable frame rates, they could have got away with 12GB or maybe even 10GB, but odds are their architecture would only allow 8GB as the next lowest vRAM config which would not be enough for the 6800XT or 6900XT for sure. So you're paying more for memory you can't use, does that mean it's an idiotic design decisions? No It's a trade off of the architecture, these kind of tradeoffs are made with every architecture on both camps.
Games like CoD Cold War have high resolution texture packs which when installed push the install to 130GB on disk and it runs in 10GB of vRAM just fine even with Ray tracing and all the effects on Ultra, in fact it's a good example of a modern game which is GPU/Compute bottlenecked, you can't run that game maxed out in 4k on a 6800XT for example you get about 20fps, and even on the 3080 with better RT performance you still only get about 40FPS. Yet we're not exceeding 10GB of vRAM. This follows a trend established by most of the very newest AAA titles, that GPU bottlenecks before vRAM does.
Whats interesting is we now have a 16GB VRAM GPU in the 6800XT which has comparable perfomance to the 3080 at 4K.
We can now easily test the 4K hypothesis with VRAM limitations. Has anyone yet provided an example or benchmark where the 6800XT vastly outperforms the 3080 at 4K due to a VRAM bottleneck?
If not, then it seems as if the two cards are going to become out-dated at around about the same time for 4K (which I personally think is the resolution where VRAM becomes an issue) and if this is the case, then... well I just don't see it as much of a problem.
I can 24 months from nowsee for the 2-5 titles, the 3080 struggling due to a VRAM limitation which requires NVIDIA users to use high rather than ultra texture back.
In the same breath, I can see AMD users having to not use RT in (potentially a lot of) titles due to the inferior RT AMD architecture which results in turning RT from high to low.
So both cards are going to potentially (as you'd expect) have some form of visual compromise as they age but I think both will eventually be outdated at the same time.
I wasn't totally sold on RT adoption but seeing the PS5 games including first party titles from Sony including it, its pretty clear it will be a feature that from this year onwards is widely adopted.
Whats interesting is we now have a 16GB VRAM GPU in the 6800XT which has comparable perfomance to the 3080 at 4K.
We can now easily test the 4K hypothesis with VRAM limitations. Has anyone yet provided an example or benchmark where the 6800XT vastly outperforms the 3080 at 4K due to a VRAM bottleneck?
If not, then it seems as if the two cards are going to become out-dated at around about the same time for 4K (which I personally think is the resolution where VRAM becomes an issue) and if this is the case, then... well I just don't see it as much of a problem.
I can 24 months from nowsee for the 2-5 titles, the 3080 struggling due to a VRAM limitation which requires NVIDIA users to use high rather than ultra texture back.
There's nothing so far which uses over 10GB of vRAM that I've seen/tested. And I've tested all claims in this thread which state games needing more than 10GB and when you measure vRAM actually in use.
We don't even have to wait for games that will be GPU bottlenecked at 4k we already have a load. FS2020, Avengers, Crysis Remastered, Watch Dog Legion and even CoD Cold War are all unplayable at 4k Ultra right now (assuming no DLSS usage) yet none of them exceed vRAM budgets of 10GB. As always I remain open minded to testing new things.
Tangentially, the new MSI Afterburner Beta has been updated and now enables per process memory usage in the performance list by default, so no more messing about to get it working if you just get the latest beta you can enable it with 1 click. https://download-eu2.guru3d.com/afterburner/[Guru3D.com]-MSIAfterburnerSetup463Beta4Build15910.rar
None of those games are next gen though. Next gen means PS5 games made and designed and released after 2020 so Q1 2021 we will start to see new games designed for PC and next gen console. What are the actual numbers for these games you mention? Is under 10gb 9gb?
And what if someone wants a locked 8k 30fps experience? If the console can do 4k 30fps modes a few niche users will want 8k 30fps. Some might want 4k 60fps on low and others will want max textures with RT off in next gen. 10gb does not inspire me with confidence to last 24months. And saying COD at max only runs at 20fps and never uses over 10gb totally skews the arguement. Who runs max 20fps? The settings are there so a user can pick and choose.
Next gen someone will decide what they want, Effects, Textures, Raytracing in order for 10gb to be workable we need to see proof of a very good looking game with the highest textures and see how much ram is being used. Hardly any of these games exist yet nearly all the games have been in development before the new consoles arrived. Cyberpunk will be the first probable next gen game and i have yet to see any vram numbers.
And what if someone wants a locked 8k 30fps experience? If the console can do 4k 30fps modes a few niche users will want 8k 30fps.
I agree with this also, but think Cyberpunk is a bad game to use as an example as it is an Nvidia sponsored title and they would have made sure it worked well on the 3080 'flagship'.
It would be a PR disaster if it was breaching 10GB vram.
The 3090's already cant do 8k30 without DLSS. let that sink in.
None of those games are next gen though. Next gen means PS5 games made and designed and released after 2020 so Q1 2021 we will start to see new games designed for PC and next gen console. What are the actual numbers for these games you mention? Is under 10gb 9gb?
And what if someone wants a locked 8k 30fps experience? If the console can do 4k 30fps modes a few niche users will want 8k 30fps. Some might want 4k 60fps on low and others will want max textures with RT off in next gen. 10gb does not inspire me with confidence to last 24months. And saying COD at max only runs at 20fps and never uses over 10gb totally skews the arguement. Who runs max 20fps? The settings are there so a user can pick and choose.
Next gen someone will decide what they want, Effects, Textures, Raytracing in order for 10gb to be workable we need to see proof of a very good looking game with the highest textures and see how much ram is being used. Hardly any of these games exist yet nearly all the games have been in development before the new consoles arrived. Cyberpunk will be the first probable next gen game and i have yet to see any vram numbers.
Is that with or without Tensor compression?
I'm going to assume even with. I'm sure videogames will progress to the extent that a 3080 can play it at ultra on everything including textures. This is to a degree the natural evolution of GPUs, generation to generation and definitely console generation to console generation. If anyone is expecting their 3080 to 4k/60 ultra every single title from now until the end of the PS5's life span, they're going to be dissapointed.
A) that doesn't existTurns out tensor memory compression really works...