• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Say what you want about Cyberpunks performance but it does not look terrible. Thats just a flat out lie regardless of what you think of the game. With Ray tracing on ultra @4k with everything maxed it is stunning to look at.
compare it to read dead 2 from how many years ago. yes rtx gives it an edge, but for the performance cost, the game is ugly. the game reminds me of Skyrim both performance wise and looks (dated). you don't agree that's your opinion and thats fine. but mine is it looks terrible for 2021.
 
Say what you want about Cyberpunks performance but it does not look terrible. Thats just a flat out lie regardless of what you think of the game. With Ray tracing on ultra @4k with everything maxed it is stunning to look at.
I swear my expectiations of video game graphics must be so out of whack compared to everyone else, because i just don't see it. I agree it isn't terrible but I don't see it as anything to write home about.
 
I was cheering you on from the sidelines until you falsely claimed this. :p
:cry:.. Dang and I was trying hard not to annoy the AMD fans...

But you know what I mean even Nvidia titles are done same way too, they favour their own cards. Just sadly the facts of life with gpus and sponsored titles by these GPU makers. Both are as bad as each other when it comes to their sponsored titles.
 
Last edited:
:cry:.. Dang and I was trying hard not to annoy the AMD fans...

But you know what I mean even Nvidia titles are done same way too, they favour their own cards. Just sadly the facts of life with gpus and sponsored titles by these GPU makers.
Hehe it’s okay. Let’s just remember that you won’t find any AMD titles that contain any code, effects etc which hurt Nvidia performance. We can all agree the same is not repeated on the other side, as we have seen numerous times before.

All you see is a game well optimised for RDNA, close collaboration helps, but it does not harm the competitor.
 
Hehe it’s okay. Let’s just remember that you won’t find any AMD titles that contain any code, effects etc which hurt Nvidia performance. We can all agree the same is not repeated on the other side, as we have seen numerous times before.

All you see is a game well optimised for RDNA, close collaboration helps, but it does not harm the competitor.
i think some people are just mad that nvidia, aside from "specific rt games", cannot make any game run faster on their gpus

even at their best attempt, it is a %3-5 leads at most

yeah, rt can change things, but in terms of pure raster performance, i never see any nvidia brand games running faster like this on ampere. even if they wanted to it, they can't, because that's the limitation of their design they choose to go along with

even cyberpunk without rt runs faster on rdna 2, this proves that nvidia lost their touch and now rely on RT performance/specific RT codepaths to make them look better against the competitor

its impossible to race with rdna 2 cards when they clock at 2.5 ghz and all you have is slow 1.9 ghz computational heavy cards

rtx 3070 can actually beat 6700xt in valhalla, thats not a joke, but that happens when both cards are below 40 fps at 4k...

its clear that ampere has great scalability issues. even if they somehow fix it and bolster the performance at lower resolutions, that will be only specific to certain games they choose to, if they can that is (so far they couldn't)

this proves that ubisoft or amd can easily make ways to game run much faster, much efficient on rdna2 gpu at lower resolutions, but neither developers nor nvidia couldn't achieve a similar effect on any raster title yet xd
 
Jensen announced 'NVIDIA GeForce RTX 3070 Faster Than RTX 2080 Ti at $499' in his kitchen.
He also said the 3080 is 2x as fast as a 2080, Jensen is a salesman and like with all salesmen its best not to believe everything they say.
It could be a sign that the 3070 is unbalanced. You want your card to run out of fps grunt before being limited by Vram. Doom is a sign that its the Vram that could be the limiting factor before fps.

3070 cards are 4k capable cards. I don't think we have hit the true 4k card yet.
whether a card runs out of VRAM or rasterisation the answer is always the same and that is to turn down settings, you could have 56gb VRAM on a 3070 and it's still going to struggle for fps in the most demanding titles at 4K and will need settings lowered to hit 60fps at 4K.
 
i think some people are just mad that nvidia, aside from "specific rt games", cannot make any game run faster on their gpus

even at their best attempt, it is a %3-5 leads at most

yeah, rt can change things, but in terms of pure raster performance, i never see any nvidia brand games running faster like this on ampere. even if they wanted to it, they can't, because that's the limitation of their design they choose to go along with

even cyberpunk without rt runs faster on rdna 2, this proves that nvidia lost their touch and now rely on RT performance/specific RT codepaths to make them look better against the competitor

its impossible to race with rdna 2 cards when they clock at 2.5 ghz and all you have is slow 1.9 ghz computational heavy cards

rtx 3070 can actually beat 6700xt in valhalla, thats not a joke, but that happens when both cards are below 40 fps at 4k...

its clear that ampere has great scalability issues. even if they somehow fix it and bolster the performance at lower resolutions, that will be only specific to certain games they choose to, if they can that is (so far they couldn't)

this proves that ubisoft or amd can easily make ways to game run much faster, much efficient on rdna2 gpu at lower resolutions, but neither developers nor nvidia couldn't achieve a similar effect on any raster title yet xd

Nvidia is essentially a node behind AMD and still matches them in rasterisation just shows how good Nvidia would be if they were on TSMC 7nm infact the Samsung 8nm isn't much better than the than the TSMC 12nm which Turing was on when you look at power to performance scaling.
 
i think some people are just mad that nvidia, aside from "specific rt games", cannot make any game run faster on their gpus

even at their best attempt, it is a %3-5 leads at most

yeah, rt can change things, but in terms of pure raster performance, i never see any nvidia brand games running faster like this on ampere. even if they wanted to it, they can't, because that's the limitation of their design they choose to go along with

even cyberpunk without rt runs faster on rdna 2, this proves that nvidia lost their touch and now rely on RT performance/specific RT codepaths to make them look better against the competitor

its impossible to race with rdna 2 cards when they clock at 2.5 ghz and all you have is slow 1.9 ghz computational heavy cards

rtx 3070 can actually beat 6700xt in valhalla, thats not a joke, but that happens when both cards are below 40 fps at 4k...

its clear that ampere has great scalability issues. even if they somehow fix it and bolster the performance at lower resolutions, that will be only specific to certain games they choose to, if they can that is (so far they couldn't)

this proves that ubisoft or amd can easily make ways to game run much faster, much efficient on rdna2 gpu at lower resolutions, but neither developers nor nvidia couldn't achieve a similar effect on any raster title yet xd
Yes, and remember that Cyberpunk is about as unfavourable as you can get for RDNA, and that’s before you enable RT. You disable RT, RDNA2 is pretty competitive in that game despite the extremely close ties between ISV and IHV.
 
Say what you want about Cyberpunks performance but it does not look terrible. Thats just a flat out lie regardless of what you think of the game. With Ray tracing on ultra @4k with everything maxed it is stunning to look at.
Maxed out at 4k , with DLSS Quality 2.0, it does look very good. Pitty the gameplay is the opposite.

GTA V with mods also looks great;)
 
We know what VRAM is used for, well some of us do.

You seem to be ignoring the fact that most GPUs are 10GB or less and that consoles will target 10GB, yet you claim people should get the card with the most VRAM. I'm gussing by that you mean RNDA2, which is already a lost cause due to poor raytracing performance. This makes no sense at all.

What size of market would a title requiring 16GB of VRAM reach compared to one that targets 8 and 10GB?

RT is still a primitive version as in directx12 with low rays per pixel=NOT much different to raster shading (just a gimmick for now).
Rdna 2 or 3080ti or 3090, 3080 should be good for about 1.5 to 2 years with 10GB but avoid 8gb cards if you want to play with max tex quality at 4k res. The industry is shifting towards 4k gaming as in "next gen"
I think the industry would set the bar at 12gb for max bells and whistles so both 12/16gb cards can run it but reduced tex quality setting for 8gb cards, it will still run on the lower vram cards and consoles just fine but their trailer video will show the max eye candy 4k screen version, that's just the way the sneaky marketing industry is! (sort of like what happened with the cyberpunk 2077 marketing!)
 
Last edited:
I suggest anyone using Doom Eternal as a benchmark for anything look into the memory texture pool settings and swap between High and Ultra nightmare and see if they can actually spot differences in game. During all my testing I couldn't even with identical side by side screenshots and pixel peeking. Most of the people writing guides for optimized settings in the game recommend the same, there's no benefit above High.

Under the hood this setting just controls and engine variable called is_poolsize which is the number of MB to set the texture pool size to. Past a certain value it doesn't improve quality and I'd argue that limit is "High". Past that you're reserving more memory but you're not doing anything helpful. You could just type into the console is_poolsize 20480 to assign 20GB of vRAM and make the game unplayable on any video card, but it's not actually giving you any benefit.

Don't trust me though, just test for yourself. If you're on an 8GB card and you're hitting vRAM limits then lowering this value to High wont impact visuals but will make it playable. Just keep in mind I believe High maps to 2Gb poolsize, Ultranightmare maps to 4.5GB so you can see how much of an insane leap this is.
 
What game is this? It looks like some weird unity asset flip game you find on the steam store for 2 pounds. If you look at the visuals they're awful, there's obviously no advanced rendering effects being used there at all, there's shadows on some objects but it's almost completely uniformly lit. What settings are being used here in terms of lighting, filtering, Anti-aliasing, global illumination, dynamic lighting, etc

Way back near the start of the 8/10GB threads someone posted results of memory used in I think Doom Eternal based on a tool that can hook the games exe and inspect the entire rendering pipeline, what textures are in memory and stuff lik ethat. I downloaded that and tested it back when testing the resident evil claims of memory usage, but I must have uninstalled it and I can't for the life of me remember what it's called. I want to grab this again and just run though some games and make a more detailed memory used on texture comparison. Because something doesn't add up here, many of these AAA games are getting away with relatively small texture pools relative to total memory usage and I just want to test if that's accurate or not when setting these via engine vars.

if someone can remember the name then PM or @me please.

Yep visuals look better in 4k, I recorded in 1080p, they're a bit simple. Point lights, directional lights, hbao, taa, gi etc mostly maxed out.
Doom gets you a better view using the variables but its just one game, and using the vram meter in lots of games graphics menu is a bit of a guessing game.
I would say the biggest users of vram are, changing the screen mode from say 1080p to 4k , lots of 4k and 2k textures, some shader effects like super sampling AA. Reflection and geometry can use a fair bit too, geometry is still being kept low by software houses for now with a high res tex skin. Still 2D textures are going to use the most of Vram.
 
I doubt it as nvidia will have to compete with all the cheap used mining cards that will flood the market next year.

My trading instinct is telling me that even if most of these mining cards are in the hands of gamers, the demand will still be off the charts for GPUs for about 3 years.
There are millions of geforce gamers out there wanting to upgrade and lots of people who weren't impressed by the 2060/70/80 and skipped that gen.
 
A 3080 or a 6900xt wont have the firepower in future to run at 4k. Take Metro Exodus enhanced edition, a 6900xt cant run that at 4k. Cyberpunk 2077 a 6900xt cant run it at 4k with RT on.

rx-6900-xt-bile-cyberpunk-ray-tracing-modunu-kaldiramiyor-technopat-1.jpg


I am playing the start of Cyberpunk 2077 right now at 4k on a 2060 with DLSS 2.2.9 at ultra performance. I expect at some point I will have to lower the resolution but its performance is better than a 6900xt at 4k. The RTX 2060 has 6GB of VRAM.
What will happen when even more demands are made on the current GPUs? The 6900xt is not a 4k card. It has 16GB's of VRAM but lacks the performance. If the future uses AMD tech and console only type games then AMD could do 4k but if the future games are heavy in RT then the 6900xt is not a 4k card.
Basically Unreal Engine 5 games will run with Luma and Nanite. So will most PC GPUs. Top end PC games are going to add RT on top. Basically very few cards are going to run future full RT games at high resolution like 4k.

If we talk about a future with Cyberpunk 2077 type games then 4k is not possible native on a 6900xt. If we talk about games like Dirt 5 or GodFall then you can run games at 4k on a 6900xt. VRAM here does not appear to be the main problem with RT.

So what do AMD think?



List of games benchmarked with the RTX3080.
  • Marvel’s Avengers
  • Cyberpunk 2077
  • Watch Dogs Legion
  • Call of Duty Warzone
  • Crysis Remastered
  • Project CARS 3
  • Horizon Zero Dawn
  • Red Dead Redemption 2
  • Quantum Break
  • Borderlands 3
  • CONTROL
  • Assetto Corsa Competizione
  • Kingdom Come Deliverance
  • Ghost Recon Wildlands
  • Anthem
  • Metro Exodus
  • Assassin’s Creed Valhalla
  • Immortals Fenyx Rising
  • Godfall
Call of Duty Warzone (with Ray Tracing enabled) VRAM usage to 9.3GB.
Watch Dogs Legion required 9GB at 4K/Ultra (without Ray Tracing).
Marvel’s Avengers used 8.8GB during the scene in which our RTX2080Ti was using more than 10GB of VRAM.
Crysis Remastered used 8.7GB of VRAM
Quantum Break used 6GB of VRAM. All the other games were using between 4-8.5GB of VRAM.
Is there a game that cant run at 4k maximum settings on a 3080?
https://www.dsogaming.com/articles/are-10gb-of-gddr6x-of-vram-enough-for-4k-ultra-gaming/

Amd will have to up their RT performance for RX7000 series, but you have to realise RT is just a gimmick (rubbish!) as its been implemented in DXR at the moment with just a low amount of rays per pixel so you get nothing like 3DSmax (vray)RT. For example the reflection is just a simple mirror on or off effect, where is the glossy reflections?
3080 should be good for about 1.5 to 2 years but games will likely tax it after that time.
 
Ampere is only "impressive" at MSRP and even then, only on some of the lineup.

Demand for cards like a $1500 3070's should *not* be high.

The demand will be there for users with an old card but when they see the price tag will they buy or not is another question!
Ampere is still impressive at a high price above MSRP if you're holding a older gpu and looking at the performance charts.
I paid £700 for my 6700xt quite above MSRP but totally happy with the upgrade.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom