• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
The demand will be there for users with an old card but when they see the price tag will they buy or not is another question!
Ampere is still impressive at a high price above MSRP if you're holding a older gpu and looking at the performance charts.
I paid £700 for my 6700xt quite above MSRP but totally happy with the upgrade.
Tbh the 6700XT isn't even great at msrp.
 
Yes, and remember that Cyberpunk is about as unfavourable as you can get for RDNA, and that’s before you enable RT. You disable RT, RDNA2 is pretty competitive in that game despite the extremely close ties between ISV and IHV.

The RT peformance is GPU hardware related. Nvidia just have better RT hardware. With AMD there are no RT cores to speak of, the functionality is integrated in the TMUs.

With NVidia they have a RT Core which performs ray-to-box checks or ray-to-triangle tests.
But the RT cores cannot be used for anything else than these intersection calculations.

AMD's design is more efficient. NVidias more powerful.

4k RT medium (all effects) Cyberpunk 2077 with DLSS 2.2.9 ultra performance on a RTX 2060. Faster than a 6900xt.



Amd will have to up their RT performance for RX7000 series, but you have to realise RT is just a gimmick as its been implemented in DXR at the moment with just a low amount of rays per pixel so you get nothing like 3DSmax (vray)RT. For example the reflection is just a simple mirror on or off effect, where is the glossy reflections?
3080 should be good for about 1.5 to 2 years but games will likely tax it after that time.

RT is such a gimmick DX12u is built to provide the features for DXR games. Major 3d engines like Unreal Engine 5 are built to provide a RT gaming experience. Luma being a form of software RT'ing and the engine itself supporting RT. Basically RT is the standard with Unreal Engine 5. DXR or RT is so far from being a gimmick, stating it as so harms your credibility.

AMD have to up their RT performance, Nvidia are rumored to be doubling their cores and going 5nm. In true rumor mill fashion, Nvidia Ada Lovelace: Next-gen graphics could be 71% more powerful than RTX 3080. With the move to a smaller node, Nvidia is also said to be packing up to 18,432 CUDA cores into Ada. https://www.digitaltrends.com/computing/nvidia-ada-lovelace-next-gen-gpu-architecture/

Meanwhile the Radeon RX 7900 XT, could cost as much as $2,000.


AMD could potentially use a multi-chip module (MCM), or chiplet, design borrowed from the architecture used on its Ryzen processors for RDNA 3. This change in design will allow the company to double the number of compute units on each die, for a total of 160 compute units and 10,240 stream processors. This would in effect double the amount of compute units on the Radeon RX 6900 XT. AMD will likely have several configurations in the series, and a Navi 32 SKU could contain 120 to 140 compute units using the same MCM design.

If all else is the same, we can expect AMD’s GPUs to ship with at least 12GB of VRAM. The company had previously claimed that 12GB of RAM https://www.digitaltrends.com/computing/amd-radeon-rx-6700-xt-12gb-gddr6-vram/ on GPUs is the minimum requirement to be future-proof when playing AAA titles. https://www.digitaltrends.com/computing/amd-radeon-rx-7000-series-everything-you-need-to-know/

Wait until they come out, the games they have to run will dictate the winner.
 
Last edited:
RT is still a primitive version as in directx12 with low rays per pixel=NOT much different to raster shading (just a gimmick for now).
Rdna 2 or 3080ti or 3090, 3080 should be good for about 1.5 to 2 years with 10GB but avoid 8gb cards if you want to play with max tex quality at 4k res. The industry is shifting towards 4k gaming as in "next gen"
I think the industry would set the bar at 12gb for max bells and whistles so both 12/16gb cards can run it but reduced tex quality setting for 8gb cards, it will still run on the lower vram cards and consoles just fine but their trailer video will show the max eye candy 4k screen version, that's just the way the sneaky marketing industry is! (sort of like what happened with the cyberpunk 2077 marketing!)

Both Quake 2 RTX and Minecraft RTX perform very as full path tracers. Metro Exodus Enhanced again shows what is possible with limited RT use. CP2077 wraps raytraced GI, shadows, AO and reflections around legacy rasterisation and again performs very well. I've just finished playing Control, which adds a good amount of physics destruction on top of a beautifully raytraced environment again porforms very well. There has to come a time when people realise it's not just a gimmick.

1.5 - 2years should give us Direct IO streaming data directly in to VRAM with the result less is required. Engines will adapt from loading a level to loading a chunk. I really can't see requirements rising until 4k becomes the norm and 8k not unusual. Even then don't expect a huge jump as development is already going in to AI texture reconstruction to ease transmission and storage. Eventually this will improve cloud gaming also.

I don't understand your dislike for CP2077, especially as you claim to have worked as a 3D programmer for 20years+. New tech should excite, while creating a challenge not seen for some time in 3D graphics. I'm sure I watched Hardware Unboxed describe it as one of the best looking titles and they have been pretty negative on RT.
 
Tbh the 6700XT isn't even great at msrp.

Yep exactly right it's about 2080 super performance, but it allows me to use 8k textures in programming and fill Vram and test things out and okay 4k gaming if you can accept 35+ fps.
I'm not dropping 1 or 2k on a GPU until the microsoft team implement proper ray tracing into DXR so I'll wait with a lower spec card until real RT appears.
 
The RT peformance is GPU hardware related. Nvidia just have better RT hardware. With AMD there are no RT cores to speak of, the functionality is integrated in the TMUs.

With NVidia they have a RT Core which performs ray-to-box checks or ray-to-triangle tests.
But the RT cores cannot be used for anything else than these intersection calculations.

AMD's design is more efficient. NVidias more powerful.

4k RT medium (all effects) Cyberpunk 2077 with DLSS 2.2.9 ultra performance on a RTX 2060. Faster than a 6900xt.





RT is such a gimmick DX12u is built to provide the features for DXR games. Major 3d engines like Unreal Engine 5 are built to provide a RT gaming experience. Luma being a form of software RT'ing and the engine itself supporting RT. Basically RT is the standard with Unreal Engine 5. DXR or RT is so far from being a gimmick, stating it as so harms your credibility.

AMD have to up their RT performance, Nvidia are rumored to be doubling their cores and going 5nm. In true rumor mill fashion, Nvidia Ada Lovelace: Next-gen graphics could be 71% more powerful than RTX 3080. With the move to a smaller node, Nvidia is also said to be packing up to 18,432 CUDA cores into Ada. https://www.digitaltrends.com/computing/nvidia-ada-lovelace-next-gen-gpu-architecture/

Meanwhile the Radeon RX 7900 XT, could cost as much as $2,000.


AMD could potentially use a multi-chip module (MCM), or chiplet, design borrowed from the architecture used on its Ryzen processors for RDNA 3. This change in design will allow the company to double the number of compute units on each die, for a total of 160 compute units and 10,240 stream processors. This would in effect double the amount of compute units on the Radeon RX 6900 XT. AMD will likely have several configurations in the series, and a Navi 32 SKU could contain 120 to 140 compute units using the same MCM design.

If all else is the same, we can expect AMD’s GPUs to ship with at least 12GB of VRAM. The company had previously claimed that 12GB of RAM https://www.digitaltrends.com/computing/amd-radeon-rx-6700-xt-12gb-gddr6-vram/ on GPUs is the minimum requirement to be future-proof when playing AAA titles. https://www.digitaltrends.com/computing/amd-radeon-rx-7000-series-everything-you-need-to-know/

Wait until they come out, the games they have to run will dictate the winner.

Yes RT is industry standard but if you look at scenes rendered with 3dsmax its nothing like that, its a poor mans version!
I had a look at cp2077 and the cars look like made of shiny chrome and streets like shiny plastic! as for watch dogs legion looks like mirrors stuck to the road as puddles!
I don't work in the industry, I only program interesting projects for myself but talk to programmers artists in the industry and we solve game prog related issues etc, I did well when I was a trader so don't have to work at all. Just enjoy 3D prog and rendering.
We need much faster RT GPUs then ms will implement 3DSMAX style RT about 5 years away at the least.
 
Both Quake 2 RTX and Minecraft RTX perform very as full path tracers. Metro Exodus Enhanced again shows what is possible with limited RT use. CP2077 wraps raytraced GI, shadows, AO and reflections around legacy rasterisation and again performs very well. I've just finished playing Control, which adds a good amount of physics destruction on top of a beautifully raytraced environment again porforms very well. There has to come a time when people realise it's not just a gimmick.

1.5 - 2years should give us Direct IO streaming data directly in to VRAM with the result less is required. Engines will adapt from loading a level to loading a chunk. I really can't see requirements rising until 4k becomes the norm and 8k not unusual. Even then don't expect a huge jump as development is already going in to AI texture reconstruction to ease transmission and storage. Eventually this will improve cloud gaming also.

I don't understand your dislike for CP2077, especially as you claim to have worked as a 3D programmer for 20years+. New tech should excite, while creating a challenge not seen for some time in 3D graphics. I'm sure I watched Hardware Unboxed describe it as one of the best looking titles and they have been pretty negative on RT.

Cp2077 looks good enough with the colored lighting and so on but you can do only so much with mirror effects. Cars should not be like shiny chrome. The low ray per pixel versions of RT gi,shadows, ao and reflections look not much different to what we can do with raster (that's why people see it as a gimmick), real RT as in 3DSMAX looks a LOT better than game RT.
Direct IO sounds promising.
 
There is one game I had to turn the texture quality down on.

RE3.

But I don't see too much in to it as it was a remake of a 1999 game.
You only did that because the broken in game menu said it needed more vram. It was the same in RE2 and that ran fine even on 8GB maxed out as I recall.

I played RE3 everything maxed out with no issues with 10gb. Go install and see for yourself.

In conclusion that game runs fine maxed with 10gb at 4K, never mind 1440p ;)
 
Honestly there are one or two individuals on this thread who are absolutely trolling while not providing a single example of a existent game that the 3080 is struggling with regarding the Vram requirements.
Then you have people claiming to be DEVS when they have zero idea what RT is, you can smell the BS from a mile away.
 
There are signs that 8GB is maybe not enough in Doom Eternal. This reviewer states, "I don't claim performance is all about VRAM - at 1440p I believe it is why the RDNA 2 GPUs are faster than the 3060 Ti/3070. I also show Ampere doing best, relatively speaking, at 4K which is likely a result of those architectural features you've highlighted above" https://youtu.be/bPezogcMGgA

In Doom Ethernal 4K RT on Vulkan. The 6900xt 16GB is slower than the 3070 8GB. At 1440p the 3070 falls behind the 6800xt and the 6900xt. At 1080p the 3070 beats both the 6800xt and 6900xt.

Doom Ethernal is interesting because it asks for 11GB of vram for 4k with RT but the benchmark shows the 3080 far ahead of the 6900xt at all resolutions.

Same in Cyberpunk 2077, 6900xt on medium RT cant match 3060 ti. https://youtu.be/2IoyeNqt1qI?t=226 3:46

A 6900xt is the same as a RTX 3060ti RT medium at 1440p. https://youtu.be/2IoyeNqt1qI?t=409 6:49

From the review, "I wasn’t too critical of the RX 6800 XT for its ray tracing performance, but for a £900 GPU, the RX 6900 XT is not that much faster than the £369 RTX 3060 Ti in ray traced workloads. I don’t think the RX 6900 XT is fast enough to justify its asking price, when the RTX 3080 is snapping at its heels (and outpacing it in ray traced workloads) for £250 less cash." https://www.kitguru.net/components/graphic-cards/dominic-moass/amd-rx-6900-xt-review/33/

So for RT games @4k the extra 6GB of vram over the RTX 3080 is useless. There just is not the level of performance needed from the 6800xt or 6900xt to run a RT game @ 4k. In fact 1080p would be the most likely resolution or even maybe 1440p. you can see the performance issue in 3D mark were the 6900xt cant beat the 2080ti in the DXR benchmark.

You dont need very high resolution textures at 1080p. I can tell you that 29.21fps https://www.kitguru.net/components/graphic-cards/dominic-moass/amd-rx-6900-xt-review/5/ in a RT game at 1440p is not a great experience.
 
Last edited:
Yep visuals look better in 4k, I recorded in 1080p, they're a bit simple. Point lights, directional lights, hbao, taa, gi etc mostly maxed out.
Doom gets you a better view using the variables but its just one game, and using the vram meter in lots of games graphics menu is a bit of a guessing game.
I would say the biggest users of vram are, changing the screen mode from say 1080p to 4k , lots of 4k and 2k textures, some shader effects like super sampling AA. Reflection and geometry can use a fair bit too, geometry is still being kept low by software houses for now with a high res tex skin. Still 2D textures are going to use the most of Vram.

It is very simple...it just obviously lacks the kind of shadow and lighting detail that people expect from modern games, there's only very basic shadows, I can't see any GI there's no surfaces picking up colours from nearby objects, there's definitely no kind of self shadowing on most of the objects, like buildings, there's obviously no SSAO or ambient occlusion of and kind. It's why it looks so flat and uniformly lit. I'd need to know the game and probably test it myself to see what is going on and what settings are really used. It's not a game I've ever seen before it looks like some amateur thing, not to offend whoever made it, it's just not really a good representative of a modern production ready game. It doesn't have much to do with screen resolution either, the primary culprit in why it kinda looks like one of those asset flip games is total lack of any real proper lighting and shadows.
 
Cp2077 looks good enough with the colored lighting and so on but you can do only so much with mirror effects. Cars should not be like shiny chrome. The low ray per pixel versions of RT gi,shadows, ao and reflections look not much different to what we can do with raster (that's why people see it as a gimmick), real RT as in 3DSMAX looks a LOT better than game RT.
Direct IO sounds promising.

Cyberpunk 2077 is a hybrid RT game. Most of what you see is raster. With Cyberpunk 2077 the most RT is via the sun and is used as a GI. Then reflections.

A AAA game that uses pathtracing is metro exodus enhanced edition, talk of infinite bounces for the lighting etc. Every RT game has some compromises for performance reasons.
 
Honestly there are one or two individuals on this thread who are absolutely trolling while not providing a single example of a existent game that the 3080 is struggling with regarding the Vram requirements.
Then you have people claiming to be DEVS when they have zero idea what RT is, you can smell the BS from a mile away.
Probably because this is a strawman made by people intentially misconstruing the other sides argument.

But this has been pointed out before and ignored so go ahead and ignore it again.
 
Cyberpunk 2077 is a hybrid RT game. Most of what you see is raster. With Cyberpunk 2077 the most RT is via the sun and is used as a GI. Then reflections.

A AAA game that uses pathtracing is metro exodus enhanced edition, talk of infinite bounces for the lighting etc. Every RT game has some compromises for performance reasons.

If you did that RT GI in raster it would look just about the same, that's the point we need a proper high ray per pixel GI, reflection, shadows so it looks like this!

Plant_Models.bmp


It took a quite a few seconds in 3DSMAX to render this, I can't remember if I used GPU or CPU but the 3090 would have no chance of doing this in real time so maybe 3NM chips in 5 years or so?
If MS implement this quality level of RT into directx then I will gladly drop £2K or more on a GPU even to play at 1080p, and I hate dropping down to 1080p from 4k.
Basically for todays RT the effects do not look much better than using raster because of the low quality RT used, todays game RT does not have the "WOW" factor, we need much faster GPUs to run quality RT.
Hey I remember the ZX spectrum and other models from when I was a kid, must have been just a few years old, yes 64k and 128k of memory then!
 
Last edited:
Honestly there are one or two individuals on this thread who are absolutely trolling while not providing a single example of a existent game that the 3080 is struggling with regarding the Vram requirements.
Then you have people claiming to be DEVS when they have zero idea what RT is, you can smell the BS from a mile away.
Yeah and it seems the people complaining either don't actually have the card or had to pay out a lot more for an AMD card or 3090 so are now trying to justify their purchase.
 
If you did that RT GI in raster it would look just about the same, that's the point we need a proper high ray per pixel GI, reflection, shadows so it looks like this!

Plant_Models.bmp


It took a quite a few seconds in 3DSMAX to render this, I can't remember if I used GPU or CPU but the 3090 would have no chance of doing this in real time so maybe 3NM chips in 5 years or so?
If MS implement this quality level of RT into directx then I will gladly drop £2K or more on a GPU even to play at 1080p, and I hate dropping down to 1080p from 4k.
Basically for todays RT the effects do not look much better than using raster because of the low quality RT used, todays game RT does not have the "WOW" factor, we need much faster GPUs to run quality RT.
Hey I remember the ZX spectrum and other models from when I was a kid, must have been just a few years old, yes 64k and 128k of memory then!

You dont know what you are talking about.

 
Status
Not open for further replies.
Back
Top Bottom