• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Pardon.

What possible reason could you have for saying that? All modern games out right now run fine in 4k on the 3080. I have a 3080 and a 4k monitor and I play all of my games in 4k, with no exceptions.
His reason probably is to put down people who cannot afford or justify buying a 3090 and make himself feel good.

If one did not know better, you would think the 3090 has a huge performance lead over the 3080. Nope, 15-20%. Is that the difference in performance needed to go from 1080p to 2160p these days? :p
 
His reason probably is to put down people who cannot afford or justify buying a 3090 and make himself feel good.

If one did not know better, you would think the 3090 has a huge performance lead over the 3080. Nope, 15-20%. Is that the difference in performance needed to go from 1080p to 2160p these days? :p

Yeah, well not only is the 3090 really not significantly faster than the 3080, but the 3080 obliterates most games in 4k in the first place. I don't understand, what games are people playing that don't run at 4k with a 3080?
 
Yeah, well not only is the 3090 really not significantly faster than the 3080, but the 3080 obliterates most games in 4k in the first place. I don't understand, what games are people playing that don't run at 4k with a 3080?


Indeed 3080 is a great card for 4k. 3090 is single fps increase in perf @ 4k for double the money.

The reason people still dont get 4k is that they look at the apples to apples comparisons from reviewers that turn ALL settings to max - when in fact many game settings are there to enhance lower resolutions - so you turn those off. People that don't know what they are doing just buy the max to turn it to max and think it's top end of PC gaming. Thus a well set up 3080 @ 4k using the correct video settings will outperform a 3090 where someone just turns all settings to max - as they waste that 15% extra perf available rendering the settings designed to enhance low resolutions.

Some people are easily parted with their hard earned due to lack of understanding. If the 3090 was WORTH it then I'd have bought one. 15% for 100% more dollar. Er, nah.
 
Yeah, well not only is the 3090 really not significantly faster than the 3080, but the 3080 obliterates most games in 4k in the first place. I don't understand, what games are people playing that don't run at 4k with a 3080?

It’s funny because a lot of 3090 owners only have the card due to shortages on the 3080 because, which is obviously better value.

If you can afford it then why not, especially with pricing and supply getting increasingly worse. Some definitely feel they have to justify their purchase, and some are just trolling.

I’m loving my 3080 for high fps 1440p gaming. VRAM usage hasn’t been an issue.
 
The reason people still dont get 4k is that they look at the apples to apples comparisons from reviewers that turn ALL settings to max - when in fact many game settings are there to enhance lower resolutions - so you turn those off. People that don't know what they are doing just buy the max to turn it to max and think it's top end of PC gaming. Thus a well set up 3080 @ 4k using the correct video settings will outperform a 3090 where someone just turns all settings to max - as they waste that 15% extra perf available rendering the settings designed to enhance low resolutions.

I'm not even sure this is true though? I mean what benchmarks are being run that rule out 4k on a 3080? Certainly any games not running 4k well are exceptions rather than the rule, and generally for specific reasons. For example I know FS2020 is one of the worst of the bunch but MS actually have a reputation of releasing games from that franchise with futureproofed graphics because they generally intend the game to last many years into the future profiting off primarily DLC.

I do get the point that some visual settings are there to offset otherwise low resolutions, you dont need 8xMSAA @ 4k for example, it's just overkill. I've been replaying the Assassin's creed games and you can use 8xMSAA at 4K in ACII and it literally has no visual benefit over 4xMSAA for example, it's just too hard to actually make out visual differences. Actually going into the menu and setting 4k resolution and 8xMSAA just felt wrong in the moment. As a gamer there was something intrinsically kinda gross about how overkill that is, but it runs at 60hz just fine at about 30% GPU usage which is hilarious.

It’s funny because a lot of 3090 owners only have the card due to shortages on the 3080 because, which is obviously better value.

If you can afford it then why not, especially with pricing and supply getting increasingly worse. Some definitely feel they have to justify their purchase, and some are just trolling.

I’m loving my 3080 for high fps 1440p gaming. VRAM usage hasn’t been an issue.

Yeah the price delta enforced with these cards is often highly non-linear with respect to their performance, you pay over the odds to get the last few % of performance. And oddly enough the scalping has actually worked in a way to correct that imbalance. But I think the 3090 is definitely a red herring in this discussion, not sure what it has to do with the capabilities of the 3080 as a 4k card. The 3080 handles 4k just fine. It seems to me that some people have expectations that the memory demands of games in 4k are steep vs 1080p and they're really not. The frame buffer is still under 100MB for 4k, even if you add in AA it doesn't jump much. It's definitely higher but it's not like you need the 24GB of a 3090 to run 4k compared to 10GB of the 3080. All games in 4k right now on a 3080 happily run under the 10GB limit, when measuring the actual usage vs the allocation.
 
Owned them both, using a 3090 currently since i manage to sell my 3080 for crazy money and upgraded to a 3090 for zero cost, personally i think the 3090 is a waste of money and a 3080 is more than enough for any games you throw at it.
 
The latest versions can actually approximate vRAM that is in use, it was added into the beta a while back and you had to fiddle around to enable it, but it's there in the latest betas right out of the box and you can just enable it on the OSD. I actually reached out to a lot of the mainstream reviewers back when this feature was added in and asked if they were going to change their stance on measuring vRAM requirements. Especially those reviewers who were taking 24GB 3090s and claiming that some games required 23GB of vRAM (which is obviously stupid and wrong). But it was mostly radio silence. Interestingly enough it was the lesser known hardware enthusiasts who were doing benchmarking of their own cards on places like youtube who were ahead of the curve and adopted a stance of measuring both.

it looks like this thread has more less gone full circle, in all of this arguing has anyone actually come up with a game that requires more vRAM than the 3080 has (10Gb) which doesn't already have a GPU bottleneck on the frame rate? My guess would be no, but just interested on peoples findings. I finally got around to playing RDR2, and in 4k on my 3080, maxed out it's using something like 6Gb.
i debunked the myth of allocation:

https://www.youtube.com/watch?v=sJ_3cqNh-Ag

allocated vram shows as 7.7 gb and "dedicated usage" shows as 6.8 gb. yet vram related FPS drop happens anyways, and hugely. you would expect that with "real 6.8 gb" vram usage, game would not tank the performance. yet, it does

(7.7 gb is allocated for GODFALL alone, according to afterburner)

if it's really allocated in the terms of "empty vram", why does the vram spill to ram and tanks the performance?

there are something different going on here with this vram debacle. after this incident, i stopped trusting the dedicated vram value, because it had no meaning or whatsoever for this game

this game also debunks the myth that "gpu performance becomes bottleneck before vram is filled".

rtx 3070 can run this game at 1440p 50-60 fps ultra with ray tracing enabled. but once vram goes overboard, performance tanks to 30s. this is a huge performance deficit.

i tried lowering everything to medium and it still didn't help. it simply couldn't handle both 1440p and ray tracing in the same time

--

this benchmark is done by disabling everything in the background. no discord, no chrome. no extra vram consuming apps.

with discord enabled, even at 1080p you get frame drops.

i would like someone to test with a pretty normal pc config that have discord open in the background with 10-15 servers, geforce experience, and try running the godfall at 4k ultra with rt on. i bet it will experience same frame drops
 
Last edited:
Owned them both, using a 3090 currently since i manage to sell my 3080 for crazy money and upgraded to a 3090 for zero cost, personally i think the 3090 is a waste of money and a 3080 is more than enough for any games you throw at it.
Yeah the 3080 is the one to get but unfortunately though it's not the one nvidia wants you to buy.

The RADEON RX 6000 series is by far the best option this gen unless you really need G-Sync.

It's a decent option but certainly not "by far the best".
 
The RADEON RX 6000 series is by far the best option this gen unless you really need G-Sync.
Not at all lol, NO DLSS and ****** RT performance is a big no from me personally, dont get me started on how bad the drivers are when it comes to older or indie titles, not only that but they arent even cheaper than their Nvidia counterparts so you are paying effectively the same for a product with less features.
 
Not at all lol, NO DLSS and ****** RT performance is a big no from me personally, dont get me started on how bad the drivers are when it comes to older or indie titles, not only that but they arent even cheaper than their Nvidia counterparts so you are paying effectively the same for a product with less features.

DLSS is used by only a few games and has no benefit if your already getting 150FPS in pure rasterization. RT is a big hypejob that in real world gaming barely anyone cares that much about it right now. The drivers are flawless for me. The software is much nicer than Nvidia bloatware. Radeon 6000's could be had for less than Nvidia counterparts when I was looking. Big Navi 7nm process is far superior and uses less power compared to Amperes samsung 8nm bodge job.
 
DLSS is used by only a few games and has no benefit if your already getting 150FPS in pure rasterization. RT is a big hypejob that in real world gaming barely anyone cares that much about it right now. The drivers are flawless for me. The software is much nicer than Nvidia bloatware. Radeon 6000's could be had for less than Nvidia counterparts when I was looking. Big Navi 7nm process is far superior and uses less power compared to Amperes samsung 8nm bodge job.

Johnny, please stop talking fanboy rubbish. DLSS is a significant feature that is now gaining a lot of traction and that provides a huge boost to framerates, especially with RT enabled, and without much of a visual impact. It makes a difference especially at 4k or with lower tier cards. It provides a significant boost at 4k gaming for 3080/3090 owners and will do a lot to lift the performance of lower tier cards too. That is cold and objective fact.

The only notable disadvantage Nvidia in this generation have is in VRAM amounts... in other respects vs AMD they are overall the better performer and more sensible purchase. I get that you like your AMD and are happy with your new purchase, but biased nonsense like this has to be called out.
 
Last edited:
Or ray tracing, or DLSS

DLSS is only required to increase RT performance which is mostly needed in big triple A games that utilize that feature with all the bells and whistles at max. The radeon cards can also make use of Raytracing. The 6800XT for example has the equivalent raytracing performance to a 2080TI which is not too shabby IMO.

low latency

The Radeon cards have tons options to reduce latency.

or video encoding, or streaming friendly services

Are you a streamer?

Basically after reading from Jensens propaganda script you only have one single semi-relevant feature advantage over AMD this generation :D
 
Last edited:
Or ray tracing, or DLSS, or low latency, or video encoding, or streaming friendly services, etc. Really anything that you would want from a PC GPU.

If you do want a AMD GPU, get a console :D
DLSS is used by only a few games and has no benefit if your already getting 150FPS in pure rasterization. RT is a big hypejob that in real world gaming barely anyone cares that much about it right now. The drivers are flawless for me. The software is much nicer than Nvidia bloatware. Radeon 6000's could be had for less than Nvidia counterparts when I was looking. Big Navi 7nm process is far superior and uses less power compared to Amperes samsung 8nm bodge job.


Every day you have more and more AAA titles being supported with DLSS and please dont come with the 150 fps logic, no way you are playing games like control or Cyberpunk at 150 fps.
, so you telling me you can get AMD GPUS for less than a 650£ 3080 (they are hard to get but very possible, meanwhile AMD direct doesnt even ship cards to the UK),
if we are going by pure rasterisation the 3090 is still king anways making your argument fall apart anyways. Not a fanboy or anything like that but i would find it very hard to justify paying the same for an AMD card knowing that Nvidia is offering much more for the same price at the moment.
 
DLSS is only required to increase RT performance which is mostly needed in big poorly optriple A games that utilize that feature at ultra settings.

Basically after reading from Jensens propaganda script you only have one semi-relevant advantage Nvidia has over AMD this generation :D
I can see you have zero interest in engaging any logical discussion anyway... one for the ignore list methinks.
 
DLSS is only required to increase RT performance which is mostly needed in big triple A games that utilize that feature with all the bells and whistles at max. The radeon cards can also make use of Raytracing. The 6800XT for example has the equivalent raytracing performance to a 2080TI which is not too shabby IMO.



The Radeon cards have tons options to reduce latency.



Are you a streamer?

Basically after reading from Jensens propaganda script you only have one single semi-relevant feature advantage over AMD this generation :D



you must be trolling at this point, but please tell us, what advantages does AMD have then? because if you think that a feature that actually lets you fully use bells and whistles on games is useless i dont know what to say lol.
 
Status
Not open for further replies.
Back
Top Bottom