• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
From taking a glance at your sig it looks like
Every day you have more and more AAA titles being supported with DLSS and please dont come with the 150 fps logic, no way you are playing games like control or Cyberpunk at 150 fps.
, so you telling me you can get AMD GPUS for less than a 650£ 3080 (they are hard to get but very possible, meanwhile AMD direct doesnt even ship cards to the UK),
if we are going by pure rasterisation the 3090 is still king anways making your argument fall apart anyways. Not a fanboy or anything like that but i would find it very hard to justify paying the same for an AMD card knowing that Nvidia is offering much more for the same price at the moment.

Maybe I am not interested in the likes Control (more of a Nvidia pure VFX showcase than a game) or Cyberspunk 2077 (one of the biggest let downs of the modern gaming industry) ??? I don't think the majority of gamers are either to be honest. Yes I get 150 fps in actual good fun games like Hell let Loose on my Radeon at max settings.
No you can never get any 3080 for £650. Try doubling that figure and your in the right ballpark, meanwhile OCUK themselves had a number of Radeon 6000 GPU's for sale abit above MSRP a few months ago. (Much better than Nvidia scalper prices.)
The 3090 is not a good GPU in terms of performance/price/power consumption when compared to Big Navi (the real king of this gen).
 
From taking a glance at your sig it looks like


Maybe I am not interested in games like Control (a purely VFX showcase nvidia game) or Cyberspunk 2077 (one of the biggest let downs of the modern gaming industry) ??? I don't think the majority of gamers are either to be honest. Yes I get 150 fps in actual good game like Hell let Loose on my Radeon at max settings.
No you can never get any 3080 for £650. Try doubling that figure and your in the right ballpark, meanwhile OCUK themselves had a number of Radeon 6000 GPU's for sale abit above MSRP a few months ago. (Much better than Nvidia scalper prices.
The 3090 is not a good GPU in terms of performance/price/power consumption when compared to Big Navi (the real king of this gen).


i got a 3080 for that price and so did my mates so i know it is very possible..... you still havent told me what is AMD advantage other than the power consumption.
 
Not a fanboy or anything like that but i would find it very hard to justify paying the same for an AMD card knowing that Nvidia is offering much more for the same price at the moment.
Nvidia isn't offering really anything when there's no availability and god knows how long queues.
There's likely some people who ordered soon after release still waiting for their cards.
And what little stocks some shops have like some 3090 models, those are simply insanely expensive costing 2½-3 times the rest of the PC.

No one deserves any compliments from this round of GPUs!
 
i got a 3080 for that price and so did my mates so i know it is very possible..... you still havent told me what is AMD advantage other than the power consumption.

Literally no one is going to believe you were one of the 1% of people that was lucky enough to obtain a 3080 for MSRP without proof... Let alone your "friends".

You still haven't told me what is Nvidia's advantage other than the DLSS. ;)
 
DLSS is only required to increase RT performance which is mostly needed in big triple A games that utilize that feature with all the bells and whistles at max. The radeon cards can also make use of Raytracing. The 6800XT for example has the equivalent raytracing performance to a 2080TI which is not too shabby IMO.

You should take some time to learn what DLSS is before posting.

Even on a smaller node and higher clock frequency, RT on the 6000 series is garbage. That's the reason why many including myself skipped Turing.

The Radeon cards have tons options to reduce latency.

Nvidia provide Reflex, which out performs AMD's 'tons options'.

Are you a streamer?

No, I'm a software developer, but I do stream video across the pond from time to time.

Basically after reading from Jensens propaganda script you only have one single semi-relevant feature advantage over AMD this generation :D

At the end of the day I have a PC GPU and you, who didn't do your homework, ended up with a console chip :cry:
 
Literally no one is going to believe you were one of the 1% of people that was lucky enough to obtain a 3080 for MSRP without proof... Let alone your "friends".

You still haven't told me what is Nvidia's advantage other than the DLSS. ;)


for starters better 4K performance, believe it or not DLSS is really nice when paired with 4k resolution so playing warzone at 4k with around 150 fps is not something possible at all on a AMD GPU, i only use my GPU for gaming so performance and bells and whistles is all that matters to me.
Guess i could say Nvidia has other advantages like better mining but its something i dont intend to use so i cant comment on it.
 
Literally no one is going to believe you were one of the 1% of people that was lucky enough to obtain a 3080 for MSRP without proof... Let alone your "friends".

You still haven't told me what is Nvidia's advantage other than the DLSS. ;)

I got my 3080FE for MSRP straight from Scan a couple weeks ago as well. I got the notification on my phone (from a discord) and purchased it straight away from my phone. Didnt even check out very quickly...
 
You should take some time to learn what DLSS is before posting.

Even on a smaller node and higher clock frequency, RT on the 6000 series is garbage. That's the reason why many including myself skipped Turing.

I am very aware of what DLSS is. It's a proprietary AI rendering technique used to upscale resolution with minimum fidelity loss. RT on the 6000 is not garbage it also works perfectly fine in Unreal engine.

The 6000 series has higher clockspeeds and more VRAM.

Nvidia provide Reflex, which out performs AMD's 'tons options'.

In real world usage many people would disagree. A quick google search will reveal that Ampere GPU's are not immune from "stutter".

At the end of the day I have a PC GPU and you, who didn't do your homework, ended up with a console chip :cry:

And yet my rig blows consoles out the water... In fact if you did your research you'd find Nvidia wanted to use the same node as AMD but tried to price gouge TSMC and it backfired so were forced to go with Samsungs crappy node.
 
And yet my rig blows consoles out the water... In fact if you did your research you'd find Nvidia wanted to use the same node as AMD but tried to price gouge with TSMC and it backfired so had to go with Samsungs crappy node.

Wrinkly is fond of spouting that "its a console chip" crap, doesn't say much for nvidia if their top end is only marginally faster.
 
I am very aware of what DLSS is. It's a proprietary AI rendering technique used to upscale resolution with minimum fidelity loss. RT on the 6000 is not garbage it also works perfectly fine in Unreal engine.

The 6000 series has higher clockspeeds and more VRAM.



In real world usage many people would disagree. A quick google search will reveal that Ampere GPU's are not immune from "stutter".



And yet my rig blows consoles out the water... In fact if you did your research you'd find Nvidia wanted to use the same node as AMD but tried to price gouge with TSMC and it backfired so had to go with Samsungs crappy node.



3090 has more Vram, not only that its faster than AMD vram, 1 minute of research would show you that RT performance on 6XXX cards is much much worse, even on games like DIRT 5 it was surpassed by Nvidia after latest drivers updates (and that was a AMD sponsored title), not just that but AMD with higher clockspeeds still loses in 4k gaming.

Nvidia reflex has nothing to do with stuttering.
 
I am very aware of what DLSS is. It's a proprietary AI rendering technique used to upscale resolution with minimum fidelity loss. RT on the 6000 is not garbage it also works perfectly fine in Unreal engine.

People get so defensive over AMD RT; it's their first attempt, you can't expect it to be good on the first go, and it isn't - plenty videos showing it.

The 6000 series has higher clockspeeds and more VRAM.

And? Just because a car can rev higher or has more displacement, doesn't mean it's faster...

In real world usage many people would disagree. A quick google search will reveal that Ampere GPU's are not immune from "stutter".

A quick Google search can reveal problems about anything.
 
3090 has more Vram, not only that its faster than AMD vram, 1 minute of research would show you that RT performance on 6XXX cards is much much worse, even on games like DIRT 5 it was surpassed by Nvidia after latest drivers updates (and that was a AMD sponsored title), not just that but AMD with higher clockspeeds still loses in 4k gaming.

Nvidia reflex has nothing to do with stuttering.

3090 comes at double the price of a 6800Xt and for not much benefits. Most people don't NEED a 3090 at all.

I don't think anybody is arguing AMD have better Raytracing performance being one generation behind but the 6800XT/6900XT Raytracing performance is not "crap" like you and others are making out.

Nobody for example argued 2080TI raytracing performance was crap before, it's still not crap just not as good as the newer gen from nvidia. If for example you plan to work on UE4's raytracing the 6000 is good enough. If you just want to play that Cyberspunk flop at the ultra max RT settings in 4k you may be slightly disappointed with radeon 6000.
Or you could just turn down the settings abit and only enable RT reflections...
 
Last edited:
Literally no one is going to believe you were one of the 1% of people that was lucky enough to obtain a 3080 for MSRP without proof... Let alone your "friends".

You still haven't told me what is Nvidia's advantage other than the DLSS. ;)


Nvidia doesn't have this https://www.guru3d.com/news-story/r...es-with-high-certain-fps-intensive-games.html :D Must be a 'shrout' somewhere behind this.

If you are lucky enough to have either the 3090/3080 or a 6900XT/6800XT then apart from a few frames here and there you are up there in today's PC gaming able to process high fps at high resolutions.

DLSS 2.0 works well if it has been well implemented and I hope AMD's equivalent will come to fruition soon. Get which ever you can at the best price - anyone to have any of the top 4 cards for a decent price is lucky and can game at high resolutions and high FPS NOW.
 
I am very aware of what DLSS is. It's a proprietary AI rendering technique used to upscale resolution with minimum fidelity loss. RT on the 6000 is not garbage it also works perfectly fine in Unreal engine.

AS I already said, go read up about DLSS first.

RDNA2 is ~50% slower than Ampere during raytracing. ~75% slower when Ampere is using DLSS. That's garbage for a new GPU.

The 6000 series has higher clockspeeds and more VRAM.

Higher clock speed, more VRAM and smaller node yet only competes with Ampere in rasterisation.

In real world usage many people would disagree.

Both Steam and Twitch would put them in the minority.

A quick google search will reveal that Ampere GPU's are not immune from "stutter".

I don't think we have ever had a system that was immune to stutter, though I'm not sure what that has to do with latency. https://www.overclockers.co.uk/foru...ues-with-unstable-fps-in-many-games.18927118/

And yet my rig blows consoles out the water... In fact if you did your research you'd find Nvidia wanted to use the same node as AMD but tried to price gouge TSMC and it backfired so were forced to go with Samsungs crappy node.

We know Nvidia went with the cheaper solution, yet still managed to come out on top. Could you imagine where AMD would be today had Nvidia went with 7nm?
 
Wrinkly is fond of spouting that "its a console chip" crap, doesn't say much for nvidia if their top end is only marginally faster.

Marginally ....

DirectX Raytracing Feature Test

1 GPU
  1. Score 65.23, GPU 3090 @2250/5512, CPU 10900k @5.3, Post No.0491, Jay-G25 - Link Drivers 460.89
  2. Score 64.34, GPU 3090 @2235/5344, CPU 7820X @4.7, Post No.0489, anihcniedam - Link Drivers 460.89
  3. Score 63.87, GPU 3090 @2205/5328, CPU 6950X @4.401, Post No.0496, FlyingScotsman - Link Drivers 460.89
  4. Score 63.14, GPU 3090 @2265/4876, CPU 5950X @4.8, Post No.0462, OC2000 - Link Drivers 460.79
  5. Score 62.98, GPU 3090 @2205/5328, CPU 9900KF @5.0, Post No.0379, spartapee - Link Drivers 457.09
  6. Score 62.38, GPU 3090 @2160/4976, CPU 9900k @5.0, Post No.0480, Raiden85 - Link Drivers 460.89
  7. Score 61.68, GPU 3090 @2130/5076, CPU 5950X @4.949, Post No.0531, Grim5 - Link Drivers 466.11
  8. Score 61.61, GPU 3090 @2115/5128, CPU 9980XE @4.5, Post No.0487, Greebo - Link Drivers 460.89
  9. Score 60.23, GPU 3090 @2145/5176, CPU 3175X @4.8, Post No.0415, sedy25 - Link Drivers 457.30
  10. Score 58.58, GPU 3090 @2100/5276, CPU 3600X @4.4, Post No.0445, Bickaxe - Link Drivers 457.51
  11. Score 55.57, GPU 3090 @1980/4876, CPU 5950X @4.1, Post No.0429, Kivafck - Link Drivers 457.30
  12. Score 55.57, GPU 3090 @1995/4876, CPU 10900k @5.1, Post No.0357, Sedgey123 - Link Drivers 457.09
  13. Score 55.50, GPU 3090 @2085/5076, CPU 3800X @4.7, Post No.0450, ChrisUK1983 - Link Drivers 457.51
  14. Score 55.47, GPU 3090 @2040/4876, CPU 5900X @3.7, Post No.0423, atomic7431 - Link Drivers 457.30
  15. Score 54.39, GPU 3090 @1905/5176, CPU 10900k @5.2, Post No.0446, kipperthedog - Link Drivers 457.51
  16. Score 52.24, GPU 3080 @2235/5252, CPU 3900X @4.649, Post No.0413, haszek - Link Drivers 457.09
  17. Score 50.56, GPU 3080 @2145/5248, CPU 3600 @4.4, Post No.0411, TNA - Link Drivers 457.30
  18. Score 34.15, GPU 6900X @2625/4280, CPU 5800X @5.049, Post No.0477, 6900 XT - Link Drivers 20.12.2
  19. Score 33.31, GPU 3070 @2085/4050, CPU 3175X @4.12, Post No.0392, sedy25 - Link Drivers 457.09
  20. Score 32.54, GPU 2080Ti @2130/3500, CPU 3950X @4.301, Post No.0357, Grim5 - Link Drivers 452.06
  21. Score 29.91, GPU 2080Ti @1980/3500, CPU 8700 @4.3, Post No.0391, Quartz - Link Drivers 456.55
  22. Score 23.96, GPU 6800 @2295/4220, CPU 3900X @4.541, Post No.0459, Chrisc - Link Drivers 20.12.1
  23. Score 21.36, GPU 2080 @2025/4050, CPU 9900k @5.0, Post No.0365, Cooper - Link Drivers 457.09
 
Status
Not open for further replies.
Back
Top Bottom