• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** NVIDIA GEFORCE RTX 3080 SERIES STOCK SITUATION - NO COMPETITOR DISCUSSION ***

Associate
Joined
20 Oct 2020
Posts
269
I saw a few on here the other day concerned about 3080 performance because of cyberpunk, e.g it's already hammering it and it's only just come out so is the VRAM enough for future titles... If you can afford a Ti and that's what you want then by all means but I wouldn't let a game like cyberpunk decide it for you, it's clearly a very hard game to run, like Crysis was in the past or GTA V even now could probably still make a 3080 sweat. Some games are just hard to run, especially open world games. Personally I don't think 10gb of GDDR6x is going to be a problem for quite a while, sure as time goes on you'll have to turn more and more settings down but that no different from any other launch. I'd be very surprised if in 4 years time the 3080 isn't comparable to how the 1080 handles games today...

I agree that the 3080 will be just as competent in 4 years as a 1080 is now - and that is more competent than most people realise.

What cracks me up is that some people hold two contradictory beliefs at the same time. They'll say that a 3080 is overkill unless you game at 4k, and then later say that 10gb vram isn't enough for 4k going forward. LOL So maybe it's the perfect 1440p card then? Maybe the issue is the unrealistic demands of 4k - go buy a 3090 then, problem solved.

There are a few other things that a lot of people misunderstand.
1. Just because a game shows it's 'using' the full complement of vram, doesn't mean that it is. Usually it's just pre-caching what's available.
2. People are completely ignoring the bandwidth and just focusing on the amount of vram.
3. Game development is mostly capped by consoles capability, and by the most common PC GPU's - which is usually a 60 series by a country mile. An 8 core pc with a 3080 easily outperforms a next gen console now, and still will in 8 years time at the end of the current gen - if there even is a next gen...could be all cloud based by then. Games will still be playable on a 3060 in 4 years time, just like they are playable with a 1060 now.

I have a 3gb 780, and a 4gb 770, both the same age. If you were stuck and needed a temporary gpu to get by on, you'd definitely want the 3gb 780. I can make similar comparisons other gens, 6gb 980ti vs plenty of 8gb cards. It's always been like this.

Imo the 3080, just like the 1080, will be enough for most people to skip a gen.
 
Last edited:
Associate
Joined
18 Sep 2020
Posts
558
Location
England
I agree that the 3080 will be just as competent in 4 years as a 1080 is now - and that is more competent than most people realise.

What cracks me up are that some people hold two contradictory beliefs at the same time. They'll say that a 3080 is overkill unless you game at 4k, and then later say that 10gb vram isn't enough for 4k going forward. LOL So maybe it's the perfect 1440p card then? Maybe the issue is the unrealistic demands of 4k - go buy a 3090 then, problem solved.

There are a few other things that a lot of people misunderstand.
1. Just because a game shows it's 'using' the full complement of vram, doesn't mean that it is. Usually it's just pre-caching what's available.
2. People are completely ignoring the bandwidth and just focusing on the amount of vram.
3. Game development is mostly capped by consoles capability, and by the most common PC GPU's - which is usually a 60 series by a country mile. An 8 core pc with a 3080 easily outperforms a next gen console now, and still will in 8 years time at the end of the current gen - if there even is a next gen...could be all cloud based by then. Games will still be playable on a 3060 in 4 years time, just like they are playable with a 1060 now.

I have a 3gb 780, and a 4gb 770, both the same age. If you were stuck and needed a temporary gpu to get by on, you'd definitely want the 3gb 780. I can make similar comparisons other gens, 6gb 980ti vs plenty of 8gb cards. It's always been like this.

Imo the 3080, just like the 1080, will be enough for most people to skip a gen.
sorry but you'll have to leave, no logic allowed here.
 
Associate
Joined
3 Oct 2020
Posts
13
MSI 3080 Trio X, I went from 36th which I've been on for at least the past month or 2, to 18th in the recent email update. though the last update on these had 50pcs expected incoming. I wonder what's happened there, or if most of them just haven't been shipped yet.
 
Associate
Joined
25 Nov 2020
Posts
82
My Zotac Holo 3080 arrived this morning, and 5 mins later a DPD delivery dropped with a Gigabyte Aorus 3070 from a long forgotten late night panic buying session a few months back. And an email just informed me I have a 3060ti on the way.

The wife has been giving me filthy looks all day. Think she's seen the credit card balance for the month...
 
Soldato
Joined
26 Apr 2013
Posts
4,831
Location
Plymouth
seems like ASUS has the best supply by far compared to others, while having the best PCB design and no corners cut to the power delivery (best power stage design).
+10 to ASUS street cred

Except if you ordered the Strix. From the update thread, that seems to be one of the worst supplied GPUs.
 
Associate
Joined
29 Jan 2017
Posts
186
Location
Scotland
MSI 3080 Trio X, I went from 36th which I've been on for at least the past month or 2, to 18th in the recent email update. though the last update on these had 50pcs expected incoming. I wonder what's happened there, or if most of them just haven't been shipped yet.

You obviously don't look at these forums much do you:D Apparently only 20 came in, although only went down 18 places myself to position 21. The rest are probably coming after Christmas, possibly even into the new year now...
 
Associate
Joined
20 Oct 2007
Posts
41
3. Game development is mostly capped by consoles capability, and by the most common PC GPU's - which is usually a 60 series by a country mile.

Another thing people seem to keep missing. I've seen a lot of people complaining that RTX isn't worth it because of the FPS hit. I don't think they realize that ray tracing isn't so much a feature for users, as it is a feature for developers. Before AAA games would have entire lighting departments to simulate realistic lighting with lots of neat little tricks, this added to the cost and time of development. Now with a flick of the switch (oversimplification) any dev can access that quality.

For now certainly it won't make a huge difference to devs, as they'll need to spent extra time optimizing in order to keep frame rates high (or use DLSS) but next gen, or maybe the gen after when it becomes the standard it's going to cut dev costs by a fair chunk.
 
Associate
Joined
20 Oct 2020
Posts
269
Another thing people seem to keep missing. I've seen a lot of people complaining that RTX isn't worth it because of the FPS hit. I don't think they realize that ray tracing isn't so much a feature for users, as it is a feature for developers. Before AAA games would have entire lighting departments to simulate realistic lighting with lots of neat little tricks, this added to the cost and time of development. Now with a flick of the switch (oversimplification) any dev can access that quality.

For now certainly it won't make a huge difference to devs, as they'll need to spent extra time optimizing in order to keep frame rates high (or use DLSS) but next gen, or maybe the gen after when it becomes the standard it's going to cut dev costs by a fair chunk.

Absolutely. Spend a few hours with free software like Blender's Cycles and it doesn't take long to realise that most people are capable of creating photorealism with ray tracing, that it doesn't take an artist anymore. It's a major milestone.
 
Associate
Joined
5 May 2016
Posts
626
Location
Wales UK
If 50 people below you cancelled, but they were above the other people in the queue then you would move but the others would not. Unlikely, yes, but not impossible.

His issue is that people below him have moved down queue twice as far as him ie over 100 places. he only moved 51, hence why the big discrepancy. If they moved at least 100 so should he.
 
Associate
Joined
25 Sep 2020
Posts
169
I agree that the 3080 will be just as competent in 4 years as a 1080 is now - and that is more competent than most people realise.

What cracks me up is that some people hold two contradictory beliefs at the same time. They'll say that a 3080 is overkill unless you game at 4k, and then later say that 10gb vram isn't enough for 4k going forward. LOL So maybe it's the perfect 1440p card then? Maybe the issue is the unrealistic demands of 4k - go buy a 3090 then, problem solved.

There are a few other things that a lot of people misunderstand.
1. Just because a game shows it's 'using' the full complement of vram, doesn't mean that it is. Usually it's just pre-caching what's available.
2. People are completely ignoring the bandwidth and just focusing on the amount of vram.
3. Game development is mostly capped by consoles capability, and by the most common PC GPU's - which is usually a 60 series by a country mile. An 8 core pc with a 3080 easily outperforms a next gen console now, and still will in 8 years time at the end of the current gen - if there even is a next gen...could be all cloud based by then. Games will still be playable on a 3060 in 4 years time, just like they are playable with a 1060 now.

I have a 3gb 780, and a 4gb 770, both the same age. If you were stuck and needed a temporary gpu to get by on, you'd definitely want the 3gb 780. I can make similar comparisons other gens, 6gb 980ti vs plenty of 8gb cards. It's always been like this.

Imo the 3080, just like the 1080, will be enough for most people to skip a gen.

I couldn't agree more. I've just bought myself a HDMI 2.1 cable ready for when this non OC tuf shipment arrives, can't wait to check out some 4k 120 FPS gaming. I bought an LG 144hz 2k monitor not long back, (the GL850 I think). Then literally the following week my TV screen died on my Samsung KS9000 for a second time, anyway it was covered and a certain retailer that shares the name of an Indian dish took it off for repair. They couldn't repair it and so they replaced it with a 52" LG OLED which allows 120 FPS @ 4k. I probably wouldn't have bothered with the monitor if I'd have known, but oh well it saved me going without a screen for 2 weeks. Going from 60FPS to 144FPS with lower latency was pretty awesome, I was all of a sudden good at Modern Warfare rather than pretty average, although I have to play 1080p for now.
 
Associate
Joined
3 Oct 2020
Posts
13
True, I wish I was more like you tbh. Looking at this thread so often has been a form of torture:D
I lurked damn near constantly for the first month or so and just couldn't take it anymore, I was just giving myself that awful sense of agitation and anticipation. so I just gave myself **** to look forward to that I actually had a decent amount of control over.
 
Associate
Joined
25 Sep 2020
Posts
169
seems like ASUS has the best supply by far compared to others, while having the best PCB design and no corners cut to the power delivery (best power stage design).
+10 to ASUS street cred

Prior to these incoming shipments I would've disagreed with that, as the OC tuf was seeing a massive bias over non OC cards with about 8X as many being delivered. But they seem to be on the way to rectifying that bias.
 
Back
Top Bottom