• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

A simple test, then: how does the RTX 3050 compare with the GTX 980? By that metric, the 3050 = 2060 = 1070 = 980.

And it more or less works out: https://www.gpucheck.com/compare/nv...core-i7-4770k-3-50ghz-vs-intel-core-i9-10900k

It seems the 3050 is more equivalent to the 980 Ti: https://www.gpucheck.com/compare/nv...core-i7-4790k-4-00ghz-vs-intel-core-i9-10900k

Note that the CPU used for the 3050 is rather better than those used for the 980 & 980 Ti so some of the improvement can be put there.

Funnily enough, I have a 3050 and a Titan (really a 980 Ti with extra VRAM) so I might give it a whirl.
Okay, but what about the release dates?
Wiki (https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units), says:
980 Ti - June 2015
1070 - June 2016 - 366 days
2060 - Jan 2019 - 944 days
3050 - Jan 2022 - 1096
Which might throw out any predictions about performance in my previous post.

Anyway, just realised that even assuming new gens every 2 years, my previous "start with the 3070" thing was wrong as it had 2.5 years so eventually the x70 tier would be one gen more. This gets complex to calculate.
 
Might be off topic, but what makes more purchasing sense over say a 10 year period:

5x 3060 tier performance (2 years) ~ £2,000
4x 3070 tier (2.5 years) ~ £2,200
3x 3080 tier (3 to 3.5 years) ~ £2,300
2x 3090 tier (5 years) ~ £3,500

(time between upgrades)

I should probably tweak this so the total costs are more in line, but you get the gist. Do you think there is a sweet spot of GPU performance vs. game requirements? a 3090 at year 1 would crush it, but how would it compare say at year 4 vs a GTX 7060 or whatever is out in 2026?

I'm weighing up what level of card to buy when 4000 series is out. My goal is to have settings on high @ 4k with @120hz, which I believe the 4000 series will achieve at 4070 and higher.

It will depend on which resolution you game and what are your needs. For instance, I prefer gaming at 5784x1080 (I find it a way better experience than 4k or whatever), but I can always drop to 1080p if needed. So with a RTX2080 (which back in the day could be called overkill for such a low res), I can play nicely RT games (with some DLSS) and would probably last for this entire console generation while it was bought way before its launch.
Right now, I think a 3060ti is a bit better than the rtx2080, meaning a 4060ti will probably offer close to 3080 level of performance if not better. That will last for a looong time in lower res. gaming.

On the other hand, if you're into 4k and/or high FPS and then 8k and so on, you don't have a choice, but buy the best there is. Staying on lower resolution display could have you spend two, maybe three times in a decade, while chasing the best means buying (almost) the best there is each time.
 
And as with most of those kind of shows, who's going to make sure there's no cheating? Says they're going to smash their 3090 with a hammer, but curiously bought a no-longer-working sold-as-parts one on flea bay just shortly before... A bit of editing and viola junk 3090 smashed!


So that makes it a far harder to calculate.
Searching for deprecation of GPUs, I got a Techspot article from 2020 saying 15%, so went with that:
smNBKbS.png
Obviously 20% per year would totally change this. And it is likely that a true rate of depreciation is highest in the first year (aka the unwrapping = used goods concept).

What this doesn't take into account is your performance. If each two years, the cards go up one tier, then the 3060 buyer will get 3070 performance when they buy the 4060, then 3080 performance once they buy the 5060, etc.
So for the *60 buyers:
1 to 2: 3060 performance
3 to 4: 3070 performance
5 to 6: 3080 performance
7 to 8: 3090 performance
9 to 10: 3090+ performance.

And for the *70 buyers:
1.0 to 2.5: 3070 performance
2.5 to 5.0:: 3080 performance
5.0 to 7.5: 3090 performance
7.5 to 10.0: 3090+ performance.

Don't think I've made any decisions easier!

The 3090 pricing is slightly off - MSRP is £1400

| Tier | Per Decade | Years | Total Buy | Cost Each | Depreciation | Used | Net Cost | Total Costs |
| 3090 | 2 | 5 | 2800 | 1400 | 15% | 621 | 779 | 1558 |

Not to mention the price between releases for the top end is going up about 50%, but lets use 30%

2080ti - 900
3090 - 1400
4090 - 1820 (3090ti is 1879 so I dont think this is too far off)
5090 - 2366

I'd imagine the lower end to increase in price less, but it changes the payoff matrix slightly.
 
The 3090 pricing is slightly off - MSRP is £1400

| Tier | Per Decade | Years | Total Buy | Cost Each | Depreciation | Used | Net Cost | Total Costs |
| 3090 | 2 | 5 | 2800 | 1400 | 15% | 621 | 779 | 1558 |

Not to mention the price between releases for the top end is going up about 50%, but lets use 30%

2080ti - 900
3090 - 1400
4090 - 1820 (3090ti is 1879 so I dont think this is too far off)
5090 - 2366

I'd imagine the lower end to increase in price less, but it changes the payoff matrix slightly.

Sorry, yes I'd started off with @Gothamgoblin's figures but just filled the cost down for that Total buy costs, corrected it looks like this:
iJezJoO.png
But anyway, there is no predicting the future and it gets very hard to calculate correctly, not to mention that those who start of on x60/x600 tier are going to have wait a very long time until they get to the performance the original x90/x900 tier buyers got.
 
But anyway, there is no predicting the future and it gets very hard to calculate correctly, not to mention that those who start of on x60/x600 tier are going to have wait a very long time until they get to the performance the original x90/x900 tier buyers got.

As long as you had fun running the numbers that you had. The last 18 months or so has shown that you can only come up with a traditional average, this gen it was sometimes possible to sell new and old cards for more than you paid for them and end up with a free gpu upgrade, eg stories of miners offering a new 6700xt in exchange for a 5700xt
 
Only if you insist on the highest quality settings. I was gaming at 4k many years ago
Used to be true, but no longer the case since the advent of raytracing. It used to be that you'd drop shadow map size in half and get twice the fps (and barely notice a difference), and other such ridiculous scenarios, but now the visual difference can be staggering (see example below), so it's not so easy to just "drop the settings from ultra". In some cases it could be almost like playing two different games. Obviously mostly applicable to single-player AAA-type games, but still. Tho on the positive side we can also make do with lower resolution rendering much better than in the past & AA has much improved (which was a main consideration for resolution-bumping).

12dkTmQ.gif

NtWLKMN.gif
 
Used to be true, but no longer the case since the advent of raytracing.

Yes, RT makes an incredible difference in visuals. But I still class RT as 'highest' quality. When the RTX 20 series came out I opined that RT was basically a demo at the time and it would take 3 generations (i.e. the 40 series) for RT to hit the mainstream. I'm still hopeful that that will still be the case but game developers are still concentrating on non-RT methods so I may have to put it back a generation. The games which fully commit to RT are few and far between.
 
Yes, RT makes an incredible difference in visuals. But I still class RT as 'highest' quality. When the RTX 20 series came out I opined that RT was basically a demo at the time and it would take 3 generations (i.e. the 40 series) for RT to hit the mainstream. I'm still hopeful that that will still be the case but game developers are still concentrating on non-RT methods so I may have to put it back a generation. The games which fully commit to RT are few and far between.

Really? Interesting opinion. I have seen quite a few champion the RT settings on most games lately and this would be on current gen cards. Maybe its to justify having it as its available but many recommendation posts would slip in the "..get the rtx 30x0 as its much better at ray tracing" when there is an AMD alternative.
 
I will believe it when I see it. I stopped taking notice what MLID and the like say as they get things wrong often. I just hope we get it this year and we at least get 4070 hitting 3090 performance at least with much better RT for around £499.
 
I will believe it when I see it. I stopped taking notice what MLID and the like say as they get things wrong often. I just hope we get it this year and we at least get 4070 hitting 3090 performance at least with much better RT for around £499.

7FT7KL9.png



Don't think £500 will happen, expecting closer to £600 but still, even at £700, it would be a bargain :p :D
 
I will believe it when I see it. I stopped taking notice what MLID and the like say as they get things wrong often. I just hope we get it this year and we at least get 4070 hitting 3090 performance at least with much better RT for around £499.

I wouldn't hold my breath. I expect pricing will be £650, £850 and £1500 for 4070, 4080 and 4090. Performance could even be better than 3090 though.

Yes, we had this conversation last generation as well :D
 
Just remember £650-700 could have got you 90% of a 3090 performance a year and half ago.

So a £600 4070 edging out a 3090 in Q4 2022 isn't that great at all, be careful of Nvidia smoke and mirrors, the 3090 is a very very poor value GPU for gamers and always was.

2X a 3080 @£700 incl. additional vram would be compelling but you can bet Nvidia will do everything in their power not to give that to you even though they could.
 
With AMD being competitive and the crypto cycle coming to an end, Nvidia will need to have competitive products otherwise AMD can take a big chunk of their market.

I'll be disappointed if 4070 didn't end up beating 3090 comfortably, and I expect 4080 to be at least 2x a 3070. The move from Samsung 8nm to TSMC 4nm should enable most of the gains, with IPC improvements providing the rest.
 
Just remember £650-700 could have got you 90% of a 3090 performance a year and half ago.

So a £600 4070 edging out a 3090 in Q4 2022 isn't that great at all, be careful of Nvidia smoke and mirrors, the 3090 is a very very poor value GPU for gamers and always was.

2X a 3080 @£700 incl. additional vram would be compelling but you can bet Nvidia will do everything in their power not to give that to you even though they could.

No doubt if you picked up a 3090 at launch or within first few months, the 4070/4080 won't be as lucrative, only a 4090 will be or ideally those with 3090s would wait till the 50xx series. That is where likes of @TNA and myself always point out that the xx70/xx80 are the best cards you can get as the money you save (for what is it, 10-15% perf loss? If that....) can buy the next gen xx70/xx80, which will ultimately match or beat the previous top end card. HUB also stated that recently in their previous video in terms of value, they didn't even recommend the 6900xt or 3090 at MSRP.

3090/4090 kind of gpus are for people with no care in value and/or with money to burn.

With AMD being competitive and the crypto cycle coming to an end, Nvidia will need to have competitive products otherwise AMD can take a big chunk of their market.

I'll be disappointed if 4070 didn't end up beating 3090 comfortably, and I expect 4080 to be at least 2x a 3070. The move from Samsung 8nm to TSMC 4nm should enable most of the gains, with IPC improvements providing the rest.

Impossible to say but going by history, a 4070 in theory should match 3090 in raster., maybe be a bit worse but RT should improve by a decent chunk, I'm expecting at least a 30% gain in RT alone..... Given how much RT has taken of with this gen of gpus, I can only see it gaining far more traction going forward now especially if RDNA 3 can match/beat ampere RT at the very least.
 
Back
Top Bottom