• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Used values are definitely going to be interesting following the 50-series launch, especially as 4090 prices are so strong now and everything is pointing to the 5080 being slower and likely around the £1100-1200 mark.

Chinese uni students, inflating the 2nd hand market at the moment :D

I can see some 4090's dropping to 1K or lower if the market gets flooded with 2nd hand availability, so high at the moment, due to limited availability.
 
Used values are definitely going to be interesting following the 50-series launch, especially as 4090 prices are so strong now and everything is pointing to the 5080 being slower and likely around the £1100-1200 mark.

They're all over the place. I nearly went for the waterblicked one that went for £1000 on MM. then there are the ones people in this thread have sold for £1500 plus, "some apparently to Chinese students taking them home. I hope they're happy when the Chinese Cyberdine AI rules the world!

Edited for clarity of reference
 
Last edited:
They're all over the place. I nearly went for the waterblicked one that went for £1000 on MM. then there are the ones people in this thread have sold for £1500 plus, "some s as patently to Chinese students taking them home. I hope they're happy when the Chinese Cyberdine AI rules the world!

Edited for clarity of reference

If/when I upgrade then mine will go on the MM, partly because I'd rather sell it to a forum member than a Chinese student (lol) but also because I wouldn't sell anything worth three figures on eBay, let alone four figures.
 
out of curiosity, was the jump from the 30 series to the 40 one of the biggest ever?
It seems like it is, and im hoping the 50 series has a similar uplift gen to gen
I would say the jump from 2000 to 3000 was bigger.
maybe even 900 - 1000

I must admit having a 3090 only the 4090 remotely interested me but even that I decided was an entirely skippable card. I had planned on jumping to a 5080 (going 1 down the stack but a 2 gen improvement) but if the rumours are true it may not be worth it either.

what with PC gaming generally stagnating tbh I am not optimistic about next gen either. maybe I will just wait and get a 6080 or just wait for my 3090 to pop.
 
Last edited:
out of curiosity, was the jump from the 30 series to the 40 one of the biggest ever?
It seems like it is, and im hoping the 50 series has a similar uplift gen to gen
It seems like everyone is jumping in on this question, so I'll also put in my 2 cents (and try to stick to the facts as best as possible). It really depends on resolution, game, etc. As cards get older they age more poorly and the newer software seperates them out more and more (due to things like increasing VRAM requirements). Any general numbers of this GPU is x% better than that GPU are usually overall averages at a given resolution.

This video by hardware unboxed does a very factual numbers to numbers comparison between generations and shows how bad things are, especially as of the RTX 4000 series:

Else this chart is a quick overall view of how things look, but admittedly at 1080p and keeping in mind newer games were also used, so older GPUs may perform disproportionately worse or there could be CPU bottlenecking.
odX4dmxSVcAKwfs6pcqvJL-970-80.png.webp
Historically, the jumps have been significant all the way down the stack.

980ti > 1080ti ~85% (35.9fps vs 66.4fps)
1080ti > 2080ti ~45% (66.4fps vs 96.3fps)
2080ti > 3090 ~30% (96.3fps vs 125.5fps) -I used the 3090 since the top card went from x80ti to 90
3090 > 4090 ~23% (125.5fps vs 154.1fps)

970 > 1070 ~69% (26.5fps vs 44.7fps)
1070 > 2070 ~56% (44.7fps vs 69.8fps)
2070 > 3070 ~41% (69.8fps vs 98.8fps)
3070 > 4070 ~23% (98.8fps vs 122.0fps)

1060* > 2060 ~72% (32.1fps vs 55.5fps) -6GB model used, since the 3GB gimped version came out later
2060 > 3060 ~30% (55.5fps vs 72.3fps)
3060 > 4060 ~17% (72.3fps vs 84.9fps)

The higher end cards make bigger jumps at 1440p, as of recent generations at least.
3BUQTn5dZgQi7zL8Xs4WUL-970-80.png.webp

980ti > 1080ti ~89% (26.6fps vs 50.2fps)
1080ti > 2080ti ~50% (50.2fps vs 75.6fps)
2080ti > 3090 ~40% (75.6fps vs 106.0fps)
3090 > 4090 ~37% (106.0fps vs 146.1fps)

1070 > 2070 ~56% (33.1fps vs 51.8fps)
2070 > 3070 ~50% (51.8fps vs 77.7fps)
3070 > 4070 ~25% (77.7fps vs 97.8fps)

1060 > 2060 ~74% (23.0fps vs 40.1fps)
2060 > 3060 ~34% (40.1fps vs 54.0fps)
3060 > 4060 ~13% (54.0fps vs 61.2fps)

So what about 4k?
The older GPUs drop off like a rock and some GPUs perform far worse than their segment due to things like VRAM. Unfortunately Tom's Hardware didn't include numbers for most of the older GPUs so I can't put in much of any comparison. But the 1440p vs 1080p should give you an idea what to expect. The 4090 increases performance from the 3090 by 62% (114.5 from 70.7) and 63% for jump between 3090 and 2080ti (70.7 vs 43.5). I'm guessing part of the big jumps here include the fact that the flagships are the only ones to get lots of memory and decent memory bandwidth to be able to handle 4k titles well enough.

Even though I've just done all of those very rough approximate calculations up above, I don't feel it's a decently representative picture of what to expect. Nvidia messing around with segmentation and naming makes like for like generation comparison tricky. Especially as with newer games tested, the older architectures tend to do more poorly. E.g. it appears as though a 1070 usually performs better than a 980ti, even though back when it released, it was more neck and neck between the 2 cards.

With older GPUs being gimped on newer software, the trend of seemingly getting less performance with each generation would actually be amplified, as the older GPUs would have held up better at the times when their follow-ups released.

Perhaps back in the day one might have expected 50% better performance with next gen. But we can see that with the 40 series we're lucky to get significant gains over previous gen, with mainly the higher tier card getting the best gains, mainly at the highest resolutions and settings and very poor stagnation at the mainstream cards. The Supers obviously hold out better, but there's no guarantees that there will be a super each generation.

A 5060 with a disproportionate 50% performance boost over a 4060, would still struggle to be better than a current gen 4070. It looks even worse when compared to the 4070 Super.
A 5070 getting a 40%+ boost over the 4070 like we used to, would certainly be able to get close to a 4090. But in reality, we'd be lucky for a 5070 to beat a 4080.
For 4k, sure a 5090 should be able to get ahead of a 4090 by at least 30-40%, if not the usualy 60% like we've had before.

But overall, this is just speculation based on previous performance and as we've seen, what happened before means bugger all when Nvidia can choose to mix things up and shift naming, pricing and segmentation around.

TLDR I've wasted a decent amount of words showing past performance increases and it might not necessarily be indicative of what to expect for the 5000 series.
 
Last edited:
Those charts are useless for the top cards as there is clear CPU bottlenecking going on. Sure in many cases you will be CPU limited with a new card but imo there is no point titling a chart GPU Generational Performance if you're showing only a 23% uplift for the 4090 because the CPU isn't fast enough :) The 4K results are better presumably because it's now GPU limited.
 
It seems like everyone is jumping in on this question, so I'll also put in my 2 cents (and try to stick to the facts as best as possible). It really depends on resolution, game, etc. As cards get older they age more poorly and the newer software seperates them out more and more (due to things like increasing VRAM requirements). Any general numbers of this GPU is x% better than that GPU are usually overall averages at a given resolution.

This video by hardware unboxed does a very factual numbers to numbers comparison between generations and shows how bad things are, especially as of the RTX 4000 series:

Else this chart is a quick overall view of how things look, but admittedly at 1080p and keeping in mind newer games were also used, so older GPUs may perform disproportionately worse or there could be CPU bottlenecking.
odX4dmxSVcAKwfs6pcqvJL-970-80.png.webp
Historically, the jumps have been significant all the way down the stack.

980ti > 1080ti ~85% (35.9fps vs 66.4fps)
1080ti > 2080ti ~45% (66.4fps vs 96.3fps)
2080ti > 3090 ~30% (96.3fps vs 125.5fps) -I used the 3090 since the top card went from x80ti to 90
3090 > 4090 ~23% (125.5fps vs 154.1fps)

970 > 1070 ~69% (26.5fps vs 44.7fps)
1070 > 2070 ~56% (44.7fps vs 69.8fps)
2070 > 3070 ~41% (69.8fps vs 98.8fps)
3070 > 4070 ~23% (98.8fps vs 122.0fps)

1060* > 2060 ~72% (32.1fps vs 55.5fps) -6GB model used, since the 3GB gimped version came out later
2060 > 3060 ~30% (55.5fps vs 72.3fps)
3060 > 4060 ~17% (72.3fps vs 84.9fps)

The higher end cards make bigger jumps at 1440p, as of recent generations at least.
3BUQTn5dZgQi7zL8Xs4WUL-970-80.png.webp

980ti > 1080ti ~89% (26.6fps vs 50.2fps)
1080ti > 2080ti ~50% (50.2fps vs 75.6fps)
2080ti > 3090 ~40% (75.6fps vs 106.0fps)
3090 > 4090 ~37% (106.0fps vs 146.1fps)

1070 > 2070 ~56% (33.1fps vs 51.8fps)
2070 > 3070 ~50% (51.8fps vs 77.7fps)
3070 > 4070 ~25% (77.7fps vs 97.8fps)

1060 > 2060 ~74% (23.0fps vs 40.1fps)
2060 > 3060 ~34% (40.1fps vs 54.0fps)
3060 > 4060 ~13% (54.0fps vs 61.2fps)

So what about 4k?
The older GPUs drop off like a rock and some GPUs perform far worse than their segment due to things like VRAM. Unfortunately Tom's Hardware didn't include numbers for most of the older GPUs so I can't put in much of any comparison. But the 1440p vs 1080p should give you an idea what to expect. The 4090 increases performance from the 3090 by 62% (114.5 from 70.7) and 63% for jump between 3090 and 2080ti (70.7 vs 43.5). I'm guessing part of the big jumps here include the fact that the flagships are the only ones to get lots of memory and decent memory bandwidth to be able to handle 4k titles well enough.

Even though I've just done all of those very rough approximate calculations up above, I don't feel it's a decently representative picture of what to expect. Nvidia messing around with segmentation and naming makes like for like generation comparison tricky. Especially as with newer games tested, the older architectures tend to do more poorly. E.g. it appears as though a 1070 usually performs better than a 980ti, even though back when it released, it was more neck and neck between the 2 cards.

With older GPUs being gimped on newer software, the trend of seemingly getting less performance with each generation would actually be amplified, as the older GPUs would have held up better at the times when their follow-ups released.

Perhaps back in the day one might have expected 50% better performance with next gen. But we can see that with the 40 series we're lucky to get significant gains over previous gen, with mainly the higher tier card getting the best gains, mainly at the highest resolutions and settings and very poor stagnation at the mainstream cards. The Supers obviously hold out better, but there's no guarantees that there will be a super each generation.

A 5060 with a disproportionate 50% performance boost over a 4060, would still struggle to be better than a current gen 4070. It looks even worse when compared to the 4070 Super.
A 5070 getting a 40%+ boost over the 4070 like we used to, would certainly be able to get close to a 4090. But in reality, we'd be lucky for a 5070 to beat a 4080.
For 4k, sure a 5090 should be able to get ahead of a 4090 by at least 30-40%, if not the usualy 60% like we've had before.

But overall, this is just speculation based on previous performance and as we've seen, what happened before means bugger all when Nvidia can choose to mix things up and shift naming, pricing and segmentation around.

TLDR I've wasted a decent amount of words showing past performance increases and it might not necessarily be indicative of what to expect for the 5000 series.

this was super informative, looks like what I assumed was the complete OPPOSITE of what the facts are !
 
Back
Top Bottom