• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 5000 SERIES

I don't really care about power consumption, my Taichi XTX pulls 400w+ at full load and I couldn't care less as the actual cost per year for me is minimal with my usage
 
In recent years that's correct albeit I did upgrade from the 1080ti to the 3080 on launch day.
1080ti 16nm
2080ti 12nm
3090 8nm
4090 5nm
5090 4nm

I think we've reached the end of big generational jumps. I Expect the 60 series to be around the same uplift from 5090 in percentage terms. So upgrading every 2/3 generation seems better.
 
Last edited:
I don't really care about power consumption, my Taichi XTX pulls 400w+ at full load and I couldn't care less as the actual cost per year for me is minimal with my usage
I couldn't care less about the cost also it's more I don't want my house to burn down from 1 silly connector consuming over 1/2 KW and not to mention the size of the GPUs are getting obnoxious as well and need bolting to your case with with Frankenstein brackets haha.
 
Going from 6800XT to 5090 and I expect at least 100% more performance in raster. That's absolutely fine by me. The 6800XT has been a real trooper for the last four years, gave it a good sendoff to my irl bestie for half what I paid for it.
I'm really looking forward to checking out the nvidia featureset, I have always looked upon DLSS scaling/framegen with a little bit of envy.
 
1080ti 16nm
2080ti 12nm
3090 8nm
4090 5nm
5090 4nm

I think we've reached the end of big generational jumps. I Expect the 60 series to be around the same uplift from 5090 in percentage terms. So upgrading every 2/3 generation seems better.
There is still room for leaps, The smaller the node gets the more transistor space available to previous nodes after 3nm maybe 2 if possible if they can hit reliable yields or it will be new technology afterwards.
 
mentioned on here
3D Timespy uplifts between generations

1080ti to 2080ti: 34% uplift
2080ti to 3090: 48% uplift
3090 to 4090: 73% uplift
4090 to 5090: 22% uplift

I'm confused. So 48% wasn't a "big jump"?

I get that the 4090 was a beast and has kind of spoiled us and our expectations a bit but the 20-30 series was a decent jump IMO.
 
I have 0 issues upgrading every gen if the bump is worth it but for me my goal is a minimum of 50-60% bump in raw power.
Mine are the same, which is why I went from 1080ti to 3080, and am looking for my next upgrade. If you upgraded to something like the 4090, then you sort of signed up to having a limited upgrade path i.e. 5090 or nothing.
 
  • Like
Reactions: ne0
I'm confused. So 48% wasn't a "big jump"?

I get that the 4090 was a beast and has kind of spoiled us and our expectations a bit but the 20-30 series was a decent jump IMO.
of course, I was just responding to Adrian who mentioned "my goal is a minimum of 50-60% bump in raw power."
 
mentioned on here
3D Timespy uplifts between generations

1080ti to 2080ti: 34% uplift
2080ti to 3090: 48% uplift
3090 to 4090: 73% uplift
4090 to 5090: 22% uplift
3090 wasn't the halo card though, the 3090ti was. I thought the 5090 wasn't out yet?
 
20% uplift from 40X0 to 50X0 would be seen as disappointing I believe

30% and above would be viewed amicably

35%+ and everyone will be pretty pleased and all complaints will be related to stock and scalping issues imo
I'd want at least +50% if I was spending 1k and probably closer to 100% for 2k
 
You'd think GPUs coming out in 2025 could run 2020's Cyberpunk at more than 28fps :cry:

Amazed any card can run a modern 4k game path traced. Its a marvel compared to where we were 5 years ago.

I think people are deluded.

Also you might want to check again when Cyberpunk pathtracing came out.

 
Last edited:
  • Haha
Reactions: TNA
if its not a competitive online shooter then I'm fine with the devs targetting a smooth 60fps with really good image quality. yes, I know this is 60fps on a 5090 but I come from the days of Crysis when PC gamers loved that they had titles that had options to push the hardware super hard. I have 144Hz monitors and for most games I don't care much beyond 60fps

This will only be accurate if the frame time and frame pacing is good, otherwise it's going to be a sloppy experience as has been witnessed with so many unreal engine games the last few years requiring months of patches regardless of GPU. They can target a 60fps, and it might be smooth, but smooth doesn't means optimised as we all know. Frame time and frame pacing matter more than actual fps targets. I can only think of maybe 3-4 games in the last few years that released with no issues with launch day optimisation and the only things patched were actual game bugs that in some cases blocked late stage progression etc.

I'd want at least +50% if I was spending 1k and probably closer to 100% for 2k

Yes pretty much this. People are forgetting that here we have bigger power draw to get to those % uplifts which is not what we should want. The 30 to 40 series was an actual generational uplift for example, my 3080 Ti FE would draw around 350W at times in the highest demanding games like Cyberpunk, yet here I am running double the raw framerate and typically drawing less power with the 4090. That in my eyes is an acceptable gain given the increase in cost (3080 Ti was £1000, 4090 was £1600).
 
Last edited:
2080 Ti can't even run Quake 2!! /s

Tech-Focus-Cyberpunk-2077-Overdrive-HEVC.mp4_snapshot_02.28_%5B2023.04.22_15.32.12%5D.jpg
 
Back
Top Bottom