• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 5000 SERIES

Isn’t the supporting numbers nvidia’s chart for far cry 6, 3d marks time spy and speed way?


Looks like they estimated 4 games including Wukong. The Wukong results are the weirdest, they make it look like the RTX 5080 hates that game but every other RTX5000 card loves Wukong much more
 
Last edited:
Currently have a 3070 which has served me well for high fps 1080p gaming and some 4k gaming. I have a nice 4k projector setup which I use sometimes (RDR2 looks stunning!). But I find I have to crank most things down to obtain a steady 60fps.

So I am looking to move to a 5000 series card so I can play the majority of my games in 4k on the projector. These will all be single player games so even with frame gen and DLSS increasing latency by a bit it shouldn't kill the experience right?

I am eyeing the 5070Ti or 5080, I'd sell my current 3070 (for £200 maybe?) so it'd be a £700-£800 upgrade. Is this worth it coming from a 3070? I'm trying to figure out what performance increase I'd get. Maybe 2.5x?
 
By the time 16GB isn't enough, the 5080 won't be able to run the latest games at 4K native. The same is true for 4080(S) and 7900 XTX.

People still find ways to make it work and use these cards for far longer. Actually, as much as I hate things like upscaling and framegen technologies, I must admit that those technologies help longevity of the GPUs they're on. By the time 16GB isn't enough, die-hard 5080 owners (who don't want to upgrade to £2k 7080 for 40% performance boost and 24GB VRAM) can use use DLSS4 to keep up, albeit with loss of visual quality.

I mean look at the number of folks still running 1080tis or even folks on just low-end low VRAM GPUs, somehow still scraping on by.
 
That's the way it should be/always has been. (Apart from 4000 series)
I really don't get this analogy, About the 40x being one of the worst gens, Sure nothing else made sense other than the 4090 but the 4090 obliterated the 30x top end including the 3090ti was one of the biggest uplifts I have seen not to mention the price was almost the same.

Hence the reason I shelled out for it, You're talking about a 60-80+% uplift for the same price in raw performance.
 
People still find ways to make it work and use these cards for far longer. Actually, as much as I hate things like upscaling and framegen technologies, I must admit that those technologies help longevity of the GPUs they're on. By the time 16GB isn't enough, die-hard 5080 owners (who don't want to upgrade to £2k 7080 for 40% performance boost and 24GB VRAM) can use use DLSS4 to keep up, albeit with loss of visual quality.

I mean look at the number of folks still running 1080tis or even folks on just low-end low VRAM GPUs, somehow still scraping on by.
This is very real, i'm on a 3070Ti and 8gb is a struggle, Indiana jones and stalker 2 i've been pretty much forced to use either lowest settings or DLSS to actually play the game. Theres a bunch of games where VRAM is my limiting factor at 1440p. i'm definitely contemplating a 5070Ti i think at least 16GB should see me by for a good while at 1440p
 
FAEIK FRAEIMS!!!!!!!!!!!
lol nvidia saying 80% of people used DLSS like people don;'t care about IQ.
99% of games turn it on by default?

most people probably don't realise it kills IQ

I'm with linus you don't buy a 2k graphics card because you want to use DLSS and frame gen.
these same people likely bought oled monitors for the 0ms refresh rate, and uber low input lag..

I wanna see people use frame gen 4x in something like Iracing, and compare lap times without it.

Framegen would be amazing for consoles, it means we could get better looking games on PC without consoles holding us back graphically.

but on PC I'm not convinced.
 
Last edited:
I really don't get this analogy, About the 40x being one of the worst gens, Sure nothing else made sense other than the 4090 but the 4090 obliterated the 30x top end including the 3090ti was one of the biggest uplifts I have seen not to mention the price was almost the same.

Generally on launch only three models seem to be available. One of the models was almost double the price. It took another year for the super to reset near to where it should have been and it was still a 40% increase in price. Then you have lower down the stack like the 4060. This is more like a 50 series in performance and people tend to notice this.
 
Wishing you kept your 4090 now? :D

Well 4090 values are certain to drop compared to what they've been selling for in recent months but, if these performance figures prove accurate, I think it'll maintain a used value over £1k, especially the FE which is usually in higher demand.

Those hoping for sub-£500 4090s were always living in a dream world.
 
I really don't get this analogy, About the 40x being one of the worst gens, Sure nothing else made sense other than the 4090 but the 4090 obliterated the 30x top end including the 3090ti was one of the biggest uplifts I have seen not to mention the price was almost the same.

Hence the reason I shelled out for it, You're talking about a 60-80+% uplift for the same price in raw performance.

The 4090 was literally the only decent card in the initial line-up. The SUPER refresh addressed that to a large degree.
 
Question for the folk who are against DLSS/fake frames.

If we get to the point where the fake frames/pixels are 95% accurate to a natively generated frame and the input lag penalty drops, say, 50% less than it is now - would you still not use it?

Where is the line drawn, at what point does 4K or 1440p with every setting cranked up at 200+ fps become enticing?
I am super latency sensitive and play mostly first person games. The latency increase needs to be reduced by a factor of 3-5 before i consider using it. This on a 4090 as well, so I already have the highest possible base frame rate. The only game I used it is in MSFS 2024. It's fantastic there and has very few artefacts.
 
Currently have a 3070 which has served me well for high fps 1080p gaming and some 4k gaming. I have a nice 4k projector setup which I use sometimes (RDR2 looks stunning!). But I find I have to crank most things down to obtain a steady 60fps.

So I am looking to move to a 5000 series card so I can play the majority of my games in 4k on the projector. These will all be single player games so even with frame gen and DLSS increasing latency by a bit it shouldn't kill the experience right?

I am eyeing the 5070Ti or 5080, I'd sell my current 3070 (for £200 maybe?) so it'd be a £700-£800 upgrade. Is this worth it coming from a 3070? I'm trying to figure out what performance increase I'd get. Maybe 2.5x?
If you're planning to game at 4K I wouldn't go for anything less than the 5080. You really don't want to rely on frame gen ... it's adequate if your base frame rate is acceptable eg 60 fps but otherwise can feel unresponsive eg. Black Myth Wukong. Any of the 5000 series cards will be a decent upgrade for you, but the resolution you plan to play at has a big bearing on which card you buy ; 4K is very demanding - only the 4090 currently handles it effortlessly.
 
I won't pay 5090 price, I love gaming but not that much, price is a joke. Considered waiting for super but I'm due an upgrade and CBA to wait any longer.
 
I am super latency sensitive and play mostly first person games. The latency increase needs to be reduced by a factor of 3-5 before i consider using it. This on a 4090 as well, so I already have the highest possible base frame rate. The only game I used it is in MSFS 2024. It's fantastic there and has very few artefacts.
I wouldn't say I'm super sensitive, but the input latency is very noticeable on some games eg. BM Wukong. So, for action games, Soul's like etc I wouldn't want to use any FG/MFG if I can avoid it - any increase in latency is undesirable.
 
Generally on launch only three models seem to be available. One of the models was almost double the price. It took another year for the super to reset near to where it should have been and it was still a 40% increase in price. Then you have lower down the stack like the 4060. This is more like a 50 series in performance and people tend to notice this.
Sure the full lineup is never or usually never available on launch but comparing apples to apples the 4090 was and still is 70% faster than a 3090 and in some cases it's twice as fast.

I think some are forgetting they almost cost the same £100 more in reality except this time the 5090 cost is a minimum £500 more for 20% not 70%-80%+

Some will say this is the way forward and in most cases with Nvidia it is because they like to make £££ for their shareholders but there still is and always will be the right time upgrade at the right cost and I think users are going to see that when the benchmarks leak.

Everyone's expecting to get a 4090 on the cheap it ain't going to happen there won't be much difference and they will still slot towards the higher end of the 5080-5090 gap but with more Vram on the 4090 Vs 5080 naturally.

But in terms of recent generational uplift for pound per frame nothing has beaten the 3080 when compared to the 2080 and same goes for the 3090 to 4090.

It was obvious this time though because of the lack of competition from AMD, I didn't expect anything different this time.
 
The 4090 was literally the only decent card in the initial line-up. The SUPER refresh addressed that to a large degree.
It was with our heads screwed on nothing else made sense in that lineup, Obviously you can blame lack of competition or Nvidias greed that's up for debate but both are true lol.
 
Even if it was, wouldn't be supporting them after what they did to their workers on pay and such and would rather go with Alphacool or similar. Also gigabyte are doing a waterblock GPU.
I would hope by now they have their house in order , apparently they sorted this out months ago according to them and I imagine if they still were not paying their staff then they would have no one working there and would have disappeared, I wouldn't work at a company who isn't paying the staff and neither would most i imagine but if they have learnt from their mistakes and trying to genuinely get back on top and supply stock to companies/ paying their staff then I have no problem buying a waterblock though ocuk myself... I don't want the company to go down the toilet, my PC is stuffed with around £2000 of their cooling gear :cry:

But yes other options will be available from Optimus, watercool, Alphacool, Heatkiller, Byski, Barrow etc etc so are other options to go with but I've always like the design and aesthetics of the EK blocks myself.
 
lol nvidia saying 80% of people used DLSS like people don;'t care about IQ.
99% of games turn it on by default?

most people probably don't realise it kills IQ
They're right. The percentage of PC gamers that even look at the settings menu in a game is probably very small.

When the 5060 lands PC Builders are going to advertise machines for £800 that get '200FPS' in Fortnite and they're going to fly off the shelves.
 
If you're planning to game at 4K I wouldn't go for anything less than the 5080. You really don't want to rely on frame gen ... it's adequate if your base frame rate is acceptable eg 60 fps but otherwise can feel unresponsive eg. Black Myth Wukong. Any of the 5000 series cards will be a decent upgrade for you, but the resolution you plan to play at has a big bearing on which card you buy ; 4K is very demanding - only the 4090 currently handles it effortlessly.
Good points. 5080 is the one I'm eyeing the most. If I can get one that is!
 
Question for the folk who are against DLSS/fake frames.

If we get to the point where the fake frames/pixels are 95% accurate to a natively generated frame and the input lag penalty drops, say, 50% less than it is now - would you still not use it?

Where is the line drawn, at what point does 4K or 1440p with every setting cranked up at 200+ fps become enticing?
The few times I've enabled FG, it was like having motion smoothing on on my TV. Hated it.

It needs to be 90% the same across the board as traditionally processed frames for me to want to use it. And even then, my 3440x1440 monitor only goes up to 175hz.

I'm not against the technology in principal, but as it stands I've no interest in it.
 
Back
Top Bottom