• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 5000 SERIES

Had that 5080 had more than 16gb of VRAM then the FOMO would have been present for me.

However, 4080s to 5080 looks like a complete waste of time at this stage.

Easy decision though. You already have a 4080. I have been on a 3090 4+ years where 4070Ti or lower was not worth it.
 
Over-estimating ordinary folks with the 5080 there, there's a reason why the most popular GPUs being sold are the x60 tiers.
A decade ago none of the ordinary folks could afford a £1k Titan and I don't think things have improved since.


I do agree. I have a 4090 but most people I know are running 60/70 class cards. I'm only able to justify the extra because its both a work tool and entertainment for me. If I only used my PC for gaming idk if I could justify the spend.
 
Slowly becoming like phone upgrades, going from the last model to the latest there is not as big a jump.
intel will catch them up eventually if they keep going. so would AMD if they want to compete again

They just have to target gamers.. 4090 and 5090 is basically a workstation card doubled up as a graphics card..

do we really need all the hardware thats on a 5090 for gaming only?
 
Easy decision though. You already have a 4080. I have been on a 3090 4+ years where 4070Ti or lower was not worth it.
For those on a 3090 or 3080 a 5080 should offer around +55% and +75% respectively so not terrible but for those already on a 4080 then this gen is one to skip unless you want to move up to the 90.
 
Last edited:
To be fair, this is all smoke and mirrors to sell MFG on cards below the 5090, and/or upsell the 5090. There's nothing to say the raw performance won't improve on Dune on release, Nvidia may be manipulating it. It's more than likely another Cyberpunk situation.
That's my point though, just like it's an issue on 40 series, you need a baseline frame rate that's acceptable (60) before frame gen is enabled in order to get a decent level of input latency, and only the 4080 Super and 4090 offer that in certain game engines like UE5 which is what's being demonstrated here, or when path tracing.

Nvidia can market to the lower card audience but that doesn't excuse the fact of the matter, that only a certain handful of cards can make use of frame gen without a latency compromise, which has always been the case which is where all the frame gen hate stemmed from in the first place as lower tier cards don't deliver the best experience with it enabled in practice, sure it looks good in videos showing the FPS, but that means nothing as a player lol.

I have a 4090 so I don't really care for my own gaming as everything runs great with the settings I prefer with low latency either way, but if we want better games going forwards we NEED to be pointing these things out so everyone benefits.
 
Last edited:
For those on a 3090 or 3080 a 5080 should offer around +55% and +75% respectively so not terrible but for those already on a 4080 then this gen is one to skip unless you want to move up to the 90.

I just seen a DF post where its ~17% which is poor but if that's the improvement over the 4080 then the jump of just over 50% is good enough for me.
 
  • Haha
Reactions: TNA
That's my point though, just like it's an issue on 40 series, you need a baseline frame rate that's acceptable (60) before frame gen is enabled in order to get a decent level of input latency, and only the 4080 Super and 4090 offer that in certain game engines like UE5 which is what's being demonstrated here, or when path tracing.
I've been trying more games with the new LS update and with multi frame gen now I'm struggling to see if going from 4090 to 5090 is justified. Sure it will probably get less artifacts. But from what I'm experiencing the Nvidia solution better be damn near perfect to justify the price. LSFG was meh before from my experience. But artifacting and latency is noticeably improved now.
 
I just seen a DF post where its ~17% which is poor but if that's the improvement over the 4080 then the jump of just over 50% is good enough for me.
The 4080 at launch would have given you around +30-35% for $1200 so +55% for $1000 is definitely an improvement, while certainly not spectacular after 4.5 years. That said if I was still on the 3080 then I'd probably bite on a 5080.
 
The 4080 at launch would have given you around +30-35% for $1200 so +55% for $1000 is definitely an improvement, while certainly not spectacular after 4.5 years. That said if I was still on the 3080 then I'd probably bite on a 5080.

5070Ti is still tempting. Will check also the 9070 but its less of a chance overall.

That's my point though, just like it's an issue on 40 series, you need a baseline frame rate that's acceptable (60) before frame gen is enabled in order to get a decent level of input latency, and only the 4080 Super and 4090 offer that in certain game engines like UE5 which is what's being demonstrated here, or when path tracing.

Nvidia can market to the lower card audience but that doesn't excuse the fact of the matter, that only a certain handful of cards can make use of frame gen without a latency compromise, which has always been the case which is where all the frame gen hate stemmed from in the first place as lower tier cards don't deliver the best experience with it enabled in practice, sure it looks good in videos showing the FPS, but that means nothing as a player lol.

I have a 4090 so I don't really care for my own gaming as everything runs great with the settings I prefer with low latency either way, but if we want better games going forwards we NEED to be pointing these things out so everyone benefits.

I have been posting this for ages tbf.
 
Last edited:
Custom AIBs price was £100 more. It was been like that for every generation.

It hasn't been like that for every generation. The first Founders Edition card was the GTX 980. I'm not sure if it's a coincidence but James Clerk Maxwell himself was considered the "founder" of the modern field of electrical engineering. The GTX 980 FE was priced slightly higher than the AIBs as a deliberate strategy by Nvidia to position the Founders Edition as a premium product. The FE variant remained more expensive than AIB cards for subsequent generations until lockdown when there was a component shortage. During the shortage, AIBs were forced to increase prices because their owns costs had also increased. Nvidia however, stuck to their own MSRP for PR reasons. The AIBs learned that the consumer is happy to pay over the FE MSRP so the AIB cards have been more expensive ever since.
 
This will be the first time an 80 hasn't matched or beat the top card from the previous generation, At least we know why Nvidia didn't go for a $1200 MSRP

Indeed, I wasn’t just hoping it would be the case, but expecting it based on historical evidence. I can’t see me selling up my 4080 and putting the guts of £500+ pound towards a 15% - 20% faster GPU that relies on fake frames to “win”.
 
Ordinary people aren't buying the 90 series cards (albeit a few enthusiasts will), there'll be demand from companies and AI individuals, the sheer AI performance of these cards are becoming insane (almost 2x the 4090) so there very much will be demand for those from companies who don't have enough cash or infrastructure to buy the blackwell AI chips
 
I have been posting this for ages tbf.

You weren’t the only one. The irony is FG is practically worthless for those who actually need it. The only time I found it acceptable was when using an Xbox controller. It was for people running a GPUs that already got 60 FPS plus. Anyone with sub 50 FPS were getting excessive amounts of additional lag.
 
Last edited:
intel will catch them up eventually if they keep going. so would AMD if they want to compete again

They just have to target gamers.. 4090 and 5090 is basically a workstation card doubled up as a graphics card..

do we really need all the hardware thats on a 5090 for gaming only?
I hope so because a monopoly is the worse thing for us end consumers
 
From a game only perspective, i personally think that anyone with a 4090 should be waiting for reviews as remember we are all getting DLSS 4 (apart from MFG) and FG which Nvidia have already said will be getting performance bump..

And from a raster PoV if its only a 25-30% performance uplift then it all depends on a personal level if that is enough justification for for you.

It was also interesting listening to the WAN show.. they were limited on what they could say but apparently Nvidia have said that just because you can turn on 4x MFG doesn't mean you should, you should just be aiming for slightly above your monitors refresh rate.. so if you are going to be hitting your monitor refresh rate with just normal FG then there is literally 0 reason to purchase in my opinion.
 
Last edited:
Back
Top Bottom