• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 5000 SERIES

Unfortunately, I can still see it selling well because the competition is not there. These kinds of prices have been normalised over a few years now and nVidia will know exactly where to price them.

Disappointments re: gen on gen performance aside, these are nevertheless going to be the most powerful / third most powerful GPUs that you can buy.

I can't really understand people queuing for something to give them 20 extra frames on a game.

What games are you so desperate to use it on at the moment?

It depends on what people are upgrading from but yes I doubt many 4080 users will be queuing up, if any.
 
  • Like
Reactions: ne0
The 5080 isn't a great uplift but the 4080 super is the same price and slower so what is the better option? I don't see anything better at the moment from people to upgrade to. Maybe the 5070ti. The AMD cards will have the usual lack of features and won't be cheap enough to make up for that.
The 5000 series has some nice improvements in other areas as well it adds MV-HEVC (Multiview) which could add some VR improvements, AV1-UHQ which could improve latency and 4 2 2 encode decode which could improve quality. I look forward to seeing what improvements can be made with Virtual Desktop VR streaming. It's not all about rasterisation.
 
I can't really understand people queuing for something to give them 20 extra frames on a game.

What games are you so desperate to use it on at the moment?
I think there's also the excitement/act of doing it itself. I watched episode 7 at a midnight release despite having to be up early the next day when it came out with some friends, made no difference in theory vs watching it at a more reasonable time the next day or whatnot but the act of being one of the first/it being a 'not normal' thing to do had an appeal.
 
I don’t want this to sound like I am defending Intel or AMD. But it’s not up to them to make Nvidia lower their prices.

It really is utterly illogical to assume two companies with less than 10% combined of a market share, are the reason the dominant player is shafting us with pricing.

Is this one of those “my Nvidia GPU would be cheaper if AMD lowered prices” things? Apologies if not but we sure do see that failed logic a lot.

I mean anyone ready to drop over £1000 on a 5080, should maybe think “paying this tells Nvidia they have the pricing right”.

Exactly this, totally agree. I would also add that buying into the 5000 series, is like saying the consumer will literally buy anything at any price from Nvidia.
 
The 5080 isn't a great uplift but the 4080 super is the same price and slower so what is the better option? I don't see anything better at the moment from people to upgrade to. Maybe the 5070ti. The AMD cards will have the usual lack of features and won't be cheap enough to make up for that.
The 5000 series has some nice improvements in other areas as well it adds MV-HEVC (Multiview) which could add some VR improvements, AV1-UHQ which could improve latency and 4 2 2 encode decode which could improve quality. I look forward to seeing what improvements can be made with Virtual Desktop VR streaming. It's not all about rasterisation.

Another bonus, potentially minor for some, is finally getting the new display port standard to avoid DSC.

I've had DSC issues on my current monitor because, unfortunately, there isn't an ability to disable it (unlike some premium models). I've ordered a monitor with the new standard so that should be a nice pairing. Edit: not cheap though...!

Edit: for anyone interested.... see my posts from here (in that thread) with the solution below:

Well, I finally figured out what has been going on. Maybe!

In short, the cables I've been using haven't been able to support the bandwidth. The monitor can support 160fps at 4k... but display port 1.4a sockets cannot... at least without reverting to using DSC.

I think what has been happening is that the monitor has been flipping between using DSC and not, depending on the use case... this checks in with the sort of 'monitor changing display' glitch that happens when alt tabbing out of game when using an Nvidia card. Display port 1.4 with a 2.1 DP bandwidth cable was a massive improvement but the glitch did occur again.

I'm now using the HDMI 2.1 cable that came with the monitor and it's been solid since... here's hoping I solved it. The issue has never arisen when using an older non-2.1 HDMI cable because my other monitors are only 1440p.

My hunch is that my PC can detect what the monitor can support but has no idea what cable is being used, which enables idiot users like me to make a mismatch. It then got choked out when the bandwidth got exceeded and it couldn't compress enough to get through the 1.4a display port socket.

.... what I'm describing here could explain what was going on for you, since it's Nvidia cards that apparently struggle with DSC turning on and off (i.e. when alt tabbing out of games). Thought I'd mention it if that's helpful!

Since then, it's still happened a few times so I've been locking the monitor at 144hz (140fps via G-Sync) when running anything in HDR.
 
Last edited:
Disappointments re: gen on gen performance aside, these are nevertheless going to be the most powerful / third most powerful GPUs that you can buy.



It depends on what people are upgrading from but yes I doubt many 4080 users will be queuing up, if any.
I know someone (IRL) with a 4090 who is planning from driving from Edinburgh to queue at a shop in the Bolton area.

That's about a 4 hour drive.
 
Another bonus, potentially minor for some, is finally getting the new display port standard to avoid DSC.

I've had DSC issues on my current monitor because, unfortunately, there isn't an ability to disable it (unlike some premium models). I've ordered a monitor with the new standard so that should be a nice pairing. Edit: not cheap though...!

Edit: for anyone interested.... see my posts from here (in that thread).
Yes true the new Display Port is quite a nice feature upgrade as well. We don't really know yet how much FP4 precision will help with Ai as well. At the moment the Ai benchmarks don't look super impressive but I guess it will take time for the new optimisations to be made to fully utilise FP4/FP6.
 
As a 3080 owner I'm on the fence regarding the 5080. I was expecting more and honestly, if the uplift really is less then 10% over a 4080 I might just hang on with the 3080 until something better comes along.

Should have just bought a 4090 a year ago for MSRP :(
 
These early leaked figures for the 5080 look so bad.

Unless people REALLY need a new card I would wait for the inevitable Ti
Like the "inevitable" 4080 Ti you mean? Best of luck waiting for that. Nvidia aren't going to sell you 5090 dies on the cheap, no matter how cut down they are, because they can use them in workstation cards instead and charge more than for a 5090. AD102 dies that didn't make it were turned into various workstation cards like the RTX 5000 Ada Generation. Only 12800 CUDA cores enabled and a cut-down 256-bit bus, but a price tag north of £4000. Why would they sell the same die in a gaming card for a quarter of that figure?
 
i've got a golden ticket.

giphy.gif
 
Back
Top Bottom