• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Pricing for the new Zotac Super cards has been updated, at OCUK, £959.99 to £1019.99 for the 4080 S, £779.99 to £779.99 for the 4070 Ti S, and £578.99 to £599.99 for the 4070 S.

I like the look of the 4070 Ti Super.

Still to expensive. Even more so now because proper next gen cards not far off.

My 4070 Ti cost me £575 and will still be better than a 4070 S :D
 
Is there a big step up between a 3080FE and a 4070Ti? On paper there is, but in Cyberpunk during certain locations/scenes I am only getting 23fps @1440p with all settings on high, and I want to be getting at least 60fps.
This is on a Ryzen 5 7600, Asus B650 and 32GB DDR5.

Joxeon will be here to soon tell you there is hardly any difference :p
 
AFAIK it was from Nvidia, I don't watch or read anything from MLID as all he does historically is get a rumor and then label it as "my sources told me" he just lies for clicks along with all the other rumor channels.

Seems odd though Nvidia would say anything at all. But yeah, those dates sound right based on past experience.
 
According to Nvidia they will will be releasing 5000 series in Q4 which starts in October, Depending on what the competition is doing, If AMD and Intel don't offer anything competitive at the upper mid to high end we likely won't see a release until Q1-2025 so it'll likely be a similar release time frame with the 4000 Super series to 5000 series.
Do you have any links to where Nvidia have stated this, i cant find it anywhere. I don't think they will be released to Q2/3 2025 myself as if rumors are true about AMD having no high end next gen GPU's thenm they just would not need to do anything
 
Is there a big step up between a 3080FE and a 4070Ti? On paper there is, but in Cyberpunk during certain locations/scenes I am only getting 23fps @1440p with all settings on high, and I want to be getting at least 60fps.
This is on a Ryzen 5 7600, Asus B650 and 32GB DDR5.
No.
Joxeon will be here to soon tell you there is hardly any difference :p
Steve’s got the numbers ;)
Screenshot-666.png

Screenshot-665.png

Screenshot-690.png
 
Last edited:
Agree.

If amd were competing in all areas, nvidia wouldn't be able to set the price, which is why I never get why people don't point the blame at amd to do better. It's just simply how business works and if the situations were reversed then it's nvidia that people should blame in order to do better against amd.



Never really get the driver ui argument tbh, yes amd is nicer/cleaner and faster but how often are people really using this? Perhaps with amd, you have to use the driver control panel more often? I can somewhat understand the not wanting to install MSI AB to tweak the gpu but again, it's a great piece of software where as when I had amd, the included overclocking/undervolting was tempermental, half the time, the settings would reset every reboot. Performance overlay etc. amd and nvidias are good but rivatuner with MSI AB is much better, more accurate, more options to tweak and more useful metrics to show. In some ways, I actually find nvidia better from a usability and choice of tweaking/options tbh especially combined with nvidia profile inspector app. I do think nvidia really need to combine all their apps together though in order to be more unified and modern but as the old saying goes, "if it ain't broke, don't fix it"

Reflex, DLDSR, DLSS are things I use all the time when gaming (although as said, it's tied to hardware too)

Back into r290 days, you could downsample from a higher resolution in Eyefnity. I'm not sure it's still a thing now, but with nVIDIA Surround I can't do that. It's the biggest annoyance so far.

Other than that AMD will do nothing (compete), because it can't do nothing. So far they make major changes to architecture when a new gen console is out, so until PS6 and whatever XBOX will be out, AMD will still lag behind and should be glad the law keeps them in the business or else nVIDIA would have pushed them out.

What's the point?

They presumably did the best they could with RDNA2 and 3, there's scaling issues with compute units, that might only be significantly improved with better transistor/process technology.

AMD doesn't produce it's own GPUs, they rely on companies like TSMC.

AMD doesn't compete at the super high end (RTX 4090), because it would require a lot of power, and they don't want their designs to rely on 850w power supplies like the RTX 4090.

What AMD offers is competitive in terms of performance per watt, which we see between the RX 7900 XTX and the RTX 4080 (although, they might need to drop the price of the XTX when the Super model is released).

hmmm
4080 304w vs 361w 7900xtx
4080 294w vs 360w 7900xtx in RT
4080 68w vs 129w in 60 hz gaming...


power-raytracing.png
power-gaming.png
 
Was half tempted to upgrade from my 3080ti to a 4080 super when I saw this but not so sure, tbh there isnt anything I can't play on max already with acceptable frame rates so its obviously just the upgrade itch!

The key thing is whether there are any games each person plays or wants to play which they can't currently run "sufficiently well".

There are obviously games which will make a 4090 sweat, let alone a 3080Ti.
 
I don't think many are gonna choose the RTX 4080 just to save a few watts (~50w) in gaming, compared to the RX 7900 XTX - The XTX is still around £100 cheaper and is likely to fall further due to price competition from the Super cards.

AMD was already hitting the power consumption limits of what they deemed reasonable with the RX 7900 XTX. There comes a point where lots of customers would need to replace their potentially expensive PSU, which would doubtless be a pain point.

I was referring to the TDP of these cards, they have similar power supply requirements. Both cards use significantly less power than the RTX 3090 TI or the RTX 4090.
 
Last edited:
What bonobos are gaming at 60hz? Can we run a poll and then ban them?
Probably the vast majority. Wouldn't cap at 60 FPS though (nor is this a good idea even to show in benchmarks), that's just a bad idea in lots of games.

60 hz on 4K monitors isn't uncommon, alas I'm too much of a peasant to replace my display.
 
Last edited:
No.

Steve’s got the numbers ;)
Screenshot-666.png

Screenshot-665.png

Screenshot-690.png

Looking at that it seems like I'll see a nice performance bump with the 4080 (if I can get hold of one of the 10 FE models available from Nvidia at launch :D).

In reality if people are going for the 4080S/90 now, how much of a performance boost will the 5XXX series see? And how much are they going to cost?
 
Last edited:
Back
Top Bottom