• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

this comparison could probably be heavily biased because the person is just using one generation to estimate those benchmark numbers.. can someone work out those numbers across five generations,starting with maxwell
When you look at Turing the 2080 is closer to a 2080ti than what a 4080 16gb is to a 4090 and bear in mind the 2080 priced at around £700 was regarded as a poor card.
 
i just looked at pascal numbers.. the 1080 had 70% of 1080 ti's cuda count
even the 2080 looks like that in the 70% ballpark range..

so yeah probably thats where the target should be 70% of top-end for xx80, theres some obfuscation around xx80ti and xx90 tiers, it could be marketing or theres really some kind of upward segmentation out there which should be clear after amd releases its products.. that should offer some validation

edit: the 4080 12gb though, i am not sure if jensen is even keen on selling it - maybe its a tier reserved for oems, dell or lenovo pre-builts which dont operate in the enthusiast segment
 
Last edited:
I heard someone else mention this and if true it would be an incredibly foolish move by Nvidia, to not see the bubble bursting.
It wouldn't have been finalised until quite recently but they definitely designed it 2-3 years ago
Yea i can't think of a reason why they wouldn't have seen it coming, maybe they thought another crypto would take ETH's place, maybe they were enthralled by the record profits, i honestly don't know but the 4000 series defiantly seems over engineered for current markets.
I honestly think 4080 priced are subsidising 4090s, and that gamers as a whole are subsidising R&D for enterprise and professional.
 
I honestly think 4080 priced are subsidising 4090s, and that gamers as a whole are subsidising R&D for enterprise and professional.
You'd think but from what i was reading last night it seems it maybe the other way around, or at least Nvidia may have much smaller margins on the smaller die's.
Due to high wafer costs, GPU die costs are up massively, but the die is only a portion of a GPU’s total bill of materials (BOM). The BOM of a GPU also includes memory, packaging, VRMs, cooling, and various other board-level costs. When moving from the previous generation 3090/3090ti (GA102) to the new 4090 (AD102), these board-level costs remain the same. As such, the MSRP increase from $1499 to $1599 is enough for Nvidia to maintain margins and deliver substantial gains in performance per dollar. The MSRP cannot be compared directly as the 3090ti GPU sells for $999, or even less, meaning performance per dollar in traditional rasterization rendering is flat.

More significant issues arise when we look further down the stack to the 379mm2 AD103 and 295mm2 AD104. This is where Nvidia faces the big crunch in costs. AD103 and AD104, alongside their accompanying packaging, memory, VRM, board, and cooler BOM, must sell in high-end GPUs for Nvidia to maintain margins.
 
If I had money now I'd get a used 3080 and a UW monitor to keep my PC relevant, flog the 2070S and the BenQ 32 inch, and buy a PS5 for my LG C2 later. Would like to play some games like rally sims or fighting games on PC still, even with a console but that depends on the price, I refuse to pay MSRP for used 3000 series so not gonna be easy... Will wait to see what AMD has to offer but I'm not getting my hopes up.

Either way, I don't really have any money to spare for this now. Or rather, I don't think it's sensible. 2070S turned out to be the best GPU I've ever had, got it below MSRP and it's happily chugging along at 1440p since 2019. Nothing will give me this sort of value with the current state of things and the 4000 series is just a big fat joke.

Heck, even PS5s are sold for way over retail in my country and it's hardly possible to buy one without a bundle.
 
If "high end" $900 GPU is only 48% of full fat... There's a problem indeed

I think using reference points other than money and performance plays into the companies' hands.

They control the naming schemes. They control how they name the dies. They control how they segment the stack. They control which cards get the "flagship" moniker.

They manupulate all the factors they control to convince us to spend more money.

How does it perform? What does it cost? Everything else is just noise.
 
Last edited:
Think I am going to wait now till next year - see who comes out with the most insane top end card then get overclockers to build me a new rig around it. Should have more mature PSU, CPU, GPU and DDR5 offerings. Hopefully the UK is still in one piece then which is a tough ask with our current pathetic PM and government
 
An interesting video about the potential power problems people could have with the 4090.

TL DW For those who haven't seen it yet. The new 12 pin plug has a limited plug and unplug life cycle(30). It has been observed that when this lifecycle rating is exceed, the pins may overheat and melt.
During the course of the video, Jay plugs and unplugs the cable and the ability to connect the plug changes on video.
ATX 3.0 GPUs can communicate wityh ATX 3.0 PSU to understand how much power they can draw.

What a massive fail, on the plug lifecycle
 
TL DW For those who haven't seen it yet. The new 12 pin plug has a limited plug and unplug life cycle(30). It has been observed that when this lifecycle rating is exceed, the pins may overheat and melt.
During the course of the video, Jay plugs and unplugs the cable and the ability to connect the plug changes on video.
ATX 3.0 GPUs can communicate wityh ATX 3.0 PSU to understand how much power they can draw.

What a massive fail, on the plug lifecycle

It's just a click bait video. That 30 plug in cycle spec has been like that for the last 20 years with crimp terminals.

See here:

We have confirmed with NVIDIA that the 30-cycle spec for the 16-pin connector is the same as it has been for the past 20+ years. The same 30-cycle spec exists for the standard PCIe/ATX 8-pin connector (aka mini-fit Molex). The same connector is used by AMD and all other GPU vendors too so all of those cards also share a 30-cycle life. So in short, nothing has changed for the RTX 40 GPU series.
 
Last edited:
Back
Top Bottom