• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

We can't directly compare these. The big different with the 40x0 cards - and the thing that's really jacked up the price - is the size of the L2 cache. Even the badly named card has 8x the cache of the 3090Ti. That cache makes brute bandwidth less significant.
The extra L2 cache is likely a cost saving measure as it enables nvidia to cut down the bus size
 
Conversely, fortunate that PC games have enough settings to enable playing new releases on your current hardware for years to come. I get that people want all the resolution, framerate and whatever but that doesn't exactly exlude anyone who can't afford or justify the latest hardware.
Well I'm one of those who would rather just not play than play at a substandard level. It's like clay shooting, when I can't afford to do that anymore I'm not going to go paintballing just to experience something similar.
 
Jensen is right, he just failed to explain his position by answering as if he's talking to 12 year olds.

The price of high end chips are not going down, just take a look at TSMC's pricing schedule - you can't manufacture the same size wafer on 5nm and expect it to be cheaper than 12nm, it ain't happening. This used to happen in the past, not only were smaller node processes smaller and more efficient but they were also cheaper to use.

We're just talking about the silicon here, one of many costs that go into a GPU and just cause the silicon cost has gone up doesn't mean the GPU needs to. Nvidia is purposefully designing the X pensive GPUs, it doesn't need to and it's purposefully trying to make very high margins and these things have a greater impact on GPU prices than the cost making a smaller transistor.



Yes, that is all true. They also didn't really hide it - when J.H. had a call just few weeks ago, with shareholders, he explained clearly NVIDIA's plans: manipulate the market, make sure 3k series sells without dropping pricing or margins, and by that make sure to not destroy high margins for future products of theirs (4k series and later). In short words, they want to uphold the high margins from mining-times, just because they think they can - irrelevant of any production cost increases. Satisfying the consumer has not been mentioned there at all. It's not really a secret, it's been posted all over the internet, but it had oddly small reach amongst gamers - most people seems to not have heard about it. Though, NVIDIA loves to forget that they do have growing competition (AMD and soon hopefully Intel) which will hopefully force them to adjust their thinking to be more real.
 
A quick question and guess it’s more of a what do you think, I am going for the ZOTAC which has 4 cables in to the graphics card. Corsair have a cable for their psus which is a 2 cables one which can handle the full 600w, I’ll be ok to use either right? I could use the Corsair one if/when it comes to the uk.
I was sent an email from them a few hours ago notifying me of new stock. I opened the email 50min after and it was already sold out.. smh
 
Its a dangerous game because nvidia run the risk of pricing gamers out of PC's altogther, once they are lost to consoles it will be hard to get them back. The variable in the nvidia equation thats missing is poeples disposable income, in lockdown everyone had more money, thats over, very much so. We can only hope sales numbers are so low they have to adjust pricing.
 
The only reason the 4080s are so overpriced for what you get is the vast stock of 30-series NVIDIA still need to sell. That's it. That's the only reason. No Moore's law (dead or otherwise) not L2 cache. Just an overabundance of 30-series they have to make look appealing price-wise or they will never sell.

Get a 4090 if you can afford it, otherwise wait till the backlog of 30-series is sold through. Only then will the price come down to where it should be.
In my view - get 4090 only if you actually make money on it or are enthusiast with enough money for it to not matter much to you. If you'd have to take credit for it and want it more than need it, plus it brings you no money... that would be a really silly thing to do. In such case it's just a toy that will lose value rapidly for a bit of entertainment. I reckon each person would have to actually think about it before buying or face potential huge buyer's remorse later.
 
Last edited:
Nvidia has released benchmarks for Overwatch 2

This is average FPS at 2560x1440p max settings.

Using the RTX3080 as a base, the rtx4080 12gb is 19% faster than the 3080. The 4080 16gb is 47% faster. And the rtx4090 is 105% faster than the RTX3080



They also suggest/recommend in the table in that article that 3k series (even 3080Ti and by that 3090) are good only for 1080p these days. Want 1440p or higher - only 4k series will be good, apparently. Details that it's 360FPS+ (number of people owning such monitors is miniscule) on full details only. :) Ah, the PR/marketing BS in full force.
 
You only need a 4090 card if you play @4k or UW and want to chase fps to match a high hz monitor/TV.

Other than that, pick up a 3090 or 6900XT for very high performance on lower resolutions - these cards do well @ 4k anyway. DLSS 2.0 does a good job for games where implemented well.
 
They also suggest/recommend in the table in that article that 3k series (even 3080Ti and by that 3090) are good only for 1080p these days. Want 1440p or higher - only 4k series will be good, apparently. Details that it's 360FPS+ (number of people owning such monitors is miniscule) on full details only. :) Ah, the PR/marketing BS in full force.

To be fair, that's tagged as "competitive play" rather than for normal muggins.
 
To be fair, that's tagged as "competitive play" rather than for normal muggins.
Competitive peeps lower details a lot usually - and not just to pump FPS up, but also to remove visual clutter from the screen. Then that table would be of no use to them. Also, just looking at the table with latency per frame (which is directly comparable to overall latency whilst playing), with such high FPS number it's 1-2ms difference... No human being can perceive such small latency change, not even super-human competitive players.
 
Competitive peeps lower details a lot usually - and not just to pump FPS up, but also to remove visual clutter from the screen. Then that table would be of no use to them. Also, just looking at the table with latency per frame (which is directly comparable to overall latency whilst playing), with such high FPS number it's 1-2ms difference... No human being can perceive such small latency change, not even super-human competitive players.

All fair points. Kind of a silly table really.
 
You only need a 4090 card if you play @4k or UW and want to chase fps to match a high hz monitor/TV.

Other than that, pick up a 3090 or 6900XT for very high performance on lower resolutions - these cards do well @ 4k anyway. DLSS 2.0 does a good job for games where implemented well.

1440p@144fps too - don't think there's any game out there with RTX that runs at 144fps at 1440p.
 
You only need a 4090 card if you play @4k or UW and want to chase fps to match a high hz monitor/TV.

Other than that, pick up a 3090 or 6900XT for very high performance on lower resolutions - these cards do well @ 4k anyway. DLSS 2.0 does a good job for games where implemented well.

I want a 4090 for VR purposes.
 
I found a image showed cost per gates on 10nm and 7nm.

CsnEK2i.png

Cant find details about cost for 6nm, 5nm, 4nm and 3nm but found a image showed what TSMC cost per wafer and chip.

nfij8ob.jpg

5nm wafer is really very expensive at $16,988 each.
Smaller chips means more chips per wafer.
This is due to smaller fab processes providing increased performance per area on chip, with increased density.
I hope I didn't bork the wording but I'm sure you know what I mean.

I think wafer cost doesn't compare to R&D cost, the focus is because it sounds like a lot to you and me when we see wafer cost.
 
Smaller chips means more chips per wafer.
This is due to smaller fab processes providing increased performance per area on chip, with increased density.
I hope I didn't bork the wording but I'm sure you know what I mean.

I think wafer cost doesn't compare to R&D cost, the focus is because it sounds like a lot to you and me when we see wafer cost.

In going back in time, what was happening is that smaller nodes would cost more $ in nominal terms, but the increase in density would far exceed the increase in price, effectively every new node was reducing the price per transistor - that's not happening anymore, the price per transistor on the latest nodes are now more expensive because the additional cost is exceeding the increases in density
 
Last edited:
Back
Top Bottom