• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

If Nvidia wants to maximize profit, they're going to need a bigger market.

I wonder what proportion of Nvidia's profits comes from each category. The xx80 and xx90 cards presumably have much higher margins but smaller volume, whereas they surely shift many, many more xx50, xx60 and xx70 cards and laptops but make less from each sale.
 
I wonder what proportion of Nvidia's profits comes from each category. The xx80 and xx90 cards presumably have much higher margins but smaller volume, whereas they surely shift many, many more xx50, xx60 and xx70 cards and laptops but make less from each sale.

From memory its the low end which makes the big profits by a long way. 30% of graphic card sales is laptops and rising every year and increased by 30% last year.

I think Nvidia sold 42m gfx cards in 2021. If you look at steam usage of cards, most of those will be the middle and lower end. The high end (gamer) may make more profit per card but they dont sell that many of them comparatively
 
The 3000 series launch was a one off, miner's with demand and scalper's fuelled the high prices, the pandemic and ferlow demand and scarcity of the high end 3090's fuelled higher prices even more. this time around Nvidia has its allocation by TSMC which it tried to cut back on but was not allowed. This gives Nvidia a high allocation of GPU chips it has to pay for.
They are hoping they can shift this allocation but its not a given they will, they might even end up with a higher surplus near the end of the 4000 series, but will try to make as much profit as they can from early adopters of the 4000 series, and later we shall see the 4090ti and maybe a full fat Titan in the 4000 series.

Nvidia needs to sell the 4000 series allocation but must also know with out miner demand and plentiful stock mean's the scalper's will also not be driving prices. also people waiting for the so called cheaper 4000 series will now look at the cheaper 3000 series cards instead, also diminishing Nvida 4000 series demand. AMD are in a better position, new Ryzen CPU's also due any time, Console's and updated GPU's along with the new 7000 series GPU. If AMD play there cards right with no massive price hikes they could steal away Nvidia custom and sell all there allocation if the 7000 series of GPU's perform to expected levels with out being 600w power muncher's needing new PSU's to run them.

Will be interesting to see how many will hang onto the 3090 and 3090ti's they have now until the better 4090ti is released this time around, or maybe a Titan full fat card. one thing for sure the 2nd hand market for cheap 3090 cards at £500 to £600 for those who dont need the latest and greatest gen will be getting the best deal's.
I personally don't have an issue with Jensens so call layering strategy to try and shift 3000 but I do have an issue with him trying to sell mid range cards for premium prices. imo what he should have done is release a 4080ti with around 13~14000 cudas for $1200 and then either held the 4080 16gb or released that for $900 even though I still think that would have been overpriced for the specs.
 
Is it bad that I really want a 4090 even though I already have a 3090 and have been using the PS5 much more recently?.... :D.

I wonder if I can lose the extra cost as stationary.....
 
Last edited:
We should also consider the possibility that Nvidia just couldn't innovate enough to offer value and make money at the same time. It's not always an easy task. They may have just come up short this generation.
 
Matters not what we think about dodgy naming and prices, Nvidia's next couple of financial quarters will be telling if Jensen has played a blinder or not. Personally I think he will have to drop ada prices sooner rather than later.
 
Can we go back to the DLSS 3 point about fake frames.
People don't like the idea of computer generated frames being inserted into the pipeline in between the other computer generated frames. Yup you all seem to forget that all the frames are computer generated in the first place.
Ok so they are generated in a different way, but does that make them any less relevant. We had the same argument back when CPU hyperthreading started, oh no there not real cores that will never do. Technology moves on, new things emerge, some even catch on and stick around, others die off quite quickly. Up scaling is here to stay as both NVidia, AMD and even Intel now use it.

If this way of increasing frame rates, proves to be successful, then the others will follow suit, if it isn't then it will die off.
That's great and all but what you end up with is unbalanced cards that will perform well in games that support DLSS 3.0 and not so good in ones that don't.

The 4080 12gb isn't much faster than a 3080 in games that don't support DLSS 3.0 which right now is over 99% of games yet your being charged for the cards performance with DLSS 3.0
 
How many 4k gamers do you think there are?

According to the Steam Hardware survey 2.54% game at 4k. According to Dataprot.net there are 1.7 billion PC gamers. 2.54% of 1.7 billion gives a figure of 43 million. This also ignores those with 4k displays who game at lower resolutions (typically 1080p), There are apparently 120 million people who use Steam so 3 million Steam gamers game at 4k.

Neither is a trivial market.
 
According to the Steam Hardware survey 2.54% game at 4k. According to Dataprot.net there are 1.7 billion PC gamers. 2.54% of 1.7 billion gives a figure of 43 million. This also ignores those with 4k displays who game at lower resolutions (typically 1080p), There are apparently 120 million people who use Steam so 3 million Steam gamers game at 4k.

Neither is a trivial market.
With around 8 billion people on the planet I find it hard to believe almost 1/4 are PC gamers unless they count minesweeper and solitaire etc.
 
Yeah I don't trust the 1.7 billion number, I don't recall getting surveyed - this number is close to just assuming every household that has a pc in the world is a gamer

in reality, steam has 120 to 200 million users depending on who's active and when and most PC gamers apart from those who exclusively pirate games, will have a steam account
 
Last edited:
According to the Steam Hardware survey 2.54% game at 4k. According to Dataprot.net there are 1.7 billion PC gamers. 2.54% of 1.7 billion gives a figure of 43 million. This also ignores those with 4k displays who game at lower resolutions (typically 1080p), There are apparently 120 million people who use Steam so 3 million Steam gamers game at 4k.

Neither is a trivial market.
At first I was going to point out that you cannot use the steam data and extrapolate it into the data port article. Then I clicked on the article saw some weird numbers including that stupid stat that 50% gamers are women and realised that the entire article is questionable.
 
Actually, I've seen that stat elsewhere. Women are - IIRC - primarily mobile gamers. Candy Crush and the like.
Yeah I know where it comes from. I just don’t consider someone who plays a mobile game because they are bored on a bus, a gamer.

A person isn’t a footballer because they happen to kick a ball every now and then.
 
According to Dataprot.net there are 1.7 billion PC gamers.

Absolute nonsense - they're probably counting people that play Candy Crush, Angry Birds or whatever in that figure (yeah it says PC gamers but it's so obviously nonsense and mobile gaming has been used to skew demographic results before i.e the "woman make up 50% of all gamers" nothing-stat)
 
Last edited:
Considering raw raster performance, if the Tech Powerup article earlier is anything to go by, then it's up to 59fps on Cyberpunk when DLSS is not turned on, then you have the increased latency too. To my eyes reading that, it doesn't seem like a big jump in raw raster performance from my 3080 Ti, let alone a 3090 Ti which would be a bit higher still.

Is that difference worth the cost of a 4090 and a new PSU upgrade too? That can only be answered on an individual basis I suppose. To me it doesn't seem cost effective. DLSS is the future, but DLSS 2 core components will continue to be developed as they form the subset of DLSS 3 anyway, so both will continue to improve. I think I would rather the higher fps and lower latency when these are laid on the table.

In my example above I can feel the latency difference between no DLSS vs DLSS in Quality vs Performance. You should be able to anyway as those latency figures are quite big jumps. Granted I do not have a Reflex compatible mouse so cannot speak for how that component of the technology changes the experience.

Once the 40XX prices are slashed, this view might change though as cost vs relative perf is a big consideration as mentioned already in the thread for many, regardless of affordability, it's more the principles.

I do however want to see the same test from a 4080 16GB...
To be honest, the way they slammed up the wattage usage and the price, I'm guessing they just lost control on this series of cards and didn't manage it well. Obviously Nvidia always have good software support so they relied on that to push the best out but we're looking at chips that were just inneficient and not well designed so they had to chunk up the power use and (considering this) still managed to not get the kind of gains you'd expect on proper rasterised performance. It's definitely a strange release when you look at it.
 
Back
Top Bottom