• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Was it all "enthusiasts" though?

Once mining profitability tanked, so too did Nvidia's sales.

3090ti's have been sitting on shelves at almost half price for a while now. I'm sure Nvidia has noticed.

*edit* quoted from page one of the thread. Point still stands though.
Always have to question how popular both the Ti's, and especially the 3090 Ti, were for big miners.

Sure as a gaming card whose OTT price could be reduced somewhat by mining with for a few months they made some sense. But for a warehouse of mining cards? The only came out this year when profitability was already way down, so hard to see anyone getting their costs back on a £1,700 card. But then plenty of irrational miners out there!
 
The only came out this year when profitability was already way down, so hard to see anyone getting their costs back on a £1,700 card. But then plenty of irrational miners out there!

-And they never sold as well as cards sold at the peak of the mining boom. That's my point.

When profitability was high, almost anything with "GPU" sold out fast. Once profitability tanked, sales tanked.

The "enthusiast" market for $2k 3090Ti's was not big enough to sell through inventory. It wasn't even close.
 
That core count difference between the 80/90 is huge.

3080 to 3090 was just over 20% jump in core count, but 4080 to 4090 is around 68%!

Sounds like they are purposefully gimping both variants of the 4080 in an effort to push more people in to the ultra expensive 4090…
 
That core count difference between the 80/90 is huge.

3080 to 3090 was just over 20% jump in core count, but 4080 to 4090 is around 68%!

Sounds like they are purposefully gimping both variants of the 4080 in an effort to push more people in to the ultra expensive 4090…
The 3080 and 3090 are the same 102 die, just cut down. The supposed 4080 16GB and 4090 are different dies, 102 vs 103. so as to differentiate the two even more. The 4080 12GB is a 104 die, so it should actually be called the 4070, but Nvidia are hoping to charge an extra £100 by calling it the 4080.
 
The 3080 and 3090 are the same 102 die, just cut down. The supposed 4080 16GB and 4090 are different dies, 102 vs 103. so as to differentiate the two even more. The 4080 12GB is a 104 die, so it should actually be called the 4070, but Nvidia are hoping to charge an extra £100 by calling it the 4080.

Indeed, but even the ‘real’ 4080 has a massive gap between it and the 90, much larger than the 30 series. At least it leaves more of a gap for the Ti and a bit more product differentiation over what we saw with the 30 series sometimes.
 
The 3080 and 3090 are the same 102 die, just cut down. The supposed 4080 16GB and 4090 are different dies, 102 vs 103. so as to differentiate the two even more. The 4080 12GB is a 104 die, so it should actually be called the 4070, but Nvidia are hoping to charge an extra £100 by calling it the 4080.

People were banking on the 4070 being the one to get this gen. This is in the context of those wanting something better than the 3070 (hoping the 4070 will be a cheap and powerful upgrade). Will be interesting how they fair considering as mentioned above, the 80 series is on a weaker SKU this round.
 



Coming soon...

GeForce RTX 4090​

Starting with RTX 4090, this model features AD102-300 GPU, 16384 CUDA cores and boost clock up to 2520 MHz. The card features 24GB GDDR6X memory, supposedly clocked at 21 Gbps. This means that it will reach 1 TB/s bandwidth, just as the RTX 3090 Ti did. Thus far, we have only heard about a default TGP of 450W, but according to our information, the maximum configurable TGP is 660W. Just note, this is the maximum TGP to be set through BIOS, and it may not be available for all custom models.

GeForce RTX 4080 16GB​

The RTX 4080 16GB has AD103-300 GPU and 9728 CUDA cores. The boost clock is 2505 MHz, so just about the same as RTX 4090. This model comes with 16GB GDDR6X memory clocked at 23 Gbps, and as far as we know, this is the only model with such a memory clock. The TGP is set to 340W, and it can be modified up to 516W (again, that’s max power limit).

GeForce RTX 4080 12GB​

GeForce RTX 4080 12GB is what we knew as RTX 4070 Ti or RTX 4070. NVIDIA has made a last-minute name change for this model. It is equipped with AD104-400 GPU with 7680 CUDA cores and boost up to 2610 MHz. Memory capacity is 12GB, and it uses GDDR6X 21Gbps modules. RTX 4080 12GB’s TGP is 285W, and it can go up to 366W.

As you can see, there is no RTX 4070 listed for now, and AIBs do not expect this to change (after all, it’s just 6 days until announcement). As far as the launch timeline is concerned, RTX 4090 is expected in the first half of October, while RTX 4080 series should launch in the first two weeks of November. We are waiting for detailed embargo information, so we should have more accurate data soon. To be confirmed are still PCIe Gen compatibility and obviously pricing.

If the above is true then it seems to me Nvidia want to charge 3080 money for the 4080 12gb which is essentially a 4070 and they will probably look to charge 3080 Ti money for the proper 4080 (if you can even call it that). Sad, but that’s what it looks like.

There is a small chance that the 4080 16gb will be priced at 3080 price or there about and they are using the 4080 name as the xx80 cards are very popular and they try and hit $500-550 on that, but I would not hold my breath.

I mean look at the cuda core difference between the 4090 and 4080 12GB. It is huge. Will be interesting to see the pricing and performance on these :D
 
I think the 4090 performance will be incredible, looking at the core count, it’s massive. But yeah - expensive across the board. The 4080 12GB could still be a decent card, especially for smaller builds with its lower power requirements. I expect it will be very fast. But it’s just sad they’ve labelled it a 80 series card, because you know they’ve ONLY done that because they’ve figured out they can charge an even higher premium for it.
 
If the above is true then it seems to me Nvidia want to charge 3080 money for the 4080 12gb which is essentially a 4070 and they will probably look to charge 3080 Ti money for the proper 4080 (if you can even call it that). Sad, but that’s what it looks like.

There is a small chance that the 4080 16gb will be priced at 3080 price or there about and they are using the 4080 name as the xx80 cards are very popular and they try and hit $500-550 on that, but I would not hold my breath.

I mean look at the cuda core difference between the 4090 and 4080 12GB. It is huge. Will be interesting to see the pricing and performance on these :D

Yup as mentioned further back, if nvidia do try anything scummy, I'll simply see what rdna 3 offers and if nothing compelling either then I'll skip a gen, which is no hardship since all the games I was looking forward to have been pushed back to 2023 and 2024.

It all depends on price and ray tracing performance, couldn't care less about rasterization nowadays given the massive and quick shift to RT we are seeing now but I won't be paying more than £800 for that upgrade. I certainly won't be upgrading for an extra 2GB or 6GB vram given still waiting on all these supposed "vram" issues to manifest :cry:
 
I think the 4090 performance will be incredible, looking at the core count, it’s massive. But yeah - expensive across the board. The 4080 12GB could still be a decent card, especially for smaller builds with its lower power requirements. I expect it will be very fast. But it’s just sad they’ve labelled it a 80 series card, because you know they’ve ONLY done that because they’ve figured out they can charge an even higher premium for it.

Who knows, it’s not official yet. Might still be called something else. But yeah, sad if they call that a 4080. But as usual I will look at price for performance.
 
I know no one knows yet but do you think this is going to be like the 3000 series launch in that they will be hard as rocking horse poo to get hold of? Or do you think with the cost of living etc that the demand for them will be lower, especially the 4090? I just don't know if I should try for an FE like last time or get a AIB one.
 
Yup as mentioned further back, if nvidia do try anything scummy, I'll simply see what rdna 3 offers and if nothing compelling either then I'll skip a gen, which is no hardship since all the games I was looking forward to have been pushed back to 2023 and 2024.

It all depends on price and ray tracing performance, couldn't care less about rasterization nowadays given the massive and quick shift to RT we are seeing now but I won't be paying more than £800 for that upgrade. I certainly won't be upgrading for an extra 2GB or 6GB vram given still waiting on all these supposed "vram" issues to manifest :cry:

These recent years doesn't feel great for buying games. The only ones that has been appealing must haves is the Sony ports. God of War, Spider-Man, The Last of Us, The Uncharted Collection. As well as Cyberpunk that I've still never gotten back into. I look at everything else and think, do I need this? Nothing is saying god! I cannot miss out on this or need the hardware to run it. I certainly don't feel like I'm struggling with 12GB. I have to have this!

Plus the fact that Battle Royale killed gaming. They're all clones. Everything is follow the herd. With the need to sell skin cosmetics.
 


 



Of course revenue drops for Nvidia that's because the mining bubble has burst so purchases for mining purpose has dropped of a cliff. Has no reflection on gaming
 
If you have an NVMe with a heatsink, you will definitely need to relocate it as the card will not slot in otherwise as the heatsink on the NVMe will obstruct it. It happened to me with a 3090.
 
Of course revenue drops for Nvidia that's because the mining bubble has burst so purchases for mining purpose has dropped of a cliff. Has no reflection on gaming
It demonstrates that the sky-high demand and sky-high prices were not attributable to the current gaming market. The real demand, and prices gamers are willing to pay, is....less.
 


I don't think you actually read those articles if you think PC gaming in general is on the decline. No surprises there.
 
Back
Top Bottom