• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

More space for the bend, plus the card will fit into smaller cases
The 4090s are extremely long cards which almost reach the front fans in most cases so having the connector there will cause a severe bend in the cable. For instance, it doesn’t work on my Lancool 2 mesh case as my Zotac card is almost touching the front fans
 
The 4090s are extremely long cards which almost reach the front fans in most cases so having the connector there will cause a severe bend in the cable. For instance, it doesn’t work on my Lancool 2 mesh case as my Zotac card is almost touching the front fans

The evga ftw3 card is much shorter than other 4090s. They made the card wider but shorter than other AIB 4090 cards - so it's a full 4 slot card but it's length is 5cm/2inches less than others
 
Last edited:
great chart. Really drives home that the 4080 16gb is nothing more than a 4070 in disguise.

As usual things a more nuanced than this. For a start that chart is missing all the past xx90 cards. The dual chip cards.

TSMC 4nm is still a cutting edge node. Traditionally Nvidia new chips are on nodes that Apple have already been using for 2 years. Ie Apple had already moved on from Pascals 14nm when Nvidia and AMD moved in. And the 1080ti came out 9 months behind the 1080.

TSMC 3nm is just coming out, but to me 4nm still feels a lot 'newer' than the nodes Nvidia normally gets to play with. What do other people think? To me if the raw 4K performance* is => 40% of the 3080 then the 'name' 4080 is valid. The thing that's wrong is the price. Could the 4080 be under $1000 in a year? I think so. That would still make it as horrible value wise as a 2080/2080 super after taking inflation into account, but its 'back in line' price wise.

* With the big cache size I don't think there is any fear of 2K performance struggling here.
 
Last edited:
I got a mail from Nvidia showing huge double digit perf improvement in 30 series cards.. is that a general average trend across all games or a case of cherry picking?
 
As usual things a more nuanced than this.

TSMC 4nm is still a cutting edge node. Traditionally Nvidia new chips are on nodes that Apple have already been using for 2 years. Ie Apple had already moved on from Pascals 14nm when Nvidia and AMD moved in. And the 1080ti came out 9 months behind the 1080.

TSMC 3nm is just coming out, but to me 4nm still feels a lot 'newer' than the nodes Nvidia normally gets to play with. What do other people think? To me if the raw 4K performance* is => 40% of the 3080 than the 'name' 4080 is valid. The thing that's wrong is the price. Could the 4080 be under $1000 in a year? I think so. That would still make it as horrible value wise as a 2080/2080 super after taking inflation into account, but its 'back in line' price wise.

* With the big cache size I don't think there is any fear of 2K performance struggling here.
It's a custom TSMC 5nm not 4nm, nvidia called N4 so it confuses customers into thinking its 4nm.

The jump in performance is decent because they come from Samsung 8nm which is really an equivalent to TSMCs 10nm but other than the 4090 it looks like it'll be quite a poor uplift for a 2 node jump on everything further down the stack especially with the increase in prices so the node benefit isn't being passed onto customers.

For $1200 we should really be getting a 4080ti, the 3080ti was $1099 and given the 4090 uplift was around 70% over a 3090 for $100 more we should expect to see atleast a 60% improvement over a 3080ti for that same $100 increase.
 
It's a custom TSMC 5nm not 4nm, nvidia called N4 so it confuses customers into thinking its 4nm.

Potentially yes you are right, like TSMC 12 vs 14. And 5nm was 2020 for Apple. So it boils down to whether there is no improvement on Nvidia's 4nm vs AMD's 5nm here. If there is no difference, then it would invalidate the point I was trying to postulate in my post above.

Both Shrimp and Kuo indicate that the 4nm process isn’t significantly better than the 5nm approach, and any improvements would be marginal. However, this claim appears to be contradictory because Qualcomm‘s most recent flagship chip, the Snapdragon 8 Plus Gen 1, is said to have benefited from TSMC’s 4nm technology. gizmochina
 
Last edited:
Price must be a typo because it says £2249,99. Pre scalped for your convenience.
Wow, I pre-ordered a Strix for £400 less... that TUF price is a crazy markup for what is supposed to be one grade above a Zotac!
I got a mail from Nvidia showing huge double digit perf improvement in 30 series cards.. is that a general average trend across all games or a case of cherry picking?
What are you blathering about? Your question makes no sense.
 
The jump in performance is decent because they come from Samsung 8nm which is really an equivalent to TSMCs 10nm

And yet AMD still couldn't beat them last gen on a supposedly superior node. AMD's gaming rasterisation peformance/power efficiency was better, but their computational - ie raytracing game and productivity - was worse. We got a _lot_ of transistors from Samsung vs Turing. For people like me interested in productivity raytracing the move to Samsung was a good outcome.

This generation is looking like Nvidia will do OK on performance per watt, just suffer on build cost vs AMD. People like MLID keep telling us Nvidia is paying more for TSMC 4nm (apparently from other sources, $7 billion was shelled out) and cards will cost more to make than AMD.
 
And yet AMD still couldn't beat them last gen on a supposedly superior node. AMD's gaming rasterisation peformance/power efficiency was better, but their computational - ie raytracing game and productivity - was worse. We got a _lot_ of transistors from Samsung vs Turing. For people like me interested in productivity raytracing the move to Samsung was a good outcome.

This generation is looking like Nvidia will do OK on performance per watt, just suffer on build cost vs AMD. People like MLID keep telling us Nvidia is paying more for TSMC 4nm (apparently from other sources, $7 billion was shelled out) and cards will cost more to make than AMD.
AMD was a generation if not more behind Nvidia with a much smaller RnD budget so just being close was a good result. This gen they look like they will comfortably beat Nvidia in raster while offering similar RT at every price point they are competing at so it looks like a good result for them.
 
Pricing is crazy for the partner cards. I don't blame ocuk for this. But when you're talking a £400-500 on some cards something has to give. I realise that were still in the supply and demand phase so pricing may drop when AMD cards are released but I'm not holding my breath. I hope AMD have a crazy supply of cards ready to go.
 
Pricing is crazy for the partner cards. I don't blame ocuk for this. But when you're talking a £400-500 on some cards something has to give. I realise that were still in the supply and demand phase so pricing may drop when AMD cards are released but I'm not holding my breath. I hope AMD have a crazy supply of cards ready to go.
Why don't you blame OCUK for this? What is the difference to someone selling on eBay for hugely inflated prices - they are both taking the ****.
 
Last edited:
Back
Top Bottom