• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4070 12GB, is it Worth it?

Status
Not open for further replies.
Ηοw did you conclude that amd are better engineers? Seriously thats crazy to suggest. Usually they deliver less with more. AMD stands absolutely no chance without TSMC. None whatsoever. They would have gotten eaten alive by Intel if Intel had access to 5nm TSMC and AMD was using Intel's fabs. You know the last time amd didn't have the node advantage they delivered the BULLDOZER. Just saying.

Even their GPUs, they have more of EVERYTHING. Ram ,transistors, bus widths, shaders, yadayadayada. Since we had the same discussion a few days ago, and die size is SO important, the XTX has a 50% bigger die than the 4080, and a 50% bigger bus width, and we can all see the results of that..

Ah...... TSMC, just go with TSMC and THEY will make all the best stuff for you.

The last guest Tom (MLID) had on was a foundry guy, go have a look, Tom put it to this foundry guy that given the clear advantages of chiplets, now that Nvidia are back at TSMC why don't they make chiplet GPU's?
The foundry guy looked a little perplexed and said "TSMC don't design chips, they simply make them" Tom, having not listened to a word this foundry guy just said explained that 'Nvidia told him that they don't see any benefits in chiplets and will continue with monolithic because that suits them better"
The foundry guy suggested that chiplets is the way forward for everyone.

He went on to explain something a little deeper, when Apple need something manufactured they have specific demands from the node, in their case its very low powered efficiency, Apple being the designers of their architecture help TSMC design and optimise the node for their architectural needs.
With that TSMC add that innovation and engineering to their portfolio as a diverse foundry with a set of skills.

AMD are No:2 for TSMC, they have very different demands from the node to Apple, AMD need power and frequency scalability, AMD add their own expertise to help TSMC optimes their node to suit AMD , TSMC also add that innovation and engineering to their portfolio.

The foundry guy added that Xilinx at some time had specific packaging demands for their products, Xilinx designed those packaging technologies for TSMC's node.
Both Apple and AMD now use those TSMC packaging technologies.

TSMC are the best foundry because they have the best customers designing their stuff for them, TSMC are not architectural designers, they are manufacturers with a portfolio of innovation created by its core customers.
 
Last edited:
Ah...... TSMC, just go with TSMC and THEY will make all the best stuff for you.

The last guest Tom (MLID) had on was a foundry guy, go have a look, Tom put it to this foundry guy that given the clear advantages of chiplets, now that Nvidia are back at TSMC why don't they make chiplet GPU's?
The foundry guy looked a little perplexed and said "TSMC don't design chips, they simply make them" Tom, having not listened to a word this foundry guy just said explained that 'Nvidia told him that they don't see and benefits in chiplets and will continue with monolithic because that suits them better"
The foundry guy suggested that chiplets is the way forward for everyone.

He went on to explain something a little deeper, when Apple need something manufactured they have specific demands from the node, in their case its very low powered efficiency, Apple being the designers of their architecture help TSMC design and optimise the node for their architectural needs.
With that TSMC add that innovation and engineering to their portfolio as a diverse foundry with a set of skills.

AMD are No:2 for TSMC, they have very different demands from the node to Apple, AMD need power and frequency scalability, AMD add their own expertise to help TSMC optimes their node to suit AMD , TSMC also add that innovation and engineering to their portfolio.

The foundry guy added that Xilinx at some time had specific packaging demands for their products, Xilinx designed those packaging technologies for TSMC's node.
Both Apple and AMD now use those TSMC packaging technologies.

TSMC are the best foundry because they have the best customers designing their stuff for them, TSMC are not architectural designers, they are manufacturers with a portfolio of innovation created by its customers.
Not at all true. First of all, the fact that - you are suggesting chiplets is the way to go for everyone - yet nvidia is curbstomping amd without chiplets, is a perfectly clear example of nvidia doing a better job in designing the products. They got more with much less. Specwise the 7900xtx is monstrous next to a 4080, yet performance is mediocre. RDNA2 was able to compete with Ampere, but that's because of a huge gap in node. Imagine RDNA 2 on Samsungs 8nm and ampere on TSMC. Yeah...well.

Furthermore, there would be no 3d vcache cpus without TSMC. That's all TSMCs patent. AMD would be losing on every measurable metric by Intel while intel is on an inferior node with monolithic chips (which - again - you said are not as good as chiplets).
 
Not at all true. First of all, the fact that - you are suggesting chiplets is the way to go for everyone - yet nvidia is curbstomping amd without chiplets, is a perfectly clear example of nvidia doing a better job in designing the products. They got more with much less. Specwise the 7900xtx is monstrous next to a 4080, yet performance is mediocre. RDNA2 was able to compete with Ampere, but that's because of a huge gap in node. Imagine RDNA 2 on Samsungs 8nm and ampere on TSMC. Yeah...well.

Furthermore, there would be no 3d vcache cpus without TSMC. That's all TSMCs patent. AMD would be losing on every measurable metric by Intel while intel is on an inferior node with monolithic chips (which - again - you said are not as good as chiplets).

I'm not suggesting, Tom's guest, a foundry insider suggested it.

You're also not listening or deliberately being obtuse, Intel or Nvidia cannot just demand that TSMC make them chips like AMD's chips with their chiplets and 3D stacking, they would be told that is not how it works, they would be told you deign it and we make it.
 
Last edited:
I'm not suggesting, Tom's guest, a foundry insider suggested it.

You're also not listening or deliberately being obtuse, Intel or Nvidia cannot just demand that TSMC make them chips like AMD's chips with their chiplets and 3D stacking, they would be told that is not how it works, they would be told you deign it and we make it.
Of course intel and nvidia have to design it. So? That's not the point, the point is you can design all you want, without tsmc you cannot produce it. Only tsmc can make 3d stacked chips.

You have to remember that amd started actually competing in the cpu space after they moved to tsmc. When they were using glofo, they were the better choice of course, but that's because they were undercutting intel by a lot. They were selling their 4 core cpu for less than half the price of a 7700k, and the 7700k was miles ahead in performance, especially gaming. MILES. They reached parity in core performance with zen 2 built on tsmc. And just to remind you, the 7700k was a 2014 architecture built on a 2014 node. If amd were the better designers they should have matches that day one on glofo
 
Half the forum is :cry:
C9Dsn82.png

GLKyd9z.gif

Right......

Anyway.

I think Nvidia are a very different kettle of fish to Intel, AMD are also in a very different situation now vs pre Ryzen.

AMD cannot best Nvidia, AMD are semiconductor engineers, as are Intel, i am going to upset melmac and he can rant on his own in a dark corner all he likes...... AMD are better semiconductor engineers than Intel, AMD bested Intel by being better.
AMD know this, they always have been better, they just needed a chance to recover. Hence my thread in the CPU room, i have been following this for a long time, i knew what was coming just as soon as AMD quit dying.

I also think AMD are better semiconductor engineers than Nvidia. however, Nvidia are better software engineers than AMD, i'm not talking about Driver UI's and features, i'm talking about the really complicated stuff in the back end that makes things work.
IMO Nvidia are about as good as it gets in that, one has to admire their skill, its phenomenal.

AMD will never be as good in RT, in image up scaling, image encode / decode.... AMD are playing to their strengths as best they can but having the most advanced architectural design matters for absolutely nothing at all when because of the software you get what matters to you better from Nvidia.

Sure there is a point at which people will buy the AMD GPU over the Nvidia one, but at $250 6750XT vs $500 3070..... that's not sustainable and the instant AMD do that they will never recover from it because from that moment on its what will be expected from AMD. That's the real problem with that.

Yes i know 7900XT..... i have said many times that was bad, just as bad as Nvidia's 4080 stunt and its right that AMD are called out on it, i hope they never try to pull a stunt like that again.

So TLDR:

Nvidia are better in pretty much every area hence why they can charge a premium, good to know we have finally got there :cry:
 
Of course intel and nvidia have to design it. So? That's not the point, the point is you can design all you want, without tsmc you cannot produce it. Only tsmc can make 3d stacked chips.

You have to remember that amd started actually competing in the cpu space after they moved to tsmc. When they were using glofo, they were the better choice of course, but that's because they were undercutting intel by a lot. They were selling their 4 core cpu for less than half the price of a 7700k, and the 7700k was miles ahead in performance, especially gaming. MILES. They reached parity in core performance with zen 2 built on tsmc. And just to remind you, the 7700k was a 2014 architecture built on a 2014 node. If amd were the better designers they should have matches that day one on glofo

Its exactly the point.

On competitiveness in foundries he said after selling what became Global Foundries AMD dumped them and their remaining shares in them for TSMC because they didn't invest in the latest UV machines and manufacturing.
Its not just one thing, in that sense right AMD cannot make the same chips at GloFo, but TSMC cannot use advanced architectural designs from one customer to another because those things need to be designed by said customer.

He also brought up Samsung, Nvidia with their time there did nothing to help make them any better.

He has hopes for Intel foundries but they will need help from their competitors, which means they are going to have to be quite friendly and Intel have been courting AMD.
 
Its exactly the point.

On competitiveness in foundries he said after selling what became Global Foundries AMD dumped them and their remaining shares in them for TSMC because they didn't invest in the latest UV machines and manufacturing.
Its not just one thing, in that sense right AMD cannot make the same chips at GloFo, but TSMC cannot use advanced architectural designs from one customer to another because those things need to be designed by said customer.

He also brought up Samsung, Nvidia with their time there did nothing to help make them any better.

He has hopes for Intel foundries but they will need help from their competitors, which means they are going to have to be quite friendly and Intel have been courting AMD.
Our disagreement lies on your claim that amd has better chip designs. All evidence at least the way I see it points to the exact opposite. Now sure you can claim that it's due to bad software that their hardware severely underperforms, but there is no way to know that.

All i can see is amd delivering less with more. The xtx should be dwarfing the 4080 in performance, and yet it's barely ahead on some scenarios and a lot behind in others. Is their software really THAT bad?
 
Whether or not its 'better' or not is one thing but you can't argue against the fact that AMD definately have the most innovative design. IMO is this AMD's first gen Ryzen moment in the GPU sector. Lets see how they compare to Nvidia in a gen or two once the kinks are ironed out. Chiplets are the future and its only a matter of time really before Nvidia take the same approach.
 
Our disagreement lies on your claim that amd has better chip designs. All evidence at least the way I see it points to the exact opposite. Now sure you can claim that it's due to bad software that their hardware severely underperforms, but there is no way to know that.

All i can see is amd delivering less with more. The xtx should be dwarfing the 4080 in performance, and yet it's barely ahead on some scenarios and a lot behind in others. Is their software really THAT bad?
Until others catch up they do, AMD have been making MCM X86 CPU's for 6 years now, Intel still talk more about MCM concepts than doing.
No one else does 3D Stacking
AMD were first with MCM GPU's, waiting for Intel or Nvidia to do that, Intel had MCM GPU concepts that they talked a lot about, one day that was quietly dropped never to be spoken of again.

Also don't forget, AMD first MCM CPU's were at Global Foundries, not TSMC.

All i can see is amd delivering less with more. The xtx should be dwarfing the 4080 in performance, and yet it's barely ahead on some scenarios and a lot behind in others. Is their software really THAT bad?

IMO Apple, arguably has some of the most advanced core designs, they are not the fastest.
 
Last edited:
Our disagreement lies on your claim that amd has better chip designs. All evidence at least the way I see it points to the exact opposite. Now sure you can claim that it's due to bad software that their hardware severely underperforms, but there is no way to know that.

All i can see is amd delivering less with more. The xtx should be dwarfing the 4080 in performance, and yet it's barely ahead on some scenarios and a lot behind in others. Is their software really THAT bad?

Was wondering how far longer can the nodes shrink ?
 
I don't think we will ever agree cause amd hur dur. I'm going by the results, they have the cutting edge and their 50% more of a gpu 7900xtx barely beats - if even that - the 4080 that's half the chip.

And the same applies to the cpu space where it basically took a move to tsmcs 7nm to actually match Intel's ancient 2014 node and cpu design.

Even their zen 4 part that moved to 5nm,compare it to Intel's jump between 12 and 13th gen that was on the same node
 
Chiplet designs have the disadvantages of needing more transistors and more power, because you need to have more complex interconnects. But it makes manufacturing easier because you can use smaller chips, and only can use lagging nodes. Nvidia and Intel will have to make the move at some point too and navigate the same issues. Intel already started with Lakefield.Nvidia won't get the same process node jump next generation to 3NM.Ironically AMD will get a bigger boost next generation, as their current generation is on worse nodes than Nvidia.
 
Last edited:
C9Dsn82.png

GLKyd9z.gif



So TLDR:

Nvidia are better in pretty much every area hence why they can charge a premium, good to know we have finally got there :cry:

boom-preacher-lawson.gif
 
Chiplet designs have the disadvantages of needing more transistors and more power, because you need to have more complex interconnects. But it makes manufacturing easier because you can use smaller chips, and only can use lagging nodes. Nvidia and Intel will have to make the move at some point too and navigate the same issues. Intel already started with Lakefield.Nvidia won't get the same process node jump next generation to 3NM.Ironically AMD will get a bigger boost next generation, as their current generation is on worse nodes than Nvidia.

To use an example.

Analogue semiconductors have stopped shrinking entirely at about 7nm, it tailed off quite drastically already at 14nm.

Memory controllers and Memory cache are Analogue semiconductors, because they no longer shrink anything on the logic die that is that is the same size on 4nm as it was on 7nm, its taking up quite a lot more die space, more of the die space you have to work with is given over to these things.
This is what Jenson is talking about when he says Moores Law is dead.

To stop the size of the die from ballooning out of control and costing more just because of the dead useless space Analogue semiconductors take up Nvidia are cutting down on how much of that they have on those dies, so 3070 vs 4070 a 256Bit Bus becomes 192Bit, its narrower and you can only use memory capacity by 3 multiples.

AMD with MCM remove all of that Analogue semiconductors off the logic die, they make all of that stuff on cheap older 6nm nodes in 64Bit blocks which they then glue to the logic die by the amount of their choosing, 4X for 256Bit, 6X for 384Bit..... any number, really, with in reason.
Another added bonus is the logic die, the most important bit is much smaller than it otherwise would be, the 5nm 7900XTX is literally half the size at 300mm vs the 4090 at 600mm on 4nm, its smaller than a 6700XT.

Don't tell me Nvidia aren't doing MCM because they chose not to.
 
Last edited:
I find the delusion from both sides rather amusing when PCMR thinks these companies can do what they want. Everyone of them think its a sellers market. Its a sellers market for the FOMO crowd.

So last December(the quarter where most sales tends to happen):

LhvzOEH.png

This March:
U00y0bt.png

Graphics card sales are in the toilet for both.Even though Nvidia sells far more dGPUs, AMD console revenue is almost the same as Nvidia consumer graphics revenue last quarter! :cry:


Their last quarter consumer graphics revenue results were down 46% YoY to $1.83 billion.



Nvidia has massively cut down on TSMC orders:
https://twitter.com/SKundojjala/status/1640405109359378433

OuOI71R.png

Then look at the collapse in CPU sales from Intel and AMD. Yet the PCMR forum guys think Nvidia,AMD and Intel can price their dGPUs and CPUs at any price they want. AMD has already had to reduce prices of its dGPUs and CPUs.

Now it appears the RTX4080 and RTX4090 prices are being reduced too:

The RTX4070 was reduced in price within a week of a launch.

BkqAUuD.png

Samsung,etc are saying they are making losses in NAND/DRAM. Prices of SSDs,DDR4,DDR5,etc are collapsing.

PC sale are collapsing:

Gartner Says Worldwide PC Shipments Declined 30% in First Quarter of 2023​


High Inventory and Low Demand Led to Second Consecutive Quarter of Historic Year-Over-Year Decline​


But iTs A sElLeRs MaRkEt right? But Ai MeAnS tHeY cAn SeT aNy PrIcE! PCMR are delusional! :cry:

It's all going according to plan Rodney!

luster-chandelier.gif
 
Last edited:
I find the delusion from both sides rather amusing when PCMR thinks these companies can do what they want. Everyone of them think its a sellers market. Its a sellers market for the FOMO crowd.

So last December(the quarter where most sales tends to happen):

LhvzOEH.png

This March:
U00y0bt.png

Graphics card sales are in the toilet for both.Even though Nvidia sells far more dGPUs, AMD console revenue is almost the same as Nvidia consumer graphics revenue last quarter! :cry:


Their last quarter consumer graphics revenue results were down 46% YoY to $1.83 billion.



Nvidia has massively cut down on TSMC orders:
https://twitter.com/SKundojjala/status/1640405109359378433

OuOI71R.png

Then look at the collapse in CPU sales from Intel and AMD. Yet the PCMR forum guys think Nvidia,AMD and Intel can price their dGPUs and CPUs at any price they want. AMD has already had to reduce prices of its dGPUs and CPUs.

Now it appears the RTX4080 and RTX4090 prices are being reduced too:

The RTX4070 was reduced in price within a week of a launch.

BkqAUuD.png

Samsung,etc are saying they are making losses in NAND/DRAM. Prices of SSDs,DDR4,DDR5,etc are collapsing.

PC sale are collapsing:



But iTs A sElLeRs MaRkEt right? But Ai MeAnS tHeY cAn SeT aNy PrIcE! PCMR are delusional! :cry:

It's all going according to plan Rodney!

luster-chandelier.gif

Yep. That is why I have not touched any of the cards this gen. Same as Nexus. Poor fella still stuck on his 10GB 3080. I fear he will give in and get a 4090 next year closer to the end of that cards life cycle :cry:
 
To use an example.

Analogue semiconductors have stopped shrinking entirely at about 7nm, it tailed off quite drastically already at 14nm.

Memory controllers and Memory cache are Analogue semiconductors, because they no longer shrink anything on the logic die that is that is the same size on 4nm as it was on 7nm, its taking up quite a lot more die space, more of the die space you have to work with is given over to these things.
This is what Jenson is talking about when he says Moores Law is dead.

To stop the size of the die from ballooning out of control and costing more just because of the dead useless space Analogue semiconductors take up Nvidia are cutting down on how much of that they have on those dies, so 3070 vs 4070 a 256Bit Bus becomes 192Bit, its narrower and you can only use memory capacity by 3 multiples.

AMD with MCM remove all of that Analogue semiconductors off the logic die, they make all of that stuff on cheap older 6nm nodes in 64Bit blocks which they then glue to the logic die by the amount of their choosing, 4X for 256Bit, 6X for 384Bit..... any number, really, with in reason.
Another added bonus is the logic die, the most important bit is much smaller than it otherwise would be, the 5nm 7900XTX is literally half the size at 300mm vs the 4090 at 600mm on 4nm, its smaller than a 6700XT.

Don't tell me Nvidia aren't doing MCM because they chose not to.
When Nvidia does MCM designs it will be the second coming!

D6wnr03.jpg

Yep. That is why I have not touched any of the cards this gen. Same as Nexus. Poor fella still stuck on his 10GB 3080. I fear he will give in and get a 4090 next year closer to the end of that cards life cycle :cry:

His fault with his obsession with RT. If a game is not good enough to hold my attention on medium settings,even if it looks fantastic on the highest settings its a tech demo for me. I can always run it again on better settings when I have faster hardware a few years later.


And he loved the smackdown melmac provided you :p:cry:
Because he was right about AMD repeating screwing up launches doing dumbarse things,something I have moaned about for years. He also missed some more egregious screwups during that period,which I almost felt like chiming in on. Even Zen1 had a disastrous launch when it came to the platform.

AMD has the ability to clutch defeat from the jaws of victory.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom