• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Then maybe you should try and understand how it works.

Pascal dies were tiny. Margins were enormous. Maxwell dies (most of them) were tiny, margins were enormous. Kepler dies were mostly tiny too, margins were slightly less.

Tiny dies with high clocks will not work any more. For RT they need to be enormous, or at least contain a lot of stuff, even shrunk, yet still be quite large. It's partially to do with why Turing saw an enormous price hike, to new levels. Were the margins still there for Nvidia? of course they were ! *BUT* the part you are trying to dismiss is very real. Manufacturing costs were more than ever. The cooler alone cost them $50+ a piece TO MAKE. All of that gets tacked on at the end, then the margin is added.

That is why, even if the 3080 is faster than the 2080Ti in *everything* it "only" costs £800. That is a what? £600 saving over the 2080Ti, and in RT it will absolutely blow it away.

Whether you like that price? I don't really care man. I don't like it either and I won't be paying it.BUT. That does not dismiss the facts.

AMD won't do you any favours either. They'll just price slightly less than Nvidia. And still charge far too much. Get used to it, or, this December buy a console and walk away from it.

I understand all that. You have said nothing that is not known. Lol.

I am not trying to dismiss anything. I am pointing out that huge increases in prices cannot be just put down to increased costs alone.
 
Problem is "RTX" is what is pushing up the price but it has been a failure so far. No one really cares about it and it's not the direction things are going at the moment. Though now that they have included it, they can't backtrack and remove it to cut costs. AMD could use that to u

Ray-tracing is behind photogrammetry (looks better imo and much lower requirements) and VR on the feature priority list. Way behind. Nvidia is having to pay developers to include RTX, but even when they invest the time in it, it's not enough of an improvement to warrant the massive fps hit so most people turn it off.
 
Last edited:
If Tensor cores are the excuse for the high prices, why would 15-20% cost $1400

They can't have it both ways. A 2080ti without ray tracing would have been okay at $700

OK look let me explain how this works once. I'm not being sarcastic, but try and understand. This goes for any one else wondering why Turing was so expensive.

It had an absolutely ENORMOUS die. 772mm2. Want me to put that in perspective? OK. The Titan XP had a die sized 471mm2. So that die is about 40% larger. Following so far?. Right. This is a silicon wafer.

4iTR6TO.jpg

They are round. OK, so see all of the little squares? they are CPU or GPU dies. See all of the ones cut off around the edge? those go in the bin. However, not all of those dies will work. Silicon wafers are not perfect, and any inconsistency will lead to a dead die. The larger that die is? the higher chance that a die size will encroach on a bad area.

These large dies are called monolithic dies. Because they are one die, and they are huge. And they are very expensive, because like I said the success rate is much lower. If you look at how AMD made Ryzen (well, Jim Keller) you can see what genius went into it. Why? because the dies are made in small packages and then "glued" together as Intel say with Infinity Fabric. Which means the dies per cluster (core amount depending on shrink and how much they can fit in an area) are tiny which means that the success rate is off the chart.

That is why the 18 core Intel costs a ****** fortune, yet the 3950x in comparison costs them bugger all. Because the 3950x is made from much smaller dies, meaning the % of working dies per wafer is enormous lol.

THAT is why Turing was expensive. To produce. Even if TSMC charged them "per working die" they would have charged them for the ones that didn't work in that price too. Otherwise they would have gone bust. THAT is why Nvidia went to Samsung. THAT is why the 3080 has been most probably made in two smaller parts (do you understand the logic now?) and that is why it costs £600 less than a 2080Ti even with a $100 for them to make cooler on.

Without Ray Tracing? PAH. They could have stuck to cards like the 670, 680, 970, 980, 1070.... You get my drift yeah? tiny weeeeeny little dies that are really successful and cheap and then charge a fortune. But that, categorically WILL NOT WORK for Ray Tracing. For that you need serious muscles. Had they not bothered? then AMD *would have*. And they would have looked really stupid.

Again I will reiterate, I think RT is awful as it sits. BUT. Those Tensor cores are *AMAZING* in DLSS 2.0. It's genius level stuff.
 
I understand all that. You have said nothing that is not known. Lol.

I am not trying to dismiss anything. I am pointing out that huge increases in prices cannot be just put down to increased costs alone.

Titan XP 471mm2.
2080Ti 772mm2.

40% more at the same price. Meaning the 1080Ti with exactly the same size die (but not fully working hence the cheaper price) was what? £699 launch price or about £750. Add 40% and a much more expensive cooler and where do you end up?

I knew as soon as Nvidia dropped their tiny dies and went big tank things would get expensive. I just knew it. The GTX 480 almost bankrupted them, because it was HUGE and they couldn't sell it for what they wanted to because of ATI and the 5000 series cards. That were not huge, and were mostly as fast. That is where the light bulb went off in Nvidia's head (Kepler, tiny little dies huge clocks) and ATI sold to AMD and AMD screwed the pooch and did the opposite LMFAO. And have been ever since, until Polaris and Navi. Huge tank dies are all well and good, but in DX11 games? clock speed mattered the most.

That all changes when you ad RT.

If you had any sense (or more money than sense, like I did lol) you would have avoided Turing completely and waited for the real deal, Ampere. That was what was coming, Turing was a stand in 'cause Jen must have been bored or something.
 
OK look let me explain how this works once. I'm not being sarcastic, but try and understand. This goes for any one else wondering why Turing was so expensive.

It had an absolutely ENORMOUS die. 772mm2. Want me to put that in perspective? OK. The Titan XP had a die sized 471mm2. So that die is about 40% larger. Following so far?. Right. This is a silicon wafer.

4iTR6TO.jpg

They are round. OK, so see all of the little squares? they are CPU or GPU dies. See all of the ones cut off around the edge? those go in the bin. However, not all of those dies will work. Silicon wafers are not perfect, and any inconsistency will lead to a dead die. The larger that die is? the higher chance that a die size will encroach on a bad area.

These large dies are called monolithic dies. Because they are one die, and they are huge. And they are very expensive, because like I said the success rate is much lower. If you look at how AMD made Ryzen (well, Jim Keller) you can see what genius went into it. Why? because the dies are made in small packages and then "glued" together as Intel say with Infinity Fabric. Which means the dies per cluster (core amount depending on shrink and how much they can fit in an area) are tiny which means that the success rate is off the chart.

That is why the 18 core Intel costs a ****** fortune, yet the 3950x in comparison costs them bugger all. Because the 3950x is made from much smaller dies, meaning the % of working dies per wafer is enormous lol.

THAT is why Turing was expensive. To produce. Even if TSMC charged them "per working die" they would have charged them for the ones that didn't work in that price too. Otherwise they would have gone bust. THAT is why Nvidia went to Samsung. THAT is why the 3080 has been most probably made in two smaller parts (do you understand the logic now?) and that is why it costs £600 less than a 2080Ti even with a $100 for them to make cooler on.

Without Ray Tracing? PAH. They could have stuck to cards like the 670, 680, 970, 980, 1070.... You get my drift yeah? tiny weeeeeny little dies that are really successful and cheap and then charge a fortune. But that, categorically WILL NOT WORK for Ray Tracing. For that you need serious muscles. Had they not bothered? then AMD *would have*. And they would have looked really stupid.

Again I will reiterate, I think RT is awful as it sits. BUT. Those Tensor cores are *AMAZING* in DLSS 2.0. It's genius level stuff.

good read, thanks
 
Titan XP 471mm2.
2080Ti 772mm2.

40% more at the same price. Meaning the 1080Ti with exactly the same size die (but not fully working hence the cheaper price) was what? £699 launch price or about £750. Add 40% and a much more expensive cooler and where do you end up?

I knew as soon as Nvidia dropped their tiny dies and went big tank things would get expensive. I just knew it. The GTX 480 almost bankrupted them, because it was HUGE and they couldn't sell it for what they wanted to because of ATI and the 5000 series cards. That were not huge, and were mostly as fast. That is where the light bulb went off in Nvidia's head (Kepler, tiny little dies huge clocks) and ATI sold to AMD and AMD screwed the pooch and did the opposite LMFAO. And have been ever since, until Polaris and Navi. Huge tank dies are all well and good, but in DX11 games? clock speed mattered the most.

That all changes when you ad RT.

If you had any sense (or more money than sense, like I did lol) you would have avoided Turing completely and waited for the real deal, Ampere. That was what was coming, Turing was a stand in 'cause Jen must have been bored or something.
I did avoid it, way overpriced for what you got ;)

But now looking forward to Ampere and will grab a 3070 or maybe even a 3080 as I have many games I look forward to in the next 12 months :D
 
I did avoid it, way overpriced for what you got ;)

But now looking forward to Ampere and will grab a 3070 or maybe even a 3080 as I have many games I look forward to in the next 12 months :D

As always the value will be lower down the stack. If you want bleeding edge you have to pay for it.

I would never, ever have bought a 2080Ti (let alone a £1800 Kingpin lmfao) had I not had the cash literally fall in my lap. I was good, I have no debt and I have savings and yeah.. I don't drink, or smoke, or anything else (no drug habits or what not, I don't gamble either I used to write the software so I know how it works :D ) but yeah, I had £7k fall in my lap unexpectedly and so I bought a new PC. Mostly because I knew that I would have wasted it all otherwise on tat, and would have had nothing to show for it. I am not a "usual" top end card buyer. I had already bought my upgrades (I have two rigs, so 2070 for £350 in one, and 2070S for £418 in the other) and I was happy.

Before that? used Titan XP for £675 (less than a new 1080Ti and it was 3 months old) and before that a used Titan XM for £400 right before the 1070 came out for a bit more, yet the XM was faster because it had a Hybrid. My second rig got a mining Vega 64 for £230.

I never ever buy bleeding edge GPUs. Not ever. Last time I did? was three Titan Black, and again that money came out of thin air too unexpectedly.
 
Nvidia net and gross margins rose during the Turing era,especially when 1 year ago, they had mining margins with very little mining and had 2/3 of their revenue from gaming.

Edit!!

Any cost increase has been massively outweighed by price increases to end consumers.

People also forget,that TSMC 12NM is basically TSMC 16NM which was an old process node by then,and relatively high yielding and cheap.

This is why Intel is still making decent profits with 14NM++++++++++++++++++++ despite using larger dies than AMD,ie,its probably very cheap and high yielding by now.
 
Last edited:
Problem is "RTX" is what is pushing up the price but it has been a failure so far. No one really cares about it and it's not the direction things are going at the moment.

Computer Graphics folks, from academics to games designers, have been 'caring' about real-time raytracing for decades. It's in the new consoles, it's in AMD's next gen, I'd call that a pretty big part of "the direction things are going", personally.
 
Nvidia net and gross margins rose during the Turing era,especially when 1 year ago, they had higher than mining margins and had 2/3 of their revenue from gaming.

Yeah they did well. And yeah, they did put prices up but that was because other avenues were failing for a while.

But even with the inevitable pee take because they were ahead there is no denying Turing was expensive to make.

Good to see you, BTW :)
 
As always the value will be lower down the stack. If you want bleeding edge you have to pay for it.

I would never, ever have bought a 2080Ti (let alone a £1800 Kingpin lmfao) had I not had the cash literally fall in my lap. I was good, I have no debt and I have savings and yeah.. I don't drink, or smoke, or anything else (no drug habits or what not, I don't gamble either I used to write the software so I know how it works :D ) but yeah, I had £7k fall in my lap unexpectedly and so I bought a new PC. Mostly because I knew that I would have wasted it all otherwise on tat, and would have had nothing to show for it. I am not a "usual" top end card buyer. I had already bought my upgrades (I have two rigs, so 2070 for £350 in one, and 2070S for £418 in the other) and I was happy.

Before that? used Titan XP for £675 (less than a new 1080Ti and it was 3 months old) and before that a used Titan XM for £400 right before the 1070 came out for a bit more, yet the XM was faster because it had a Hybrid. My second rig got a mining Vega 64 for £230.

I never ever buy bleeding edge GPUs. Not ever. Last time I did? was three Titan Black, and again that money came out of thin air too unexpectedly.
Where is it you hang about that cash keeps falling out of thin air like this? I may hang around there too when I get the chance, though I would stick the money on overpayments on my mortgage, as like you I have no bad habits like the ones you mentioned :D


Nvidia net and gross margins rose during the Turing era,especially when 1 year ago, they had higher than mining margins and had 2/3 of their revenue from gaming.
Lies :p
 
Yeah they did well. And yeah, they did put prices up but that was because other avenues were failing for a while.

But even with the inevitable pee take because they were ahead there is no denying Turing was expensive to make.

Good to see you, BTW :)
So in the end you agree with me then. Haha.
 
So in the end you agree with me then. Haha.

No I don't dude. Not in the logic you were talking. Bottom line? the 3090 is smaller than the 2080Ti. And it's £1400, or more. However, they need to make the co processor too...

RT is going to be expensive. As I have explained that was a foregone conclusion as soon as Nvidia switched back to making tanks. They got ahead first (by a long way) and then they dropped the tank RT GPUs before AMD could. And yeah, I saw the comment about consoles and yeah, they have it, but Nvidia beat all of that to the punch by over two years (remember the Volta Titan? how much was that again? lmfao).

Where is it you hang about that cash keeps falling out of thin air like this? I may hang around there too when I get the chance, though I would stick the money on overpayments on my mortgage, as like you I have no bad habits like the ones you mentioned :D

No kids, no mortgage, no wife, single etc. And like I said, no habits at all.

Things happen in life :) we all have good luck at least sometimes.
 
Yeah they did well. And yeah, they did put prices up but that was because other avenues were failing for a while.

But even with the inevitable pee take because they were ahead there is no denying Turing was expensive to make.

Good to see you, BTW :)

Good to see you too!:)


The issue is Nvidia keeps increasing prices,but also tends to stick to older and cheaper nodes. There is a reason why AMD/ATI tended to have lower margins,as they moved quicker to newer,lower yielding nodes,and priced the GPUs cheaper too.
 
So in your mind RT is "eye-candy over substance", but all the other techniques are not?
Should we all be happy with Commodore-64 levels of graphics?

Personally I love the idea of more accurate reflections and better modelling of transparent and semi-transparent materials, it can all add to the immersion.
(I also have a lot of time for C64 games, before anyone asks...)

at the moment you dont need Ray Tracing to enjoy a game but you had to pay for the hardware to use it. The "other techniques" as you call it were not added premium extras if you bought into the hardware.
Did someone playing Battlefield for instance enjoy the game more because it looked better with RTX enabled ? From what I read it was turned off because of the hit performance.
 
Computer Graphics folks, from academics to games designers, have been 'caring' about real-time raytracing for decades. It's in the new consoles, it's in AMD's next gen, I'd call that a pretty big part of "the direction things are going", personally.

Only a handful of games are actually using it and it's not worth having on in any of them due to the performance hit. It really doesn't add anything to the experience. If Nvidia wasn't giving them cash they wouldn't have bothered. It's not sustainable in this form just like Physx and 3D vision wasn't.

Give it 5 years maybe and it will just become a basic feature. Once it runs well and costs drop. But right now there are other priorities which have a far bigger impact on the experience and sales numbers.
 
No I don't dude. Not in the logic you were talking. Bottom line? the 3090 is smaller than the 2080Ti. And it's £1400, or more. However, they need to make the co processor too...

RT is going to be expensive. As I have explained that was a foregone conclusion as soon as Nvidia switched back to making tanks. They got ahead first (by a long way) and then they dropped the tank RT GPUs before AMD could. And yeah, I saw the comment about consoles and yeah, they have it, but Nvidia beat all of that to the punch by over two years (remember the Volta Titan? how much was that again? lmfao).



No kids, no mortgage, no wife, single etc. And like I said, no habits at all.

Things happen in life :) we all have good luck at least sometimes.
In the end all I am saying is their profit margins are up and it can’t be all put down to increase of costs like some like to do here. But you chose to interpret in differently.
 
In the end all I am saying is their profit margins are up and it can’t be all put down to increase of costs like some like to do here. But you chose to interpret in differently.

Nvidia marketing is cunning. Do you think it was co-incidence,before we saw the first price rise,ie,the first Titan,that a leak apparently showed Nvidia complaining about high node pricing,etc. Ever since then as Nvidia makes more record gross/net margins and revenue,people keep thinking Nvidia is only rising prices to cover rising costs. Before no one gave two craps about production costs,etc - it shows you how well guerrilla marketing works.

If that was the case,Nvidia margins and revenue from the consumer space would remain flat. Like I said 12 months ago,with mining petering out,they made well over 60% margins with consumer graphics revenue taking up 2/3 of all their revenue. That is more than Apple or Intel BTW.

TSMC 12NM/16NM was a lagging node,so Nvidia was doing TSMC a favour by using it.

The only reason margins dropped a bit recently is because they had to pay billions of USD for Mellanox.

This is what gamers don't appreciate,Nvidia is using all the money from rising gaming GPU prices,to fund things such as Tegra(which I believe lost them at least a $1 billion at the start),and other purchases. So they are loosing money or subsidising those purchases,from consumer sales. This is exactly the same reason Intel milked consumer sales,ie,to fund contra-revenue for Atom and other purchases.

So all those high Intel/Nvidia prices gave people cheaper Atom/Tegra CPUs in cheap consumer products(do people think Nintendo wants expensive parts too?). Even AMD to a degree is probably doing it,ie,console SOCs will be lower margin than the consumer products. The irony their loyal fans get taken for a ride,and the rest who only wanted some cheap tablets or consoles,got them cheaper due to gamers helping out. The irony.
 
Last edited:
Back
Top Bottom