• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Caporegime
Joined
18 Oct 2002
Posts
39,322
Location
Ireland
It has been said, from legit sources, that the cooler alone costs $150 each to make. That's not sold on to us as a profit, that is the cost to make each one.

When you make a cooler like that at that price you are so far up your own a**e that all reality is gone. Nvidia want to be Apple *so* bad. However, for someone to actually get away with that you need to offer something that your buyers will see as truly exclusive. Not just the fastest (which I have absolutely no doubt it will be) but also the most exquisitely built.


Basically all speculation "sources" or not. Throw enough **** at a wall and some of it is gonna stick. Remember how all the "sources" got the fury x wrong in terms of hbm? One minute it was 4 gig, then 6 gig then 8 gig, then up and down until it was revealed it was 4 gig, then they can point to their 4 gig "source story" and proclaim they were right while sticking their fingers in the ear and whistling Dixie if anyone brings up the articles that "sources" got wrong. Just seems more like tech sites pulling numbers out of the air once they spot an unusual cooler design citing a random $150 to make for clicks++.
 
Soldato
Joined
6 Feb 2019
Posts
17,596
According to that article the FE has the cut out for the cooler but the ones for Board Partners will be the regular rectangle shape. They also claim cost of the FE cooler is $150USD.

$150 sounds excessive way ott- it's hard to belive, they also don't back it up with any further details. But I do belive Nvidia has had to change their cooler because of the TDP. Most recent leaks show the 3080ti/Titan shipping with 350w tdp that's significantly more than Turing had to cool
 
Soldato
Joined
6 Feb 2019
Posts
17,596
With the way the fins in the red area are curved like that air isn't going to be able to travel trough to the other fan.

If the idea is to have a trough flow then it seems badly designed to me. i mean a fan on the back unless it is deeply in bedded, it will makes the card very thick

I'm reckoning its a bad idea and probably completely bogus. but as always only time will tell.

It's a split pcb design - one fan and one heatsink cools just the power delivery pcb and the second fan and second heatsink cools just the gpu core and memory pcb
 
Caporegime
Joined
18 Oct 2002
Posts
39,322
Location
Ireland
It's a split pcb design - one fan and one heatsink cools just the power delivery pcb and the second fan and second heatsink cools just the gpu core and memory pcb


Well, maybe maybe not. They're saying there's cutouts in the pcb which would make it similar to the original gtx 295 before they went with a single pcb variant.

Can only tell so much from the pics and they don't really show enough detail to know what is really going on underneath it all. It would be a really odd move if they only dedicated a single axial fan to cooling the gpu and memory etc if this is indeed meant to be a power hog, in cooling terms it would be a downgrade over the 2080 FE cooler and that was just about enough to keep turing "cool". And that heatsink design was honestly crap as it pretty much blocked off air getting to the pcb.
 
Last edited:
Soldato
Joined
23 Apr 2010
Posts
11,896
Location
West Sussex
Basically all speculation "sources" or not. Throw enough **** at a wall and some of it is gonna stick. Remember how all the "sources" got the fury x wrong in terms of hbm? One minute it was 4 gig, then 6 gig then 8 gig, then up and down until it was revealed it was 4 gig, then they can point to their 4 gig "source story" and proclaim they were right while sticking their fingers in the ear and whistling Dixie if anyone brings up the articles that "sources" got wrong. Just seems more like tech sites pulling numbers out of the air once they spot an unusual cooler design citing a random $150 to make for clicks++.

There is speculation and there are facts. The leak apparently came out of Foxconn, or somewhere else where these parts are being made. I can well believe what the coolers cost too, because the 2080Ti FE cooler cost a bomb too. About $70 IIRC.

I'm not debating the cost of the GPU itself, just the cooler, as that is all we have any factual information on ATM.

If someone told you when Bulldozer and then Sandybridge launched what would happen afterwards you would never have believed them. Intel are going to keep us jammed on four cores, progressively up the price until an I3 costs £185 and ETC. But that is what happened and what will continue to happen any time one company gets to dominate.

Don't think for one moment that Nvidia will start selling cheap GPUs. That isn't how greed works.

The $150 figure was put out by someone on the ground. It all depends on how the cooler is manufactured. If, for example, it is carved from billet? it will be considerably more expensive than some plastic gash. Remember, we live in a world where Apple charges $1000 for a monitor stand and case wheels for what are they? $500?
 
Soldato
Joined
23 Apr 2010
Posts
11,896
Location
West Sussex
@ALXAndy
Just spit balling some ideas here. Let us assume that it is going to cost $150 for the cooler alone. Why would they design such a cooler unless they need it? They could have literally slapped on the current FE coolers add a few visual tweaks and jobs done. Keep all the extra money for themselves. But they didn't. They designed a new cooler from the grounds up.

What if they simply cannot use the current FE coolers on these new 3000 series cards?

Comparing the renders and the images of the 3080 cooler, to the RTX 2080ti cooler (link) it seems like there is more fin surface area on the 3080 cooler.

If you scroll down the chart here you will see that the TDP for the A100 chip is 100W higher than then V100 it is replacing. Edit: That is with a 20% reduction to boost speeds

If the assumptions are correct i reckon they've had to do this to keep temperatures in check.

Nvidia and need. Have they ever gone together? Did Apple need to over engineer the Mac Pro so that it costs $5k for a load of garbage spec? did they have to make a $1000 monitor stand? no, of course not.

Could've, would've, should've. Jen has already proven that thanks to us people are more than willing to pay £1300 for a GPU so why not?

Did Nvidia need to make Founder's Editions in the first place? no. Greed. Now they are just showing off, just like the people who will buy them. Remember, when someone buys a FE 100% of the profits go to Nvidia. So that "gap" they leave for OEMs like MSI and so on doesn't need to be there. So, on something like a 2080Ti a $70 cooler made little difference, given the profits they were raking in (record ones remember? just like they boasted of). Now see, I was not expecting that but then that just shows that whilst I am rather cynical I am rather naive. I thought the lack of people paying those daft prices for the 20 series would hurt them but it seems I had it all wrong.

"What goes up but doesn't come down?" prices.
 
Associate
Joined
15 May 2020
Posts
387
I think the partner cards will take twice as long to appear as in the past.

Anyway you never know AMD might do a Ryzen and bring out a product to sway the masses away like they did against Intel this time round.

Fingers crossed for that from AMD.

I think that there are quite a few like me hanging out for an HDMI 2.1 card to use for TV gaming. Without a competitor they're gonna absolutely scalp us.
 
Caporegime
Joined
8 Sep 2005
Posts
27,421
Location
Utopia
If you read his post he said

AMD will need a strong lineup across the board to gain any meaningful market share not just at high end which the average Joe doesn't really care about.
I read what he said and it was basically "If AMD isn't faster they will not gain any market share". It doesn't matter whether he said across all product lines or not, as the principle is the same and AMD will anyway be judged by their performance of initial releases of RDNA2, which will be at the high-end. They just have to be as competitive as possible and not lag behind too much, while being better value, and they will sell a bucketload.
 
Associate
Joined
21 Apr 2007
Posts
2,487
Did Nvidia need to make Founder's Editions in the first place? no. Greed. Now they are just showing off, just like the people who will buy them. Remember, when someone buys a FE 100% of the profits go to Nvidia. So that "gap" they leave for OEMs like MSI and so on doesn't need to be there. So, on something like a 2080Ti a $70 cooler made little difference, given the profits they were raking in (record ones remember? just like they boasted of). Now see, I was not expecting that but then that just shows that whilst I am rather cynical I am rather naive. I thought the lack of people paying those daft prices for the 20 series would hurt them but it seems I had it all wrong.

"What goes up but doesn't come down?" prices.

No you weren't wrong imho, Turing sales shifted up with the 2070S they were down on volume sales prior to that but obviously did ok (still below expectation as I recall) with GP because of the margins.... Just like Apple with falling iPhone sales but higher prices. Now is the time Apple decided to release the iPhone SE2 (their best selling line because of its lower cost) which they could have done ages ago its not unlike the 2070S in many ways. Nvidia as you were saying earlier is looking at Apple and they are doing everything they can to de-commoditise GPUs to turn it from a blackbox into some religious artefact of value purely to drive up prices, to make it the best and people want it and therefore pay the premium. It works whilst enough people can afford it and your customers continue to upgrade.

At the end of the day who except a very small handful of owners actually cares what the damn thing looks like, or what its theoretical potential is? Its the gaming experience that matters most and personally I think each generation more PC gamers are just waking up to those realities because it has a cost whilst you will always have a % that don't give a damn for a variety of reasons which I won't spell out here cos there is always one salty sea dog amongst them ;)
 
Associate
Joined
16 Jan 2010
Posts
1,415
Location
Earth
I do hope the 850w PSU I bought in March is gonna be up to the task of powering a RTX 3080!

(I had bought a new case so thought of grabbing a new PSU at the same time. Mine was from 2012 but still going)

I was told at the time 850w was more than enough.
There's no such thing as more than enough with PSUs. I learnt that when my EVGA G2 1300W struggled and caused crashes on 295X2 quadfire and
I had to get a G2 1600W.
Minimum 1600W, it's the only way to be sure ;).
 
Back
Top Bottom