• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

They took a new node and they still only mactch something from 2 years before on older tech.

as for the 50% claim we only have AMD's word for that. Not like they have ever added coal into the hype train before, or vendor slides are always told the truth. They will need a 50% PPW boost for sure to stay competitive.

It was what was available at the time. It's irrelevant how each team achieve the performance, just that they did achieve it. As stated Navi is more efficient clock for clock on similar power to a 2070 Super, which is current tech.
 
How does it look like a gimmick? What about the cooler do you feel doesn't look practical or like it is designed to dissipate large amounts of heat? Why would they need to "distract" us with a heatsink?

Sounds like you're just making things up, because Nvidia are not going to spend a lot of money on R&D for a beastly cooler like that if it has no practical purpose.

What about that design makes you think it works any better than stuff that's on the market now?

The cooler doesn't look "beastly" to me. There are simpler/cheaper solutions on the market now that would probably dissapate just as much heat.

It looks like complexity for the sake of complexity. -Or even worse, an excuse to charge a lot more money for a little more performance...again.
 
What about that design makes you think it works any better than stuff that's on the market now?

The cooler doesn't look "beastly" to me. There are simpler/cheaper solutions on the market now that would probably dissapate just as much heat.

It looks like complexity for the sake of complexity. -Or even worse, an excuse to charge a lot more money for a little more performance...again.


Techpowerup are of the opinion it's similar to what sapphire do with an overhanging heatsink, just nvidia have slapped another fan on the backside. So it looks different but it's been around for quite a while.
 
Techpowerup are of the opinion it's similar to what sapphire do with an overhanging heatsink, just nvidia have slapped another fan on the backside. So it looks different but it's been around for quite a while.

If that's the case, then I hope both fans are moving air in the same direction so one isn't pulling hot air in from the exhaust of the other.

It also would mean the fin layout doesn't need to be as complicated as it is.
 
It was what was available at the time. It's irrelevant how each team achieve the performance, just that they did achieve it. As stated Navi is more efficient clock for clock on similar power to a 2070 Super, which is current tech.

It's not irrelevant since you stated the 5700XT is more efficient when it really isn't 7nm v 12nm, a smaller die size and no dedicated RT hardware with that information being fact it makes your statement incorrect 12nm isn't current tech.

Hopefully AMD can close the gap but it'll be interesting to see, 320w for Ampere sounds unrealistic looking at historical releases Nvidia have been pretty impressive when it comes to efficiency considering the die sizes.
 
Last edited:
It's not irrelevant since you stated the 5700XT is more efficient when it really isn't the facts are 7nm v 12nm, it also has a smaller die size and has no dedicated RT hardware with that information being fact it makes your statement incorrect 12nm isn't current tech.

I said it's more efficient clock for clock. I also said they are around the same power draw so take from that what you will. The numbers are there so why does it matter that AMD chose 7nm and Nvidia chose 12nm? RIGHT NOW they are roughly equivalent.

You can say all the "but if" things you want, that's just how it is. Soon Nvidia will move to 7nm and AMD claim a 50% efficiency increase so we will see where the chips lie at that point, but for now they are roughly even in performance per watt.
 
Hopefully AMD can close the gap but it'll be interesting to see, 320w for Ampere sounds unrealistic looking at historical releases Nvidia have been pretty impressive when it comes to efficiency considering the die sizes.

The A100 increased the TDP vs V100 by 100W and lowered the boost clock. It is possible that TDP will go up for Nvidia in their gaming cards as well
 
I said it's more efficient clock for clock. I also said they are around the same power draw so take from that what you will. The numbers are there so why does it matter that AMD chose 7nm and Nvidia chose 12nm? RIGHT NOW they are roughly equivalent.

You can say all the "but if" things you want, that's just how it is. Soon Nvidia will move to 7nm and AMD claim a 50% efficiency increase so we will see where the chips lie at that point, but for now they are roughly even in performance per watt.

Not much more I can say as you don't appear to be looking at the facts I'm really confused how they can be equivalent right now with the 2070 having a larger die with dedicated RT hardware on a older process node whilst consuming 10w less than the 5700XT?

The A100 increased the TDP vs V100 by 100W and lowered the boost clock. It is possible that TDP will go up for Nvidia in their gaming cards as well

This is very true that's why I said it'll be interesting to see I'm curious to see the improvements AMD make being on the 7nm node longer exciting times ahead! It would be real nice if AMD pulled another 9700 type release out the bag I really want to upgrade this year.
 
Last edited:
Not much more I can say as you don't appear to be looking at the facts I'm really confused how they can be equivalent right now with the 2070 having a larger die with dedicated RT hardware on a older process node whilst consuming 10w less than the 5700XT?

I know it has a larger die, the RT cores do nothing the majority of the time anyway. 7nm v 12nm will always be a smaller die. The performance is the performance and the power draw is the power draw though.

Being on 12nm was a choice Nvidia made and that's that, AMD are not bad for using an advanced node to offer the performance instead of a bigger die.
 
I know it has a larger die, the RT cores do nothing the majority of the time anyway. 7nm v 12nm will always be a smaller die. The performance is the performance and the power draw is the power draw though.

Being on 12nm was a choice Nvidia made and that's that, AMD are not bad for using an advanced node to offer the performance instead of a bigger die.

No one said AMD are bad for using a newer process I'm just looking at your original statement about efficiency then looking at the facts and it doesn't quite add up that's all it. Whats funny your argument holds up even less if you compare the 5700XT to the 2080Ti which consumes only 25w more with its 12nm 775mm² die compared to the 7nm 251mm² die on the 5700XT if you let that sink in, makes you wonder how much less power Turing would use if it was stripped of the RT and Tensor cores.
 
Last edited:
No one said AMD are bad for using a newer process I'm just looking at your original statement about efficiency then looking at the facts and it doesn't quite add up that's all it. Whats funny your argument holds up even less if you compare the 5700XT to the 2080Ti which consumes only 25w more with its 12nm 775mm² die compared to the 7nm 251mm² die on the 5700XT if you let that sink in, makes you wonder how much less power Turing would use if it was stripped of the RT and Tensor cores.

I said efficiency per clock, not power efficiency. I have stated that several times. Though looking at reviews the power draw of a 2070 Super is usually about 20W more, not less than the 5700XT.
 
Already did that. Titan V and RTX Titan.

Coming back to prices. How many times have we been told that new NAND is much cheaper to produce? and how many times have SSD prices immediately dropped because of it? none. Just because these new cores may be cheaper for Nvidia to make that doesn't mean they will sell them cheaper.

The original Titan was £999. The Black was £699 or £799 (because it was just a nail for the 29x coffin). Maxwell Titan was £1000 with no DP, Titan XP was £1300 and Titan RTX was £2300 was it? or £2400? something like that.

Now I am sure that some of those were far cheaper to make. Mostly because in the case of the Titan it was literally just a 6gb 780. Didn't stop them though did it? same goes for the Maxwell one and all of the others.

Quite a few facts wrong in this post starting with the original Titan price which was £829.99 give or take a couple of quid depending where and who you got them from, as usual the Asus ones were the rip offs.

Not going to bother pointing out the other price errors but it is easy for anyone to look up.

The real point of the post above is correct though in as much as NVidia cards have become very overpriced. This is not really a problem with the Titan V or RTX Titan, what idiot buys those anyway. The real problem is when the excessive pricing extends to the cards below the Titan. This is not because NVidia are milking their customers, it's because Turing is a crap product that costs too much to produce for very little extra benefit.
 
2 games that I've bought recently have made me feel that my 1080ti is starting to show it's age, Control & Star Wars Jedi. RDR2 is also pushing it to the limit. I'll throw some cash at a new gpu but it has to be a proper performance boost this time, with none of that triangles BS like with RTX. If Nvidia pull another stunt like last time then I'll need to find some decent series on Netflix or maybe even start talking to the wife of an evening.

I'm getting a little hyped as always. :)
 
Quite a few facts wrong in this post starting with the original Titan price which was £829.99 give or take a couple of quid depending where and who you got them from, as usual the Asus ones were the rip offs.

Not going to bother pointing out the other price errors but it is easy for anyone to look up.

The real point of the post above is correct though in as much as NVidia cards have become very overpriced. This is not really a problem with the Titan V or RTX Titan, what idiot buys those anyway. The real problem is when the excessive pricing extends to the cards below the Titan. This is not because NVidia are milking their customers, it's because Turing is a crap product that costs too much to produce for very little extra benefit.

Was a mix up of £ and $. Which itself has changed since then, given the pound has done nothing but drop against the dollar since.
 
2 games that I've bought recently have made me feel that my 1080ti is starting to show it's age, Control & Star Wars Jedi. RDR2 is also pushing it to the limit. I'll throw some cash at a new gpu but it has to be a proper performance boost this time, with none of that triangles BS like with RTX. If Nvidia pull another stunt like last time then I'll need to find some decent series on Netflix or maybe even start talking to the wife of an evening.

I'm getting a little hyped as always. :)

At 1440p my experiences with my 1080 in those 3 games:

Control: I played this over Xmas 2019. Was expecting it to be a system destroyer. I maxed it out at 1440p.
RDR2: Yes. System destroyer. But using the Digital Foundry settings it looked good and played well.
Star Wars: I had no issues maxing this out at 1400p. Even if my fps was only around 60. (according to online benchmarks)
 
At 1440p my experiences with my 1080 in those 3 games:

Control: I played this over Xmas 2019. Was expecting it to be a system destroyer. I maxed it out at 1440p.
RDR2: Yes. System destroyer. But using the Digital Foundry settings it looked good and played well.
Star Wars: I had no issues maxing this out at 1400p. Even if my fps was only around 60. (according to online benchmarks)

I'm looking for 75+ @ 3440*1440
 
Back
Top Bottom