• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

People waiting 3080 Ti should do reality check.

Sure if Nvidia brings out 2080 Ti size chip, that's going to take the ultimate performance crown unless architecture is flop.
But the fact remains that 2080 Ti is huge megalith with 750 mm2 size.
That requires very mature manufacturing node to be viable for consumer markets.

And 7nm is still new node for high performance parts.
While AMD may have been producing Zen2s on it for year and half that means very little:
Zen2 chiplet is tiny 74 mm2 in size, which doesn't need mature process to give good yields.


Hence I wouldn't expect much more than 500 mm2 sized 7nm GPUs in near future.
In year or two 7nm should be more mature for bigger high end chips.
But for example TSMC is pushing hard for next smaller node, which might offer performance boost without increasing size.
(and long term unknown is if GPUs change to chiplet design)
 
It was a driver mess aswell, regretted everything about it.

I'll never have amd again including CPU.

GPU's i get it, i've had annoying issues with mine too but why the CPU? I've had a Ryzen platform for 3 years now and its never put a foot wrong, its been immaculate, perfect.
 
Doesn't matter, even if your rig would have been 300% better/faster/more silent for the same or less money some people will just refuse to touch anything with red on it ;).
Yea, I don't get it, but each to their own.


Fortunately Gibbo managed to unhitch some of the carriages from the Hype Train. Saved a number of lives :p
Indeed :)


Still? Good Grief... :p
Haha. I will never forget it. It was that good :p:D
 
People waiting 3080 Ti should do reality check.

Sure if Nvidia brings out 2080 Ti size chip, that's going to take the ultimate performance crown unless architecture is flop.
But the fact remains that 2080 Ti is huge megalith with 750 mm2 size.
That requires very mature manufacturing node to be viable for consumer markets.

And 7nm is still new node for high performance parts.
While AMD may have been producing Zen2s on it for year and half that means very little:
Zen2 chiplet is tiny 74 mm2 in size, which doesn't need mature process to give good yields.


Hence I wouldn't expect much more than 500 mm2 sized 7nm GPUs in near future.
In year or two 7nm should be more mature for bigger high end chips.
But for example TSMC is pushing hard for next smaller node, which might offer performance boost without increasing size.
(and long term unknown is if GPUs change to chiplet design)

I'm not bothered how big the die is as long as the performance is there and the heat is manageable.
 
Die size effects how many transistors are used but then so does transistor density. A die shrink should allow more work to get done with less die space.
 
As I said, as long as the performance is there is don't care whats in it.

Obviously within reason to cost, and power consumption
And how are you supposed to determine that if you have nothing to compare it to. For example the difference between the 2070s and the 2070 is pretty enormous in some games. However if you only had the 2070 you would have thought that the performance is there.

https://www.guru3d.com/articles-pages/f1-2020-pc-graphics-performance-benchmark-review,6.html
Lol, the 2060s is beating a 2070 in this example. If there was no super variant you would never know how gimped the xx70 was. Something that Nvidia did in the past with the 1070 vs 1070ti. And I'm sure they will do it again with the 3070 because they know they got people like you believing the "performance is there" not realizing there is more performance to be had at the same price point.

IMHO, if nvidia felt that you would consider AMD they wouldn't pull this kind of trick. They would just give you a 2070 that is a 2070s, now. Verses giving you the performance of a 2060s and call it a 2070, charging you more.

It time to learn from history.
 
Last edited:
Doesn't matter, even if your rig would have been 300% better/faster/more silent for the same or less money some people will just refuse to touch anything with red on it ;).

Yeap, vendor loyalty is a silly thing. I understand not buying AMD when there's a big performance gap, or a big performance per watt gap, or bad driver issues. What I don't understand is why NV fans refuse to buy AMD even in the few points of history when AMD have been superior (9700pro, 9800pro, 4870, 5870, 7970). These cards were far ahead of their NV counterparts, yet the majority still bought NV. IMO this is what caused AMD to do so badly since the 7970 - they simply lost out on too much market share, and revenue for R&D suffered. Hawaii, Fiji, Polaris, Vega and Navi have all been terrible, for one reason or another.

I fully believe this is partly why PC components are now so expensive, as NV prices has forced more and more onto consoles. The irony is that AMD power the consoles... I expect consoles to continue to eclipse PC gaming going forward, and for the overall PC gaming market share to decrease.

Very curious to see what NV price their 3000 series at, this may have implications for the future of PC gaming as a whole. If the PS5 and new xbox are half the price of a 3070/3080, consoles will dominate even further. PC gaming will have less innovation and attention, until it's only mostly old men, bickering on forums and playing old games dreaming of the good old PC master race.
 
And how are you supposed to determine that if you have nothing to compare it to. For example the difference between the 2070s and the 2070 is pretty enormous in some games. However if you only had the 2070 you would have thought that the performance is there.

https://www.guru3d.com/articles-pages/f1-2020-pc-graphics-performance-benchmark-review,6.html
Lol, the 2060s is beating a 2070 in this example. If there was no super variant you would never know how gimped the xx70 was. Something that Nvidia did in the past with the 1070 vs 1070ti. And I'm sure they will do it again with the 3070 because they know they got people like you believing the "performance is there" not realizing there is more performance to be had.

It time to learn from history.
Why is the 2070 and 3070 being brought into the discussion? @Too Tall was responding to a post about the 2080Ti / 3080Ti.
 
Why is the 2070 and 3070 being brought into the discussion? @Too Tall was responding to a post about the 2080Ti / 3080Ti.
"As long as the performance is there" (which means any fps higher then previous gen gpu) is why we've seen folk buy and create threads asking to buy 1070 to 1070ti to 2070 to 2070s.

What I need to ask you is why wouldnt you think the performance isn't there between the higher tier cards. When you don't have the information to suggest otherwise. Or implying that the 1070 to 1070ti and 2070 to 2070s didn't happen because you believe his discussion was about the 3080ti???

In the most recent benchmark we have a 2060s on par/beating a 2070. Which shows me how depreciated that card is. This proves that the 2070 was never the tier Nvidia charged you for. But was a tier below it. The 2070 was the card show cased at release of the 2000 series. Which few expected it to be superseded with a gpu die that is faster at the same price point later on. Sandbagging the "super" dies for a release date after the initial release date. Which brings me to the other point...

If this is the tactic Nvidia is doing for the 3000 series and you want to get the best performance for the money you want to spend once it's best to wait after release . That way you aren't buying the same card twice as others have done.

So...would nvidia use this tactic for a 3080ti for something higher, at the same price, and call it a 3090? I hope not. But history does tell us they did with the xx70 series...

I believed I needed to expound on it using the xx70 series as its example to what people will buy. Therefore, not to confuse the statement between tiers of performance.
 
Last edited:
To expand when I say "as long as ther performance is there" I mean if the performance of the 3080Ti is around 25% or higher than that of the 2080 Ti I will buy one.

If it were hypothetically only around 10% then i wouldnt be interested as it would be enough of an upgrade - I'm just talking in terms in pure gains of the 3080ti over the 2080ti.

Die size effects how many transistors are used but then so does transistor density. A die shrink should allow more work to get done with less die space.

But as a consumer I dont really care how many transistors are in a GPU, all I really care about is how many FPS its going to get.

If card A gives 165 FPS with 3million transistors on a 750mm2 die and card B gives 150 fps with 6 million transistors on a 500mm2 die, i'm going to buy card A all day long. Price and wattage being similar.
 
Last edited:
People waiting 3080 Ti should do reality check.

Sure if Nvidia brings out 2080 Ti size chip, that's going to take the ultimate performance crown unless architecture is flop.
But the fact remains that 2080 Ti is huge megalith with 750 mm2 size.
That requires very mature manufacturing node to be viable for consumer markets.

And 7nm is still new node for high performance parts.
While AMD may have been producing Zen2s on it for year and half that means very little:
Zen2 chiplet is tiny 74 mm2 in size, which doesn't need mature process to give good yields.


Hence I wouldn't expect much more than 500 mm2 sized 7nm GPUs in near future.
In year or two 7nm should be more mature for bigger high end chips.
But for example TSMC is pushing hard for next smaller node, which might offer performance boost without increasing size.
(and long term unknown is if GPUs change to chiplet design)

Two things, Nvidia's next gen GPUs seem to be coming on Samsung's 8nm node which is really an improved 10nm mode and is pretty mature process at this stage.
Even if they were using TSMC's 7nm, it's a very mature process now. It has been out for nearly 18 months at this stage.
 
Yeap, vendor loyalty is a silly thing. I understand not buying AMD when there's a big performance gap, or a big performance per watt gap, or bad driver issues. What I don't understand is why NV fans refuse to buy AMD even in the few points of history when AMD have been superior (9700pro, 9800pro, 4870, 5870, 7970). These cards were far ahead of their NV counterparts, yet the majority still bought NV. IMO this is what caused AMD to do so badly since the 7970 - they simply lost out on too much market share, and revenue for R&D suffered. Hawaii, Fiji, Polaris, Vega and Navi have all been terrible, for one reason or another.

Same could be said for AMD fans, there are plenty of people who wouldn't buy AMD either. Not sure what you mean by people not buying AMD/ATI. They bought the 9700Pro, they bought the 9800pro cause they were the best cards at the time. And that goodwill held over for the x800 cards too, even though they were worse than Nvidia's cards they still sold well. Market share for that period was pretty even.

4870 sold well but it wasn't the fastest card.
5870 had the market to itself for 6 months so it sold well too.
7970? You can't really include this card. It launched at really high price for the time. It was more expensive than the 680 and slower, they had to release the GHz edition to just match the 680. It was only in November of that year, 11 months after release, that they got the performance drivers out that made the 7970 best card out of the then two by far. But by that stage the news about Nvidia releasing a Titan card in February started.

They had had great cards. The 290 cards were brilliant, but the first two/three months were marred by black screens, bad Elipda memory and terrible reference coolers.
 
Are you honestly expecting the 2080 Ti to keep the performance crown? So the 3080 Ti is going to be slower?

That's not what he is saying. He is saying that if Nvidia bring out a chip on a new node that's as large as the 2080ti is now, that it will take the performance crown unless Ampere sucks.
 
Back
Top Bottom