• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Sorry Twinz but I think your logic is flawed. In this case it's not about how much faster the 3090 is than the previous flagship card, but how much faster it is than the 3080... which is 10% more performance for double the price. The 3090 is almost completely redundant for gaming as the 3080 is already filling so much of its performance bracket for half the price... that it why it is by far the worst value card ever released.

So if Nvidia had screwed us, like they did with Turing, and made the the 3080 only provide 2080Ti performance for 2080Ti money...that would make the 3090 a better value? Do you not see the problem with this thinking? With your "logic" Nvidia need only offer *worse* value down the stack to make the 3090 a "better value".

The 2080 Ti was 40% faster than a 2080 for double the price and was, depending on the resolution, 25-35% faster than a 1080Ti.

You are just making my point that neither the 2080 or 2080Ti was ever a good value. Neither of them were ever worth their price. Ignoring Pascal, and the price/performance improvements that came with pretty much every generation before it, was the only way to even try and pretend that the 2080Ti was anything other than a terrible value.
It's pretty obvious how overpriced the 2080Ti was in danlightbulb's Timespy chart:
Timespy (based on Jayz2cents published timespy score of 18125):

EknO3Ja.png

To be clear, I'm not arguing that the 3090 is a good value. I'm arguing that it uses the same generational-progress recipie that the 2080Ti used. Therefore, it caters to the people who bought into the 2080Ti's value proposition as a proper generational improvement. Some people bought into it so hard that they were lecturing us on how Nvidia was not a charity and more performance was going to cost more money. "Gotta pay to play." and such. Well, Nvidia offered up the 3090 to that camp, and the 3080 to those of us who thought real generational progress meant getting meaningful performance uplifts *at given price points*.

The 3080 caters to the people who thought the 2080ti wasn't worth the money to begin with.

As I pointed out before, the reasonably-priced 3080, sitting at the $700 price-point in the same stack with the 3090, just makes the 3090's terrible value more obvious than the 2080Ti's terrible value was. The (also terrible) 2080 merely acted as "cover" for the 2080Ti's crap value. That cover isn't available this time because Nvidia decided to get beck on track with generational progress.

Look at the chart. The 2080Ti's performance level should have been $700. (Maybe $750-$800 for Ray-Tracing beta testing) At the $700 price point, it would have been an "okay" performance bump, and the 3080 would just be following suit with another "okay" bump in performance at that price point.
 
You are just making my point that neither the 2080 or 2080Ti was ever a good value. Neither of them were ever worth their price. Ignoring Pascal, and the price/performance improvements that came with pretty much every generation before it, was the only way to even try and pretend that the 2080Ti wasn't anything other than a terrible value.
It's pretty obvious how overpriced the 2080Ti was in danlightbulb's Timespy chart:


You think the 2080Ti was overpriced its nothing compared to 3090.
jkJX8hJ.png


Timespy score is about 10% higher than 3080.

At least the 2080 Ti had a good 30% performance boost over its sibling 2080. The 3090 is only 10% ahead.
 
Last edited:
You think the 2080Ti was overpriced its nothing compared to 3090.


At least the 2080 Ti had a good 30% performance boost over its sibling 2080. The 3090 is only 10% ahead.

You are comparing within the same generation. Nvidia can make any single card look like a good value by simply making another card in the same stack an even *worse* value.

The 2080's value was crap. Using it to justify the 2080Ti's price/performance is folly.

Now that Ampere has a card with proper generational improvement, the 2080Ti's replacement has nothing to hide behind the way the 2080Ti hid behind the 2080's crap value.

Comparing a new generation to previous generations is a better way to judge progress.

The 3080 fits with pascal and the generations before it.

The 3090 fits with Turing's formula for generational "progress".

People who didn't buy into Turing's formula now have the 3080. People who thought the 2080Ti was an acceptable generational improvement now have the 3090.

The 3090 offers a better *generational* improvement than Turing did.
It also does so for less of a *generational* price increase.
 
Last edited:
Cant find the link atm the moment comparing the strix 3090 oc at 480W to a 3090 FE but just take a look at the techpowerup

Zotac 3090

https://www.techpowerup.com/review/zotac-geforce-rtx-3090-trinity/22.html

71.5 fps in rdr2

Asus Stix oc at 480W

https://www.techpowerup.com/review/asus-geforce-rtx-3090-strix-oc/22.html

80.2 fps

Thats a 12% perfromance increase over an ordinary 3090

A lot of the games like Jedi and Control at over 14% faster on the OC strix 3090 vs the zotac 3090.

They might have had an unlucky chip in their strix 3090 as they were only getting to 1960 boost when running at 480W whereas the review I saw they had their strix card running at 2000Mhz boost with 380W.

Very interesting, mate. Thanks for that. Although we have to admit that it does strongly depend on the game. From their conclusion:

"We also did a test run with the power limit at the 480 W maximum ASUS provides. 480 W is much higher than anything available on any other RTX 3090, so I wondered how much more performance can we get. At 4K resolution, it's another 2%, which isn't that much, but it depends on the game, too. Only games that hit the power limit very early due to their rendering design can benefit from the added power headroom."

Still, if I was going to splurge on a 3090, it does seem worth saving up for the Strix.
 
My personal belief is that Jensen has made some very, very bad executive decisions for this generation and when the dust has settled we will see him coming under fire from within Nvidia (in addition to the media) for this release.

Depends how you look at it, from a technical pov its a bit of a cluster* but from the marketing and potential sales pov its been an enormous success clearly and looking at it from the all important shareholders pov seeing the cash coffers rapidly filling I don't think Jensen will be fearing for his job anytime soon.
 
So two questions that just popped into my head.

1. Will the 3070 launch be the same as the 3080 i.e poor stock levels, scramble to buy?

2. Will anyone actually want a 3070? £470 is still not cheap and as the 3080 is only £180 more and offers a very good performance level compared to currently available cards (albeit competition as yet unknown).

If the 3070 is 2080 Ti performance, its actually overpriced compared to the 3080.
 
So two questions that just popped into my head.

1. Will the 3070 launch be the same as the 3080 i.e poor stock levels, scramble to buy?

2. Will anyone actually want a 3070? £470 is still not cheap and as the 3080 is only £180 more and offers a very good performance level compared to currently available cards (albeit competition as yet unknown).

If the 3070 is 2080 Ti performance, its actually overpriced compared to the 3080.

1. Yes.

2. you might be right. A lot of potential 3070 buyers seem to have jumped a level and sprung for the 3080 if they managed to grab a £650 one. There are some 3070 boards that are more money than a 3080 which is ridiculous. Plus you keep seeing more and more evidence of a 3070ti and 3080 20gb being out in October. So maybe Q1 might be No. Plus there will be more leaks about the Navi perfromance. I saw yet another today that puts the big navi at 3080 performance and ONLY 250W power and cheaper. That makes AIB 3070s even more unpalatable if true.
 
So two questions that just popped into my head.

1. Will the 3070 launch be the same as the 3080 i.e poor stock levels, scramble to buy?

2. Will anyone actually want a 3070? £470 is still not cheap and as the 3080 is only £180 more and offers a very good performance level compared to currently available cards (albeit competition as yet unknown).

If the 3070 is 2080 Ti performance, its actually overpriced compared to the 3080.

1. Yes, unless through binning the factory had a heap more weaker silicon they might have more to begin with (3080/90 rejects)
2. 3080 is out of stock, with backlog so no choice until we start to see shipments coming in to be listed.

I saw yet another today that puts the big navi at 3080 performance and ONLY 250W power and cheaper. That makes AIB 3070s even more unpalatable if true.

Seen similar too. 250w even within 5% of the 3080 would be a winner in my eyes. If it pegs it level or edges it in certain titles its a massive scalp IMO.

Dont even mention that glorious fine wine - mmm mmmm. :D
 
Navi and 30 series isn't going to be a straight easy comparison IMO. Going to likely depend on a lot of variables including RT and DLSS and then what games people play/likely to play and using what tech. More debate than ever before and probably lots of forum member temporary bans :D
 
1. Yes, unless through binning the factory had a heap more weaker silicon they might have more to begin with (3080/90 rejects)
2. 3080 is out of stock, with backlog so no choice until we start to see shipments coming in to be listed.



Seen similar too. 250w even within 5% of the 3080 would be a winner in my eyes. If it pegs it level or edges it in certain titles its a massive scalp IMO.

Dont even mention that glorious fine wine - mmm mmmm. :D

3070 is a different core so won't be 3080/90 rejects.
 
Navi and 30 series isn't going to be a straight easy comparison IMO. Going to likely depend on a lot of variables including RT and DLSS and then what games people play/likely to play and using what tech. More debate than ever before and probably lots of forum member temporary bans :D

Why does this keep being mentioned? No games are using them, and even ones that are supposed to be (Cyberpunk for example) are delayed or when released surprisingly bad (Horizon Zero Dawn). It shouldn't be a deal swinger, this generation at least as again its just poor and wont be a concrete feature until hopper.
 
My bad, your right. That means 1. is a definite yes.

It is a 392.5mm2 die with 2 SMs disabled - shouldn't have any issues with producing larger quantities, they've also taken longer building up stock while pushing people towards the 3080 who might have been on the fence (well if they were available). Whether demand and/or coronavirus situation will have an impact on availability is another matter.

I'd assume they are probably not producing whole wafers of GA102 and GA104 but producing a mix to reduce the cost of defects - at something like a 7-8:1 ratio unless there are other products on there as well.
 
Why does this keep being mentioned? No games are using them, and even ones that are supposed to be (Cyberpunk for example) are delayed or when released surprisingly bad (Horizon Zero Dawn). It shouldn't be a deal swinger, this generation at least as again its just poor and wont be a concrete feature until hopper.
But why ignore it? :) Each to their own but I think the majority now want RT and accelerated performance using DLSS. It's only going to get used more in future. See what I mean by debate :). Comparing them IMO is going to be an impossible mission as what matters to people is going to differ a lot, so switching off DLSS to compare with an AMD GPU some might see as ridiculous, others may see as fair. The feature sets are probably not going to be comparable like in the good old days of raster and just raw FPS.

I'm looking forward to what AMD come up with - more so than last gen of AMD GPU's. So lets see.
 
Last edited:
Back
Top Bottom