• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

speaking of value, added price/perf. (where the number is the $ amount for 1% of performance from 2080 ti baseline)


FWIW Nvidia are supposed to be getting rid of the Ti suffix and just going with Super brand because of customer confusion. Which is why we're getting the 3090 rather than the 3080 Ti.
 
I would love to see Nvidia charge 20% less for 40% more performance, but i doubt they will, AMD are also not going to charge $800 for their fastest card, it'll be more than that.
Yeah, I'm not wed to the prices (nor the performance), it's just some provisional values I'm going with for now.

FWIW Nvidia are supposed to be getting rid of the Ti suffix and just going with Super brand because of customer confusion. Which is why we're getting the 3090 rather than the 3080 Ti.

Yeah, I've heard but for now I'm going with the old way until we get more confirmation. An nvidia gpu for $999 by any other name... would burn my pocket just as much. :)
 
There will always be enough mega rich PC people to buy the top tier NVidia card at any price.

It’s like football...pay the players millions, the ticket prices go up and it prices the poorer fans out of the game.

However, I don’t want to see this lifting the bar across the board as it shrinks the player base and harms the pc gaming industry.

Nonsense. The average PC gamer does not buy to-end cards, the average PC gamer is still on 1080p with an increasing amount of people upgrading to 1440p. I know 4k is becoming less niche, but in the grand scheme of thing it's still niche and if the 3070 is anything like the 2080 Ti then it will be a capable 4k card also, Nvidia's continued improvement of DLSS and it's adoption in more games will only help with this and the it will become increasingly pointless to buy the top end cards unless you want 4k 120fps with DLSS.
 
Nonsense. The average PC gamer does not buy to-end cards, the average PC gamer is still on 1080p with an increasing amount of people upgrading to 1440p. I know 4k is becoming less niche, but in the grand scheme of thing it's still niche and if the 3070 is anything like the 2080 Ti then it will be a capable 4k card also, Nvidia's continued improvement of DLSS and it's adoption in more games will only help with this and the it will become increasingly pointless to buy the top end cards unless you want 4k 120fps with DLSS.

The low end is basically nailed though with 570 et al. Most of us are just waiting for a card that can cope with 4k without having to turn down settings. Yes it is niche (4k) but if you dont play competitive type fps shooters there's plenty of displays to choose that just need a powerful card. The 2080Ti almost cuts it but is still overpriced and the newer gen stuff should be able to beat it for less lower down the stack i.e. 3070 ish.
 
I don't care so much about the performance it's the name that sells it for me. A Titan Super TI 3070 with 2070S+10% performance and I'm there with 2xRTX RT at 1080. £799 and not a penny more.

That sounds awful 2070s + 10% isn't anywhere near fast enough for a 3070.
That would put it about 2080s performance, well down on what most would expect and certainly not for £799.
 
Nonsense. The average PC gamer does not buy to-end cards, the average PC gamer is still on 1080p with an increasing amount of people upgrading to 1440p. I know 4k is becoming less niche, but in the grand scheme of thing it's still niche and if the 3070 is anything like the 2080 Ti then it will be a capable 4k card also, Nvidia's continued improvement of DLSS and it's adoption in more games will only help with this and the it will become increasingly pointless to buy the top end cards unless you want 4k 120fps with DLSS.

Nonsense...I see. When low end GPUs are expensive and destroyed by consoles, that doesn’t harm pc gaming?

If the 3070 is a big chunk more expensive than the 2070 was at launch then we get further and further away from a level playing field.

if things keep moving the same direction that 20 series took us then we’ll have £1k gpus that are at the low end of the gaming tier.
 
You are forgetting that there is still all the previous cards out there to buy, these new cards are the top tier bleeding Edge cards. It won't price people out of the game, as you can still play the game just not at 4k últra 100+fps

it will as they’ll just stick to consoles as the jump in cost to even a good 1080p card won’t be worth it.

Cyberpunk is not going to run well on lower end cards.
 
People who think Nvidia can keep jacking prices (like they did with Turing) indefinitely need to think ahead a bit. 43% increase in price every two years would put the 2032 8080Ti at what? $12,000?

How many of those do we think gamers will buy?
 


Hmm, interesting rumor...
So Cyberpunk 2077 will heavily use TAA. And most games Nvidia partner up with may specifically be using TAA for their DLSS 3.0 implimentation.
And it's rumored that DLSS 3.0 will be on by default in the drivers. Sneaky... I wonder if they will back port this for Turing users?
NvCache will compete with HBCC but AMD is also working on something new as well. No announcement yet.

Hmm, using tensor cores for compression/decompression. Now that's neat. And I'm sure it will come in handy for RT applications.
And the fabled driver GUI overhaul is coming as well. Is that for Ampere or all Nvidia GPUs?
 
People who think Nvidia can keep jacking prices (like they did with Turing) indefinitely need to think ahead a bit. 43% increase in price every two years would put the 2032 8080Ti at what? $12,000?

How many of those do we think gamers will buy?
The main thing that seems to happen with things that are very high priced is people just hang onto them for far longer and no upgrade or replace them so often...
Like what happening now with mobile phones with high end mobiles costing 1k+

The more something costs the longer you try to make it last..
 
Last edited:
People who think Nvidia can keep jacking prices (like they did with Turing) indefinitely need to think ahead a bit. 43% increase in price every two years would put the 2032 8080Ti at what? $12,000?

How many of those do we think gamers will buy?

They can’t do it indefinitely. It will have to stop at some point. Doesn’t mean it will stop right now.
 
The main thing that seems to happen with things that are very high priced is people just hang onto them for far longer and no upgrade or replace them so often...
Like what happening now with mobile phones with high end mobiles costing 1k+

The more something costs the longer you try to make it last..

True, but there is another reason why companies have been doing it - lack of innovation. Phones were innovating at rapid pace in the first few years after the iPhone was launched and because consumers were upgrading every single year, it made sense to charge say $500 for a phone. But because there is no longer any innovation, it makes more sense to charge $1000 because the consumer isn't going to upgrade any faster at a lower price anyway - so you can make more money up front.

This is a common business tactic. Early in a businesses life cycle, your products are constantly innovating and you keep your prices lower to entice more demand, upgrades and gaining market share. Later in your life cycle, the innovation dries up, upgrades dry up and market share is saturated - the only way you now make more margin is to charge higher prices every year.

They can’t do it indefinitely. It will have to stop at some point. Doesn’t mean it will stop right now.

They can only do it until such time that sales numbers no longer reflect consumer acceptance. i.e according to Mind Factory most GPU purchased are now in the $450 to $550 bracket, where as a few years ago it was the $200 to $350 bracket. Nvidia may not know if it's hit the ceiling yet, perhaps they can get those $550 GPU buyers to pay $650? They won't know until they try and if it fails, pricing reverts back to the last maintenance level where consumer price was in equilibrium.

So I am fully expecting Nvidia and AMD to both raise GPU prices this year, because they don't yet know if they've hit the ceiling of consumer willingness to pay.

I'm also fully expecting AMD to raise its CPU prices for Ryzen 4000 for the exact same reasons - though CPU prices rises will be more muted due to the fact that most sales are still closer to the lower end of the market - probably due to mid and high end stuff not offering much performance difference. Which by itself is interesting- it means consumers are still shopping for value, yet they seem to believe $550 for a 2070Super is good value.
 
Last edited:
They can only do it until such time that sales numbers no longer reflect consumer acceptance. i.e according to Mind Factory most GPU purchased are now in the $450 to $550 bracket, where as a few years ago it was the $200 to $350 bracket. Nvidia may not know if it's hit the ceiling yet, perhaps they can get those $550 GPU buyers to pay $650? They won't know until they try and if it fails, pricing reverts back to the last maintenance level where consumer price was in equilibrium.

So I am fully expecting Nvidia and AMD to both raise GPU prices this year, because they don't yet know if they've hit the ceiling of consumer willingness to pay.

I'm also fully expecting AMD to raise its CPU prices for Ryzen 4000 for the exact same reasons.
I fully expect prices to rise as you say, and I fully expect people to keep paying.

Look at all the people here with £1500+ LG OLED TVs. They didn't bat an eyelid spending that kind of cash on a TV.

nV+AMD probably (correctly) think CPUs and GPUs could command higher prices than they do.

People have completely accepted the new mid-range GPU price of £400+, and like you say, who knows what the limit is... I don't think we've found it yet!
 
About a week ago there was a rumour that Ampere cards (he high end ones) will use a 12 pin Power cable - then a couple days later it was somehow labeled as fake.

But it's just been confirmed by Steve at Gamers Nexus that it's 100% real, he's confirmed it with AIBs and with companies that make power cables.

Also confirmed is that not all High end cards will use it - only the OEM Nvidia cards will use a single cable 12 pin for power, AIB cards can use it too but Steve says most said they continuing to use dual 8 pins instead.
 
Back
Top Bottom