• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Will Nvidia start to lose market share?

Before Nvidia lose market share other manufacturers must actually start producing cards in bulk. Finally AMD caught up performance wise in the 6xxx generation and rather than mass produce they decided to send their marketing manager online to say how he got a card on launch day. With that type of commitment its no surprise people steered clear in the 7xxx generation as they were launching later. Why would you hold off for a card that may or may not be available. I had bought AMD for many generations up to the 6xxx but ended up going for a 3080. Why? because it was easier to get one. Once bitten ......
 
Last edited:
And it makes sense, outside of the 4090. The 4090 is the only Nvidia card that makes sense. You want the best, you don't care about money fine, AMD cannot rival it in terms of pure performance. Not that I would spend that much money on a GPU, but I could at least understand why people would.

Anything below that it makes no sense what so ever, and I don't rmemeber a phase like this (at least in recent years) why there has been such a big gap.

AMD is the budget king at the moment, yes. But the only reason anything below the 4090 doesn't make sense on the Nvidia side is because of the 2nd hand 3080/3090 market.

If people want DLSS and if people want to be able to turn on ray tracing but don't want a 2nd hand card then the 40 series makes a lot of sense for them. Just because you might not see the benefit or care about ray tracing and DLSS doesn't mean nobody else will care either. There are thousands of people out there who want to use these options in their games and don't trust the 2nd hand market enough to grab a cheaper 30 series card.

In my view, the top end AMD cards make no sense because I wouldn't want to spend the best part of a grand on a graphics card and still not be able to use ray tracing. I can't imagine spending that much and then still not being able to run my games with those top graphical effects. After all, isn't the whole point of a graphics card is to make a game look nice/realistic and play smoothly? But for a lot of people, they don't care for DLSS and ray tracing so the AMD cards make sense to them. There is no right and wrong in this, it's down to personal preference.
 
AMD is the budget king at the moment, yes. But the only reason anything below the 4090 doesn't make sense on the Nvidia side is because of the 2nd hand 3080/3090 market.

If people want DLSS and if people want to be able to turn on ray tracing but don't want a 2nd hand card then the 40 series makes a lot of sense for them. Just because you might not see the benefit or care about ray tracing and DLSS doesn't mean nobody else will care either. There are thousands of people out there who want to use these options in their games and don't trust the 2nd hand market enough to grab a cheaper 30 series card.

In my view, the top end AMD cards make no sense because I wouldn't want to spend the best part of a grand on a graphics card and still not be able to use ray tracing. I can't imagine spending that much and then still not being able to run my games with those top graphical effects. After all, isn't the whole point of a graphics card is to make a game look nice/realistic and play smoothly? But for a lot of people, they don't care for DLSS and ray tracing so the AMD cards make sense to them. There is no right and wrong in this, it's down to personal preference.
You do realise AMD cards have had RT ability since the 6000 series, with the 7900xtx performing roughly the same as a 3080/3090. No its not as fast as Nvidia, but people act they can't do anything. Which I guess shows the weakness of AMD marketing and the Nvidia consumer mindshare at work.
 
Last edited:
Like I said - Switch 2.

NV make the SOC in the current Switch. It's sold 126 million units so far, and counting,,

Switch 2 will also be using an NV SOC.

It's interesting because Nintendo seems to have a bit of a negative effect around their successor consoles. Like the 3DS was quite similar to the DS and it sold about half as many units. The Wii U was an absolute flop compared to the Wii. This makes me wonder if Nintendo are going to try to anticipate this market behaviour and do something unexpected.
 
The only way for Nvidia to lose market share is for AMD or Intel to offer something that is better and cheaper than Nvidia at multiple tiers with the software to match, Until that is the case Nvidia will remain dominant.
 
Don't Nvidia have better power usage this generation? Feels like the better fit for consoles especially when you consider how impactful DLSS and Frame Gen would be for console performance.

Doubt Nvidia are that interested though as there's probably significantly more profit to be had with data centres and AI.
 
I don't think any console maker is gonna switch from amd to Nvidia mid generation

I also think switching to Nvidia next generation is also highly unlikely because no one wants $700 consoles anymore. Nvidia doesn't make x86 CPUs, but AMD does and so making a console with an AMD APU is much cheaper than an Nvidia + Intel console solution which needs two processors. Unless console makers want to switch from x86 to Arm or use Intel and deal with potential backwards compatibility issues, I can't see them moving to Nvidia for a high end gaming console and I just don't see it happening
 
Last edited:
Don't Nvidia have better power usage this generation? Feels like the better fit for consoles especially when you consider how impactful DLSS and Frame Gen would be for console performance.

Doubt Nvidia are that interested though as there's probably significantly more profit to be had with data centres and AI.

Yep, Der8auer did an interesting test and with a special bios allowing for more voltage, higher clocks etc... a 7900XTX in standard rasterisation needs to run at over 600 watts to match what a 4090 does at just over 300 watts.
 
Last edited:
Yep, Der8auer did an interesting test and with a special bios allowing for more voltage, higher clocks etc... a 7900XTX in standard rasterisation needs to run at over 600 watts to match what a 4090 does at just over 300 watts.

Which is totally irrelevant in a console level GPU. Like for like the power consumption in gaming is not that different.

7900 XT 20GB - 306w
4070Ti 13GB - 289w

Most of the difference is for the extra VRAM.

 
It's interesting because Nintendo seems to have a bit of a negative effect around their successor consoles. Like the 3DS was quite similar to the DS and it sold about half as many units. The Wii U was an absolute flop compared to the Wii. This makes me wonder if Nintendo are going to try to anticipate this market behaviour and do something unexpected.
Nope, they hit gold with the Switch. Even they have been shocked by its success. And, so far, MS and Sony don't want to compete in that handheld battleground. Switch 2 will be more of the same, just "better"; Nintendo have learnt their lessons.

It will be interesting to see what Nvidia come up with for them, as the current design based off an old Tegra chip is very long in the tooth. Will also be interesting to see if Nvidia at some point in the future offer an updated Shield based on that tech. They still do have Cloud gaming aspirations after all.
 
AMD is the budget king at the moment, yes. But the only reason anything below the 4090 doesn't make sense on the Nvidia side is because of the 2nd hand 3080/3090 market.

If people want DLSS and if people want to be able to turn on ray tracing but don't want a 2nd hand card then the 40 series makes a lot of sense for them. Just because you might not see the benefit or care about ray tracing and DLSS doesn't mean nobody else will care either. There are thousands of people out there who want to use these options in their games and don't trust the 2nd hand market enough to grab a cheaper 30 series card.

In my view, the top end AMD cards make no sense because I wouldn't want to spend the best part of a grand on a graphics card and still not be able to use ray tracing. I can't imagine spending that much and then still not being able to run my games with those top graphical effects. After all, isn't the whole point of a graphics card is to make a game look nice/realistic and play smoothly? But for a lot of people, they don't care for DLSS and ray tracing so the AMD cards make sense to them. There is no right and wrong in this, it's down to personal preference.

What idiot is buying a 3080 with 10gb vram?

The 12gb variant is only just passable at this point, and really 12gb should be just for mid range now like Nvidia 60 and 70 cards and AMD's alternatives.
 
The only way for Nvidia to lose market share is for AMD or Intel to offer something that is better and cheaper than Nvidia at multiple tiers with the software to match, Until that is the case Nvidia will remain dominant.
You realize that this is actually the case ?
And its not enough
 
Back
Top Bottom