• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

It may be that Nvidia simply couldn't innovate effectively with the 20xx series and was just incapable of making a profit at reasonable prices.

In the absence of competition, we don't really know how low they could set prices before taking a loss on the 20xx series.

They had better get their $h!t together for the 30xx series or they may find themselves in the same situation as Intel. (the other company that overcharged customers for years while underestimating AMD)
 
Two reasons. Because people keep paying the increased prices, and because AMD want to be "premium" too. So no price war, no price competition. Where nVidia leads, price wise, AMD follow.

e: Also I'm going to laugh my ass off when the 3070 is £550+ :p


I think it's more a case of AMD not wanting to be perceived as cheap sh*te rather than being seen as a premium brand. I mean, if performance was similar but AMD were much cheaper, people would question build quality etc as the reason why.

I think AMD also realised a while ago that it doesn't really matter how cheap they priced their GPUs, everyone would still by Nvidia, even when Radeon cards were better.

So if AMD are going to sell the same volume regardless of where they price them, why not make as much money as you can out of the sales you do get?
 
Well... it's subjective and depends on how much is someone willing to sacrifice. For instance, I don't think a rtx2080 can keep 60fps@4k all the time in RDR even on lowest settings if you're not willing to lower the render resolution. For instance, it can't keep 60fps (all the time! not just on average) at 75% of 4k if it's not on the lowest of settings and even then it may struggle.



No, they do not offer that much performance if you're not willing to lower settings. So, if you pump those settings up, that $1200 card can't even do 60fps@1080 in native resolution in some games ;). Even in rasterization alone and you may be at the limit with a 2080ti.

when Crysis came out nothing could run it properly even if you had the best Nvidia cards in SLI, does it mean both Nvidia and amd were failures at the time?

I've heard developers say thing like, they could make game graphics right now that would make a top graphics card run at 1fps, does that mean Nvidia and AMD aren't doing good enough?
 
I think it's more a case of AMD not wanting to be perceived as cheap sh*te rather than being seen as a premium brand. I mean, if performance was similar but AMD were much cheaper, people would question build quality etc as the reason why.

I think AMD also realised a while ago that it doesn't really matter how cheap they priced their GPUs, everyone would still by Nvidia, even when Radeon cards were better.

So if AMD are going to sell the same volume regardless of where they price them, why not make as much money as you can out of the sales you do get?

that was a different time and market conditions you have to go back a good 6yrs to the 290x era for that to hold any substance
 
I think it's more a case of AMD not wanting to be perceived as cheap sh*te rather than being seen as a premium brand. I mean, if performance was similar but AMD were much cheaper, people would question build quality etc as the reason why.

I think AMD also realised a while ago that it doesn't really matter how cheap they priced their GPUs, everyone would still by Nvidia, even when Radeon cards were better.

So if AMD are going to sell the same volume regardless of where they price them, why not make as much money as you can out of the sales you do get?

They don't sell as much due to their perceive image, their reputation. If for a moment they'll actually put effort into releasing good products from the start and not overhpye them to the Moon and back, give ittime, they will sell more. For instance of good product: R290(x), but poorly launched due to its reference blower. Fury, Vega were over hyped.

when Crysis came out nothing could run it properly even if you had the best Nvidia cards in SLI, does it mean both Nvidia and amd were failures at the time?

I've heard developers say thing like, they could make game graphics right now that would make a top graphics card run at 1fps, does that mean Nvidia and AMD aren't doing good enough?

I did not said that. All I've said is that current cards are by no means "more than enough", in general, and that 4k may be problematic if you don't have the best.
 
They don't sell as much due to their perceive image, their reputation. If for a moment they'll actually put effort into releasing good products from the start and not overhpye them to the Moon and back, give ittime, they will sell more. For instance of good product: R290(x), but poorly launched due to its reference blower. Fury, Vega were over hyped.



I did not said that. All I've said is that current cards are by no means "more than enough", in general, and that 4k may be problematic if you don't have the best.

Don't forget late. No point having a competitive product release years after the competition
 
Fury, Vega were over hyped.

Navi too. It was going to beat the 2080 Ti at £200 if you listened to the build-up hype. And the RX480 before that, prior to launch, was going to be on a par with nvidia's x80 for half the cost. They've all turned out to be decent cards at fair prices, but nothing like the nuclear bomb that Zen set off in the CPU market.
 
Well, damn. At those prices I’m going to have to start buying previous gen second hand cards. Either that or join the console crowd.

I think a lot of people (including me) are thinking along those lines.

I recently fired my PS4 up after a couple of years though, and in that period I've gone from a 16:9 monitor to a 21:9 one, and boy, do I miss it when playing the PS4 on a 16:9 TV, which is something to consider.

I thought it was fantastic when I first had the PS4, now it looks squashed!
 
I think a lot of people (including me) are thinking along those lines.

I recently fired my PS4 up after a couple of years though, and in that period I've gone from a 16:9 monitor to a 21:9 one, and boy, do I miss it when playing the PS4 on a 16:9 TV, which is something to consider.

I thought it was fantastic when I first had the PS4, now it looks squashed!

By the time any new Nvidia cards hit the shelf's consoles will be better :D
 
Both MS and Sony have confirmed 4k @ 120 fps. While that could be Displayport (as I'm currently using), it's almost certainly HDMI 2.1.

I would put that under how games could do 1080p/60 on the ps3/x360 but the relality is to get the fidelity most games will be much lower. 4k 30fps will be the norm with the odd 4k 60fps game.
 
Both MS and Sony have confirmed 4k @ 120 fps. While that could be Displayport (as I'm currently using), it's almost certainly HDMI 2.1.

Playback of video most likely, not games. I’m sure there’ll be some DLSS type trickery going on too rather than native 4K.
 
A RTX 2080 - the estimated performance level of the X1X and PS5 - has little problem with 4k 60 fps. Unless you enable RTX, of course. And remember that consoles have less overhead than PCs.

"estimated", and people have been doing that estimation by comparing Tflops, which is usless when you are comparing different architectures let alone different architectures and vendors.
 
A RTX 2080 - the estimated performance level of the X1X and PS5 - has little problem with 4k 60 fps. Unless you enable RTX, of course. And remember that consoles have less overhead than PCs.

From personal experience, a rtx2080 has kinda a lot of problems even lower than 4k if you don't want to sacrifice settings.
 
Back
Top Bottom