• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Why are GPUs so expensive?

@FoxEye
I think there's a lot of room to shave off £ from platform, i.e. CPU+Mobo+Ram, because chances are most games won't push the CPU to its limits in which case you don't need to match the power of the PS5 as much as the performance, if they continue not offering unlocked fps modes. So getting enough CPU power for 60 fps is stupid cheap, and for GPU (assuming PS5 just gets a 5700 XT modified with RT+VRS) the (£250) 5700 would've been a great competitor if not for them getting VRS+RT as well (and we don't have those GPUs yet, but then neither is the PS5 out).

Raxthkml.png


Brand new (and not the best deals always, eg like I said there's £50 more saving on the 5700 during sales compared to what I put). But if you already have SSD, Case, PSU, maybe some half-decent CPU, then you really only need a GPU upgrade to get to parity. And if you consider that you don't need PS+ to play online or upload saves, then price differences aren't as great. And again, with a year out, there's always more £ to save as prices go down & we find out what the PS5 can actually do.

I think the value proposition will be tougher vs the XSX, because that's looking to pack a lot more GPU power & the desktop equivalents are still stupidly overpriced. That one's gonna take longer to surpass, but who knows, it's still all rumours atm.
 
I've not followed the XSX as I'm more a Jap console guy vs the ShooterBox :p

Will defo need a new CPU... have a 2500k lol

Actually I will need a new PSU as well, come to think of it. Oh well, there goes another £100 :p You don't scrimp and save on your PSU, boys and girls.
 
We'll see what happens in the run up to/post the next console launch. That will be make or break.

If mid-range GPUs are going for £400 at that time, then that's going to hurt the PC market. If people end up paying a chunk of cash on a PS5 or Xbox SX, and it delivers a kick-ass 1080p/1440p experience, it's going to be brutal for PC. This gen's consoles were pretty weak with their Jaguar cores and so-so GPUs. But we're looking at a big boost next gen. And an uncertain future for both nVidia's and AMD's new cards, as shrinks get harder and harder, and extra raw perf is more difficult to find (and more difficult to fully exploit on Windows).

Basically mid-range needs to return to sane levels before then. And the mid-range needs to keep up with PS5/XSX in terms of perf offered too. No more stagnation in the mid-range, trying to re-sell the same perf at the same price every damn year for half a decade. If mid-range PC GPUs get trounced by the consoles it almost wouldn't be surprising, tbh. PC is a cash cow now.

So in short, the next mid-range GPU can't be £400+ from either team. And it needs to be good. Better-than-console good.

Otherwise, yeah, watch this space.

If what Digital Foundry said about a lower spec console from Microsoft, around 4TF, is correct, then I wouldn't expect too much from the next gen consoles as everything would need to run on that first.
Also, if they want to target 4k@60fps, also the jump in image quality won't be that big, so plying on a lower resolution with a lower hardware will be possible.
Keep in mind that only towards the end of a console's life you have games that make the best out of it - and which also require more hardware in the PC space. If that 12.3TF monster console, with 8c/16t CPU (above 3GHz and would be nice if it had 24 or 32GB RAM), would target 1080p 60fps or even 1080 30fps only, than yes, it would have been something wonderful indeed! ;)

The biggest jump would be in CPU performance. If the 8c/16t 3GHz+ turns to be true, than all those below will have a hard time keeping up (although 1-2 cores will be kept off gaming and dedicated to OS and other stuff).

As it stands now, I don't see much of a problem in the PC space. It will get bad for PC and consoles alike when 5G has enough penetration and can provide low lag, high bandwidth to a significant crowd as will enable cloud gaming to take off properly in those areas. Until then... same ol', same ol'...

Yeah, trying to sell what should be a £200/250 GPU is just taking the proverbial ****.

I just get the feeling AMD would rather people buy a console, since they produce the tech, than trying to be more competitive in the GPU market.

Let be honest, Radeon is ***** brand and has a stigma attached to it. I know people that wouldn't go near their GPUs. It's an issue AMD need to sort out. Quick-style.

Some people want AMD to be competitive, especially in the high end market, so they can buy cheap nVIDIA products. That can't go on forever. They were most of the time (if not always) competitive in at least some areas, but never had the success people gave to nVIDIA and Intel.

Of course they want for people to buy the consoles, it's a double win! Triple or even more if they buy both and the PC parts (at higher than normal price)!
 
When it comes to pulling wool over people's eyes they only have themselves to blame - I can't believe how much focus people put on the naming convention as though it matters a jot what the number on the end is. I'll happily buy a RTX3010 if it offers great performance at a good price, even if that price level is more than a 1660S or whatever.

Don't forget the "GTX" moniker itself at one time was used to denote high end parts costing many hundreds of dollars. Worry about the performance for the price, not the labels. People get all hot under the collar about the "70" range or whatever getting more and more expensive, just imagine it was called the "95 Super-Ultra-Ti-XXXTREME" range or whatever if it makes it easier.
I think you are completely missing the point.

Traditionally it's been like this:
80= flagship
70= high-end
60/60Ti= Mainstream/Mid-range
50= Entry level

But now it is like:
2080ti= flagship at premium price
2080= high-end at flagship price
2060/60ti/70= mid-range at high-end price
1660/60ti (which should had been the 50/50ti)= entry level at mid-range price

The point is that the performance that people are getting are less for their money, due to Nvidia pushing their cards pricing across the board to 1 tier up in pricing. The 2060 6GB is the perfect example for illustrating everything that is wrong; you got some people arguing 2060 having 6GB is fine because it is a "1080p card", but last I checked "1080p" had no business being launched at $349+, and when even the 2 years old+ 1070 had 8GB, it's kinda apparent that Nvidia trying to milk the consumer with on one hand their can cut corner on the cost of vram, and on the other it is the perfectly excuse to push people into buying the even more expensive 2070 8GB instead at £100 more.
 
I think you are completely missing the point.

Traditionally it's been like this:
80= flagship
70= high-end
60/60Ti= Mainstream/Mid-range
50= Entry level

But now it is like:
2080ti= flagship at premium price
2080= high-end at flagship price
2060/60ti/70= mid-range at high-end price
1660/60ti (which should had been the 50/50ti)= entry level at mid-range price

The point is that the performance that people are getting are less for their money, due to Nvidia pushing their cards pricing across the board to 1 tier up in pricing. The 2060 6GB is the perfect example for illustrating everything that is wrong; you got some people arguing 2060 having 6GB is fine because it is a "1080p card", but last I checked "1080p" had no business being launched at $349+, and when even the 2 years old+ 1070 had 8GB, it's kinda apparent that Nvidia trying to milk the consumer with on one hand their can cut corner on the cost of vram, and on the other it is the perfectly excuse to push people into buying the even more expensive 2070 8GB instead at £100 more.

it has been said before but this is the perfect summary except that they charged more for RT when it was in its infancy and not worth paying for. I skipped 20 and await 30.
 
I think new games suck compared to the playability of games from 10-15 years ago. All about graphics these days and little though to the storyline/playability. Witcher 3 is perhaps the better of the modern games for playability.

Skyrim, Fallout 3, etc....all have masses of playability compared to the new stuff.
 
If I'm missing the point it's because people still insist on tying it to naming convention and basing their expectations on their perception of what that should mean, rather than talking in terms of relative performance abstracted from the branding. Prices are going up but they were doing that since before the RTX generation anyway - the 1080ti has always been expensive, it launched at £700, but then it was also significantly faster than the 1080 and indeed competitive with Titan cards from previous generations that had cost even more. I've not tried to figure out if people really are getting less for their money or not, but they are getting less big increments compared to yesteryear. It's why after 20+ years of buying 3d accelerator cards brand new I got a second hand one last year because I wasn't satisfied with how much I'd have to spend to get a significant improvement. If people are expecting something different based on the nametag rather than checking benchmarks and deciding for themselves whether it is worth the money, then that's their look out.

As for 2060 6GB vs 1070 8GB (newer part in what you describe as mid-range tier having less VRAM than previous gen part in what you describe as high-end tier) that has been going on for ages. 8800GTX had more VRAM than the 9800GTX. 970GTX had more VRAM than the 1060 3GB.
 
I still think the 1080Ti > 2080 best illustrates what's wrong with this generation of GPU's.

Same money. Same performance. The naming scheme doesn't matter.
 
I think you are completely missing the point.

Traditionally it's been like this:
80= flagship
70= high-end
60/60Ti= Mainstream/Mid-range
50= Entry level

But now it is like:
2080ti= flagship at premium price
2080= high-end at flagship price
2060/60ti/70= mid-range at high-end price
1660/60ti (which should had been the 50/50ti)= entry level at mid-range price

The point is that the performance that people are getting are less for their money, due to Nvidia pushing their cards pricing across the board to 1 tier up in pricing. The 2060 6GB is the perfect example for illustrating everything that is wrong; you got some people arguing 2060 having 6GB is fine because it is a "1080p card", but last I checked "1080p" had no business being launched at $349+, and when even the 2 years old+ 1070 had 8GB, it's kinda apparent that Nvidia trying to milk the consumer with on one hand their can cut corner on the cost of vram, and on the other it is the perfectly excuse to push people into buying the even more expensive 2070 8GB instead at £100 more.

Except that all changed with the release of Kepler. @HangTime is right. You are stuck on naming conventions, naming conventions that have meant nothing since Kepler was released. The naming scheme means nothing now between difference generations of cards.

Your summary is all wrong. The RTX 2xxx cards are all high end to extreme high end. The GTX 1xxx cards are the low to mid range cards. The 2060 is not a replacement for the 1060. The actual replacement for the 1060 is the 1660Ti.
 
Except that all changed with the release of Kepler. @HangTime is right. You are stuck on naming conventions, naming conventions that have meant nothing since Kepler was released. The naming scheme means nothing now between difference generations of cards.

Your summary is all wrong. The RTX 2xxx cards are all high end to extreme high end. The GTX 1xxx cards are the low to mid range cards. The 2060 is not a replacement for the 1060. The actual replacement for the 1060 is the 1660Ti.
My summary is base on the tier of the chips being used, not whatever Nvidia's BS marketing them as or the names that they label them.
 
Except that all changed with the release of Kepler. @HangTime is right. You are stuck on naming conventions, naming conventions that have meant nothing since Kepler was released. The naming scheme means nothing now between difference generations of cards.

Your summary is all wrong. The RTX 2xxx cards are all high end to extreme high end. The GTX 1xxx cards are the low to mid range cards. The 2060 is not a replacement for the 1060. The actual replacement for the 1060 is the 1660Ti.

RTX 2060 with 1080p30 performance - high-end card? nvidia has very screwed views on the world!
 
RTX 2060 with 1080p30 performance - high-end card? nvidia has very screwed views on the world!

And you definitely don't have a clue. do you really want to be embarrassed in another thread? Like in the image quality thread, you didn't know that the fog in the your Counter strike was caused by your GPU not been able to run Dx9.
 
I still think the 1080Ti > 2080 best illustrates what's wrong with this generation of GPU's.

Same money. Same performance. The naming scheme doesn't matter.

On average the 2080 is 10-15% faster than the 1080ti, especially in new games - in a per game basis there are a couple titles where the lead goes even above 15%
 
And you definitely don't have a clue. do you really want to be embarrassed in another thread? Like in the image quality thread, you didn't know that the fog in the your Counter strike was caused by your GPU not been able to run Dx9.

Maybe you are embarrassed. At least, when you speak about graphics cards, check their performance first!

rtx-poor-performance.png
 
Given you are quoting a DXR benchmark, you should also look at prior gen DXR performance. And from the benchmarks I've seen, the RTX2060 is actually very competitive with the 1080Ti in DXR mode. I personally wouldn't label it a high end card overall, but it is a high end card when it comes to DXR.
 
Your summary is all wrong. The RTX 2xxx cards are all high end to extreme high end. The GTX 1xxx cards are the low to mid range cards. The 2060 is not a replacement for the 1060. The actual replacement for the 1060 is the 1660Ti.

I don’t agree with that at all.

1080Ti was last gen ultra high end and its replacement is the 2080Ti the latest ultra high end with pretty weak performance gains and a ton more $$$ for what a v0.5 ray tracing support.

Or if you want to claim the 2080 is the successor based on price rather than model number you got what a woeful 10%, if that, if you account for overclock headroom.

Either way its rubbish for the consumer, 2000 series is ridiculously over priced for what it is and consumers are being ripped off something silly no doubt aggravated by controlled supply in the channels and completely manufactured by Nvidia too paranoid to oversupply and have to reduce price. GPU market is broken right now imho.
 
Maybe you are embarrassed. At least, when you speak about graphics cards, check their performance first!

rtx-poor-performance.png
Lol. You always ignore stuff you get wrong, which is 90% of what you say these days. Then move on to attack. Very embarrassing indeed.

Don't think I have seen anyone embarrassing themselves in the 10+ years I have been a member on this forum like you have done the past couple of months. You have set the bar high sir. At you are winning at something :p
 
Back
Top Bottom