• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Why are GPUs so expensive?

Some people dont want speed.
Exactly. Too many people just make assumptions like this. Like as if everyone has the same wants and needs.

Most people seem to think everyone needs at least 144fps or something. Nope, games I play are plenty good enough at 60fps for me. Therefore I actually can get away with a cheaper graphics card than someone who games at 1440p and needs 144fps.
 
if mining boom is irrelevant then so is pricing during a recession.
Then next gen consoles should have doubled in price because of "teh recession!!!". Wrong they have stayed the same price because even a $100 price hike has spelled doom for that particular console in the past. Contrary to gullible pc gamers, console gamers do indeed vote with their wallets.
 
GTX 470, GTX 570, GTX 670, GTX 770, GTX 970, GTX 1070 are all mid-range cards with prices started from $349.99....
GTX 480 was $499.99.
Actually no.

The 470 and 570 were the 2nd fastest card from the top with the same big chip same as the fastest card at the top at their gen.

Then the misdirection of Nvidia mid-size die GTX6"70" (60) and GTX6"80" (60ti) and 1st gen Titan and 780 series being labelled as 2 separate gen despite they are in the same gen in reality allowed Nvidia to sneakily push their produce range onto one price bracket higher. They are all according to (their) plan.

The GTX670/GTX680/GTX780/GTX780ti should had been released as a same gen as traditionally as 60/60ti/70/80, but Titan happened in between and served as excuse to delay the release of the 780 by 9 months, and the name schemes got reshuffled since to pull wool over peoples' eyes in terms of pricing perception for the card they "think" they are getting.

I mean just look at how Nvidia actually ran out of model numbers counting down this gen, and desperately cringing on the the "60" branding trying to not call their slower cards as the 50 or 50ti.
 
Actually no.

The 470 and 570 were the 2nd fastest card from the top with the same big chip same as the fastest card at the top at their gen.

Then the misdirection of Nvidia mid-size die GTX6"70" (60) and GTX6"80" (60ti) and 1st gen Titan and 780 series being labelled as 2 separate gen despite they are in the same gen in reality allowed Nvidia to sneakily push their produce range onto one price bracket higher. They are all according to (their) plan.

The GTX670/GTX680/GTX780/GTX780ti should had been released as a same gen as traditionally as 60/60ti/70/80, but Titan happened in between and served as excuse to delay the release of the 780 by 9 months, and the name schemes got reshuffled since to pull wool over peoples' eyes in terms of pricing perception for the card they "think" they are getting.

I mean just look at how Nvidia actually ran out of model numbers counting down this gen, and desperately cringing on the the "60" branding trying to not call their slower cards as the 50 or 50ti.

There are also the GTX 490 (even though only one prototype by Asus ROG Mars II), and GTX 590, so GTX 470 and GTX 570 were the third fastest nvidia cards.
 
There are also the GTX 490 (even though only one prototype by Asus ROG Mars II), and GTX 590, so GTX 470 and GTX 570 were the third fastest nvidia cards.
Don't think anyone is here to talk about cards SLI or Crossfire on the single PCB.

The point remains is that the 470/570 cards were using the top tier GPU at the time, unlike the sorry state of the current example example of 20"70" using 3rd tier from the top.

Titan has been use as the excuse to pushing what was traditionally the "80" card from a regular flagship to an ultra premium gaming product.
 
Last edited:
There's no such thing as a bad card, only a bad price. And right now nvidia aren't even an option for me because they're so expensive unfortunately.

Then next gen consoles should have doubled in price because of "teh recession!!!". Wrong they have stayed the same price because even a $100 price hike has spelled doom for that particular console in the past. Contrary to gullible pc gamers, console gamers do indeed vote with their wallets.

You can sell any old ***** to a pc gamer.
 
I still maintain my claim, the 2070 super and 5700xt are decent 4K cards.

mid you disagree then your standards are too high and that means you should be more happy to cough up/ you can't expect bleeding edge and pay peanuts for it.


Can’t play RDR2 at 4K at all at 60fps.
 
Yeah a crazy part of me would like to see Nvidia's market share to get to 100% so AMD gives up and quits the GPU business. So when people complain about the total lack of progress and ultra high prices I can smile and say to myself "you've made your bed now lie in it".
Cos amd are giving out their GPUs for free?
 
4k gaming doesn't have to be expensive. I'm thinking about getting a 4k TV for gaming. I'm playing older games like Battlefield IV anyway so they'll look good at 4k. I'll be playing todays games like RDR2 on my next Pc in a few years time. They'll fly on future hardware and the games will be very cheap to buy.
 
Last edited:
in a single crap game with crap optimisation - pretty good actually double Xbox one x performance.

of course you can always just climb back on the high horse waiting for 8k 60 ultra at $100


No high horse. I genuinely was asking for these 4K/60 settings so I can use them on my 2080.

I watched a foundry video which pointed towards the X not running on low settings as far as I know. Sadly I think u hit the nail on the head with optimisation.

Also I’m not actually really saying ur wrong... I think for older games the 2070S is a great 4K card.

edit: I've been playing around with the settings. I can get a decent-ish smooth rate on my 2080 now. can u give me any suggestions? I need TAA on as without it the game looks too mushy. Its definitely playable but not 60fps.

If only my monitor had Gsync.. i feel like it would make it palatable. God damn why didnt i just buy a 2080ti instead of the 2080 on release.
 
Last edited:
4k gaming doesn't have to be expensive. I'm thinking about getting a 4k TV for gaming. I'm playing older games like Battlefield IV anyway so they'll look good at 4k. I'll be playing todays games like RDR2 on my next Pc in a few years time. They'll fly on future hardware and the games will be very cheap to buy.
I think this is the smartest decision. Games from a couple of years ago I can play at 4K on my 2080, for example GTA V. I think to consistently play the best games at 4K, you just need to resign yourself to playing older titles. Given our back logs are normally so huge, its not a big deal.

However sadly for me in RDR2’s specific case, I don’t want to wait to play it as I did that with GTA V and now can’t find the time to get round to it/.
 
No high horse. I genuinely was asking for these 4K/60 settings so I can use them on my 2080.

I watched a foundry video which pointed towards the X not running on low settings as far as I know. Sadly I think u hit the nail on the head with optimisation.

Also I’m not actually really saying ur wrong... I think for older games the 2070S is a great 4K card.

the foundry vids admit rdr2 is not poorly optimised for pc though?
 
4k gaming doesn't have to be expensive. I'm thinking about getting a 4k TV for gaming. I'm playing older games like Battlefield IV anyway so they'll look good at 4k. I'll be playing todays games like RDR2 on my next Pc in a few years time. They'll fly on future hardware and the games will be very cheap to buy.

great strategy, i used to do that too - playing games couple years old - they are cheap, have all updates and great performance
 
No high horse. I genuinely was asking for these 4K/60 settings so I can use them on my 2080.

I watched a foundry video which pointed towards the X not running on low settings as far as I know. Sadly I think u hit the nail on the head with optimisation.

Also I’m not actually really saying ur wrong... I think for older games the 2070S is a great 4K card.

Even in relatively older games that card should have problems. I play on 3x1080p screens (so only 75% the resolution of 4k), on a 2080 and there are problems if you want top settings at 60fps. Adding that extra resolution on top means dropping the settings even lower to compensate. Sure, some settings can be turned off or down without much loss (if any) of quality during gameplay, but still, I for one would not call that a 4k card unless you're willing to drop settings (potentially quite low) or accept lower than 60fps frame rate.

Right now the only card that I would consider to be ok for 4k would be a 2080ti. Of course, if you don't use RT, if you want RT, it's at best a 1080p card - and please don't bring DLSS or other methods that run the game at lower resolution.

As a consumer, I'm fascinated how some "defend" the increases in price. I would get it if you have some interest as a seller, but as a buyer?!

And then let's not forget that the world isn't just western Europe and Americas, other people don't have the financial power to buy expensive hardware. And yes, some will reply with "but that's not a necessity of every day life, it's a luxury!" and I would agree (not entirely, but would agree). However, having more people with stronger hardware available, would allow devs to target that hardware with more commitment (so a lower price is good for everyone), meaning we would actually possibly get bigger steps in game development year after year and not just when a generation of consoles changes into another.
 
Even in relatively older games that card should have problems. I play on 3x1080p screens (so only 75% the resolution of 4k), on a 2080 and there are problems if you want top settings at 60fps. Adding that extra resolution on top means dropping the settings even lower to compensate. Sure, some settings can be turned off or down without much loss (if any) of quality during gameplay, but still, I for one would not call that a 4k card unless you're willing to drop settings (potentially quite low) or accept lower than 60fps frame rate.

Right now the only card that I would consider to be ok for 4k would be a 2080ti. Of course, if you don't use RT, if you want RT, it's at best a 1080p card - and please don't bring DLSS or other methods that run the game at lower resolution.

As a consumer, I'm fascinated how some "defend" the increases in price. I would get it if you have some interest as a seller, but as a buyer?!

And then let's not forget that the world isn't just western Europe and Americas, other people don't have the financial power to buy expensive hardware. And yes, some will reply with "but that's not a necessity of every day life, it's a luxury!" and I would agree (not entirely, but would agree). However, having more people with stronger hardware available, would allow devs to target that hardware with more commitment (so a lower price is good for everyone), meaning we would actually possibly get bigger steps in game development year after year and not just when a generation of consoles changes into another.
I’ve managed to play on the 2080 at 4k Witcher 3 gta v dishonoured 2 and divinity original sin 2 Without ever having to go too far away from ultra

just realised I haven’t gamed that much yea I’m not rly defending anything lol
 
great strategy, i used to do that too - playing games couple years old - they are cheap, have all updates and great performance
great strategy, i used to do that too - playing games couple years old - they are cheap, have all updates and great performance


I used to / still do this but for the odd amazing game such as rdr2 I just play it now.

otherwise I have to listen to other people saying how amazing the games are and also life is sometimes too short
 
Back
Top Bottom