• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Why GPU prices are NOT likely to drop significantly EVER!

Status
Not open for further replies.
I'm thinking about just forgetting about 6000/3000 series and mothballing by PC until the 4000 series and sell the Series X around then. My concern is a 3700X CPU and 3200MHz ram (32GB) coming into 2022. Hoping it would not be a bottleneck at 4k by then.
 
I'm thinking about just forgetting about 6000/3000 series and mothballing by PC until the 4000 series and sell the Series X around then. My concern is a 3700X CPU and 3200MHz ram (32GB) coming into 2022. Hoping it would not be a bottleneck at 4k by then.


Just sell everything and start fresh in 1 year

no pint holding onto stuff that will devalue
 
I'm thinking about just forgetting about 6000/3000 series and mothballing by PC until the 4000 series and sell the Series X around then. My concern is a 3700X CPU and 3200MHz ram (32GB) coming into 2022. Hoping it would not be a bottleneck at 4k by then.

4k is all on the graphics card, for the most part. Any six core/twelve thread cpu right back to the ryzen 5 1600 will do for a few years at least.
 
I'm thinking about just forgetting about 6000/3000 series and mothballing by PC until the 4000 series and sell the Series X around then. My concern is a 3700X CPU and 3200MHz ram (32GB) coming into 2022. Hoping it would not be a bottleneck at 4k by then.

4k is all on the graphics card, for the most part. Any six core/twelve thread cpu right back to the ryzen 5 1600 will do for a few years at least.

Yah, 4K is mostly GPU bound. Why buy a 3090 to play at 1080p ?Waste if you ask me.

Totally false. You still run into CPU walls even at 4K, moreso if you are talking raytracing, but particularly if you use an Nvidia GPU. Reviewers are still slow on the uptake & mostly incompetent so you aren't going to see proper benchmarks on CPU RT, but ask the users out in the wild.

  • So, if you're buying an Nvidia GPU, then the 3700x will 100% be a bottleneck, with or without raytracing (see HW unboxed for tests).

I can tell you from my own experience with an RX 6800 & i7 6800K that in Cyberpunk the fps plummets and more than halves once raytracing kicks in (this is at 360p or so, so it's not the GPU), staying closer to 45-50ish fps than 60 fps. In general open world games are particularly brutal on the CPU once you also add RT, WD:L was also quite hard to run properly initially until they did some more work, and it's still not perfect (esp. if you start ungimping the streaming settings that are meant for consoles).

I would say that for next-gen games you'll most likely want to upgrade the CPU too, with a Zen 3 at the bare minimum, if you get yourself a nice beefy RDNA 3 or other GPU. The CPU requirements will only go up from here on out. Games like Avatar? That's gonna be an absolute bloodbath with its RT foliage, heck even TD2 already gives CPUs a very nice workout.
 
I'm thinking about just forgetting about 6000/3000 series and mothballing by PC until the 4000 series and sell the Series X around then. My concern is a 3700X CPU and 3200MHz ram (32GB) coming into 2022. Hoping it would not be a bottleneck at 4k by then.

The Ryzen 7 3700X will be fine for a few years IMHO,especially if you are not buying £1000+ GPUs for RT:

https://www.youtube.com/watch?v=AlfwXqODqp4&t=1455s


The fact is every generation there is a "better" CPU coming out,so if you measurebate every year on CPU graphs,you will never keep a CPU for longer than a year or so. Hardware forums are full of people who eternally need to justify upgrading,and I tend to be wary of reading too much into things. You can always tweak settings,etc and stuff like FreeSync/Gsync help a lot if you monitor supports it.

The games which I found which are very CPU limited at higher resolutions all are based on older engines design with Intel CPUs in mind,so are latency dependent and push one or two cores a lot. CPUs like SKL-X had issues with the same games,but consumer Skylake did not. Even the so called modern CPU limited games,also tend to be very GPU limited too,and realistically in most cases,as a person who does not upgrade CPUs that much,I found 90% of the time I have been GPU limited. I remember being on SB/IB for years,and yes you could force a CPU limited scenario using lower resolution testing,but in the realworld you still need a fast enough GPU to see it at higher resolutions. I had an IB Xeon E3 1230 V2,and went from initially an HD5850 1GB to finally a GTX1080,and only with the latter(which was several times faster),did I start to see some CPU issues at qHD.

ATM many games are intergenerational so are really only taxing a few threads heavily,so these very same games you can see a Ryzen 5 5600X/Core i5 10600K matching their 8~12 core versions. You see this with CB2077 which runs better on Intel CPUs rather than Zen3.

Once games more to the next generation consoles,literally all the console based dev kits will be based around Zen2 - this is the CPU in the PS5 and XBox Series X,in a lower clocked and L3 cache limited form(similar to the CPU part of the Zen2 APUs). Games on the next generational consoles will have to take into account the dual CCX design,which games currently really don't need to. If they don't the consoles will hit more CPU issues.

Also WRT to RT,you are still massively limited by the GPU hardware,especially the amount of RT cores,memory bandwidth and VRAM. This is why with my RTX3060TI,I see performance at 1080p in CB2077 is still significantly higher than at qHD when you switch on all the RTX effects.

I am also seeing double the performance at qHD in some of the most CPU and GPU demanding areas of the game with the previous non-RT settings(lots of NPCs,and lighting effects),when compared with my GTX1080. FPS went from around 30FPS to around 50~55FPS.

This is despite there being "faster CPUs" for the game.

So personally I think you are OK for the time-being,unless you get a really cheap deal on a Zen3 CPU. 2023,will have new generation DDR5 platforms out with even more performance,so everyone saying to get Zen3,will then say Zen3 is too slow,and then you need a Zen4,etc.

You can then carry over your RTX3080. However,if you don't really use the PC for gaming,and use other platforms more,than yes it might make sense to sell it off when the parts have value and use those systems.
 
Last edited:
I would say that for next-gen games you'll most likely want to upgrade the CPU too, with a Zen 3 at the bare minimum, if you get yourself a nice beefy RDNA 3 or other GPU. The CPU requirements will only go up from here on out. Games like Avatar? That's gonna be an absolute bloodbath with its RT foliage, heck even TD2 already gives CPUs a very nice workout.
At the current rate of CPU progress we're at 15 or 20% gains per new CPU, if that stays then I'd upgrade at least every 2 generations because that's a fair leap, at least compared to how it was for a while before that.
 
you think this is bad wait till prices return to normal, think about it, thread after thread of moaning because of low gpu prices after folk shelled out insane amounts,personally not sure whats worse, that or scalpers:):p:cry:
 
These CPU limited stories are all the same. Let's take fastest GPU money can buy and run games at embedded CPU graphics settings (=ultra bad) and then declare you're CPU limited because hugely artificial scenario like that can show some differences. Run a game on higher rez and nicer settings and you're looking at single digit speed differences. Except for cherry picked games perhaps.

I went from 2500k to 5600x and that does actually give tangible improvements depending on the game but it's a huge leap in capability.
 
I'm thinking about just forgetting about 6000/3000 series and mothballing by PC until the 4000 series and sell the Series X around then. My concern is a 3700X CPU and 3200MHz ram (32GB) coming into 2022. Hoping it would not be a bottleneck at 4k by then.

What makes you think the 4000 series and RDNA3 cards won't be even more expensive and just as hard to find for MSRP?

I don't think GPU prices will ever return to Pascal levels of normality. Nvidia Turing was the beginning of the end of reasonable prices for high-end GPU's.
 
Totally false. You still run into CPU walls even at 4K, moreso if you are talking raytracing, but particularly if you use an Nvidia GPU. Reviewers are still slow on the uptake & mostly incompetent so you aren't going to see proper benchmarks on CPU RT, but ask the users out in the wild.

  • So, if you're buying an Nvidia GPU, then the 3700x will 100% be a bottleneck, with or without raytracing (see HW unboxed for tests).
I can tell you from my own experience with an RX 6800 & i7 6800K that in Cyberpunk the fps plummets and more than halves once raytracing kicks in (this is at 360p or so, so it's not the GPU), staying closer to 45-50ish fps than 60 fps. In general open world games are particularly brutal on the CPU once you also add RT, WD:L was also quite hard to run properly initially until they did some more work, and it's still not perfect (esp. if you start ungimping the streaming settings that are meant for consoles).

I would say that for next-gen games you'll most likely want to upgrade the CPU too, with a Zen 3 at the bare minimum, if you get yourself a nice beefy RDNA 3 or other GPU. The CPU requirements will only go up from here on out. Games like Avatar? That's gonna be an absolute bloodbath with its RT foliage, heck even TD2 already gives CPUs a very nice workout.


Almost all reviewers have agreed that CPU is not significant at 4k in almost all the games they tested.

So say you use an i7 8th Gen K CPU with a 6800XT rig and also an RTX 3080, to play games at 4k, in say RDR2. The 6800XT would perform worse and loose fps. Swap the CPU of only the 6800XT with a new i7 10th Gen K CPU, the 6800XT Rigg will still perform worse.

Now, swap out the 6800XT, with an RTX 3090 ( with the 8th Gen CPU), you would definitely see better performance.
 
What makes you think the 4000 series and RDNA3 cards won't be even more expensive and just as hard to find for MSRP?

I don't think GPU prices will ever return to Pascal levels of normality. Nvidia Turing was the beginning of the end of reasonable prices for high-end GPU's.

I'm sure the 1080Ti was hitting £900 at one point, I remember paying £620 for a 1080
 
My EVGA GTX 1080 sc acx cost me £516 in 2017.

I believe Jonny Silverhand above is correct and we will never see pascal prices again. The 4000/7000 series cards next year are likely to be much more expensive than current now the manufacturers have seen people will pay any price for the latest gfx cards.
 
Status
Not open for further replies.
Back
Top Bottom