• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Why GPU prices are NOT likely to drop significantly EVER!

Status
Not open for further replies.
Associate
Joined
6 Nov 2018
Posts
313
Location
UK
Totally false. You still run into CPU walls even at 4K, moreso if you are talking raytracing, but particularly if you use an Nvidia GPU. Reviewers are still slow on the uptake & mostly incompetent so you aren't going to see proper benchmarks on CPU RT, but ask the users out in the wild.

  • So, if you're buying an Nvidia GPU, then the 3700x will 100% be a bottleneck, with or without raytracing (see HW unboxed for tests).
I can tell you from my own experience with an RX 6800 & i7 6800K that in Cyberpunk the fps plummets and more than halves once raytracing kicks in (this is at 360p or so, so it's not the GPU), staying closer to 45-50ish fps than 60 fps. In general open world games are particularly brutal on the CPU once you also add RT, WD:L was also quite hard to run properly initially until they did some more work, and it's still not perfect (esp. if you start ungimping the streaming settings that are meant for consoles).

I would say that for next-gen games you'll most likely want to upgrade the CPU too, with a Zen 3 at the bare minimum, if you get yourself a nice beefy RDNA 3 or other GPU. The CPU requirements will only go up from here on out. Games like Avatar? That's gonna be an absolute bloodbath with its RT foliage, heck even TD2 already gives CPUs a very nice workout.

I am the "users out in the wild" and been happily 4k gaming with a ryzen 5 2600 (2080ti/3080ti).

I can't run GTA 5 with grass on Ultra so I guess I should just throw it in the bin.
 
Soldato
Joined
18 Feb 2015
Posts
6,484
Almost all reviewers have agreed that CPU is not significant at 4k in almost all the games they tested.
No, they haven't. In fact I've already mentioned to you major reviewers who did thorough testing and showcased exactly the type of scenarios that will get you bottlenecked at 4K.

So say you use an i7 8th Gen K CPU with a 6800XT rig and also an RTX 3080, to play games at 4k, in say RDR2. The 6800XT would perform worse and loose fps. Swap the CPU of only the 6800XT with a new i7 10th Gen K CPU, the 6800XT Rigg will still perform worse.

Now, swap out the 6800XT, with an RTX 3090 ( with the 8th Gen CPU), you would definitely see better performance.

No it wouldn't, read & re-read: Nvidia GPUs are more CPU bound than AMD ones, so without top-tier current-gen CPUs you'll run into a CPU bottleneck even at 4K. Will it be every game? No, but it's often enough for a >$1500 GPU that you should think about it + going forward CPU demands will only increase. And then besides that you also have scenarios like turning raytracing on which further exacerbates CPU-requirements and in that case BOTH AMD & Nvidia will run headlong into CPU bottlenecks (non-Zen3/equivalent Intel etc), again, even at 4K. Mind you, when I'm saying 4K+RT I ofc assume DLSS/FSR on, else you'll run the game at 10 fps, regardless of GPU, in which case the discussion is pointless - I'm assuming that from a high-end gaming you'll want a smooth 60. Even more importantly, because it's something you WON'T see in most reviews, I'm also taking into account frametime graphs & more fine measurements all across the board so as to properly evaluate a CPU bottleneck - because just having a high average doesn't mean you have a smooth experience, even above 60 fps. 1% lows alone aren't enough.

re NV CPU:
https://youtu.be/nIoZB-cnjc0
https://youtu.be/JLEIJhunaW8
https://youtu.be/G03fzsYUNDU

and a whole bunch of other stuff I won't bother linking, you can find it if you care, otherwise "just trust me bro". The first video is the most important.

Remember also where the discussion started: future high-end GPU (so likely >RTX 3090 in performance) and with a Zen 2 CPU. There's NO DOUBT that even at 4K that will mean a CPU bottleneck, particularly if it's going to be an Nvidia GPU.

I am the "users out in the wild" and been happily 4k gaming with a ryzen 5 2600 (2080ti/3080ti).

I can't run GTA 5 with grass on Ultra so I guess I should just throw it in the bin.

GTA 5 is a different situation, because it's on DX11. But if you don't care that your 2080 Ti/3080 Ti is held back and performing more like a 2070 then... good for you? People are also happily playing that game on consoles with Xbox 360-like settings in 2021, so what.
 
Associate
Joined
6 Nov 2018
Posts
313
Location
UK
Pretty sure we could cherry pick examples of times when there is some bottleneck, games are never coded perfectly. We only suggested the guy wait a few years, which despite your ramblings, is still the best advice as a 6c12t cpu can handle nearly all games at 4k. I would use neither Cyberpunk nor a technology still in its infancy such as ray tracing as a reason to be worried about cpu bottlenecks in 4k pc gaming.

Why do you want to die on this hill? It's an established fact that 4k is all about the graphics card over the CPU, by professional benchmarks and real world users. Did you drop a grand on a threadripper thinking it would improve 4k performance and now need to convince yourself and others?

And my 3080ti is not running like a 2070 when I am getting 4k60 in all the games I throw at it with settings on the highest (bar a couple of poorly optimised ones that need reducing no matter the cpu).
 
Permabanned
Joined
20 Jan 2021
Posts
1,337
https://youtu.be/JLEIJhunaW8
https://youtu.be/G03fzsYUNDU

No it wouldn't, read & re-read: Nvidia GPUs are more CPU bound than AMD ones, so without top-tier current-gen CPUs you'll run into a CPU bottleneck even at 4K.

You mention that a GPU absent a "Top Gen CPU" will give you a bottleneck at 4k, but as far as I am aware , an overclocked 5Ghz Core i7 9th or 10th Gen Intel "K" CPU are definitely "Top Gen CPUs" when the Nvidia 3 Series/AMD 6 Series cards were released.

His review is all at 1080p with somes games at 1440p. Who buys a 3080/3080Ti or 3090 to play games at 1080p :cry: ? That's just overkill if you ask me. You might as well just go and get a 4K monitor and enjoy 4K UHD gaming, where the CPUs are not bottlenecking.
 

V F

V F

Soldato
Joined
13 Aug 2003
Posts
21,184
Location
UK
You mention that a GPU absent a "Top Gen CPU" will give you a bottleneck at 4k, but as far as I am aware , an overclocked 5Ghz Core i7 9th or 10th Gen Intel "K" CPU are definitely "Top Gen CPUs" when the Nvidia 3 Series/AMD 6 Series cards were released.

His review is all at 1080p with somes games at 1440p. Who buys a 3080/3080Ti or 3090 to play games at 1080p :cry: ? That's just overkill if you ask me. You might as well just go and get a 4K monitor and enjoy 4K UHD gaming, where the CPUs are not bottlenecking.

The same people that play 1080p at 240 - 360Hz.
 
Soldato
Joined
19 Sep 2009
Posts
2,747
Location
Riedquat system
When talking about CPU bottlenecks for me it is just a matter of how many FPS the CPU can put out and not a question of resolution. If your current CPU can push out 60fps+ minimums for example and you are happy with that then there is no need to get a new one. If you wanted 120fps and it can't push that many but your GPU can then you may decide an upgrade is in order.

I dont think the CP2077 article has enough info, only two CPUs and iirc that game ran better on intel CPUs anyway. I think most games would have run okay on my ancient 5820K but a few struggled with its ST performance.
 
Soldato
Joined
9 Aug 2013
Posts
2,663
Location
S. Wales
there u go :D
index.jpg
 
Caporegime
Joined
20 May 2007
Posts
39,703
Location
Surrey
Looks like they are already dropping. Deals at or around RRP today from this place.

RRP might return over the coming months in more volume.
 
Associate
Joined
31 Dec 2010
Posts
2,441
Location
Sussex
eth is on way up again, so may encourage miners to become active again in buying seeing as 3060ti is available, can they resist
Ah, that explains it.
I was surprised when I looked at WhatToMine while comenting about the RX 7900XT rumours: all the cards had big profitability.
No wonder, ETH and BTC are again up - around 25% over the last 7 days.
 
Soldato
Joined
9 Aug 2013
Posts
2,663
Location
S. Wales
Ah, that explains it.
I was surprised when I looked at WhatToMine while comenting about the RX 7900XT rumours: all the cards had big profitability.
No wonder, ETH and BTC are again up - around 25% over the last 7 days.
yep and tbh i dont think its a trap, i think its weathered a little rocky patch and up we go, although it will become increasingly hard to mine as time gets on, so im not sure how much effect it will have on the gpu market, especially after the crackdown in china
 
Permabanned
Joined
22 Oct 2018
Posts
2,451
Looks like they are already dropping. Deals at or around RRP today from this place.

RRP might return over the coming months in more volume.

Yes but I think that we are unlikely to see many cards that have a low margin. You note there are plenty of the high rrp to performance cards like the 3070ti and the 3080ti but absolutely none of the low margin 3080's.
 
Permabanned
Joined
20 Jan 2021
Posts
1,337
eth is on way up again, so may encourage miners to become active again in buying seeing as 3060ti is available, can they resist

Oh well , at least more GPUs are in stores for everyone and I think most gamers have already bought a GPU so if the miners want to buy them, they should do.
 
Soldato
Joined
9 Aug 2013
Posts
2,663
Location
S. Wales
Oh well , at least more GPUs are in stores for everyone and I think most gamers have already bought a GPU so if the miners want to buy them, they should do.
i wont be buying any to mine, its just spending money for me using my vega,using the apu to browse and the series s to game until prices of mid tier fall back to normality
 
Status
Not open for further replies.
Back
Top Bottom