i think he said his games are dipping by 10fps not down to 10fps
Yes dipping 10 FPS sometimes from 60 moving down to around 50fps in some locations in games.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
i think he said his games are dipping by 10fps not down to 10fps
You haven't mentioned what CPU you have, and yes it sounds like a CPU bottleneck especially since you talk of single core performance.
its i7 8700k , from looking at benchmark scores it hardly makes any difference maybe like 1 FPS more in 4k gaming so its 100% not the problem here and 100% not worth upgrading just yet. ( https://cpu.userbenchmark.com/Compare/Intel-Core-i7-8700K-vs-Intel-Core-i7-10700K/3937vs4070 )
its i7 8700k , from looking at benchmark scores it hardly makes any difference maybe like 1 FPS more in 4k gaming so its 100% not the problem here and 100% not worth upgrading just yet. ( https://cpu.userbenchmark.com/Compare/Intel-Core-i7-8700K-vs-Intel-Core-i7-10700K/3937vs4070 )
A post from a position of ignorance by me, but isn't a quick drop in FPS often a sign of hitting a VRAM limit?
You meant to say:
the lower the res the more likely you’ll bottleneck the CPU
The higher the res the more the gpu bottlenecks and the cpu does less work.
Ssssshhhhh. I think you might have gotten away with mentioning it just this once.
Just to school me on this one too, why does the CPU do less work at higher resolutions? I would have expected it to do the same amount of work or more, only the GPU limits frame rates. Or is it the case that if the GPU is only able to generate 60FPS, the CPU will only do 60FPS worth of work rather than the 150FPS it's potentially capable of?
Interested as that has practical implications that the latest and greatest CPU is less relevant to gaming at higher resolutions.
its i7 8700k , from looking at benchmark scores it hardly makes any difference maybe like 1 FPS more in 4k gaming so its 100% not the problem here and 100% not worth upgrading just yet. ( https://cpu.userbenchmark.com/Compare/Intel-Core-i7-8700K-vs-Intel-Core-i7-10700K/3937vs4070 )
I still dont get the impression that the new gen are really ready for 4k anyway. Older titles maybe, newer titles (next gen) no chance. Unless you are looking for a 30fps experience. Especially if RT gets involved.
I'm referring to next gen with RT enabled.
It will be a 30fps job.
I still stand by my statement that 4k isn't here yet.
Older titles sure and especially those with no RT. But with it and the latest titles with all the bells and whistles I can't see it happening.
We'd need DLSS 3.0 or a new generation.
---
I mean tell me howo Watch Dogs Legion is at 4k ultra settings and RT on max?
And Cyberpunk?
They will both be 30-40 FPS if your lucky. But with Gsync it will still be a great experience.
(I speak with a 3080 in mind)
62.6 for my 3080 with those settings.Just benchmarked RDR2 with [email protected] and 3090FE on stock clocks. Avg of 66fps with pretty much everything on ultra etc and additional settings as per games radar suggestions for max visuals. Basically all out and can turn down quite a few things no problem. Your 3080 shouldn't be far behind....
Are you reading the actual thread mate? RTX is a gimmick in most games, I personally have it on low/medium, disabled some times. Then I get high (60-120fps) in the vast majority of games.
I think many, including yourself, just want to believe 4K isn't ready yet and make up excuses. I mean, your GTX 1080 isn't doing ray tracing either, why only consider 4K if you can enable max ray tracing? That's a really dumb logic.