• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Right lets talk everyone - are we all just been psychologically tricked into "needing" new graphics cards?

The great irony is that generated frames need a decent framerate in the first place. Congrats, you can do 60 real fps, lets smooth that with another 180 generated frames for a slightly smoother image with the responsiveness of 60fps.

Meanwhile the 30fps strugglers could be smoothed all the way to 240fps with generated frames and it's making the picture prettier but the game is still only reacting at 30fps...

It belongs next to motion blur.
 
I can't help but feel we've stagnated in everything.

1440p and 4k have been around for a long time now, (4k TVs been commercially since 2013, with mainstream from 2016, 9-12 years) we should really be able to have decent frame rates at 4k as a minimum now.

Graphics don't appear to be significantly better than the Xbox 360 days, they are better but in terms of what they were doing with the hardware then, to what they are doing now.. seems to me that games just aren't optimized, maybe because they have lots of ram, and they have lots of compute power and capacity it has instilled laziness in the coding I don't know,

I don't like the idea of fake frames, whenever I've played with them on it feels unnatural, almost like everything is delayed.
 
No I understand CPu bottlenecking. It's just it isn't always apparent that a system is bottlenecking due to the CPU.

The i7 8700k wouldn't always show 100% load even when the GPU isn't getting less than 100%

A GPU you see 100% load and know it's bring pushed to it's limit. In this case a new GPU would increase the performance.

A CPu on the other hand plays a massive part in the overall PC performance. Without doing a lot of bigging through logging software you would belive the CPu is doing fine.
You then have another limiting factor is RAM speed that that DDR4 can impact a struggling CPu.

You can then also blame the games bad optimisation that I feel a lot of users do without realising the impact of their own system.

But if the GPU is not hitting 99% at least and you're not using some frame capping software then this would point to the CPU being the issue no? The CPU does not need to hitting 100%.

Even then, there is more to it like cache.

I have a 5900X. I know I would get higher fps if I upgrade. But as i tend to be ok as long as I can hit 60fps I am not too bothered and would rather wait for AM6 platform and jump on that.

So far I can only think of a handful of games that would improve a lot. One of the big ones is Spiderman games. Even at 4K.
 
But if the GPU is not hitting 99% at least and you're not using some frame capping software then this would point to the CPU being the issue no? The CPU does not need to hitting 100%.

Even then, there is more to it like cache.

I have a 5900X. I know I would get higher fps if I upgrade. But as i tend to be ok as long as I can hit 60fps I am not too bothered and would rather wait for AM6 platform and jump on that.

So far I can only think of a handful of games that would improve a lot. One of the big ones is Spiderman games. Even at 4K.

No because as I pointed out to another reply. So many factors can point to a GPU not hitting 100% load and it isn't always the CPU to blame.

Poor game coding, software, even user error.
Jay 2 cents just released a software video yesterday I believe with Intel presentmon this software can help users to find the issue with the games.

Jay rightly at times talks about CPu bottleneck and the term is definitely used wrongly in the industry.

People believe if my CPU isn't hitting 100% then I am not CPu limited.
That is far from the truth.
 
People believe if my CPU isn't hitting 100% then I am not CPu limited.
That is far from the truth

This is not news though. Old news intact.

Assuming you watch tech channels and their reviews like hub, it would be obvious based on their benches alone when one is cpu bottlenecking. I know for a fact that if i went for a 9800x3d I would see a sizeable uplift in spiderman games. Both because of cpu and faster ram.
 
Which will happen if your buddy Lisa prices the 9070 XT wrong. Lol.

If she does it right it will swing in the other direction.
My concern with the 9070 is likely the complete opposite of everyone else. I'm concerned that whilst the raster performance may be 4080 levels, which would likely be ok for me, the FSR support in games will just be missing. If for those fancy AAA rpgs that I enjoy, DLSS provides smooth and pretty enhancements, and FSR support is lacking or less accomplished, I'm missing out on a better experience.

But I'm ever less convinced that raster fps is a metric that I care about once my games are past 60fps. I'm more excited by what the next generation will do with mega geometry and the like.
 
  • Like
Reactions: TNA
The most demanding thing I do with my rig is sim racing in VR on an HP Reverb. I decided to launch FPSVR and see how hard I was straining my hardware and the 4090 was coasting along between 50 and 60% load with GPU frame times consistently in the 5 to 6ms range. Since my headset runs at 90hz, I just need to stay under 11.1ms.

I am basically display-bottlenecked. Lol.

My recent upgrade from a 5800X3D to a 9800X3D was also totally unnecessary with CPU frame times around 3ms on the 9800X3D. (The 5800X3D was also comfortably under 11.1ms but I don't recall the exact number.)

The tiny FE cooler on the 5090 does give me a little bit of the upgrade itch as I find it very impressive, (Assuming it's able to cool 575w effectivly) but I don't know that I want to drop $2k to tinker with an impressive cooler. I would probably end up spending additional money to convert everything over to a SFF build and get no visual improvement in my gaming experience.

I thought the looming windows 10 cutoff would spur me to upgrade my headset to one that doesn't require WMR, but the BSB runs at 72hz and the Crystal light is big and bulky.

A friend of mine has pre-ordered the Pimax Super so I'll be able to try it once it releases. Maybe that will be the headset that pushes me into upgrading, but I think I'm going to just grab some popcorn and watch from the sidelines this GPU cycle.
 
Also I can't believe people still watch jay2c :cry:

I stick to the two Steve's lol
Yea the steves are the best. techjesus is all in on integrity, no matter the target, you have to respect that.

And HUB's work ethic is the highest amongst all those channels, when questions pop up, they'll spend 40 hours straight benchmarking. Honestly, I might rather have a jobbyjob than benchmarking duty
 
'dynamic resolution scaling'

64 fps, 99th 55fps low
There's your problem, this wouldn't be acceptable to lots of people

I currently have an RTX 3080 and in a few of the games I play I'm having to heavily mess about with settings to get the frame rates I want. 4 years is a decent run for it, but yes I do want an upgrade. Going from 8700 cores to 9-10k doesn't seem like a great upgrade either, but the 20k on the 5090 will suit me fine. Knowing it's likely to be another 4 years plus before a suitable upgrade from there also makes it easily justifiable within my hobby budget.
 
Back
Top Bottom