• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Might do, name them...
Why would you do that? why would you ask me to list them so that you can go through them and say they don't count? How does that help or achieve anything? The fact that you're already acknowledging that you're going to do that proves you're still trying to make a different argument to what i am.

How about this? Your Desktop environment.
 
Last edited:
i went 75 Hz to 165 Hz screen, smoooooth....
Well, how much can you do on the 4090, 144hz? What's the fastest panel out there that is also is high quality for color reproduction and angles, not just FPS? Can you really feel a significant difference between 120/144 and 165? And then again, there is gaming where is really, really difficult to push those numbers without sacrificing details/image quality.
 
:D
200w.gif
 
In a desktop environment anything above 120Hz is is just adding numbers without any noticeable gains. 60Hz to 120 is the biggest gain for sure, 144Hz seems the sweet spot as a balance between excellent gaming smoothness (remember the higher the fps you can squeeze the lower the render latency is and the better the VRR experience becomes).

I'm running my 175Hz at 144 because qd-oled has less vrr flicker in this current gen when playing dark scenes in games or using certain desktop apps that have heavy shades of grey like Photoshop, and I get native 10-bit colour instead of 8+FRC, even if 8-bit on an Nvidia card is just as good as native 10-bit.
 
I thought we were doing so well with no Nvidia vs AMD for a while.
One having the faster port inevitably brings that in to it even if that's not the point.

For me its that constant regression in what you're getting for your money in the name of increased profits and planned obsolescence.

When people pointed that out with 8GB £500 GPU's in 2020 it became AMD vs Nvidia because unfortunately they are the only other alternative and they had double the VRam.

If we are to remain silent on these things because calling it out makes it a hot topic then they will walk all over us.

I spent nearly 10 years with Nvidia before switching to AMD, i only did it because Nvidia have already got too complacent and offered me much less than what i wanted compared to AMD.

Its a £2000 card, put a better DP on it....
 
Last edited:
In a desktop environment anything above 120Hz is is just adding numbers without any noticeable gains. 60Hz to 120 is the biggest gain for sure, 144Hz seems the sweet spot as a balance between excellent gaming smoothness (remember the higher the fps you can squeeze the lower the render latency is and the better the VRR experience becomes).

I'm running my 175Hz at 144 because qd-oled has less vrr flicker in this current gen when playing dark scenes in games or using certain desktop apps that have heavy shades of grey like Photoshop, and I get native 10-bit colour instead of 8+FRC, even if 8-bit on an Nvidia card is just as good as native 10-bit.

Just on this, 8 bit + frc is actually better than native 10 bit on a variety of displays it seems although this might be more for nvidia since their frc handling is better than amds.

 
Money well spent :))

I'm really curious what games, relative recent ones, can go beyond 120/144 and it's also useful to have such a big "speed".
Honestly world of Warcraft would.

Getting a new expansion this year and they always update asset visuals.

Before any one moans, it's likely to out sell any other PC game.
 
In a desktop environment anything above 120Hz is is just adding numbers without any noticeable gains. 60Hz to 120 is the biggest gain for sure, 144Hz seems the sweet spot as a balance between excellent gaming smoothness (remember the higher the fps you can squeeze the lower the render latency is and the better the VRR experience becomes).

I'm running my 175Hz at 144 because qd-oled has less vrr flicker in this current gen when playing dark scenes in games or using certain desktop apps that have heavy shades of grey like Photoshop, and I get native 10-bit colour instead of 8+FRC, even if 8-bit on an Nvidia card is just as good as native 10-bit.
Honestly, there's no excuse for Nvidia for omitting DP 2.0 or what ever in the 4000 series.

I don't really care for your use case, spending the amount of money you did on a 4090 even with a bit of tax evasion and having something outdated shouldn't be defended in any way.
 
For me its that constant regression in what you're getting for your money in the name of increased profits and planned obsolescence.
For me, if you'd have made that as your point from the start, you'd probably have had plenty of people, myself included, agreeing with you.

Because that is very true. Sometimes I think you like a bit of back and forth :D
 
Last edited:
Back
Top Bottom