Thinking Of Upgrading To High Definition monitor

Associate
Joined
6 Dec 2019
Posts
41
Hi,

I thinking about buying new monitor in higher definition for gaming. Not sure what the different types are or the terminology ?

I currently have a Dell2311H Monitor 23". Don't want anything huge 25-26" max really. Don't want to spend a huge amount of money either...

What would you suggest ?

Thanks

Viper10



Corsair Graphite 600T Case
MSI B550-MAG (Socket AM4) DDR4 ATX Motherboard
AMD Ryzen 5 3600 Six Core 4.2GHz (Socket AM4) Processor - Retail
Patriot Viper Steel 16GB (2x8GB) DDR4 PC4-28800C17 3600MHz Dual Channel Kit
Powercolor Radeon RX 5700 XT Dual Fan 8GB GDDR6 PCI-Express Graphics Card (top PCI)
Crucial P5 Plus 1TB M.2 PCIe x4 NVMe SSD/Solid State Drive (top location)
Seagate Barracuda 7200.12 1TB SATA 6Gb/s 32MB
Seasonic Prime Ultra Snow Silent 650W 80 Plus Platinum Modular Power Supply
Arctic Freezer 34 ESports Duo White CPU Cooler - 2 x 120mm
Dell2311H Monitor (Display Port)
 
Basically it's like this... your monitor is old, anything you get now will look miles better. You'll want IPS panel technology. Any monitor you get will connect via DisplayPort.

You need to choose whether you want to stick with 1920x1080 (aka Full HD, 1080p) resolution, which tends to be around 24-25 inches, or step up to 2560x1440 (aka 1440p) which tends to be 27 inches.

You'll want to choose a variable refresh rate technology which is compatible with your graphics card, and any graphics card you're likely to buy in future. This is FreeSync for AMD cards, and G-Sync or G-Sync Compatible for NVIDIA cards. Thankfully, there are plenty of "FreeSync with G-Sync Compatible" monitors which gives you full compatibility. These monitors will give you a refresh rate of about 144hz, up from your current 60hz, which is a much smoother gaming experience, and allows your monitor to vary the refresh rate based on the fps output by the graphics card, and because they're synchronised you don't get screen tearing. This all makes your gaming experience way better.

Have a browse around here:
 
Or be using lots of compression dlss/fsr features to get you by!
I think get you by is the operative phrase. I remember when all the arguments were about IQ, now it's turn on ray tracing, lower the frame rate then use DLSS to make the image worse then it was at native. Hence I'll stick at 1440p rasterised native image until nice 4K native ray tracing is viable on a XX70 class card. Might be a while!
 
I think get you by is the operative phrase. I remember when all the arguments were about IQ, now it's turn on ray tracing, lower the frame rate then use DLSS to make the image worse then it was at native. Hence I'll stick at 1440p rasterised native image until nice 4K native ray tracing is viable on a XX70 class card. Might be a while!

Couldn't agree more! I recall the odd person laughing about the statement 'better than native' and now I think they even say it now. Whilst ray tracing is improving I can only see it currently in storyline or rpg games where the pace can allow the environment to immerse you. For the majority of games it just takes too much effort or it has not been implemented well enough.
 
Couldn't agree more! I recall the odd person laughing about the statement 'better than native' and now I think they even say it now. Whilst ray tracing is improving I can only see it currently in storyline or rpg games where the pace can allow the environment to immerse you. For the majority of games it just takes too much effort or it has not been implemented well enough.
Wasn't there similar discussions when rasterisation first came in? It's a long time ago so I'm not sure. It takes quite a few generations before they become mainstream, past experience makes me wary of jumping on new stuff.
 
But its not new stuff really, nvidia championed it in Turing and games have been weaving it in or having it from the beginning. I guess its newer for AMD as this gen will be their second chance to evolve it but even still the hardware is hitting 4 years with it and the principles have been available for ages.

What we are seeing and experiencing is too much of a trade off, so as you say it can be accepted as being too early. But also as we mention some larp on how great it is however compromising in other areas to maintain a reasonable frame rate. I think your right though, once a xx70 class card can use it without breaking too much of a sweat - maintaining usual settings we can class it as standard with the bar forever set on recommended labels in game releases!
 
Ok so a 1440P would be ideal for a 27" screen. I suppose 4k would be better but mainly if you wanted a bigger screen you'd see more benefit (with more cost).

I like ....


Had an LG (standard monitor) in the past and it's still glowing.....any other suggestions ?

I take it these (1440P) use a bit more energy than standard monitors ?

Do they come with web cams ?

Thanks

Viper10
 
Back
Top Bottom