Anyone went 1440p/4k and is unimpressed?

Got a 4K monitor a few months ago..

- Only play games at 2560x1440
- Watch all media at '4k' (Really just 1080p..)
- Tried watching YouTube on 4k: Lol..

Love the monitor for productivity.. Anything else? Not for a long time :/
 
You're not unimpressed, you just made the mistake of not having a system to decently perform @ 4k and are misplacing your judgments on it.

Also, are you really running 1080p on a 4k monitor? I'm not familiar with any sort of scaling tech but won't that look extremely ugly compared to native?
 
Gaming at 4K now, I think it makes a huge difference over 1080p, and I'll happily sacrifice settings to make it run on my GTX 760.

A lot depends on how close you are to the screen as well. If you're on the other side of a living room, then it's not going to make much difference. If you're right in front of it, then it's much more noticeable.
 
Got a 1440p monitor a few years ago from our friends in Korea and have just upgraded to a 980ti -this seems to be the sweet spot for this card as you can crank the settings up and still get a smooth frame rate in most games.

It was a big jump for me from 1080p and I am very happy with my investments.
 
Likewise went from gaming at 1080p@120Hz to 4K 60Hz and 1440p@120+Hz and was much more impressed by the experience at 1440p, 1-2 games are immense at 4K but overall didn't feel like the best suited to me.
 
I went from 1200p to 1440p and I much prefer 1440p. Obviously it annoyed me when I couldn't played games on as high a setting as before, but now I have a 980ti so that doesn't factor anymore.

Trying to go back to 1200 or even 1080 would be pretty tough I think.
 
That post reads awfully like your sniping at people for having more money than you.

I have never tried 4K but have learned the principle of placing high quality settings and good performance firmly ahead of high resolution years ago,

Moores law is the great equaliser. Meanwhile there is probably an old game with good play that doesnt need a new setup especially. I like old games that scale really well, I think doom 3 is one of them


csgo plays at 4k, how many fps can you get on that because it needs a lot but in theory its a basic game to run
 
That post reads awfully like your sniping at people for having more money than you.

Don't you hate it when people spend the money they've earned on a hobby they enjoy!?
People should make do with the bare minimums that they need just to make sure nothing goes to waste or get underutilised...
 
It all depends on the game. Try playing something like Half-Life 2 at 4K and the difference won't be great compared to 1080p. But try something like Crysis 3 and you'd be blind not to notice a massive difference in detail. Games like this have more texture, shader and geometry detail than 1080p can show, hence when you go to 4k, that detail is now visible.

Personally, I won't go anywhere near 4K until I can build a rig at a reasonable cost where I don't have to lower the quality settings to accommodate for the higher resolution. Having a higher resolution only really allows you to see more detail. If the actual image being rendered looks crap, it will still look crap at 4k. The source material needs to look good first and foremost, good enough to saturate all the resolution available at 4K. It's like using a 144hz monitor on a system that can only push out 50fps. It's not going to look any smoother than it would on a 60hz monitor (apart from maybe a reduced appearance of tearing).
 
Last edited:
That post reads awfully like your sniping at people for having more money than you.

Ha Ha, but not quite.

I am not so much better than a hapless gaming tech geek who puts far too much time, energy, and money into an activity that is best suited to teenage boys.

I know very well how this 'hobby' can exploit the OCD tendency in people and me sneering at people who are slightly worse affected than I am is my way of drawing a line under how much time and energy I am willing to put into PC gaming tech, and of course how much bucks I am willing to blow on all that stuff. (I might be a bit futile and pathetic, but at least I am not as bad as *them*)

So there is that, and then there is the pragmatic aspect of 4K gaming in that it is just not worth the outlay and the hassles. Even if money wasn't an issue, I would rather game at a maxed out solid 1080p60 than 4Ksub60, with all kinds of graphical compromises having to be made, and worrying how my system will run the next AAA title that I really want to play. Besides that, 1080p is just too damn 'compatible'. My gaming laptop is 1080p and runs games well at that resolution, I had that hooked up to my big LCD TV downstairs, which is also 1080p. However, since I am/was playing Witcher 3, I decided I wanted the graphical power of my desktop for that one, downstairs on big LCD TV (1080p). I therefore moved my laptop upstairs to my 24" IPS monitor, which you guessed it, is also 1080p and I play online BF4 (100% maxed) and CS:GO through my laptop hooked up to all my desktop peripherals. Also, what is the standard resolution for Blu-Ray and/or high def video files? 1080p.

Best bang for buck, least hassles, best compatibility and longevity of equipment is undeniably found at 1080p. So for someone to aim for 4K, right now, when they will pay through their bloody noses for a big pile of equipment (GPUs n panels n all) that will be classed as mid range junk within 3 years or less, they either have both plentiful amounts of disposable income and spare time on their hands, or they suffer from some form of OCD illness. I suspect that most on here fall somewhere rathermore towards the latter end of that spectrum than the former.

When 1080p first hit the scene, the wise money kept away from it. If you look at the earlier 1080p monitors or TVs from the mid 2000s, you will know that they absolutely suck by today's standards and if you bought one back then, you will know how extortionately expensive they were compared with modern day high end panels. Same thing will apply to 4K without a shadow of a doubt.
 
Remember the total number of pixels means nothing really. A 40" 4k display is around 110 PPI, a 22" 1080p display is around 102 PPI. 4k is good for large screens as it allows higher PPI (pixels per inch) meaning a better quality picture. I've stuck with 22-24" 1080p monitors for my PC as going any larger just results in a lower PPI and therefore worse "quality" picture. You would see a much larger change if you used Dynamic Resolution to run a game at 4k on a 1080p display.
 
Last edited:
I went from a 1440 monitor to an ultrawide 3440x1440.
There's no difference in the picture. There's just more off it. I think it feels much more immersive on the wider screen.
I've got no intention of buying a 4k screen at the minute. My single Titan X probably isn't enough to do it justice.
 
This is a forum full off extremist gaming tech consumers who fall for every bank account raping, credit card maxing 'reason' to upgrade that they come across. One should take opinions such as:

  • "once you have tried 4K, you just can't go back"
  • "my GTX 980 plays everything great at 4K"
  • "You need to go Sli"
  • "My Sli configuration works flawlessly"
  • "you need Titans"
  • "Worth every penny!"
  • "Who needs two kidneys anyway!"

Still doing 1080p nicely with my Athlon X4 645 3.1 GHz OC'd to 3.4Ghz, 8GB RAM and HD 6970.

So no, not everyone does what you said.. ;)
 
Last edited:
1024x768 on a 8800gt right now though I have a different setup it needs RMA. I was hoping they'd put crysis on for 50p or something as I never played that when this card was king, Im sure its still good now

Remember the total number of pixels means nothing really. A 40" 4k display is around 110 PPI, a 22" 1080p display is around 102 PPI. 4k is good for large screens
Thats a good point, my first CRT was 14" and mostly seemed fine for a very long time as I was sitting two inches away from the screen
 
TBH, I find 4k utterly pointless at this point in time. You have to invest heaps of money into hardware and have an oven in your case to maintain decent framerates at higher details.

Because, let's face it, running 4k at the expense of in-game quality settings is a completely daft trade-off. I'd take maxing games at 1080p over playing at medium at 4k any day. The resolution may be higher but it sure as hell won't compensate for the effects you had to disable to make games run acceptably.

1080p is still the most sensible choice. It looks very good and doesn't require monstruous hardware to play games nearly maxed.
Maybe it's because I'm not an avid gamer so I'm more than happy with what what my setup offers in terms of gaming quality. The point still stands, though.

4k is a pipe dream which is best left in peace until more powerful GPUs arrive and games start using textures that can actually hold their own at such ridiculously detailed resolutions.
 
Ha Ha, but not quite.

I don't entirely disagree with you - I tend to buy a little behind bleeding edge and usually get 90+% of the performance for half the price.

The generic poking at SLI made me laugh a bit - for years I bought carefully selected SLI setups where scaling would be optimal i.e. one card is already getting say 70% of my target framerate to keep things smooth and make the most out of my 120Hz panels and by and large I've had very little issues with my SLI setups i.e. GTX470 SLI, GTX260 SLI, etc. sadly in many cases they ran into VRAM headroom issues before running out of core performance but I probably spent less or the same as someone did buying the next card up when it came to it i.e. GTX280 SLI on release would have set you back about £1100 and even after the price drop over £600 - 2x 260 and 2x 470 cost me in total £650.
 
With my next machine I'm building, I'm looking to go 1440p Ultrawide, as I feel having 21:9 to be more visually impressive than just 4k and then eventually upgrade that to an Ultrawide 4k screen when it becomes a bit more mainstream.
 
With my next machine I'm building, I'm looking to go 1440p Ultrawide, as I feel having 21:9 to be more visually impressive than just 4k and then eventually upgrade that to an Ultrawide 4k screen when it becomes a bit more mainstream.

+1

I only bought my super ultrawide last year and going by my past monitor upgrades, I do it about every 4 years. So maybe 3 years time ill have to see if the hardware is up to 4k by then. Even then I think I could never miss out on a super wide monitor now.
 
Ha Ha, but not quite.

I am not so much better than a hapless gaming tech geek who puts far too much time, energy, and money into an activity that is best suited to teenage boys.

I know very well how this 'hobby' can exploit the OCD tendency in people and me sneering at people who are slightly worse affected than I am is my way of drawing a line under how much time and energy I am willing to put into PC gaming tech, and of course how much bucks I am willing to blow on all that stuff. (I might be a bit futile and pathetic, but at least I am not as bad as *them*)

So there is that, and then there is the pragmatic aspect of 4K gaming in that it is just not worth the outlay and the hassles. Even if money wasn't an issue, I would rather game at a maxed out solid 1080p60 than 4Ksub60, with all kinds of graphical compromises having to be made, and worrying how my system will run the next AAA title that I really want to play. Besides that, 1080p is just too damn 'compatible'. My gaming laptop is 1080p and runs games well at that resolution, I had that hooked up to my big LCD TV downstairs, which is also 1080p. However, since I am/was playing Witcher 3, I decided I wanted the graphical power of my desktop for that one, downstairs on big LCD TV (1080p). I therefore moved my laptop upstairs to my 24" IPS monitor, which you guessed it, is also 1080p and I play online BF4 (100% maxed) and CS:GO through my laptop hooked up to all my desktop peripherals. Also, what is the standard resolution for Blu-Ray and/or high def video files? 1080p.

Best bang for buck, least hassles, best compatibility and longevity of equipment is undeniably found at 1080p. So for someone to aim for 4K, right now, when they will pay through their bloody noses for a big pile of equipment (GPUs n panels n all) that will be classed as mid range junk within 3 years or less, they either have both plentiful amounts of disposable income and spare time on their hands, or they suffer from some form of OCD illness. I suspect that most on here fall somewhere rathermore towards the latter end of that spectrum than the former.

When 1080p first hit the scene, the wise money kept away from it. If you look at the earlier 1080p monitors or TVs from the mid 2000s, you will know that they absolutely suck by today's standards and if you bought one back then, you will know how extortionately expensive they were compared with modern day high end panels. Same thing will apply to 4K without a shadow of a doubt.


i agree and all, but you forget, if it weren't for these "OCD types" that plow money into every new tech/venture, we wouldn't get every new tech/venture , as its that initial money that's been plowed into whatever new tech it may be, that funds the rest of its development , which makes it cheaper for the rest of us.
 
Back
Top Bottom