Soldato
- Joined
- 25 Nov 2011
- Posts
- 20,675
- Location
- The KOP
Hello if I buy a 4k TV do you get an option to selected, 2560x1440p?
Thanks
Thanks
Hello if I buy a 4k TV do you get an option to selected, 2560x1440p?
Thanks
You could run 1440 on a 4k screen.
It wouldn't look very good though. Would look a bit blurred.
I believe 1080p would work better due to pixel scaling ratio.
1080 is exactly half 2160 lines of a 4k screen.
Depending on how good the TV is and how close you're sitting to it, it can be quite difficult to discern much difference between 4K and 1080p so I wouldn't get too uptight about a high res particularly if you're playing games from your couch.
That doesn't sound very intuitive. Why would they leave out such a simple and obvious feature? And isn't one of the core features of TVs' "Computer modes" to bypass any scaler after-effect features, so smoothing etc. shouldn't be an issue?Every resolution on a TV goes through the scaler with all the artefacts and smoothing that brings. There is no 1:4 pixel mapping (1080p -> 2160p). It's why many cheap TVs look fine at 4k but absolutely atrocious with normal SD/HD TV content.
Integer ratio scaling is something many have asked for from AMD and Nvidia for years. There's even a petition setup.
http://tanalin.com/en/articles/lossless-scaling/
[Feature Request] Nonblurry Fullscreen Upscaling at Integer Ratios
This question is Not Answered.
At integer scaling ratios, full-screen upscaling should be done just by duplicating pixels, with no blur at all. For example, Full HD (1920×1080) image could be displayed on a 4K display (3840×2160) with no blur, just by displaying one image pixel as a group of exactly 4 (2×2) absolutely identical physical pixels with no interpixel diffusion whatsoever.
On the contrary, full-screen upscaling via graphics driver is currently blurry even if scaling ratio is integer (e.g. 2x, 3x, 4x). For FHD (1920×1080) and HD (1280×720) images on 4K (3840×2160) monitors of 24-27-inch size where individual image pixels are almost indistinguishable, such blur unreasonably decreases perceptible sharpness without adding anything useful.
There should be a driver option to disable blur or at least to switch between bilinear/bicubic and nearest-neighbour interpolation.
For better understanding of what nonblurry integer-ratio scaling is, please see the demo. Thanks.
See also the “Nonblurry integer-ratio scaling” article which is an attempt to explain the blur issue and collect and summarize all the important relevant information about the issue and nonblurry integer-ratio scaling by pixel duplication as a solution.
See also a corresponding petition on Change.org. https://www.change.org/p/nvidia-amd-nvidia-we-need-integer-scaling-via-graphics-driver
Update (2017-07-06): The feature is now supported by nVidia GeForce driver 384.47 (Beta) for Linux via the “Nearest” transform filter.
Thanks guys, I be sitting about 7 feet away from the screen. I just dont want to game at 1080p after using 1440p for sometime now. Surely 1440p on a 4k screen will still look much better?