• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA Volta with GDDR6 in early 2018?

A lot of stuff isn't even that - loads is 480p 1690kbps and some a weird hybrid res 540px or something at ~2500kbps.

EDIT: Edge does indeed have higher bitrates - though I've not found anything above 4860kbps so far and most of it looks pretty much identical to the same video in firefox at 2490kbps :s
 
Last edited:
The original argument wasn't 4K vs high refresh - but that I claim that compared to previous jumps in resolution where people almost universally appreciated the extra screen estate the opinion is more split at 4K - beyond just requirements for refresh rate or hardware to power it.
Yeah. Due the higher jump in hardware requirements and not being able to have higher refresh rates, some people have chosen to go into denial about the benefits in image quality that 4K brings. Either that or they need to go spec savers :p
 
Yeah. Due the higher jump in hardware requirements and not being able to have higher refresh rates, some people have chosen to go into denial about the benefits in image quality that 4K brings. Either that or they need to go spec savers :p

You jest - but not an insignificant number of people who've looked at my panels and gone lower res are due to eye sight and not wanting to go above ~28" monitor on their desks - Windows just ain't there with UI scaling yet to fully take advantage of the extra resolution purely for making stuff look nicer via text anti-aliasing, etc. and people underestimate how much desktop stuff is still working at a level where you are interacting not that far from a per pixel basis (so there is a limit to how fast that will change) rather than like with Android, etc.
 
You jest - but not an insignificant number of people who've looked at my panels and gone lower res are due to eye sight and not wanting to go above ~28" monitor on their desks - Windows just ain't there with UI scaling yet to fully take advantage of the extra resolution purely for making stuff look nicer via text anti-aliasing, etc. and people underestimate how much desktop stuff is still working at a level where you are interacting not that far from a per pixel basis (so there is a limit to how fast that will change) rather than like with Android, etc.
I have found the latest version of Windows 10 helped with that. But still not perfect.
 
That is the thing - with previous jumps in resolution it was almost purely about gaining extra pixel screen estate but the same isn't as purely the case with 4K where increasingly there is an element of using the extra pixels to increase the density and hence quality of on screen elements. Which makes for a more complex story.
 
That is the thing - with previous jumps in resolution it was almost purely about gaining extra pixel screen estate but the same isn't as purely the case with 4K where increasingly there is an element of using the extra pixels to increase the density and hence quality of on screen elements. Which makes for a more complex story.
Be that as it may, the way I see it is, I upgrade my graphics card every year or two on avarage. The main reason I do it is for better graphics in games. I am not fixated with fps, as long as it feels smooth to me, I could not give a monkeys if it is 60 or 30fps.

Some people have only been moving to 1440p the past couple of years, but I had 1600p nearly a decade ago and felt we was stuck at that resolution so I was very happy at the arrival of 4K. The only downside to it for me is windows not being able to handle scaling as good as Mac OS does. But it keeps improving and I am happy at the moment using 200% scaling.
 
I do not use aa which saves fps, also turn off all the undesirable effects like depth of field, motion blur etc

AA is required less and less at higher resolutions as the resolution itself helps to combat jaggies. MSAA is a form of AA that just renders at higher resolution before downscaling image to fit resolution in use. And this is how I play my games anyway. Heck, jaggies even less noticeable if you're sitting further away from a screen (aka with larger screens).

With Volta 4K power will be here imo. But there will always be a handful of people who need minimum 100fps in which case they will have to wait another 2-3 more years.

Agreed, a Volta 1180/2080 (whatever you wanna call it) will probably be the comfortable spot for 4k, giving a little bit of extra headroom on the 60fps. As for higher refresh rate/FPS... that's probably me lol. At least with a monitor, I won't buy a 4k screen till the 144Hz models become more common. GPUs not yet with 4k, but I feel that way for 1440p, which is why I'm hesitant on a 1080ti or Vega.

You can run forza 6 4k maxed with an rx 480 so it proves nothing of what it can or cannot do. Really needs to be a far more demanding game to be of any credibility.

Indeed, some games are optimised very well and others are not quite so well coded for PCs. I remember someone in the PC Gaming section of these very forums (or perhaps this section?) claiming that higher GPU grunt just gives devs an excuse to put in less effort and be sloppy with game optimisation and that some devs already do. All I know is, the visuals in games like Ghost Recon Wildlands and Dishonoured 2 don't seem to justify increased performance demands on PC.
 
AA is required less and less at higher resolutions as the resolution itself helps to combat jaggies. MSAA is a form of AA that just renders at higher resolution before downscaling image to fit resolution in use. And this is how I play my games anyway. Heck, jaggies even less noticeable if you're sitting further away from a screen (aka with larger screens).



Agreed, a Volta 1180/2080 (whatever you wanna call it) will probably be the comfortable spot for 4k, giving a little bit of extra headroom on the 60fps. As for higher refresh rate/FPS... that's probably me lol. At least with a monitor, I won't buy a 4k screen till the 144Hz models become more common. GPUs not yet with 4k, but I feel that way for 1440p, which is why I'm hesitant on a 1080ti or Vega.





I am expecting one Volta Ti based card in 2019 to give the same performance has two 980Ti`s in sil.
 
I appreciate it isn't for everyone but having owned many different monitors, I can honestly say that 'ultra wide screen' has blown me away. I have owned 4K and whilst the IQ was very good, I can't say that my eyes picked up much of a noticeable difference between 4K and 1440P and I would choose 144Hz over 60Hz 4K every day of the week. Of course that is my opinion and my opinion alone but UWS IPS 100Hz is a sight to behold.
 
I appreciate it isn't for everyone but having owned many different monitors, I can honestly say that 'ultra wide screen' has blown me away. I have owned 4K and whilst the IQ was very good, I can't say that my eyes picked up much of a noticeable difference between 4K and 1440P and I would choose 144Hz over 60Hz 4K every day of the week. Of course that is my opinion and my opinion alone but UWS IPS 100Hz is a sight to behold.
Yeah, everyone is different and has different preferences. The next monitor I can see truly blowing me away will be an OLED one. The colours on that new Dell OLED is so nice. I am waiting for an 4K OLED 120Hz Freesync 2/G-Sync with HDR to come out. That will for me be the ultimate monitor and will likely last a good 5 years at least. With OLED we will have no more backlight bleed or IPS glow and even superior colours :D

In the meantime I may even give ultrawide another go if I fancy something new while waiting. But chances are I would end up sending it back once I see the lower IQ like I did in the past. If I can live with it, it would be nice though, as I can go back and play some older games in 21:9 for a new experience :)


Damn, y'all are still going on about this? I was hoping if I stepped away, the damage would be reduced. lol
:D
 
I appreciate it isn't for everyone but having owned many different monitors, I can honestly say that 'ultra wide screen' has blown me away. I have owned 4K and whilst the IQ was very good, I can't say that my eyes picked up much of a noticeable difference between 4K and 1440P and I would choose 144Hz over 60Hz 4K every day of the week. Of course that is my opinion and my opinion alone but UWS IPS 100Hz is a sight to behold.

It's a very common opinion and one that's got me thinking 1440p 144hz over 4k 60hz. Then there is the wide screen like you say which is where i think i will go. It's the support that has me worried. I am in no rush atm so by the time i do get around to it by the end of this year i think i will have a more informed opinion. Hopefully i can get a widescreen, high refresh with Freesync 2. Will see how rich i am feeling at the time.
 
Yeah, everyone is different and has different preferences. The next monitor I can see truly blowing me away will be an OLED one. The colours on that new Dell OLED is so nice. I am waiting for an 4K OLED 120Hz Freesync 2/G-Sync with HDR to come out. That will for me be the ultimate monitor and will likely last a good 5 years at least. With OLED we will have no more backlight bleed or IPS glow and even superior colours :D

In the meantime I may even give ultrawide another go if I fancy something new while waiting. But chances are I would end up sending it back once I see the lower IQ like I did in the past. If I can live with it, it would be nice though, as I can go back and play some older games in 21:9 for a new experience :)



:D

Kind of my original point though - come the leap to 4K we are seeing much more people taking different options rather than it just being the next logical step upwards for much more diverse reasons than purely performance or refresh rate and GPU manufactures can't purely say well we will concentrate on 4K and leave it at that because that is where everyone is going to be in 2-3 years time.
 
Kind of my original point though - come the leap to 4K we are seeing much more people taking different options rather than it just being the next logical step upwards for much more diverse reasons than purely performance or refresh rate and GPU manufactures can't purely say well we will concentrate on 4K and leave it at that because that is where everyone is going to be in 2-3 years time.
I am not sure why this would even be a thing for GPU manufacturers. They just keep pumping out better cards with more performance. Maybe you meant game developers?
 
I am not sure why this would even be a thing for GPU manufacturers. They just keep pumping out better cards with more performance. Maybe you meant game developers?

The inference I was responding to was that 4K would be where the optimisations/focus was before long - not a claim on my side.
 
The inference I was responding to was that 4K would be where the optimisations/focus was before long - not a claim on my side.
In the the near future for game developers that will be the case imo. Right now they all target 1080p, soon they will switch straight from that to 4K. This will mainly start happening around when PS5 is out. As for GPU manufacturers, they will keep making more powerful cards. The ones more powerful than today likely will be looking at 4K, as the fastest of cards today like Titan Xp and 1080Ti already comfortably deal anything under 4K imo.

Currently I can understand why some people are unable to see the difference between 4K or see very little. In my experience some games it is much easier to appreciate the difference 4K brings than others. When game developers finally make games targeting 4K, there will be very noticeable difference in IQ vs 1440p and every new game made. By then the GPU power will be there for those who need the extra Fps and so will 144hz 4K monitors at which point most people here will likely start making the switch over.
 
Back
Top Bottom