• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA Volta with GDDR6 in early 2018?

Yeah right Gregster, you have had about 4 Titans by my count mate ;)

I have not had one yet, let me catch up no? Haha :)
3 original Titans and 2 Maxwell Titans. Certainly no plans to go SLI until they can get it running like a single GPU though.
 
3 original Titans and 2 Maxwell Titans. Certainly no plans to go SLI until they can get it running like a single GPU though.
Yeah I remember your original 3, but I never knew you had 2 maxwell's, thought it was just one.

I won't touch multi gpu, did that with a 295x2 and was forever at the mercy of drivers with new games. It was great when it worked though. Was 4K gaming on Crysis 3 and loved the graphics :D
 
so you do care for 4k gaming?
because if you didn't care at all you would say couldn't care less
I thought the same. But I thought I would leave it out :P

Plus, end of the day most people who make out like 4K is nothing special will be on 4K in the next few years and will be singing it's praises :D
 
mGPU is pretty much dead in the water atm imo. Shame, had a good run.
If the future is multiple smaller dies on a single PCB, then perhaps multi-GPU (tho not the same thing) could itself make a come back.

I guess it depends on whether they can actually make this vision of the future work properly. But then they don't have much choice, as we approach the limit of how small we can make the transistors.

Multi-GPU could well come back.
 
If the future is multiple smaller dies on a single PCB, then perhaps multi-GPU (tho not the same thing) could itself make a come back.

I guess it depends on whether they can actually make this vision of the future work properly. But then they don't have much choice, as we approach the limit of how small we can make the transistors.

Multi-GPU could well come back.

Exactly that, for many small dies to work they have to get it to be seamless to the end user and from what we have seen so far, multi GPU has quite a long way to go. But of course there are some very cleaver people working on these things, so they will get it right eventually.
 
I'm not so sure on that - 4K is a bit marmite some love it some prefer 1440p not just because of refresh rates.

Sorry did you vomit nonsense or is it just me? How on earth can ANYONE prefer 1440p over 4k when refresh rate doesn't come into it? Of course people would prefer 4k 60,100, 144 over 1440p 60, 100, 144. That's not even a comparison.
 
Sorry did you vomit nonsense or is it just me? How on earth can ANYONE prefer 1440p over 4k when refresh rate doesn't come into it? Of course people would prefer 4k 60,100, 144 over 1440p 60, 100, 144. That's not even a comparison.

Vomit nonsense LOL... what is it with people. Maybe spend awhile reading the monitor forum or something - 4K isn't for everyone.
 
Sorry did you vomit nonsense or is it just me? How on earth can ANYONE prefer 1440p over 4k when refresh rate doesn't come into it? Of course people would prefer 4k 60,100, 144 over 1440p 60, 100, 144. That's not even a comparison.

Must be just you then, to state that "how can ANYONE" prefer one thing over another is laughable, there will always be people out there that prefer one thing to another, for some reason or other. and you accuse Rroff of vomiting nonsense as you put it. :rolleyes:
 
Must be just you then, to state that "how can ANYONE" prefer one thing over another is laughable, there will always be people out there that prefer one thing to another, for some reason or other. and you accuse Rroff of vomiting nonsense as you put it. :rolleyes:

It's akin to someone preferring a 1 bedroom apartment in Luton over a 10 bedroom mansion in Beverly Hills... or someone that prefers having a 20p an hour salary over a 20k per hour salary. You can prefer one over the over but there is only ONE obvious choice that doesn't make you seem like a complete dumbass:)
 
Maybe they meant it from a performance standpoint, with having one card in their system to power a 1440p Res. Otherwise it would be a bit mind boggling.
 
Maybe they meant it from a performance standpoint, with having one card in their system to power a 1440p Res. Otherwise it would be a bit mind boggling.
Also there some without realising say it because that is what they have and do not want to feel like they are missing anything out.

Way I see it is, if you want better graphics then 4K is no doubt better, I love the sharpness. But if you are happy with less IQ in order to have higher refresh rate then 1440p is a no brainer due to the higher refresh rate. There is no right or wrong choice, just what suits you best. I hardly play online ever and love better graphics so I have had 4K now for three years and love it.
 
You can't detach cost/benefit from any comparison, because (almost) nobody is getting their monitor given them for free :p

Thus yes it's possible to prefer 1440p to 4K from a cost/benefit perspective.
 
It's akin to someone preferring a 1 bedroom apartment in Luton over a 10 bedroom mansion in Beverly Hills.

The cleaner, or painter and decorator springs to mind, bottom line is, you cannot just use such blanket statements such as "how can ANYONE" as there will always be exceptions to what most would expect. :)
 
It's akin to someone preferring a 1 bedroom apartment in Luton over a 10 bedroom mansion in Beverly Hills... or someone that prefers having a 20p an hour salary over a 20k per hour salary. You can prefer one over the over but there is only ONE obvious choice that doesn't make you seem like a complete dumbass:)

It is nothing like that - upto and including 1440p type resolution the extra resolution on a PC/Windows platform is generally used to increase the amount of screen estate - once you go above that the extra resolution is most effective when used to increase the density of UI elements, etc. i.e. icons represented using say 256x256 pixels instead of 64x64 and text looks crisper with better defined curves, etc. while proportionally using the same space in the UI.

This isn't a straightforward story with Windows as a lot of legacy use is designed around working with pixels 1:1, Windows DPI scaling is somewhat less than the greatest and for many people to make the best advantage of 4K resolution really requires 40+ inch displays while there are many reasons people want or don't have space, etc. for bigger than the normal range of 24-30".

If you play a lot of older games and even quite a few recent games the UI doesn't scale well on 4K leaving you with various problems like tiny, tiny UIs or poorly positioned on the screen and in some games (which is never really going to be solved) its hard to find a good compromise for mouse input that lets you retain low level precision when making small mouse moves and effortlessly cover a huge amount of pixels at the same time.

This isn't going to be the same story for every one - but it certainly makes 4K somewhat marmite as things stand and while it would be nice to have all those extra pixels the reality is there might not be a solution that satisfies the requirements of every user for Windows desktop usage compared to other OSes like Android where 4K can even be useful on relatively small devices.

Dunno why I dignified it with a longer response but maybe next time try asking people what their reasoning is before whirling at them like some kind of ****.

Maybe they meant it from a performance standpoint, with having one card in their system to power a 1440p Res. Otherwise it would be a bit mind boggling.

Performance can be a reason but ultimately a more temporary one as slowly GPUs catch up with increases in resolution - there was a time when 1080p for instance was looked at from a performance stand point like 4K is today.

I think people miss that previous increases in resolution in Windows have been used to increase the amount of real estate you are working with on the screen but at 4K type resolutions that largely becomes less useful - a lot of people will simply run 125-150% UI scale to use the increased pixels to make things look nice while retaining the same amount of usable screen estate and that comes with a split in what people use an OS for - especially as Windows is a bit hit and miss when it comes to DPI scaling, etc.
 
Last edited:
Back
Top Bottom