• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Geforce GTX1180/2080 Speculation thread

1080Ti can do it already if you are happy to tweak settings a little on a few really demanding games. 1180 should be able to do it just fine. But obviously not at max settings i.e. 8X MSAA etc. If people want that then they have to wait another 2-3 gens of cards. Lol.

I have a 1080Ti (since launch), had a UHD screen for about a week, got rid of it. I couldn't run anything at the time at acceptable framerate without lowering IQ settings (not talking AA here). Also, 60hz.

Hopefully the next generation **80Ti will change matters and we see more 120Hz+ UHD screens.
 
Indeed. As a 4K gamer I would be more than happy with 2X MSAA.

But many people here must play games maxed out (otherwise what’s the point right?) so they would end up with low fps and then complain. Lol. What they don’t understand is after a certain point there is diminishing returns at 4K. You don’t necessarily need to max everything out at 4K to make it look great. As I have explained it a few times here now, things look better with less AA and on “Very High” settings at 4K compared to 1440p with more AA applied at “Ultra” settings.

Not referring to you btw Kaaps :)

One of the reasons I tend to use 8xMSAA is it is very good at flattening the min/ave/max fps as very high and very low fps can be quite annoying.

You can also get very high/very low fps and even a dip in performance if people overclock the CPU too high as it is not running at its most efficient.:)

I am too poor to buy a G-Sync or Freesync monitor.:)
 
i often talk about "max detail" or max settings in games however I must admit personally i do not include AA in this.

probably not going to word this correctly but to me anti aliasing is almost like something separate to the game to me. i think of game detail in terms of draw distance, and object detail and lighting effects.

so whilst i am one of those who is looking for max detail at 4k for my next gpu , i would be happy to tone down the AA and not worry about it.
 
I have a 1080Ti (since launch), had a UHD screen for about a week, got rid of it. I couldn't run anything at the time at acceptable framerate without lowering IQ settings (not talking AA here). Also, 60hz.

Hopefully the next generation **80Ti will change matters and we see more 120Hz+ UHD screens.
Thing is, lowering IQ settings one setting down, 4K still looks better than 1440p.

The 1180Ti should indeed do the job I would imagine, until games like Cyberpunk 2077 come along anyway :p

If tweaking settings is a no no to people, then I would suggest staying at 1440p until 7nm Ti comes out at the very least.
 
Depending on the game and settings used, i will generally disagree.
Could be, I don’t know what games you play. For the most part I don’t do first person shooters which are very popular for example so I cannot speak of those games. Only one I have played of recent purely for the campaign is Doom and for sure I would stick with my comment for that game.

Most games you need to take a screen shot even at 1440p to see the difference between Ultra vs Very High, let alone 4K which is so much sharper and better. There are many videos on YouTube documenting this by the way.
 
Thing is there is no correct monitor size that covers all games.

Some like high fps and lower resolution like 1440p and some lower fps @2160p.

If you also take into account the aspect ratio of the monitors as well there is no way one size can fit all.
 
Thing is there is no correct monitor size that covers all games.

Some like high fps and lower resolution like 1440p and some lower fps @2160p.

If you also take into account the aspect ratio of the monitors as well there is no way one size can fit all.

Pretty much. I can't imagine playing BF1 for instance on a 40" 4K panel. 28" 1440p 144Hz is perfect, thanks. Where as Civ5 or Witcher 3 are brilliant on larger displays. Definitely no magic bullet here.
 
True

To be honest I prefer playing all games on my 65" OLED TV. It may max out at 60hz, but there's no PC monitor on earth than can match the picture quality.

Yeah it's nice to just kick back sometimes. Spiderman on PS4 Pro will probably look ace on that. But when you're getting flanked and shot from 150-200m away and you can't see them because you're the same distance away from your TV, that's just not fun.
 
Depends what the settings are to be fair. But i'd argue lower IQ/features at high resolution is equivalent to polishing a turd.
But we said what the settings were, ultra and very high. Basically the highest and one lower. In most games you already need to take a still screen shot to actually notice a difference between the two and that is on 1440p. Therefore I don’t see how you are concluding this.

I mean I don’t enjoy having to waste money on more expensive graphics cards for the sake of it. If there was not a big difference between 1440p and 2160p, I would have just stayed at 1440p and kept my GTX 1070 and not needed to upgrade for years. But once I saw the difference I could not go back personally.


I'd argue 4K/High looks considerably better than 1440p Very High.
In some games this is indeed the case. That sharpness you get from 4K just makes most games look so much better.
 
Yeah it's nice to just kick back sometimes. Spiderman on PS4 Pro will probably look ace on that. But when you're getting flanked and shot from 150-200m away and you can't see them because you're the same distance away from your TV, that's just not fun.

I don't really play PC multiplayer games now, but yes I can see why a proper monitor is better for that.
 
But we said what the settings were, ultra and very high. Basically the highest and one lower. In most games you already need to take a still screen shot to actually notice a difference between the two and that is on 1440p. Therefore I don’t see how you are concluding this.

Haha! How brilliantly vague. I think this conversation has run it's course now :D
 
Haha! How brilliantly vague. I think this conversation has run it's course now :D

I think the point was that in motion, the difference between ultra, very high and high is often minimal, sometimes to the point where you have to pixel peep screenshots to see the difference.
 
I think the point was that in motion, the difference between ultra, very high and high is often minimal, sometimes to the point where you have to pixel peep screenshots to see the difference.
Let’s be honest, he understood what I was saying, not like it was cryptic. He highlighted something and called it pretty vague but ignored the next sentence which left zero vagueness to what was being said.

I guess he had nothing to come back with but wanted to have the last word, can’t think of anything else. Lol.
 
Back
Top Bottom