• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

This is why titans and TI's are not 'overkill' for 1080p

This is what I just got on a stock EVGA SC TX
The Witcher 3 maxed
1080p

Frames, Time (ms), Min, Max, Avg
14274, 220094, 53, 80, 64.854

I don't think I would like to play the game maxed @1080p as it looks very rough compared to 2160p maxed.

Game was created for 1080p, anything higher your just up sampling especially if your on a monitor smaller than 40" . Same method as what is found in the Nvidia Control Panel.
 
This is what I just got on a stock EVGA SC TX
The Witcher 3 maxed
1080p

Frames, Time (ms), Min, Max, Avg
14274, 220094, 53, 80, 64.854

I don't think I would like to play the game maxed @1080p as it looks very rough compared to 2160p maxed.

53 min FPS?

Best two get a second card so!

You also need at least two 980Ti's at 1440p to play the Witcher 3 at Ultra for a minimum of 60fps. Even then in some scenes it can dip below that.
 

Just speaking on behalf of assets, not every game is designed to look good at higher resolutions. Typically the higher you go, the more jaggies you are likely to see. The more jaggies, the more AA is needed, and more AA well.. you get the picture.

Works the other way too. Games look good at their native resolution, which is really the way they were created.
 
Last edited:
Game was created for 1080p, anything higher your just up sampling especially if your on a monitor smaller than 40" . Same method as what is found in the Nvidia Control Panel.

Err no

I am quite happy maxing it out @2160p on my 32" monitor and it looks a lot better than 1080p, it also runs smoother @2160p.:)
 
Just speaking on behalf of assets, not every game is designed to look good at higher resolutions. Typically the higher you go, the more jaggies you are likely to see. The more jaggies, the more AA is needed, and more AA well.. you get the picture.

Yeah but that's not what I lol'd at.

You basically said anyone playing above 1080p is just using upscaling the same as DSR, that's not the case, native resolution 1440p or 4k monitors look far better than upscaling to the same resolution on a monitor that is native 1080p.
 
Err no

I am quite happy maxing it out @2160p on my 32" monitor and it looks a lot better than 1080p, it also runs smoother @2160p.:)

Yeah running the Witcher 3 at 1080p on a 1440p monitor already looks like hell.
Can't imagine 1080p upscaled(stretched) to 2160p would look like terrible.
 
53 min FPS?

Best two get a second card so!

You also need at least two 980Ti's at 1440p to play the Witcher 3 at Ultra for a minimum of 60fps. Even then in some scenes it can dip below that.

I like high fps too but i'm not THAT fussy, i think minimums of 40+fps is more than acceptable, tbh with the price of GPU's these days they should be handling 1440p and 1080p better than they do now. Though 1440p is getting there, single card wise.
 
I like high fps too but i'm not THAT fussy, i think minimums of 40+fps is more than acceptable, tbh with the price of GPU's these days they should be handling 1440p and 1080p better than they do now. Though 1440p is getting there, single card wise.

At the moment it seems software is head of hardware again. Especially for advanced effects and physics.
G-Sync and Freesync does help a lot with those minimums, but if you're get and average of 70-80fps and suddenly have drops to 40, you still notice them. Especially if you're at that minimum for a while.

It's still a bit terrible a Titan X can't get a stead 60fps at 1080p in the Witcher 3. Whether it's bad code ( gameworks optimisation ), or just optimisation in general it doesn't bode well for the longevity.

I guess we'll see with the new Batman game as it also uses Gameworks.
 
Yeah running the Witcher 3 at 1080p on a 1440p monitor already looks like hell.
Can't imagine 1080p upscaled(stretched) to 2160p would look like terrible.

This is why i'm not interested in 4K, anything other than native res will look pretty much like ass, too bad monitor tech took 1 step back when moving from CRT.
 
Yeah but that's not what I lol'd at.

You basically said anyone playing above 1080p is just using upscaling the same as DSR, that's not the case, native resolution 1440p or 4k monitors look far better than upscaling to the same resolution on a monitor that is native 1080p.

I admit that I maybe wrong there, I am not 100% sure on how it works on higher resolution monitors. I just know how it works from an asset point of view. I think calling 1080p poor just because you have experienced higher resolutions isn't a fair comment, its just a selfish one.

Many people - like myself - could go out and buy a 2k or 4K monitor (I mean have an ACD sitting in the closet thats 1440p. We just choose performance over higher res, but that isn't to say 1080p is poor by any stretch. And in my opinion, native resolution is the way to go.
 
Last edited:
I admit that I maybe wrong there, I am not 100% sure on how it works on higher resolution monitors. I just know how it works from an asset point of view. I think calling 1080p poor just because you have experienced higher resolutions isn't a fair comment, its just a selfish one.

Many people - like myself - could go out and buy a 2k or 4K monitor (I mean have an ACD sitting in the closet thats 1440p. We just choose performance over higher res, but that isn't to say 1080p is poor by any stretch. And in my opinion, native resolution is the way to go.

1080p isn't poor. It's just in comparison to 1440p and especially 4k there's obviously a large difference. You have to experience it to really understand.

As you said performance over higher resolution is a perfectly good decision.
 
Witcher 3
2160p maxed
using all TXs @stock

Frames, Time (ms), Min, Max, Avg
18904, 240313, 67, 88, 78.664

Better figures than I got @1080p on a single.

many years ago when i got my x800xt platinum i suffered the same thing.. the higher the res the more FPS, im still confused to this day :)
 
Forgetting the TX, 980 TI is perfectly suited for 1080p, particularly when using a 120/144hz monitor. It'd allow ypy to maintain the FPS sweet spot in slightly older or less demanding titles while still letting you play at high settings in new games for some time to come. After shifting to 144hz and G-sync I couldn't go back to 60hz, even on a nice UHD monitor. Obviously the ultimate monitor would be a combination of the three but then the expense of buying hardware that could run games maxed out at that resolution would be too high for many (myself included). Long live 1080p!

One day I'll move up the food chain but affordability is some way away. Here's hoping AMD take a step in the right direction in the coming weeks!
 
You know I have seen many posts saying that the GTX 980Ti is overkill for 1080p, so i'm gonna stick my neck out a little here and say that the EVGA reference I ordered will be used on a 1920x1080 resolution. The reason been is that I am hoping by having that extra graphics processing power I should not have to worry about tweaking in game settings to find the sweet spot for 60fps for each game and having to comprise quality to achieve it, add in screen capture which also has a performance hit should no longer be a concern. I am not naive in thinking that I will be able to crank everything to the maximum though, well really I should be able to but Anti Aliasing options etc usually need a tweak here and there anyway but I can live with that.

I was ready to make the jump back to Nvidia after a few (not so great) years with ATI and for the fastest single card for my budget at the moment was the GTX 980 Ti, so overkill or not, to me this made sense. Maybe down the line, possibly in the next year or so I will make the move to a higher resolution but right now I am quite content at 1080p, but by buying this so called "overkill for 1080p" card it should allow me to run everything at max, not worry about compromise and if monitor/resolution circumstances change I am virtually setup with probably the addition of another 980 Ti to SLI it.
 
That's just one or two badly optimized games rather than stretching hardware.

A lot of games will be restricted at the core by xbox one and ps4 hardware.
 
Back
Top Bottom