• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

This is why titans and TI's are not 'overkill' for 1080p

one reason 4k gaming or 8k is a decade away.
but then we run Minority report screens.

Decade? Might not be far off there, didn't naughty dog recently say 4K won't be doable on consoles until another 2 generations?

Now, i know PC tech is far ahead of consoles, but looking at it that way it could easily be 5+ years then before single cards come along to run 4K well enough at a solid 60+fps.

When 4K starts running really well on single high end cards we will probably just have 8k screens come out, then we are back to square one with performance :p
 
When 4K starts running really well on single high end cards we will probably just have 8k screens come out, then we are back to square one with performance :p

+1

That's the way the game works.

When 4K is doable on a single card the hardware manufacturers will start plugging 8K like mad.:D
 
1080p isn't poor. It's just in comparison to 1440p and especially 4k there's obviously a large difference. You have to experience it to really understand.

As you said performance over higher resolution is a perfectly good decision.

I found the jump from 1050p/1080p to 1440p to be forgettable. The jump to 4K, yeah, definitely more noticeable, but 1440p? Nah. And the sad thing about it is I'm saying that about games that can benefit from the increased resolution but most won't really because they're still catering to low-spec hardware (e.g. the most popular MP titles).

I think for now 4K is still far away unless you have thousands to blow on PC hardware for the games that can actually show it off & 1080p still makes a lot of sense, and it would make even more sense if it had a Freesync 144hz IPS/VA monitor for 300. :D
 
Decade? Might not be far off there, didn't naughty dog recently say 4K won't be doable on consoles until another 2 generations?

Now, i know PC tech is far ahead of consoles, but looking at it that way it could easily be 5+ years then before single cards come along to run 4K well enough at a solid 60+fps.

When 4K starts running really well on single high end cards we will probably just have 8k screens come out, then we are back to square one with performance :p

single card gaming.
while you can play at 4k with a card like Fury X thats made with bandwidth for it and advanced AMD technology behind it it still while being the fastest card in the world wont allow max out settings and such fps I would want as I am a fps nerd. while DX12 will help elevate PC gaming better as the Intel roadblock is removed we will have a better gaming experience on single cards.

But the tech isnt moving faster as we will be stuck on 14-16nm a few years and unless major breaktroughs are made we wont be accelerating the fps gaming anytime soon.
 
Given the main distinguishing feature of the Titan X over the 980 Ti is a ridiculous amount of VRAM I'd say it certainly is overkill for 1080p. You are never going to need to use the extra 6 GB at that res. The 980 Ti is more reasonable.
 
Given the main distinguishing feature of the Titan X over the 980 Ti is a ridiculous amount of VRAM I'd say it certainly is overkill for 1080p. You are never going to need to use the extra 6 GB at that res. The 980 Ti is more reasonable.

I do actually agree here even though my title says differently, had the 980ti been much less powerful then the Titan would have still been the way to go regardless of the VRAM, however since the difference between the two is negligible at best, the titan isn't worth it for that resolution.
 
Witcher 3
2160p maxed
using all TXs @stock

Frames, Time (ms), Min, Max, Avg
18904, 240313, 67, 88, 78.664

Better figures than I got @1080p on a single.

Im running witcher 3 with HBAO off, hairworks off and a slightly tweaked ini giving me high foliage with Ultra foliages grass distance to prevent pop in. Rest is set to maximum and on a single 980ti @3440x1440 im locked at 59-60fps havent had a dip yet and i just finished the white Orchard area.. Im okay with that. Not saying its not coming but by the looks of its its going to be so rare that i dont care..
 
To be fair GTAV does look very rough at 1080p without 4xMSAA though. At 4K FXAA does me fine on GTAV. It's strange because GTAIV had no AA but 5 looks a lot rougher without it.

I run with x4 MSAA for that reason.

In order to achieve 60FPS I have a few things turned down. However I don't miss grass or post FX As a result the game looks amazing.
 
I use a 295 X 2 for 1080p I don't know what you're on about.

more fps the better won't be changing to 1440p or 4k till i can at least achieve a consistent 100 fps at high/highest (i always compromise a bit for smoothness) settings in the latest games.
 
I use a 295 X 2 for 1080p I don't know what you're on about.

more fps the better won't be changing to 1440p or 4k till i can at least achieve a consistent 100 fps at high/highest (i always compromise a bit for smoothness) settings in the latest games.

This. I can never understand why people go out and spend nearly £1000+ on a graphics card, for 4K and only run games at 30-40fps. I suppose to a lot of people, pretty graphics > Smoothness.

Casing point, I have just spoken to somebody who plays TW3 (The Witcher 3) at 30fps capped, just so he can play with everything on Ultra including hairworks.
 
They probably can't understand your viewpoint either, but that's life I suppose :P

In terms of certain cards being overkill, I'd see it more of an investment if nothing else. But it depends on your financial situation as well. For me I like to get the best I can get at the time of upgrade so that I know I wont have to worry about my GPU for a while.
 
This. I can never understand why people go out and spend nearly £1000+ on a graphics card, for 4K and only run games at 30-40fps. I suppose to a lot of people, pretty graphics > Smoothness.

Casing point, I have just spoken to somebody who plays TW3 (The Witcher 3) at 30fps capped, just so he can play with everything on Ultra including hairworks.

4K doesn't need to be maxed to look better than 1080p-1440p. You can get a solid 60FPS with no dips on medium to high settings.

BTW wonder why the min FPS on PCGamer was the same for both games ?

1080p is severly CPU bottlenecked.

1080p and 1440p on a single Titan X for me is no go on a stock 3930K (OCed its okish)

4K is were it's at ;)
 
I don't get the '980Ti is overkill for 1080p' at all. Developers are clearly aiming for 1080p60 at (meaningful) Ultra settings on a single flagship gpu right now. 1080p has the advantage of giving you enough overhead to brute-force away *most* of the need for g-sync. This then allows you to utilise LCD strobing (not currently possible in conjunction with g-sync) and get the absolute best image quality in motion (strobed 1080p has a considerably higher resolution in motion than typical LCD at 1440p).

1080p won't be CPU bottlenecked if you make use of various IQ enhancing features. pCars for example runs with everything maxxed (except reflections) at 1080p60 (45fps showing briefly in the absolute worst case scenario, stormy/dark starting grid with lots of cars). Max CPU usage is ~80% of a 4790K core.
 
Last edited:
I'm reading this thread and it appears some of the most advanced and expensive gpu's in 2015 are barely good enough......

1080p is just about fine, 4k isn't, so why bother.
 
i find these results somewhat biased. Witcher 3 doesnt require AA at 4k and hairworks just kills fps. I can get a pretty much constant 60fps with high settings on 4k. The lowest ive seen it drop to is 50fps with 2 980's. So 2 980ti's should hold 60fps no problem with ultra settings minus AA and hairworks.

AA is unnoticeable and hairworks is a performance killer. To be honest its not that good anyway, i run it off just because of the performance hit it has in demanding situations.

The witcher 3 runs about 45-50fps on 4k with every setting maxed out including AA and hairworks with 2 980 Ti's in SLI.

Plus as stated above 4k with the lowest settings possible still looks a hell of a lot better than 1080p maxed out.
 
Last edited:
i find these results somewhat biased. Witcher 3 doesnt require AA at 4k and hairworks just kills fps. I can get a pretty much constant 60fps with high settings on 4k. The lowest ive seen it drop to is 50fps with 2 980's. So 2 980ti's should hold 60fps no problem with ultra settings minus AA and hairworks.

AA is unnoticeable and hairworks is a performance killer.

The witcher 3 runs about 45-50fps on 4k with every setting maxed out including AA and hairworks with 2 980 Ti's in SLI.

Plus as stated above 4k with the lowest settings possible still looks a hell of a lot better than 1080p.

Jaggies are quite visible on buildings and those small wooden bridges at 4K without AA (156PPI 28inch monitor) you have to looking at those specific areas though, if your speeding around with Roach not so much. Your estimates are roughly correct but your going to need to OC those GTX980Tis to have it not dip below ~42-43 FPS.
 
Last edited:
Jaggies are quite visible on buildings and those small wooden bridges at 4K without AA (156PPI 28inch monitor) you have to looking at those specific areas though, if your speeding around with Roach not so much. Your estimates are roughly correct but your going to need to OC those GTX980Tis to have it not dip below ~42-43 FPS.

Suppose you have to take whats worse 40fps or a tiny bit of jaggies which to be honest is no where near as bad as on 1080p and 60fps.

It shouldnt be a problem for me at 60fps minus hairworks and AA. I rarely use AA on games unless there is a significant visual improvement or i find for the extra power usage its not worth it. Not for me Anyway :).
 
Back
Top Bottom