• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Titan X in 4k benchmarks

Associate
Joined
4 Nov 2014
Posts
54
In every benchmark ive seen for 4k and this card they all use anti aliasing. Why? Last i checked 4k did not even need any kind of anti aliasing at that resolution.

it just uses unnecessary processing power which slows down the fps even further. Am i off base here? Am i missing something? Why continue to have 4k benchmarks using any AA?
 
In every benchmark ive seen for 4k and this card they all use anti aliasing. Why? Last i checked 4k did not even need any kind of anti aliasing at that resolution.

it just uses unnecessary processing power which slows down the fps even further. Am i off base here? Am i missing something? Why continue to have 4k benchmarks using any AA?

You can see 8XMSAA @2160p.:)
 
In every benchmark ive seen for 4k and this card they all use anti aliasing. Why? Last i checked 4k did not even need any kind of anti aliasing at that resolution.

it just uses unnecessary processing power which slows down the fps even further. Am i off base here? Am i missing something? Why continue to have 4k benchmarks using any AA?

Need is a subjective word,, some users need all the eye candy, others dont
 
Whilst fast paced games don't really require all the AA at 4K, you can tell (or I could) when AA was being used and when it wasn't), 4K does a good job of squashing the jaggies and the AA just finishes it off.
 
In every benchmark ive seen for 4k and this card they all use anti aliasing. Why? Last i checked 4k did not even need any kind of anti aliasing at that resolution.

it just uses unnecessary processing power which slows down the fps even further. Am i off base here? Am i missing something? Why continue to have 4k benchmarks using any AA?

On some games jaggies - what AA helps with - tend to be overly noticeable (ones developed on older graphic engines).

I prefer to play at medium-high settings with a little AA rather than go ultra without.

In competitive first person multiplayer games (COD,BF) I prefer not to use any AA so I can keep the FPS as high as possible.

If you can wait roughly a week I will be doing a 1440p and 4K review on most of the games I have with no AA.
 
Last edited:
I don't notice jaggies at 4k, I tend to have MSAA off mainly because it eats in to precious VRAM.

If I've got enough grunt left in the system to apply some level of MSAA then I will (if it's an older game for instance)

Been playing Battlefield Hardline campaign at 'Ultra' but disabled MSAA and didn't see a difference. Same with Far Cry 4.

Maybe it depends on distance from the screen and PPI density? I've got a 58" screen but I sit about 10ft from it.
 
I don't notice jaggies at 4k, I tend to have MSAA off mainly because it eats in to precious VRAM.

If I've got enough grunt left in the system to apply some level of MSAA then I will (if it's an older game for instance)

Been playing Battlefield Hardline campaign at 'Ultra' but disabled MSAA and didn't see a difference. Same with Far Cry 4.

Maybe it depends on distance from the screen and PPI density? I've got a 58" screen but I sit about 10ft from it.

I've got a 28 inch Asus 4K (157 PPI) and sit 2-3 ft away. 10ft is quite far away (58inches though is bawler) , most of the time you will be running around on BF Hardline anyway so jaggies aren't as apparent as they would be if your looking for them.

"If I've got enough grunt left in the system to apply some level of MSAA then I will (if it's an older game for instance)" same here.
 
Last edited:
I've only played the single player campaign so far, which isn't just running around lots.

They seem to have ramped up the graphics compared to multiplayer as well. I guess they've got more GPU/CPU cycles to use in the single player side, because they don't have to account for 63 other players zipping around
 
I've only played the single player campaign so far, which isn't just running around lots.

They seem to have ramped up the graphics compared to multiplayer as well. I guess they've got more GPU/CPU cycles to use in the single player side, because they don't have to account for 63 other players zipping around

Battlefield has a singleplayer ? :D
 
Never used AA @ 4K before for 2 reasons, I could barely tell the difference unless I took screen shots and compared side by side and only then could I see a couple of jagged edges, secondly I only had 4GB Vram.
Now I'm getting a card with 12GB depending upon the game I may use 2XAA if there's a difference.

In benches they use it for extra strain and to keep testing across the resolutions the same settings so it's fair
 
In every benchmark ive seen for 4k and this card they all use anti aliasing. Why? Last i checked 4k did not even need any kind of anti aliasing at that resolution.

it just uses unnecessary processing power which slows down the fps even further. Am i off base here? Am i missing something? Why continue to have 4k benchmarks using any AA?

Some of us run 4K on larger screens - eg. 40 inches.

Aliasing can be quite noticeable. Especially games like Elite Dangerous.

It's not unnecessary at all.
 
Are any of these benchmarks getting the "needed" above 60fps minimum? From the benches I have seen they are not and tbh everybody apart from the people that are playing with it say that "below 60fps min is unplayable".

I only play on a 60hz monitor so it would be good. But still I get exactly what i need.
 
Are any of these benchmarks getting the "needed" above 60fps minimum? From the benches I have seen they are not and tbh everybody apart from the people that are playing with it say that "below 60fps min is unplayable".

I only play on a 60hz monitor so it would be good. But still I get exactly what i need.

Most sites mention 60 FPS average not minimums with no major dips for a good experience.

45 FPS on a single card is better most of the time than 60 FPS in SLI/CFX. I don't think any sites have added frame-times to there benches ?

60 FPS minimums would need 2 Titan Xs @ 1440p-4K, a good OCed CPU and a pretty well optimized game.
 
Last edited:
Are any of these benchmarks getting the "needed" above 60fps minimum? From the benches I have seen they are not and tbh everybody apart from the people that are playing with it say that "below 60fps min is unplayable".

I only play on a 60hz monitor so it would be good. But still I get exactly what i need.

It is not all about fps or frame times sometimes it is about finding out what looks and works best.

Using 4 cards and max settings @2160p has some odd benefits like the max settings smooth out the min/avg/max fps.

Or another game like Civ5 on a huge map with a huge empire can slaughter any GPU setup regardless of resolution used but the odd thing is it is better to use a single card for maximum performance as minimums don't really matter in the game but a single card will work more efficiently than 2,3,or 4.

There are no hard and fast rules with GPUs it is more a case of finding out what works best.
 
Back
Top Bottom