Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
I'm not sure why you'd go 4K and then turn the settings down?
If you're not after good visuals, why go 4K?
If you're not after good visuals or 60fps, have you considered console gaming instead of PC gaming? It'd be cheaper.

I would prefer lower resolution and better effects than the other way round.

I would prefer lower resolution and better effects than the other way round.
I'm not sure why you'd go 4K and then turn the settings down?
If you're not after good visuals, why go 4K?
If you're not after good visuals or 60fps, have you considered console gaming instead of PC gaming? It'd be cheaper.

A lot of the time though people wouldn't be able to tell the settings apart without looking in the graphics menu, it's just running maximum graphics for the sake of it.
I bet developers could add an "ultra" setting which is the same as "high" and simply use massively inefficient code to reduce the frame rate and people would still run ultra simply because it's the highest setting and it must be better because of the performance hit. That would be a great way to keep the GPU market ticking over in all honestly.![]()
Same, when I am playing a game at 4k if I can't get acceptable performance I just drop to 1440p or 1080p in some cases.
I'm not sure why you'd go 4K and then turn the settings down?
If you're not after good visuals, why go 4K?
If you're not after good visuals or 60fps, have you considered console gaming instead of PC gaming? It'd be cheaper.
, calm down buddy 
A lot of the time though people wouldn't be able to tell the settings apart without looking in the graphics menu, it's just running maximum graphics for the sake of it.
I bet developers could add an "ultra" setting which is the same as "high" and simply use massively inefficient code to reduce the frame rate and people would still run ultra simply because it's the highest setting and it must be better because of the performance hit. That would be a great way to keep the GPU market ticking over in all honestly.![]()
This is actually a very valid point, as due to 4K being 2x the height/width of 1080p you can just set a 4K monitor to 1080p and you won't get the usual blur from running a panel outside it's native resolution as it's mapping 4 screen pixels per resolution pixel (maintaining square). People used to do the same thing on 1440p screens by setting them to 720p when they couldn't get decent FPS.
Can you not read? are you one of those elitists that can only play a game on everything to the max and anything less isn't worthy? not everyone is like that you know, calm down buddy
Seems like people are 50/50 about this. Reason why I'm asking this question is because I have a new build coming this week from OCUK minus a GPU because the new AMD cards should be coming out in June.
I need to get a new monitor too and I'm set on getting the Philips 40 inch 4k monitor. So obviously I'd want to try 4k gaming on it, plus I can game at 1080p and 1440p if needed.
But after seeing the benchmarks of this GPU I was considering just buying it and having it for a year or so then upgrading to the new 390x when it's dropped in price a lot.
So if this card can handle most games at 4k on decent settings for the next year+ then it might be a better option instead of splashing out £500+ on the 390x when it comes out
I've tried that and both 1440p and 1080p look nasty on a 4K display. 1440p probably for scaling reasons. 1080p I can only guess because it's using 4 pixels (including the gap between them) instead of one.
1440p will look nasty at 4K however 1080p should look pixel perfect due to the 4:1 mapping :S