• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Why do people say 290x isn't good enough for 4k but benchmarks say different?

A lot will say 290x is no good for 4k gaming, but then again you do get a lot of numpties that turn on AA in 4k. You kinda gotta try it for yourself.
 
I'm not sure why you'd go 4K and then turn the settings down?
If you're not after good visuals, why go 4K?

If you're not after good visuals or 60fps, have you considered console gaming instead of PC gaming? It'd be cheaper.
 
I would prefer lower resolution and better effects than the other way round.

Same here.

Quite frankly, people who say running at 4k with the graphical effects all on low/medium looks better than 1080p at ultra settings (on appropriate native res monitors) are talking out of their arse :)
 
In a nutshell it all depends on individual whether a 290X is enough or not. I remember playing games on a cassette tape when I was a kid and on a NES and SMS, not going to be bothered to much about the occasional slowdown today.

I would prefer lower resolution and better effects than the other way round.

Same, when I am playing a game at 4k if I can't get acceptable performance I just drop to 1440p or 1080p in some cases.

That said there are some effects in games that are very taxing on GPU and provide little to no difference to me, I just turn those off.
 
I'm not sure why you'd go 4K and then turn the settings down?
If you're not after good visuals, why go 4K?

If you're not after good visuals or 60fps, have you considered console gaming instead of PC gaming? It'd be cheaper.

A lot of the time though people wouldn't be able to tell the settings apart without looking in the graphics menu, it's just running maximum graphics for the sake of it.

I bet developers could add an "ultra" setting which is the same as "high" and simply use massively inefficient code to reduce the frame rate and people would still run ultra simply because it's the highest setting and it must be better because of the performance hit. That would be a great way to keep the GPU market ticking over in all honestly. :p
 
A lot of the time though people wouldn't be able to tell the settings apart without looking in the graphics menu, it's just running maximum graphics for the sake of it.

I bet developers could add an "ultra" setting which is the same as "high" and simply use massively inefficient code to reduce the frame rate and people would still run ultra simply because it's the highest setting and it must be better because of the performance hit. That would be a great way to keep the GPU market ticking over in all honestly. :p

*Cough ubisoft....
 
Same, when I am playing a game at 4k if I can't get acceptable performance I just drop to 1440p or 1080p in some cases.

This is actually a very valid point, as due to 4K being 2x the height/width of 1080p you can just set a 4K monitor to 1080p and you won't get the usual blur from running a panel outside it's native resolution as it's mapping 4 screen pixels per resolution pixel (maintaining square). People used to do the same thing on 1440p screens by setting them to 720p when they couldn't get decent FPS.
 
I'm not sure why you'd go 4K and then turn the settings down?
If you're not after good visuals, why go 4K?

If you're not after good visuals or 60fps, have you considered console gaming instead of PC gaming? It'd be cheaper.

Can you not read? are you one of those elitists that can only play a game on everything to the max and anything less isn't worthy? not everyone is like that you know :eek:, calm down buddy :confused:

Seems like people are 50/50 about this. Reason why I'm asking this question is because I have a new build coming this week from OCUK minus a GPU because the new AMD cards should be coming out in June.

I need to get a new monitor too and I'm set on getting the Philips 40 inch 4k monitor. So obviously I'd want to try 4k gaming on it, plus I can game at 1080p and 1440p if needed.

But after seeing the benchmarks of this GPU I was considering just buying it and having it for a year or so then upgrading to the new 390x when it's dropped in price a lot.

So if this card can handle most games at 4k on decent settings for the next year+ then it might be a better option instead of splashing out £500+ on the 390x when it comes out
 
A lot of the time though people wouldn't be able to tell the settings apart without looking in the graphics menu, it's just running maximum graphics for the sake of it.

I bet developers could add an "ultra" setting which is the same as "high" and simply use massively inefficient code to reduce the frame rate and people would still run ultra simply because it's the highest setting and it must be better because of the performance hit. That would be a great way to keep the GPU market ticking over in all honestly. :p

In fairness without some indication such as a GUI (or seeing the settings) there are probably a lot of people that can't tell the difference between 4K, 1440p and 1080p in a number of scenarios.

This is actually a very valid point, as due to 4K being 2x the height/width of 1080p you can just set a 4K monitor to 1080p and you won't get the usual blur from running a panel outside it's native resolution as it's mapping 4 screen pixels per resolution pixel (maintaining square). People used to do the same thing on 1440p screens by setting them to 720p when they couldn't get decent FPS.

I've tried that and both 1440p and 1080p look nasty on a 4K display. 1440p probably for scaling reasons. 1080p I can only guess because it's using 4 pixels (including the gap between them) instead of one.

Can you not read? are you one of those elitists that can only play a game on everything to the max and anything less isn't worthy? not everyone is like that you know :eek:, calm down buddy :confused:

Seems like people are 50/50 about this. Reason why I'm asking this question is because I have a new build coming this week from OCUK minus a GPU because the new AMD cards should be coming out in June.

I need to get a new monitor too and I'm set on getting the Philips 40 inch 4k monitor. So obviously I'd want to try 4k gaming on it, plus I can game at 1080p and 1440p if needed.

But after seeing the benchmarks of this GPU I was considering just buying it and having it for a year or so then upgrading to the new 390x when it's dropped in price a lot.

So if this card can handle most games at 4k on decent settings for the next year+ then it might be a better option instead of splashing out £500+ on the 390x when it comes out

I'm just not sure the point of going 4K and turning settings down or playing at 1440p/1080p. If you're going to play at those resolutions, it might be better to buy a monitor with that resolution.
Personally I'd buy a 4K monitor when you intend to play at 4K. The monitors may be better and possibly cheaper too.

But if you're prepared to go down to 1080p and drop your setting to lowest then you could probably get away with a lesser card than the 290X.
 
I've tried that and both 1440p and 1080p look nasty on a 4K display. 1440p probably for scaling reasons. 1080p I can only guess because it's using 4 pixels (including the gap between them) instead of one.

1440p will look nasty at 4K however 1080p should look pixel perfect due to the 4:1 mapping :S
 
I will agree with the OP.
Everything is relative and there are no absolutes.

My current setup going to last me many years and especially on the games i play most, (world of tanks/warships, elder scrolls online) is more than sufficient given that they are cpu limited (running on 1 core)
 
Back
Top Bottom