• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

2080 vs 2080Ti - equalising for FPS = graphical compromise @ 4K?

Soldato
Joined
31 Dec 2006
Posts
7,234
Curious on this one... a 2080 vs a 2080Ti, if you equalise for FPS, i.e have the 2080Ti at ULTRA settings, and drop settings on the 2080 such that you are roundabout equal on FPS, what kind of visual compromise are you making at 4K? Is it significant? All question of value and 1080Ti aside, I am just wondering on a purely VISUAL level at 4K only.
 
2070 is pointless compared to the 1070
2080 is pointless compared to the 1080Ti
2080Ti is out of reach for most, just another Titan basically.

nvidia might as well not have bothered.
 
2080 is pointless compared to the 1080Ti
2080Ti is out of reach for most, just another Titan basically.

nvidia might as well not have bothered.

That's not the question I asked. Not that I completely disagree (only time will tell though), but it's clearly not what I wanted to know.
 
That's not the question I asked. Not that I completely disagree (only time will tell though), but it's clearly not what I wanted to know.

Haha, sorry.

It's going to depend on the game and how demanding the individual in-game settings are to performance.

I don't see how anyone can give you an answer that's going to fit every situation.
 
Haha, sorry.

It's going to depend on the game and how demanding the individual in-game settings are to performance.

I don't see how anyone can give you an answer that's going to fit every situation.

No, I know it's not a one size fits all situation... heck, a 1060 can play some games at 4K just fine lol! I guess I'm more curious on anyone's actual real-world experiences with the card, that's the only real test.
 
Curious on this one... a 2080 vs a 2080Ti, if you equalise for FPS, i.e have the 2080Ti at ULTRA settings, and drop settings on the 2080 such that you are roundabout equal on FPS, what kind of visual compromise are you making at 4K? Is it significant? All question of value and 1080Ti aside, I am just wondering on a purely VISUAL level at 4K only.

Why have you asked this question, surely you can adjust the settings and ask your eyes for an opinion?
 
I don't have the GPU... so I am curious for those people who have a 2080 and are gaming at 4K what their experiences are.

If you have a 4k screen and you're interested in the visual difference, then you can test with literally any GPU. Settings to Ultra, take a screenshot. Settings to Very High, take a screenshot. You can then compare the images.

This would be a much better route as you can test the titles that interest you and you can use your own judgement. IQ is completely subjective.

Make sure you label the image file name so you can tell which is which. ;)
 
If you have a 4k screen and you're interested in the visual difference, then you can test with literally any GPU. Settings to Ultra, take a screenshot. Settings to Very High, take a screenshot. You can then compare the images.

This would be a much better route as you can test the titles that interest you and you can use your own judgement. IQ is completely subjective.

Make sure you label the image file name so you can tell which is which. ;)

Yeah that'll work, good idea.
 
It's going to vary game by game, for how many stupid settings they have, e.g. clouds in AC:OD is a major one but others aren't as severe in which case the room to tweak isn't quite as large. Here's a few videos to get an idea: SotTR, AC:OD, FH4.

I would also say, it's great to have extra performance because sometimes upping render scale 20-40% is a great joy to behold. More so if you have larger screens.
 
I get what you're saying. I think MOST games that run 60fps+ can have graphical compromises maybe to high or very high settings on the 2080. its basically whatever the 1080ti can do and that can 60fps 4k maybe 95% of games with graphical compromises?

you're asking is the 400£ premium worth the graphics... answer imo is no.

what steers me away from the 2080ti is the fact that most of the time it will really be under utilised in my rig. I'd rather get a 2080 so i can squeeze all the performance out of than the 2080ti. especially at 4k when most of us are capped at 60fps.. what is the point in the 2080ti hitting 80fps+. its literally a waste.

also i have a fear that nvidia will actually release something decent since AMD really have to turn at some point to upset this RTX cash grab.



I think by the time a plethora of games are out which the 2080/1080ti struggles on, we would have hopefully moved on GPU wise to a new generation. Also if u have a 2080.. u have DLSS which will hopefully help you out a bit with newer titles which adapt it.




what turned me off the 2080ti is seeing benches in a few games where even the 2080ti couldn't hit 4k! it was like wtf. At that point you'd probably custom resolution an ultrawide monitor to hit the FPS.. in which case i think a 2080 would probably manage it at 60fps too.


dont get me wrong, if i could justify a 2080ti, i would but it just seems like the PC component that will get replaced soonest in my new build.
 
I get what you're saying. I think MOST games that run 60fps+ can have graphical compromises maybe to high or very high settings on the 2080. its basically whatever the 1080ti can do and that can 60fps 4k maybe 95% of games with graphical compromises?

you're asking is the 400£ premium worth the graphics... answer imo is no.

what steers me away from the 2080ti is the fact that most of the time it will really be under utilised in my rig. I'd rather get a 2080 so i can squeeze all the performance out of than the 2080ti. especially at 4k when most of us are capped at 60fps.. what is the point in the 2080ti hitting 80fps+. its literally a waste.

also i have a fear that nvidia will actually release something decent since AMD really have to turn at some point to upset this RTX cash grab.

I think by the time a plethora of games are out which the 2080/1080ti struggles on, we would have hopefully moved on GPU wise to a new generation. Also if u have a 2080.. u have DLSS which will hopefully help you out a bit with newer titles which adapt it.

what turned me off the 2080ti is seeing benches in a few games where even the 2080ti couldn't hit 4k! it was like wtf. At that point you'd probably custom resolution an ultrawide monitor to hit the FPS.. in which case i think a 2080 would probably manage it at 60fps too.

dont get me wrong, if i could justify a 2080ti, i would but it just seems like the PC component that will get replaced soonest in my new build.


Yes, you've summed it up well, and that was the question I was asking myself basically. I certainly see where you're coming from. There's obviously no question that the 2080Ti is more powerful than 2080... but the value isn't there. This doesn't seem to bother as many people as the 2080 has though, due to it not being an improvement over the 1080Ti (outside of any potential future RTX features that it might be able to utilise of course, but that's a big unknown). This fact has caused all sorts of uproar, and there is undoubtedly far more anger being aimed at the 2080 than the 2080Ti... which because it offers more power, seems to get a pass in many circles, despite actually being worse value than the 2080. It's all a bit odd really, but speaks to human nature more than anything else. You're getting rinsed more with the 2080Ti, but they give you that little bit of sugar and suddenly we're more amenable.

I'd perhaps argue when we see more affordable 144Hz 4K monitors become available, the case for the 2080Ti becomes stronger, but until then it's certainly going to see a lot of wasted frames at 4K 60Hz... but that guaranteed smoothness is all most people care about for now I'm sure. It is the only card which pretty much guarantees a solid 60FPS across every game at Ultra settings (with a couple of exceptions).

:cool:
 
Last edited:
Yes, you've summed it up well, and that was the question I was asking myself basically. I certainly see where you're coming from. There's obviously no question that the 2080Ti is more powerful than 2080... but the value isn't there. This doesn't seem to bother as many people as the 2080 has though, due to it not being an improvement over the 1080Ti (outside of any potential future RTX features that it might be able to utilise of course, but that's a big unknown). This fact has caused all sorts of uproar, and there is undoubtedly far more anger being aimed at the 2080 than the 2080Ti... which because it offers more power, seems to get a pass in many circles, despite actually being worse value than the 2080. It's all a bit odd really, but speaks to human nature more than anything else. You're getting rinsed more with the 2080Ti, but they give you that little bit of sugar and suddenly we're more amenable.

I'd perhaps argue when we see more affordable 144Hz 4K monitors become available, the case for the 2080Ti becomes stronger, but until then it's certainly going to see a lot of wasted frames at 4K 60Hz... but that guaranteed smoothness is all most people care about for now I'm sure. It is the only card which pretty much guarantees a solid 60FPS across every game at Ultra settings (with a couple of exceptions).

:cool:


Yup you are right.

I do think for 4K/144hz we will need a next generation card though as the 2080ti realistically can sometimes be sub 60fps still.. that’s quite a waste of a very expensive monitor.
 
It's so game dependant.

For instance, I'm playing Shadow of the Tomb Raider at 4K/Ultra with 2xSMAA on a 2080Ti and am having to use adaptive V-sync as the FPS generally hovers around the 50-55fps mark during gameplay. And weirdly turning off Anti-aliasing doesn't seem to make any difference to the framerate.

Forza Horizon 4 though, absolutely no issues getting 4k/60 locked at Ultra/Extreme settings.
 
It's so game dependant.

For instance, I'm playing Shadow of the Tomb Raider at 4K/Ultra with 2xSMAA on a 2080Ti and am having to use adaptive V-sync as the FPS generally hovers around the 50-55fps mark during gameplay. And weirdly turning off Anti-aliasing doesn't seem to make any difference to the framerate.

Forza Horizon 4 though, absolutely no issues getting 4k/60 locked at Ultra/Extreme settings.

Forza was always easy to run maxed out on PC. I don't have FH4 yet but FH3 maxed out on my 1080FE runs perfectly at 4K never dropping below 60fps with still some power to spare.
 
If you have a 4k screen and you're interested in the visual difference, then you can test with literally any GPU. Settings to Ultra, take a screenshot. Settings to Very High, take a screenshot. You can then compare the images.

This would be a much better route as you can test the titles that interest you and you can use your own judgement. IQ is completely subjective.

Make sure you label the image file name so you can tell which is which. ;)
Agreed.

I did this recently with Final Fantasy 15. I have everything maxed out on it at 4K except for one gameworks feature which is VXAO. That really is a fps hog. Guess what, could not tell the difference in game or in screenshots outside with it on or off. But the fps drop is still huge, even when outside where one could not see the difference. Only time I can see difference is inside say in shops or cafe’s, but majority of the game you are outside... That made it a very easy decision to leave it off.

Also, for those who are getting 11gb cards, this game at times uses nearly all my 12gb vram. Though I am sure it is fine with 11gb. But it is the only game I have tested that has come close to filling up all 12gb of my Titan.
 
Back
Top Bottom