• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

1080ti vs 2080

Associate
Joined
12 Mar 2017
Posts
140
Location
West Midlands
Hi Guys,

I'm guessing this one comes up a lot atm, I'm a 4k Gamer currently using a MSI 980TI.

Is it worth upgrading to the 1080ti or holding off for the 2080?
 
I have ordered the 2080.

No difference between the two except the 2080 is likely to be better supported and pull ahead in the future by a little. That little can be a lot as both cards are on the cusp of 4K/60 but ultimately you can just move shadow detail down a bit if it’s near that region.

I think either is good, I just felt uncomfortable paying rrp for an 18 or so month old card but then again that’s just what pc gaming is.

Also the vr link cable might be interesting at some point. Dlss will likely be a concise any way of maintaining fps at 4K in a few years time too if the adoption rate is ok. Also the 2080 smashes vr bench marks which might signify it pulling ahead in the future.

All of that is speculative tho and I doubt it’s goint to be more than 10 or 20 percent faster in most games in a year.



I think if you can find something better to do with the 120 quid difference, get the 1080ti. Otherwise get the 2080.

If you have a 980ti tho, I personally wouldn’t upgrade unless you are trying to do 4K/60.

I only upgraded as I have a 670 which was showing its age severely. It must be the worst time ever to buy a card.
 
2080 all the way. When Nvidia wants to push the PC gaming market in a certain direction everyone must follow. Game devs totally understand RT and Tensor and really want it. Unfortunately the review pundits don't seem to get it at all and are narrow-mindedly focused on traditional raster benchmarks without any proprietary features enabled. Don't forget how many ex-mining cards are out there too after a crypto coin bust with owners looking to sell them off. It really plays into their advantage to have people convinced that they should save money and buy a used 1080Ti.

You won't be able to buy a GPU without HW ray tracing capability in the near future. Once devs start taking advantage of it non-RT cards will either make new games look like crap or they won't even be able to run them at all.
 
2080 all the way. When Nvidia wants to push the PC gaming market in a certain direction everyone must follow. Game devs totally understand RT and Tensor and really want it. Unfortunately the review pundits don't seem to get it at all and are narrow-mindedly focused on traditional raster benchmarks without any proprietary features enabled. Don't forget how many ex-mining cards are out there too after a crypto coin bust with owners looking to sell them off. It really plays into their advantage to have people convinced that they should save money and buy a used 1080Ti.

You won't be able to buy a GPU without HW ray tracing capability in the near future. Once devs start taking advantage of it non-RT cards will either make new games look like crap or they won't even be able to run them at all.

Buy it based on the hope devs start taking advantage of raytracing and tensor cores? Yea sure mate :) look how long it's taking them to take advantage of the low level APIs they have been asking for. DX12 still is taking its sweet time to gain traction.
It's a fair premium to pay to encourage the industry to move on. Basically footing the bill just to help the gaming industry to adopt raytracing.
1080ti all the way more like.
 
Buy it based on the hope devs start taking advantage of raytracing and tensor cores? Yea sure mate :)

Not dissimilar to buying pretty much any AMD card of late then, you get this much perform now and hope or the fine wine treatment a bit further down the line. ;):p
 
Buy it based on the hope devs start taking advantage of raytracing and tensor cores? Yea sure mate :) look how long it's taking them to take advantage of the low level APIs they have been asking for. DX12 still is taking its sweet time to gain traction.
It's a fair premium to pay to encourage the industry to move on. Basically footing the bill just to help the gaming industry to adopt raytracing.
1080ti all the way more like.

"Hopes of support" right... do you think the leader in GPU technology would build specialized HW without legal obligations in place with devs to make games that take advantage of it? Yeah sure, Nvidia has no idea what they are doing. It's not like 11 upcoming games have already committed to RT, and some more for DLSS, and a few more for programmable shaders.

Nice false equivalence fallacy to DirectX. Nothing about DX12 is comparable to RT and DLSS. CPU draw call optimization? Really can't tell if you're just trolling at this point.

I paid $90 more for a 2080FE over a 1080Ti. Wow what a stretch!

"Help the industry" lol now I know you're trolling. The industry has absolutely already moved on. 1080Ti no longer in production. It's over.
 
"Hopes of support" right... do you think the leader in GPU technology would build specialized HW without legal obligations in place with devs to make games that take advantage of it? Yeah sure, Nvidia has no idea what they are doing. It's not like 11 upcoming games have already committed to RT, and some more for DLSS, and a few more for programmable shaders.

Nice false equivalence fallacy to DirectX. Nothing about DX12 is comparable to RT and DLSS. CPU draw call optimization? Really can't tell if you're just trolling at this point.

I paid $90 more for a 2080FE over a 1080Ti. Wow what a stretch!

"Help the industry" lol now I know you're trolling. The industry has absolutely already moved on. 1080Ti no longer in production. It's over.
it just works! lol
 
lol are Nvidia paying people to post on forums again?

Let me know when a low/mid range mainstream GPU (i.e. *50/*60 class) and consoles can do RT stuff without breaking a sweat, then I'll be interested. Until then it'll be nothing more than a glorified option in games for people willing to spend obscene amounts on a GPU.
 
The industry has absolutely already moved on. 1080Ti no longer in production. It's over.[/QUOTE said:
definitely. that's why no one uses 970s anymore... oh no wait... ray tracing is not ready and the 1080 series is going to remain a good option for some time, plus the prices are ridiculous. I think it's time to skip a generation TBH
 
"Hopes of support" right... do you think the leader in GPU technology would build specialized HW without legal obligations in place with devs to make games that take advantage of it? Yeah sure, Nvidia has no idea what they are doing. It's not like 11 upcoming games have already committed to RT, and some more for DLSS, and a few more for programmable shaders.

Nice false equivalence fallacy to DirectX. Nothing about DX12 is comparable to RT and DLSS. CPU draw call optimization? Really can't tell if you're just trolling at this point.

I paid $90 more for a 2080FE over a 1080Ti. Wow what a stretch!

"Help the industry" lol now I know you're trolling. The industry has absolutely already moved on. 1080Ti no longer in production. It's over.

What the ….. ??? It will be 4 or 5 years at least before these new technologies are mainstream. There is no way game developers are going to cut off most of their market for the small amount of users that will have Ray Tracing or DLSS.

The industry hasn't moved on. Not even close.
 
2080 all the way. When Nvidia wants to push the PC gaming market in a certain direction everyone must follow. Game devs totally understand RT and Tensor and really want it. Unfortunately the review pundits don't seem to get it at all and are narrow-mindedly focused on traditional raster benchmarks without any proprietary features enabled. Don't forget how many ex-mining cards are out there too after a crypto coin bust with owners looking to sell them off. It really plays into their advantage to have people convinced that they should save money and buy a used 1080Ti.

You won't be able to buy a GPU without HW ray tracing capability in the near future. Once devs start taking advantage of it non-RT cards will either make new games look like crap or they won't even be able to run them at all.

Right now it's a gimmick. In 2-3 generations ray-tracing might be mainstream, better to wait until then. Less than 1% of gamers are likely to have a RTX card if Steam surveys are anything to go by, so devs aren't going to bother all that much with it yet. They aren't going to invest 100s of man hours implementing and testing something 0.5% of customers will actually use.

The cost just isn't worth it at all if you already have a top end GPU.
 
Not dissimilar to buying pretty much any AMD card of late then, you get this much perform now and hope or the fine wine treatment a bit further down the line. ;):p
It's about the implementation of raytracing not the performance. Anyone who thinks the performance of the 2080 will get better in raytracing over time must be out their mind if you ask me. It can handle 1 ray per pixel! 1... It's not even enough to do anything to the graphics.
I say get the 1080ti and wait until better ray trace cards are out. Maybe when the raytracing performance is around 100-1000x better we can think about buying one for that.
 
lol are Nvidia paying people to post on forums again?

Let me know when a low/mid range mainstream GPU (i.e. *50/*60 class) and consoles can do RT stuff without breaking a sweat, then I'll be interested. Until then it'll be nothing more than a glorified option in games for people willing to spend obscene amounts on a GPU.

2080 vs. 1080Ti is an enthusiast-level discussion. Shoo away.
 
What the ….. ??? It will be 4 or 5 years at least before these new technologies are mainstream. There is no way game developers are going to cut off most of their market for the small amount of users that will have Ray Tracing or DLSS.

The industry hasn't moved on. Not even close.

It will literally be later this year when AAA games implement RT. Devs have signed agreements to do this. Nvidia is not going about this alone. I can assure you.
 
Back
Top Bottom