• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

8GB vs 11GB another 2080 vs 1080ti q

The performance is nearly like for like and should over the long term favour the 2080 + RT + DLSS. If buying new and coming from a GPU where it is WORTH the upgrade I would still be buying a 2080 over the 1080ti.

Anyone with a 1070ti/1080 or 1080ti it is not worth upgrading to the 2000 series.

1070 maybe*

At 4k resolution maybe*

Anything less definitely.

(But not at these crazy prices.)

We are at the end of the day discussing the difference between a 1080ti and 2080. I personally feel it's worth spending the extra £130. But that's just me.

I never said it wasnt like for like, but the 1080ti is significantly cheaper so you get more bang for your buck.

If the 2080 was actually a 2070 and at same price as 1070 launch price, then the tables would be turned, but its not what we have, what we have is pascal at current pricing been a better buy if all you care about is raw performance. The 2080ti has a purpose if a 1080ti is not fast enough but the 2080 I feel is a lame duck right now.
 
Thanks so is the general advice,

The 2080 is better than the 1080ti but not by a lot. Therefore the 1080ti is the best bang for buck, but the 2080 the safer better for future performance, especially if you're keeping the card long term; with 11GB vs 8GB VRAM unlikely to make much of an impact at all in the next couple of years?


I'm basically getting either:

Gigabyte GeForce GTX 1080Ti Gaming OC BLACK 11264MB GDDR5X PCI-Express Graphics Card @ £579.95 inc VAT
Or
RTX 2080 FE for £750 I think.
 
We won't know vram impact until next couple of years.

1. 1080Ti, it'll keep up, but might lag behind although will you notice the difference when 80Ti's ~70fps but titles that favour 2080 is hitting ~85fps? Sell it on next year when 7nm release, lose ~£279 from purchase price-note I'm not including DLLS figures as imo it's boosted fps@reduced quality.
2. 2080 FE, going off performance lead above, sell it on next year when 7nm release, lose ~£400 from purchase price.

IMO, 2080 will only be a valid proposition when premium aibs are dropped to ~£600, because Nv not releasing true performance data on RTX/DLSS speaks volumes-before anyone gives the excuse with needing new W10 builds, NV/devs are using those builds-they have the data.
 
think of the 2080 as really being a rebadged 2070. would you buy a 2070 over a 1080ti for £120 more?

the only way that gets faster than a 1080ti is if Nvidia nerf the drivers which they already have started doing with 400.xx for a 10x cards. quite a drop
in fps in the first driver which has yet to be fully rectified.
 
i think we will see a drop in MSRP, i just can't tell you where yet. cards that aren't selling will drop in price and it appears no-one is buying the 2080 while 1080ti's are still available.
 
I had around 180 different cards and brands through my hands since 1997 and I have only positive things to say about Inno3D - I've had a dusin of models from them with no issues.
 
Damn, I was under the impression that the 2080 is faster RAM therefore the less memory is redundant

Doesn't matter how fast it is, if the game needs to store a 7GB texture, and you've only got 6GB of VRAM, you're going to encounter stuttering when the card has to temporarily move textures out of the buffer to make space for new ones.
 
Doesn't matter how fast it is, if the game needs to store a 7GB texture, and you've only got 6GB of VRAM, you're going to encounter stuttering when the card has to temporarily move textures out of the buffer to make space for new ones.

Indeed, and one of the reasons I am going for the 1080ti, I have already got stuttering on BF1 at 1440p and other games, so I want to avoid that moving forwards. 1080ti should do that.
 
I want to make it very clear however, you can often massively reduce the VRAM usage by dialling down a few settings you probably can't even distinguish between

Great video on YouTube that covers it :)
 
I want to make it very clear however, you can often massively reduce the VRAM usage by dialling down a few settings you probably can't even distinguish between

Great video on YouTube that covers it :)
Agreed. Often those Ultra/Maximum/Epic setting are only worthy of screenshots to "show off". When gaming most won't spot the difference between high/highest or Epic/Ultra/Maximum.

Doesn't matter how fast it is, if the game needs to store a 7GB texture, and you've only got 6GB of VRAM, you're going to encounter stuttering when the card has to temporarily move textures out of the buffer to make space for new ones.

It's also a matter om compression technology. If the card can compress/decompress textures fast enough it can get away with less Vram.
Also what a game can storage in the Vram and what is actually need to run fluently is very different. Some game engine will just load the Vram buffer with everything that is possible, but it can be easily as fluent on a card with less Vram. The usage of Vram is not an indication of what is actually needed to run properly.
 
Agreed. Often those Ultra/Maximum/Epic setting are only worthy of screenshots to "show off". When gaming most won't spot the difference between high/highest or Epic/Ultra/Maximum.



It's also a matter om compression technology. If the card can compress/decompress textures fast enough it can get away with less Vram.
Also what a game can storage in the Vram and what is actually need to run fluently is very different. Some game engine will just load the Vram buffer with everything that is possible, but it can be easily as fluent on a card with less Vram. The usage of Vram is not an indication of what is actually needed to run properly.
Agreed :)
 
I do not get why people keep mentioning dlss it really is not something that is or will be "change gaming" it is not bad but people have already shown how you can get the same kind of quality and fps just by upscaling with taa.

Now ray tracing really is a great new feature with a great future but atm it is clearly just like physx was back in the day but currently it is notout or really that valid for most gamers because of its clear fps issues, i imagine next gen we shall start to see a huge leap with playable fps because atm 30 fps gaming does not interest me.

Not to mention if you are 4k gamer atm for example i see no reason to buy a 2080 over the 1080ti as you will actually have a worse experience and even at 1080p its better/equal/worse depending on the game but the price difference is also an issue at £120 pounds more (at its cheapest more likely £170 as most people buy higher quality gpus or £360 if you buy the asus strix haha) that money could go to anything in the build and give you a much better system thats the difference between a 8600k and 8700k for example and i can not think of any application or game where the lower one would be better.
 

which is why I am beginning to have doubts about DLSS in the Turing cards. At the moment you can disable a few settings and lower the detail levels and still get the same image quality but lots of extra performance. You would need to analyze the frame in detail to see the differences. Now DLSS comes along, it's going to offer a 30% performance increase (supposedly) But, it doesn't look at good as the real 4K according to the Video by hardware unboxed. So if it won't look as good isn't that the same as turning down the detail levels on the 1080Ti at 4K so that it looks almost as good during gameplay?

Still waiting for a real game using DLSS to see exactly what's what.

But, it's looking like Ray Tracing is going to be the only real difference between the 1080Ti and the 2080. It's up to the person buying if the £100+ extra price is worth it for that.
 
I do not get why people keep mentioning dlss it really is not something that is or will be "change gaming" it is not bad but people have already shown how you can get the same kind of quality and fps just by upscaling with taa.

:D I am just after writing my post saying nearly the same thing!!
 
Thanks so is the general advice,

The 2080 is better than the 1080ti but not by a lot. Therefore the 1080ti is the best bang for buck, but the 2080 the safer better for future performance, especially if you're keeping the card long term; with 11GB vs 8GB VRAM unlikely to make much of an impact at all in the next couple of years?


I'm basically getting either:

Gigabyte GeForce GTX 1080Ti Gaming OC BLACK 11264MB GDDR5X PCI-Express Graphics Card @ £579.95 inc VAT
Or
RTX 2080 FE for £750 I think.

Depends what you mean by "performance", the 2080ti can barely keep up at 1080p 60fps ray tracing so what do you think the 2080 will be capable of? As for DLSS what games support it currently or plan to use it in the future? On the other hand Vram is something that is current and will affect you especially at higher resolution. 8GB isn't a lot even my old R9 390 had that.

Times past you had a choice with gfx cards older but faster cards that were a generation behind with DirectX or newer but slower cards that had the latest version of DirectX. That hasn't been a thing for years now so Nvidia are having to resort to inventing a reason to upgrade... with literally nothing that supports it either now or on the horizon.
 
in my opinion extra vram is more likely to be useful in the future than RT cores.

Is a few games already with ultra texture packs and what not to fill up the memory buffer on 11 gig gpus.

As has been said you can already get DLSS sort of quality using other techniques, the advantage of DLSS is it does it with virtually no cost on the normal gpu cores. So DLSS purpose is basically to gain performance. I suppose from nvidia point of view they compared the performance gain of say the AI learning cores for DLSS vs extra standard cores, and the maths were probably in favour of DLSS but albeit in only supported games, but the killer of the decision was probably that DLSS was decided to have more marketing value.

Also someone on here mentioned in another post about using DLSS to speed up RT on the 2070, I am going to go out on a limb and speculate they already doing this on the 2080ti RT, using DLSS to speed up the RT they got now and as such these DLSS cores are essential for the RT they added.
 
Also someone on here mentioned in another post about using DLSS to speed up RT on the 2070, I am going to go out on a limb and speculate they already doing this on the 2080ti RT, using DLSS to speed up the RT they got now and as such these DLSS cores are essential for the RT they added.
That's a really interesting point. Maybe RT and DLSS capabilities are connected. Because if these are two excellent new features, Nvidia might have used them to hype up separate product releases and sell more cards. Launching them together implies either that they're somehow co-dependent, or that they needed both selling points to convince people to buy new cards. I don't think the raw performance is much improved on Pascal, not to justify a price hike.

Maybe I'm reaching.
 
in my opinion extra vram is more likely to be useful in the future than RT cores.

Is a few games already with ultra texture packs and what not to fill up the memory buffer on 11 gig gpus.

As has been said you can already get DLSS sort of quality using other techniques, the advantage of DLSS is it does it with virtually no cost on the normal gpu cores. So DLSS purpose is basically to gain performance. I suppose from nvidia point of view they compared the performance gain of say the AI learning cores for DLSS vs extra standard cores, and the maths were probably in favour of DLSS but albeit in only supported games, but the killer of the decision was probably that DLSS was decided to have more marketing value.

Also someone on here mentioned in another post about using DLSS to speed up RT on the 2070, I am going to go out on a limb and speculate they already doing this on the 2080ti RT, using DLSS to speed up the RT they got now and as such these DLSS cores are essential for the RT they added.

which games
 
Back
Top Bottom