• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: Poll: Will you be buying a 2080Ti/2080/2070?

Which card will you be buying?


  • Total voters
    1,201
  • Poll closed .
If you have £300k in savings but no monthly disposable income, how many 2080ti's can you afford?

Also do you think someone who can £300k of disposable income a year would think twice about a 2080ti or not have a Titan?
I was just pointing out that the other poster was probably talking about savings not disposable income as your post seemed to suggest. There's a big difference

If someone has £300k in savings but no disposable income it might be that the person could afford lots of GPU's on the face of it but really could justify few or none, at least if they're any good with money and they probably are hence why they have £300k anyway :D. But they're going to be careful with how they spend that money, with zero extra disposable income. ANd of course someone with 300k a year disposable income wouldn't think twice.
 
Last edited:
I wonder, does NVLink require the game to be aware like SLI, or does adding cards just double the available performance?

The latter would be awesome :)

If NV Fabric became fully supported in consumer hardware it would be or could be viewed as one unit, but as yet it’s not clear what or how it will be supported...
 
If NV Fabric became fully supported in consumer hardware it would be or could be viewed as one unit, but as yet it’s not clear what or how it will be supported...

Only for compute like loads with current architectures - for gaming you still have to use some form of the technologies that underpin SLI/CF and their disadvantages.
 
If NV Fabric became fully supported in consumer hardware it would be or could be viewed as one unit, but as yet it’s not clear what or how it will be supported...

Interesting, so does it require specifics on the motherboard and BIOS to be fully supported.

Obviously an ideal world solution would just require you to add more cards and NVLink / bridge them and maybe tick an option in the driver to treat as one unit instead of separates.
 
Waiting for benchmarks but if the 2080 Ti does turn out to be circa 40% better than the 1080 Ti then i'll be buying one and finally making the move to 4k gaming.
 
Only for compute like loads with current architectures - for gaming you still have to use some form of the technologies that underpin SLI/CF and their disadvantages.

Supposedly everything is addressable through Kuda but again to be clarified as to how it works. Be interesting to see how it pans out for RTX cards.

Interesting, so does it require specifics on the motherboard and BIOS to be fully supported.

Obviously an ideal world solution would just require you to add more cards and NVLink / bridge them and maybe tick an option in the driver to treat as one unit instead of separates.

Only for full support bypassing PCI lanes, with the current example being the big board with IBM CPU.
 
Supposedly everything is addressable through Kuda but again to be clarified as to how it works. Be interesting to see how it pans out for RTX cards.

It isn't just about having addressable memory or expanded CUDA support - with current architectures when processing game data each GPU either has to wait until the other GPU has finished whatever it is doing or they have to do some performance costly (in terms of when you are processing things in millseconds or less) semaphoring. The only way around that with current architectures is if a developer implements their game using explicit multi-adaptor as with carnal knowledge of how their game engine operates and what the rendering workload is comprised of they can avoid anything costly and farm out the work in the most efficient manner.

Being able to much more directly and quickly access memory might make it possible to get some kind of SLI scaling (somewhere around average percentage gains) out of some games that traditionally haven't worked at all due to the memory operations destroying any performance boost and the ability to shortcut transactions between the GPUs does add a small performance bonus but that would be in the region of single digit percentages.
 
I'm replacing my 980 for a 2080 this side of Christmas. I had already decided that after the release of the 1080 that i'd skip that gen . I don't see the need for the ti but the 2080 should see me skip through a couple of generations at least or to the POF.

I'd like my system to be ready for 32"+, 4k 120Mhz+ 100% RGB monitor
 
Oh really?bugger.. tbh, I've not found a monitor of that spec anyway.. I need 100% RGB for my photography.. i'd be happy to drop the res for games..
Well to get 4k 120hz + going to need to match that with 120fps + and at the moment the 2080ti is expected to get 60-120fps on 4k. The 2080, if on par with the 1080 ti, will be around 40-70 fps at best. These are usually quoted though on higher in game settings, so I'd imagine turning down settings it could be done easily on the 2080? Really depends.
 
I had my money set aside for a while for this upgrade coming from a 970 to a 2080 (didn't have enough saved for the TI) should do me well for a few years!
 
Back
Top Bottom