• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Should NVidia drop SLI, has it had it's day ?

Have you seen my specs? What idiot buys a pair of Fury X's unless they're running 4k? lol

Haha. Nope did not look :p

But Elite Dangerous could use the extra grunt in my case when I play it at 4K. That is with highest settings except for AA though :D

Then there is Fallout 4 which did not work with crossfire and then latest Starcraft 2 did not work with crossfire. In game was fine with single gpu, but cut scenes which are not videos needed more grunt. The funny thing was when crossfire was enabled, the opposite was true :)
 
Yes they should drop sli games are woefully lacking with regards to sli. I realised this last year sold my sli cards and bought titanx. Nvidea like to give the impression they are all for sli but this is wrong , they just want us to buy more cards.
 
I've not used SLI so I can't judge it. I can say, however, that multi gpu gaming really needs to be supported if they are going to start pushing VR where framerate is going to be so important. It's going to take a lot of grunt to power and they are going to need it if they can't pull something amazing out of the bag with single cards. I'm optimistic about DX12, but being realistic we might not see tangible benefits for a few years as engines get adapted to make better use of it.
Well, dual GPU support in VR is not the same as how SLI works. So, on the plus side, it should actually be easier to do than normal SLI. On the negative side, it means that it is extra work that has to be done *on top* of of traditional SLI support(assuming a game that can be played with or without VR).

But in general yes, supporting and marketing dual GPU solutions should still remain a priority for both Nvidia and AMD, especially because of DX12 and VR.

I also think people just need to understand what they're getting into. People buy dual(or more) GPU's and then complain when it isn't widely supported or doesn't work perfectly. But it's a niche setup and it is NOT easy to get it working properly. It typically involves a lot of digging into the drivers on Nvidia's part and a lot of cooperation with developers. Sometimes it's easier than others, where maybe one SLI profile works well with another game, but this wont always be the case by any means.
 
Id rather see well built engines and games that can use multi GPU well, as many games can. This is mostly a bad software issue.

Hopefully DX12 makes that easier on those developers to implement good support.
 
This is mostly a bad software issue.
It's really not. Devs only have so much access to the workings of the graphics drivers. If they come up on issues, there's often little they can do except email Nvidia and be like, "Yo, this aint working, help me out". And then they have to hope Nvidia can actually find a way to help out. It's difficult because this sort of communication to solve these issues is extremely inefficient compared to an in-house or even in-company issue.

DX12 *could* be better because drivers aren't nearly as important. Devs will have a lot more access to how the hardware communicates and how the GPU tasks are performed. The downside is that it's going to be extra work, so I still wouldn't expect any kind of universal support or anything, but hopefully we'll see less 'problematic' implementations, as the devs can hopefully track down issues easier and actually do something about it without being at the mercy of Nvidia/AMD.
 
It's really not. Devs only have so much access to the workings of the graphics drivers.

Please explain how Nvidia are supposed to fix bad-practice rendering techniques (employed by the developer) which, for example, calculate effects temporally, rather than each individual frame, such as the lighting in Arkham Knight?

Developers need to be more careful squeezing every last frame out of underpowered consoles, and consider that these optimizations can have negative repercussions in the PC release.
 
Unless nvidia somehow made single cards perform about 80% better to compensate, then it would be a massive blow to the PC gaming industry.
 
SLI support has already been dropped, by me. And also by plenty of other people who got sick of buying 2 cards and getting the performance of one card in that latest game.

In actual fact, I had 2 copies of that latest game because the "SLI ready" card manufacturer gifted me a game that wasn't SLI ready with each card.

Nvidia can continue to "support" SLI for as long as they like, but I won't. If a game can't run at 60fps @ my current resolution on the fastest single card available then there can only be two solutions:

1 Drop down a resolution.
2 Don't buy the game.

As I game on a ROG Swift paired with a 980Ti... My choice will be option 2. I certainly won't be buying more 980Ti's to achieve the same frame rate, or even worse... Ghosting and flickering and juddering and whatnot.
 
Keep EET!!

As mentioned above until we can get 60fps at 4k on a single card it's a necessity.

What about when 120hz 4k turns up? (although it will need new DP bandwidth so newer cards)
 
Keep EET!!

As mentioned above until we can get 60fps at 4k on a single card it's a necessity.

What about when 120hz 4k turns up? (although it will need new DP bandwidth so newer cards)

I'm no monitor boffin but I'm pretty sure they could do 120hz monitors now if they wanted to. If you recall early 4K displays could be supplied with enough bandwidth by Displayport so they had to use 2 Displayport dongles to get it to work at 60hz (otherwise it was 30hz at 4K). As a result in Windows it detect the screen as 2x 1920x2160 panels side-by-side (multi stream support/MST). I can't see why they can't use MST to drive a 4k at 120hz with 2 displayports at 60hz each.
 
It's really not. Devs only have so much access to the workings of the graphics drivers. If they come up on issues, there's often little they can do except email Nvidia and be like, "Yo, this aint working, help me out". And then they have to hope Nvidia can actually find a way to help out. It's difficult because this sort of communication to solve these issues is extremely inefficient compared to an in-house or even in-company issue.

DX12 *could* be better because drivers aren't nearly as important. Devs will have a lot more access to how the hardware communicates and how the GPU tasks are performed. The downside is that it's going to be extra work, so I still wouldn't expect any kind of universal support or anything, but hopefully we'll see less 'problematic' implementations, as the devs can hopefully track down issues easier and actually do something about it without being at the mercy of Nvidia/AMD.

The reason they have to use hacky driver level messarounds to get multi GPU working is also a software issue because DX11 and its predecessors also don't really support multi GPU, despite 10 and 11 both being released well into the current modern multi-GPU space.

That said, drivers are also software and of late it seems Nvidias are not as up to scratch as they could be either. I think, that boils down to the driver trying to do too much, its become a huge piece of software. Again (hopefully) this is where things like DX12 come in, leaving the driver to be a small slim piece of software that just lets you access the GPU.

Either way we're going off topic, mulit GPU is here to stay and should be. You don't have to buy it, its there as a high end optional extra for those who want to do something crazy, like play at a ridiculous resolution/FPS without sacrificing image quality. Otherwise if you want to stick to single GPU you have to make a sacrifice somewhere. Res, IQ, FPS.
 
Back
Top Bottom