• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 4GB of VRAM enough? extremetech article

I can second this! Though i do beleive for the FX it does have an appropriate amount of vram for its grunt as a single card. Though your not wrong for having concerns! How's Mad Max btw? Is that a recommended investment? :p sorry for offtopic lol.

Very impressed with MM at the mo and only cost me £13 from a cheap key place.

Except when you sold one as it wasn't being utilised in games??? :p

:D

Spot on and far too many games had ****ing awful scaling for me and then some **** poor games would negative scale with 3 cards over 2 and then some games just stank horribly with multi GPUs. Too many in fact....***** :D
 
Very impressed with MM at the mo and only cost me £13 from a cheap key place.



Spot on and far too many games had ****ing awful scaling for me and then some **** poor games would negative scale with 3 cards over 2 and then some games just stank horribly with multi GPUs. Too many in fact....***** :D

I rest my (yellow) case :p :D
 
from a business side of things, yes! AMD and nVidia dont focus or make multi gpu users a priority. They also don't make sure they focus on these users when manufacturing their gpus. They focus on people who will be thier main investors.
They make them a higher priority than their likely percentage-of-market would indicate. Nvidia, at least. Not too familiar with how well AMD supports Xfire implementation, but I haven't heard great things.

Nvidia definitely advertises their SLI capabilities quite a bit, and while they probably don't have a massive 100 person team on ensuring SLI support for every game out there, they definitely provide enough support that you cant go around calling the users that take advantage of it 'irrelevant'. They are a smaller minority, but are by no means irrelevant.
 
They make them a higher priority than their likely percentage-of-market would indicate. Nvidia, at least. Not too familiar with how well AMD supports Xfire implementation, but I haven't heard great things.

Nvidia definitely advertises their SLI capabilities quite a bit, and while they probably don't have a massive 100 person team on ensuring SLI support for every game out there, they definitely provide enough support that you cant go around calling the users that take advantage of it 'irrelevant'. They are a smaller minority, but are by no means irrelevant.

So where did you hear not great things about xfire?
Last I checked, AMD xfire scaling is miles ahead of nvidia SLI scaling. And if the game does not support xfire, it does not support SLI as well, since the game does not support multiGPU altogether.

It is absolutely funny how one day high end cards are irrelevant to companies financials, but another day SLI is normal, and apparently most of us do it. But most of us completely forget the fact that majority of those people who do use multigpus, use mid end GPUs, or higher end GPUs once they are past their prime and are cheap to buy. No one is ****ing money.
Look you have people moaning about titan x/980ti prices or 980 being high end out of someones league, so I am sure two of these cards are even more niche than high end cards are.
SLI/xfire makes sense in mid end cards where people cannot afford cutting edge cards and they extend their system life by adding second cheap GPU.

And neither nvidia nor AMD advertise for SLI/xfire specifically. They mention that there is this feature like SLI/xfire support, but they are not advertising as the main features.
Person who actually has enough money to buy high end system, will know what SLI/xifre is without advertisements. And he will decide himself without nvidia or AMD PR.
By the way AMD is talking much more about their xfire scaling than nvidia, to be honest ;)

If xfire/sli setups were so common, we would see a lot more games having at least minimal SLI/xfire support. A lot more.
 
Arkham Knight Beta 2 patch

New VRAM bar at 4k
VincentHannah_ArkhamKnightSettings_beta2.jpg

To be fair it would probably have lower VRAM requirements without all the PhysX stuff enabled, so that might not affect 4GB AMD cards anyway.
 
Arkham Knight Beta 2 patch

New VRAM bar at 4k
VincentHannah_ArkhamKnightSettings_beta2.jpg

To be fair it would probably have lower VRAM requirements without all the PhysX stuff enabled, so that might not affect 4GB AMD cards anyway.

To be fair AMD drivers are not equal to nvidia drivers. Since AMD were so vocal about this thing called magic, where they do little something something in their drivers and vram usage in games drops down.
So if nvidia card is using certain amount of VRAM, it does not necessarily mean that AMD cards will use the same amount at same exact settings ;)

Also loled at beta tag for Arkham Knight :D
 
I've run duel card setups many a time and they're popular. 3 cards and above is when it becomes niche. Now that the cards are becoming a lot more powerful I'd run no less than 6GB cards as the bare minimum.
It all depends on what resolution you run.

I currently have a 1080p monitor and I use 1440 via vsr and my Fury Tri-X is perfect for that.

I plan on moving to 3440 x 1440 at some point and then it wouldn't be enough but that doesn't matter because when I do move to 3440 x 1440 I plan on moving to a 16nm HBM2 card as well.

So for me for now the 4gb of HBM on the Fury is ample. It would be for true 1440p as well.
 
If Nvidia had come out and run tests and said that 4GB was enough (back during the 980 and 290X 8GB days) I'm sure everyone would have dismissed it because it was Nvidia saying it and they're biased.
But when an AMD employee does the same thing it's suddenly taken as gospel? (No offence Matt)

These days I have to take everything Matt says (in public at least) with as much salt as I take anything Nvidia say (in public).

I also believe that at the time (4GB 980 and 8GB 290X) the arguement wasn't just do any current games use more than 4GB at 4K but will any future games use more than 4GB. That's what I think makes this question hard to answer.

I seem to recall that BF4 used more VRAM in Mantle than in DirectX 11. I'm not sure if any other games showed this behaviour. It's possible that with DX12 and Vulkan we could find VRAM usage going up just as easily as going down. If memory optimisation is now being left up to people like those that did the recent Batman console-to-PC port then we probably shouldn't assume they'll do a great job.
 
So where did you hear not great things about xfire?
Last I checked, AMD xfire scaling is miles ahead of nvidia SLI scaling. And if the game does not support xfire, it does not support SLI as well, since the game does not support multiGPU altogether.

It is absolutely funny how one day high end cards are irrelevant to companies financials, but another day SLI is normal, and apparently most of us do it. But most of us completely forget the fact that majority of those people who do use multigpus, use mid end GPUs, or higher end GPUs once they are past their prime and are cheap to buy. No one is ****ing money.
Look you have people moaning about titan x/980ti prices or 980 being high end out of someones league, so I am sure two of these cards are even more niche than high end cards are.
SLI/xfire makes sense in mid end cards where people cannot afford cutting edge cards and they extend their system life by adding second cheap GPU.

And neither nvidia nor AMD advertise for SLI/xfire specifically. They mention that there is this feature like SLI/xfire support, but they are not advertising as the main features.
Person who actually has enough money to buy high end system, will know what SLI/xifre is without advertisements. And he will decide himself without nvidia or AMD PR.
By the way AMD is talking much more about their xfire scaling than nvidia, to be honest ;)

If xfire/sli setups were so common, we would see a lot more games having at least minimal SLI/xfire support. A lot more.


I wish people would stop saying C/F scales better than SLI because the honest truth is they are both as good or bad as each other.

What is interesting is both brands have a number of multi GPU failures but what stands out is if AMD 4 way profiles fail they really fail. A good example of this is the bench I posted earlier in the thread for Atilla total war. This game is very poor on both brands but the difference is this -

TitanXs @2160p maxed will give playable fps

Fury Xs @2160p maxed will give 1.8 fps

A single Fury X @2160p maxed will give 4.5 fps

The above figures show two things

1. The Fury Xs run out of VRAM

2. Although the performance is dreadful in single and 4 way for the Fury X there is also massive negative scaling at work too.

A weak point for AMD has always been very bad 4 way scaling on most of the Total war games. Another example where the AMD cards don't run out of VRAM is RTW2 in this bench maxed @2160p 4 TXs can pass 100fps while 4 AMD cards struggle to get to 20fps.

Yet another new game where AMD have dreadful 4 way scaling is The Witcher 3 where you get

2160p maxed 4 Fury Xs 26fps

2160p maxed 4 TitanXs 80fps

Having said all that before the green team start relaxing, I can also find plenty of games where SLI is not up to the mark.

The bottom line is C/F is no better than SLI for 4 way support and scaling and both brands have their share of epic failures, this is one of the reasons I use both.
 
So where did you hear not great things about xfire?
Last I checked, AMD xfire scaling is miles ahead of nvidia SLI scaling. And if the game does not support xfire, it does not support SLI as well, since the game does not support multiGPU altogether.

It is absolutely funny how one day high end cards are irrelevant to companies financials, but another day SLI is normal, and apparently most of us do it. But most of us completely forget the fact that majority of those people who do use multigpus, use mid end GPUs, or higher end GPUs once they are past their prime and are cheap to buy. No one is ****ing money.
Look you have people moaning about titan x/980ti prices or 980 being high end out of someones league, so I am sure two of these cards are even more niche than high end cards are.
SLI/xfire makes sense in mid end cards where people cannot afford cutting edge cards and they extend their system life by adding second cheap GPU.

And neither nvidia nor AMD advertise for SLI/xfire specifically. They mention that there is this feature like SLI/xfire support, but they are not advertising as the main features.
Person who actually has enough money to buy high end system, will know what SLI/xifre is without advertisements. And he will decide himself without nvidia or AMD PR.
By the way AMD is talking much more about their xfire scaling than nvidia, to be honest ;)

If xfire/sli setups were so common, we would see a lot more games having at least minimal SLI/xfire support. A lot more.

Some exceptions aside multi GPU support isn't "programmed" into a game in the way many people assume - some games use features that don't work well with the way SLI and/or CF work (usually post processing effects that need data from the current or previous screen buffer, etc.) and some have a feature set that happens to be fully compatible with the way SLI and/or CF work and both have different levels of compatibility - so a feature that might throw SLI out might work fine on CF without any tweaks and vice versa. Some game developers will go back and tweak features to maximise compatibility and/or nVidia or AMD will create profiles with work arounds, etc. where possible. So a game could work straight out the box with SLI but not work at all with CF or vice versa in some cases.

Oh and how is windowed mode CF working for people these days?

As someone with a very long history of multi GPU experience with both sides the amount of BS talked about multi GPU is quite amusing sometimes - very obvious some very vocal people have no actual experience sometimes. (Still makes me laugh about the levels of **** I got for calling out the frametime issues on 69xx CF setups and then when PCPer, etc. got in the game was proved to be completely right).
 
Last edited:
So where did you hear not great things about xfire?
Last I checked, AMD xfire scaling is miles ahead of nvidia SLI scaling. And if the game does not support xfire, it does not support SLI as well, since the game does not support multiGPU altogether.

It is absolutely funny how one day high end cards are irrelevant to companies financials, but another day SLI is normal, and apparently most of us do it. But most of us completely forget the fact that majority of those people who do use multigpus, use mid end GPUs, or higher end GPUs once they are past their prime and are cheap to buy. No one is ****ing money.
Look you have people moaning about titan x/980ti prices or 980 being high end out of someones league, so I am sure two of these cards are even more niche than high end cards are.
SLI/xfire makes sense in mid end cards where people cannot afford cutting edge cards and they extend their system life by adding second cheap GPU.

And neither nvidia nor AMD advertise for SLI/xfire specifically. They mention that there is this feature like SLI/xfire support, but they are not advertising as the main features.
Person who actually has enough money to buy high end system, will know what SLI/xifre is without advertisements. And he will decide himself without nvidia or AMD PR.
By the way AMD is talking much more about their xfire scaling than nvidia, to be honest ;)

If xfire/sli setups were so common, we would see a lot more games having at least minimal SLI/xfire support. A lot more.
I've heard it from speaking to lots of people about PC specs and hardware. The gist was that frametimes with AMD cards *tend* to be worse than Nvidia cards in general, and this is compounded in many dual-GPU setups where extra latency is introduced the alternate frame rendering can result in more judder, even if the framerate figures themselves seem alright. Just what I've heard. Haven't tested it out myself.

Anyways, Nvidia definitely do advertise SLI capability. They talk about it in their press conferences, it is on their main site, it is a selling point of certain motherboards, etc. They don't make it front and center of their marketing efforts, but still, it's definitely not an afterthought. You contradict yourself by saying that neither Nvidia nor AMD advertise their multi-GPU technology but then say that AMD talks more about it than Nvidia. So it sounds like AMD do advertise this as well.

I've also not said these users are super common. Just that there's enough people taking advantage of it that they shouldn't be called irrelevant.

Lastly, supporting multiple GPU's is not some trivial task. There's some work going on to provide this support on an engine level in the future, but for now, it's still typically something requiring effort on both the developer and the GPU manufacturer's part for any given app and there's no guarantee that everything will work without hiccups.
 
I've heard it from speaking to lots of people about PC specs and hardware. The gist was that frametimes with AMD cards *tend* to be worse than Nvidia cards in general, and this is compounded in many dual-GPU setups where extra latency is introduced the alternate frame rendering can result in more judder, even if the framerate figures themselves seem alright. Just what I've heard. Haven't tested it out myself.

Not so much the case these days but originally it was a deliberate decision by ATI/AMD that they considered outright performance of "more value" to their customers - which made for an interesting situation where say a CF setup was getting 70fps, SLI only 58fps but due to the frametime consistency differences the CF setup was actually only effectively producing something comparable to say lower 50s fps whereas the SLI setup was comparable to close to the displayed fps.
 
Not so much the case these days but originally it was a deliberate decision by ATI/AMD that they considered outright performance of "more value" to their customers - which made for an interesting situation where say a CF setup was getting 70fps, SLI only 58fps but due to the frametime consistency differences the CF setup was actually only effectively producing something comparable to say lower 50s fps whereas the SLI setup was comparable to close to the displayed fps.
Fair enough if that's the case and things are different now. Like I said, that comment was not based on personal experience, just anecdotal evidence. Wasn't really that important to what I was trying to say, anyways.
 
I wish people would stop saying C/F scales better than SLI because the honest truth is they are both as good or bad as each other.


The bottom line is C/F is no better than SLI for 4 way support and scaling and both brands have their share of epic failures, this is one of the reasons I use both.

When people are saying how scaling is better I think there talking about 2 way as this is what we quite often see in the results and it's only the Captain Sensible types that go 4 way, The amount of 4 way users is probably under 1000 in the country, And I'd be surprised if it was that high, Off course I'm pulling that number out of my butt so I could be very wrong but I doubt it.
 
There are rumours circulating of NVIDIA dropping support for certain configurations anyway...

There is a lot to this that isn't as transparent as some make you believe. Three to four way configurations just aren't supported to a standard that it's even worth recommending. A lot of this analysis doesn't show in a 3 day article and same testing methodology. This is something that needs a much wider timeline and result comparative to really get a sensible scope for just why these configurations aren't worth the investment.

I am talking about a heavy usage period going back to 4 to 5 years. When SLI was worth upselling by vendors these things had the devotion they required. Today is a different business. Talking about which vendor scales better than the other and listing the failings of both serves no purpose. The way of the dinosaur is edging closer. We all like to think there is a market for such configurations, but NVIDIA at least does not, and it won't be before long that this becomes a reality.
 
There are rumours circulating of NVIDIA dropping support for certain configurations anyway...

There is a lot to this that isn't as transparent as some make you believe. Three to four way configurations just aren't supported to a standard that it's even worth recommending. A lot of this analysis doesn't show in a 3 day article and same testing methodology. This is something that needs a much wider timeline and result comparative to really get a sensible scope for just why these configurations aren't worth the investment.

I am talking about a heavy usage period going back to 4 to 5 years. When SLI was worth upselling by vendors these things had the devotion they required. Today is a different business. Talking about which vendor scales better than the other and listing the failings of both serves no purpose. The way of the dinosaur is edging closer. We all like to think there is a market for such configurations, but NVIDIA at least does not, and it won't be before long that this becomes a reality.

With dx12 games coming along soon multi gpu configs are more at the hands of the developers as far as support goes. It's going to be interesting to see how the big developers handle this as it could make or break 3-4 way setups. I think 2 way will be well served but we shall see.
 
Back
Top Bottom