• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

** DUAL GPU GREATNESS & 16GB RAM!! **

The 16GB memory part is misleading. That card can never use more than 8GB.

Might as well sellotape an 8GB stick of RAM on it and call it a total of 24GB memory. :D

But doesn't DX12 have a way that game developers can make the memory on dual GPU systems be accessed independently, so in this case 16GB would be available, rather than 8GB in DX11 and lower?
 
The power usage should be about 800-850W if they are recommending a 1000W psu. TitanX SLI take about 750-800W for comparison.

The price is pretty good considering you would get very good 4K performance for a price lower than a TitanX and comparable to a 980TI OC.

edit: didn't see that you get a Razor Ouroboros mouse worth about £100 too. Great deal which makes the card about the same price as a 980TI.
 
Last edited:
But doesn't DX12 have a way that game developers can make the memory on dual GPU systems be accessed independently, so in this case 16GB would be available, rather than 8GB in DX11 and lower?

I think so, though ultimately I believe it's down to the developer.
 
The 16GB memory part is misleading. That card can never use more than 8GB.

Might as well sellotape an 8GB stick of RAM on it and call it a total of 24GB memory. :D

It really isn't misleading at all, physically the card DOES have 16 gigs of vram on it. Its upto the end user to know that sli or crossfire on a single card implementation ends up with the ram being mirrored and its essentially half what the box states. And that's what reviews etc are for :)
 
But doesn't DX12 have a way that game developers can make the memory on dual GPU systems be accessed independently, so in this case 16GB would be available, rather than 8GB in DX11 and lower?

Yes to the first part but no to the second part. Two GPU's rendering more or less the same scene or parts thereof will still need access to mostly the same textures. There might be some savings but it will be nowhere near an effective doubling of useable vram.
 
But doesn't DX12 have a way that game developers can make the memory on dual GPU systems be accessed independently, so in this case 16GB would be available, rather than 8GB in DX11 and lower?

They can, but of we're being realistic about it, by the time such an approach becomes common place this card will be long obsolete.
 
A bit surprised they have only used the 390 and not the 390x.
Maybe AMD didn't allow them too because it might of shown up the FuryX2 in certain situations lol.
 
It really isn't misleading at all, physically the card DOES have 16 gigs of vram on it. Its upto the end user to know that sli or crossfire on a single card implementation ends up with the ram being mirrored and its essentially half what the box states. And that's what reviews etc are for :)

It's misleading for sure. Like I say, if you Selloptape an 8GB stick of RAM onto it would "physically have 24 gigs of RAM..." it would be up to the end user to see the sellotape.
 
Anyone on this board and whoever is going to spend that much should know what this card is. It's only misleading to the non-enthuasiasts.

You have no idea how often people come on and ask that very question or say that they bought an X2 card "because it has double the ram".
They should know better, but they don't always.
 
It's misleading for sure. Like I say, if you Selloptape an 8GB stick of RAM onto it would "physically have 24 gigs of RAM..." it would be up to the end user to see the sellotape.

As said anyone that's gonna buy this card would already know about the mirroring done in single card crossfire configs. The card does have 16 gigs onboard so its advertised correctly.
 
More or less every dual GPU card has been marketed this way, just be thankful that they don't add the clock speeds together like some not so reputable CPU resellers do. :D
 
Every dual gpu from either company has been marketed this way so this is nothing new. People on here will moan about anything these days.

More or less every dual GPU card has been marketed this way, just be thankful that they don't add the clock speeds together like some not so reputable CPU resellers do. :D

Beat to it.

You have no idea how often people come on and ask that very question or say that they bought an X2 card "because it has double the ram".
They should know better, but they don't always.

Yea i had to tell a guy rocking 2 gtx690's that he never had 8gb of usable vram a few months back. This guy has been gaming on PC's for a good 15-20 years so should know better. It's up to the end user to do there research before buying and to me if they get burned it's there own fault. How hard is it to research these days or come to a place like this and ask people some questions.
 
Last edited:
Back
Top Bottom