• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

****Official OcUK Fury X Review Thread****

Smaller Dev's depend on engines like Unity, There is no reason why when that and other engines get DX12 they can't port ongoing projects over.

Its not about having to start a project as a DX12 project from fresh in DX12 when its here, its simply a matter of porting what you are already working on. Edit: well, its not quite that simple.

The mechanics of it are very different from back when DX10 became DX11.

True, but how easy is it for small developer who has started their project with older version of Unity, that doesn't support DX12. To upgrade their engine to new version and continue working from that?

Though gotta admit also that most games from smaller developers don't seem to be that demanding from hardware, that they will work out well enough on DX11 aswell. But there's always those few that are demanding.
 
dice Johan andersson wanted dx12 with star wars battlefront.
its coming faster than people think

Depends how quickly we see proper DX12 support or developers just adding a basic framework for it so as to be able to tick the box. Adoption will likely be quicker overall with major engines moving to it fairly quickly but IMO we are still atleast a year away from seeing broad DX12 used in a way that will split the difference between the GPUs that are good at DX12 and those that aren't.
 
True, but how easy is it for small developer who has started their project with older version of Unity, that doesn't support DX12. To upgrade their engine to new version and continue working from that?

Though gotta admit also that most games from smaller developers don't seem to be that demanding from hardware, that they will work out well enough on DX11 aswell. But there's always those few that are demanding.

Its probably not going to be that easy and it will take a bit of time, but crucially it can be done. Engine developers are even designing it to work like that, they are already starting to prepare their users for it through incremental updates that change the engine to how DX12 would be architectured.

These updates break your build so you have to fix it, then another which breaks it again... slowly but surely your build is creeping toward DX12.

Its already been going on for some months :)
 
Hmmmm hes pretty much claiming there that the Fury uses aggressive swapping in and out of system RAM when 4GB isn't enough - which means the limit is the 16GB/s PCI-e bus (or worse if your not on a fully functioning PCI-e 3.0 pipeline) and the system memory bandwidth* while trying to put the emphasis on how fast the HBM is which makes no odds when its not the weakest link in the chain.


* Just because you have say DDR3-2400 in dual channel rated at 25GB/s doesn't mean that at anyone one time the system can pump 25GB/s at the GPU - so having DDR4/quad channel, etc. memory setups (i.e. X99) could possibly be an advantage as they'd more often have more bandwidth available to the GPU.

Thanks for the explanation. :)
 
Depends how quickly we see proper DX12 support or developers just adding a basic framework for it so as to be able to tick the box. Adoption will likely be quicker overall with major engines moving to it fairly quickly but IMO we are still atleast a year away from seeing broad DX12 used in a way that will split the difference between the GPUs that are good at DX12 and those that aren't.

Plausible

Mantle test
http://semiaccurate.com/2015/06/24/amds-radeon-goes-premium-with-the-fury-x/
 
Last edited:
Depends how quickly we see proper DX12 support or developers just adding a basic framework for it so as to be able to tick the box. Adoption will likely be quicker overall with major engines moving to it fairly quickly but IMO we are still atleast a year away from seeing broad DX12 used in a way that will split the difference between the GPUs that are good at DX12 and those that aren't.

I'm not sure it will be that long as you forget about Mantle which is similar to Vulkan and DX12 and engine architects have stated they were using it as a springboard to get their DX12/Vulkan paths working well. Since they have experience with this now the transition might be quicker. Also, the free upgrade to Windows 10 makes it much more likely that there will be a significantly larger user base able to take advantage of DX12 than was the case with previous OSs and DX versions.
 
Last edited:
This is certainly quite impressive, particularly Civilization BE with 43% improvement.

Mantle is superb and just shows whats in store for win 10 and dx12 for gamers.
It seems amd with a highly programmable technology with Fury are coming prepared for win 10.
 
1080p performance is poor to say the least.







LOL,so you posted three videos of W3 a Nvidia Gameworks game??
Compared to the R9 390X it has a performance lead.

You do realise in the Crysis 3 video it is neck and neck with the Nvidia cards,right??

Edit!!

With Hairworks off its not to far off a GTX980TI too.
 
Waiting for DX10, a short while for them to give drivers & the prices to drop (for both the G1 or the Fury-X) & for MSI to unlock the voltage would be a smart move to make an informed choice.

Some mantel benchmarks have shown from a 5-20% performance increase over DX11, that combined with potential over-clocking of 10% could see the Fury-X blow the 980ti out of the water, but then again it may not.

While it's no fun waiting when I've got the itch to upgrade now, I want to make the smart choice... (until Pascal next year when I end up doing the same thing again :p).
 
lol there are more games tested, try watching it.

And yeah 1080p performance sucks, if you can't see that you must be blind.

Wow,such EMO DORK RAGE!

But seriously,why is the 4K and 2560 performance within a whisker of the GTX980TI with 6GB of VRAM in two Nvidia sponsered games??

What a horrible performance collapse.

Is Nvidia working on better drivers to improve performance in those videos??
 
Last edited:
It's not me raging and it's not me that's posting BS. Deal with it

You are RAGING - you posted 4 videos and when you realised three did not show what you wanted to know you quickly edited.

So you are admitting you posted "BS" as you described it?? 75% going by your own description.

The fact that you insulting people and swearing means you are annoyed and its psychologically affecting you.

Your the one getting worked up not me.

Do some more RAGING its funny!!

So sweet.
 
Last edited:
Waiting for DX10, a short while for them to give drivers & the prices to drop (for both the G1 or the Fury-X) & for MSI to unlock the voltage would be a smart move to make an informed choice.

Some mantel benchmarks have shown from a 5-20% performance increase over DX11, that combined with potential over-clocking of 10% could see the Fury-X blow the 980ti out of the water, but then again it may not.
.

Fury will own the 980ti down the line.:)
 
You are RAGING - you posted 4 videos and when you realised three did not show what you wanted to know you quickly edited.

So you are admitting you posted "BS" as you described it??

The fact that you insulting people and swearing means you are annoyed and its psychologically affecting you.

Your the one getting worked up not me.

Do some more RAGING its funny!!

So sweet.

I edited nothing and

Wow,such EMO DORK RAGE!

I don't insult people so who's the one worked up?

Now I suggest you drop the BS and keep OT.
 
I edited nothing and



I don't insult people so who's the one worked up?

Now I suggest you drop the BS and keep OT.

MORE RAGING and swearing.

Such techiness. You really get irritable fast don't you? Bless.

Its really psychologically affecting you.

Want to have the last word then??

Get it in,then you can calm down.

Plus,I am the one updating the OP too.

BAHAHAHHAHAHAHHAHAHHAHAHHAHAHHA!
 
Last edited:
Wow,such EMO DORK RAGE!

But seriously,why is the 4K and 2560 performance within a whisker of the GTX980TI with 6GB of VRAM in two Nvidia sponsered games??

What a horrible performance collapse.

Is Nvidia working on better drivers to improve performance in those videos??

:confused: Let's focus on 1080p when the majority of users will be using 1440p and upwards on their £500+ cards. Also these judgements pushed on by Raven are on immature drivers with no voltage control and on Nvidia promoted games. :D

Let him do what he's doing as logic will prevail.
 
Back
Top Bottom