• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The Fury(X) Fiji Owners Thread

Yeah, but remember the lower clock speed of AMD overall. If you take that into account, the percentage difference is not much. There is a difference, maxwell does overclock better, thats pretty much a given, but its hardly as huge a deal as some people like to make out.

But you can't really compare clock for clock and if you did, Maxwell would look insane lol. Don't get me wrong, the Fury X is a great card, runs cool and quiet and saved me from putting a block on it. It has actually turned me into a fan of AIO coolers and I wasn't expecting that (I always watercool). I don't really consider 78Mhz to be an overclockers dream either and that was something that was said at E3 by Macri I believe
 
I don't really consider 78Mhz to be an overclockers dream either and that was something that was said at E3 by Macri I believe

I agree, at stock volts its pretty pathetic, I was expecting more, I mean I can get 1120 out of my 290's at stock volts... most fury X reviews I see are around 1120-1150 on stock volts... not much of an improvement at all. I am hoping unlocked voltage can help. If they overdesigned the power delivery etc like they said, hopefully it can help a lot, but right now its pretty meh from AMD... What I am more interested in though is memory overclocking, if what we have seen is correct, 600MHz on the memory is a huge performance increase... kinda funny since generally overclocking memory doesn't help much.
 
No doubt with driver updates, custom BIOS and Volt Control the Fury X will become more exciting for overclocking / benching.

But right now Titan X is king for overclocking and benchmarking.
 
Agreed with both chaps. Better drivers or even Windows 10 will help the Fury X shine more so. New lead coming tomorrow, so fingers crossed I can get stuck into some 1440P testing. 1080P is so meh lol :D
 
Win 10 + new drivers has wayyyyyyyyyyy less overhead, many many videos showing this. Project cars is a good example I saw, 90 fps in windows 8 compared to 140 in windows 10.

Project cars have patch for dx12 stuff.
30-40% boost.
cant have that with witcher 3 though, sadly.

edit: win 10 overall will be better for us PC gamers tho
 
Last edited:
I think people expecting huge gains from DX12 are going to be disappointed unless they are running a older/slow CPU. As seen from my BF4 video, pretty much the same with Mantle and DX11 in frames.
 
I think people expecting huge gains from DX12 are going to be disappointed unless they are running a older/slow CPU. As seen from my BF4 video, pretty much the same with Mantle and DX11 in frames.

I think we start seeing the big difference once games start building from get go with DX12 in mind.. So far Mantle has just been patched in.

But yeah I do* understand were you coming from.

Edit
Sorry Greg that wasn't ment to say Dont!
 
Last edited:
I think we start seeing the big difference once games start building from get go with DX12 in mind.. So far Mantle has just been patched in.

Maybe but yeah I dont understand were you coming from.

I tend to think this as well, Once games are written from the ground up for DX12 we may see the real improvements.
 
Async compute will make quite huge difference. But the real benefit we see mostly on first games is simply that they will be gpu bound most certainly 100% of time.

Games with better effect will follow after period of time. I really can't wait to see how scenes with like 100000+ drawcalls look compared to current 5-20k for starters. Lightning and explosions especially are gonna get tons of more details in them.

Game like Project cars for AMD would benefit massively from DX12, as in it's current state, it's hard to go over 50% gpu utilisation on dx11.

Greg, you got Pcars?
 
I think we start seeing the big difference once games start building from get go with DX12 in mind.. So far Mantle has just been patched in.

But yeah I do* understand were you coming from.

Edit
Sorry Greg that wasn't ment to say Dont!

I tend to think this as well, Once games are written from the ground up for DX12 we may see the real improvements.

Yer could be right with games being patched in. I did manage a run on Sniper 3 with DX12 and Mantle but I had flickering with mantle (which is lead related) but there was an improvement.

Edit:

Dygaza I do have PCARS and one of my most played games but not been on it with the FX yet.
 
Last edited:
I tend to think this as well, Once games are written from the ground up for DX12 we may see the real improvements.

When they start using better Multi GPU methods compared to AFR and Asynchronous shaders, then will DX12 etc gain far better performance.

Async compute will make quite huge difference. But the real benefit we see mostly on first games is simply that they will be gpu bound most certainly 100% of time.

witcher 3 and batman whenever they get it working that is will be patched for win 10 dx12 also. wont expect huge gains but they be there,

all hail windows 10 and Dx12 as Fury will be the great there
 
Well so far my flagship Nvidia card clocks 200Mhz over boost clocks and my flagship AMD cards clock 78Mhz over Base clocks and both on stock volts....

Unless you have that NVIDIA card under a decent watercooling loop, its frequency will drop significantly once it gets hot in graphically demanding games. The Fury X will stay at max clock the entire time provided it's being loaded.
 
GPU Boost 2.0 is predetermined by ASIC quality so there's no possible way he could answer that correctly as boost clocks will vary from card to card. Not to mention, what he's saying isn't true anyway as it's quite easy to create fan curvature to eliminate the card dropping off boost. Maxwell rocks.
 
GPU Boost 2.0 is predetermined by ASIC quality so there's no possible way he could answer that correctly as boost clocks will vary from card to card. Not to mention, what he's saying isn't true anyway as it's quite easy to create fan curvature to eliminate the card dropping off boost. Maxwell rocks.

Absolutely and even with mine still being on the stock cooler, it doesn't drop Mhz when I use a 1:1 fan profile.
 
Back
Top Bottom