• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

MSI Afterburner v4.2.0 Released

My Fury Pro seems to run 1440p via dsr fine and that's without freesync. Sure some games need several tweaks but they look as good as they could.

The things that take all the performance seem to have a small effect visually anyway so turning them down doesn't make much of a difference here, A single Fury X should fly at the real 1440p.
 
I'm just thankful that now we can get 980TI OC results vs FuryX OC results rather than having to constantly point out that the FuryX can't be properly overclocked yet.

I've been running dual cards for ages now (570 SLI, 670 SLI, 7950 CF, 290CF, 980 SLI and FuryX CF) and there is definately a part of me that would like to go back to 1 card. Problem for me is I don't want to feel like I'm going backwards either, so need 1 card to be an improvement over my existing mGPU setup or it feels like I'm spending money to get less.
 
With all this voltage control talk it makes me wonder if fury users could actually undervolt their cards and see lower temps and lower energy bills.
 
I'm just thankful that now we can get 980TI OC results vs FuryX OC results rather than having to constantly point out that the FuryX can't be properly overclocked yet.

Those results were obvious on day one. If AMD thought you could overclock it they would have enabled it and used it as a selling point.

They've been hiding it for a reason.
 
Its not really that big a deal they are getting reasonable improvements from drivers anyway. At least nobody with a fury is going to fry their card with overclocking lol.

Tbh for the first time in a while I have just been enjoying playing the games with what I have without noticing that I'm not obsessing over the frame rate.
 
Last edited:
Those results were obvious on day one. If AMD thought you could overclock it they would have enabled it and used it as a selling point.

They've been hiding it for a reason.

Sorry hiding what? I do hope you dont mean allowing Voltage control. because not AMD or Nvidia Allow voltage control.

These things have always needed to be hacked to allow control over voltage. It just so happens that Fury/x use a new design and this needed more time and work to crack.
 
.

These things have always needed to be hacked to allow control over voltage. It just so happens that Fury/x use a new design and this needed more time and work to crack.

For not much in the way of results though on the fury side. Woulda been nice to see it get upto around 1300+mhz to go toe to toe with clocked ti's but unless you happen to be running liquid nitrogen that's not gonna happen =/
 
I'm just thankful that now we can get 980TI OC results vs FuryX OC results rather than having to constantly point out that the FuryX can't be properly overclocked yet.

I've been running dual cards for ages now (570 SLI, 670 SLI, 7950 CF, 290CF, 980 SLI and FuryX CF) and there is definately a part of me that would like to go back to 1 card. Problem for me is I don't want to feel like I'm going backwards either, so need 1 card to be an improvement over my existing mGPU setup or it feels like I'm spending money to get less.

True that and no more excuses when comparing max OC Vs max OC.
 
Closed the thread thanks guys,

feeling **** off at the moment with other issues so not the right frame of mind to make a good decision.

Will chill with a brew and have a think.

If your that annoyed mate, just use the 1 Furyx and the freesync screen, that combo alone is awesome. Freesync is utterly brilliant at what it does anyway, 50fps will feel like double that on freesync. Turn off those osd's and just enjoy a smooth gaming experience. and don't get caught up in numbers. There's no need to get really upset and make some snap decisions, you've got two top end cards and a fantastic monitor. If you need funds, sell one of them and put the money towards the next gen card next year.
 
Last edited:
Sorry hiding what? I do hope you dont mean allowing Voltage control. because not AMD or Nvidia Allow voltage control.

These things have always needed to be hacked to allow control over voltage. It just so happens that Fury/x use a new design and this needed more time and work to crack.

Then maybe they should have listened to the "It's incredibly difficult given the HBM memory is tied to the GPU core voltage" part then.

Seriously how any one can act in any way surprised is pretty LOLable.

As for AMD not allowing voltage control? they allow power limit (well, normally but it doesn't do much of anything on Fury). Same as Nvidia allow the same sort of thing.

If Fury was an overclocker's dream then they would have clocked it far higher out of the box. They had a card to beat remember? not like they were competing with themselves.

So sorry if what I say offends you but that's how I feel. No stupid great expectations mean I'm not at all disappointed.
 
Its not really that big a deal they are getting reasonable improvements from drivers anyway. At least nobody with a fury is going to fry their card with overclocking lol.

Kind of how I feel too. I'm not a big overclocker anyway and my Fury currently runs all the new titles really well with only minor adjustments to the in-game settings.

And as an example turning God rays in Fallout 4 from hi to low does not make much of a difference visually.
 
Sorry hiding what? I do hope you dont mean allowing Voltage control. because not AMD or Nvidia Allow voltage control.

These things have always needed to be hacked to allow control over voltage. It just so happens that Fury/x use a new design and this needed more time and work to crack.

:confused: Precision X allows voltage control, bundled with Nvidia EVGA cards, what hacking is involved here :confused:
 
Doesn't precision X as well as 90% of all that stuff use RivaTuner?

You can change power limits in CCC. You can also heavily overclock and over volt a FX 8 core CPU in there too.

It was obvious from day one this wouldn't overclock. Even more obvious when Joshy from OCUK got his hands on the tools and was showing pretty lame overclocks.

And if that wasn't enough Trixx was just as bad.

I've had my Fury X since its literal launch and I never had any expectations at all. They were all pretty much squashed at launch.

But as others have said with better drivers they are still very, very good cards. Mine has powered through over 60 hours of Fallout 4 @ 4k maxed so it's serving its purpose perfectly :)
 
Aside from poor core overclocks I thought overclocking the HBM gave decent speed increases. I haven't seen anyone post any results in here yet.

Although HBM has more bandwidth, the speed is still only 500MHz which is slow for any games that don't actually require a lot of bandwidth.
 
Last edited:
:confused:Did i get misquoted or misunderstood?

I was asking if Precision X was the same under the hood as things like MSI and Trixx. I know the limits of the Fury X from all the moaning, snide remarks and threads. TBH i would be a little disappointed considering over-clocking was mentioned as a selling point but tbh you can technically overclock them and i am of the opinion that you should never buy a GPU if stock clocks wont cut it for you.
 
Last edited:
Aside from poor core overclocks I thought overclocking the HBM gave decent speed increases. I haven't seen anyone post any results in here yet.

Although HBM has more bandwidth, the speed is still only 500MHz which is slow for any games that don't actually require a lot of bandwidth.

That's probably why performance at lower resolutions is poor compared to the 980ti.
 
^
A good point, could be AMD pulling another Bulldoder where they release a product with the idea of catering to newer tech at the expense of older tech. Bulldozer got a lot of stick on release for having bad clock for clock speeds and because very few games made use of all the cores, scores were as bad or lower than i3s in many cases. The opinion on FX chips some what changed as more cores were being used.

I like to think this is the GPU equivalent but with Memory speed being sacrificed for bandwidth to cater for higher resolutions at the expense of low resolutions.
 
Back
Top Bottom