• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The Fury(X) Fiji Owners Thread

Still they sell every GPU they ship. And once stock gets stabilised prices should normalise as well, so more people see it as better value. Now even with somewhat inflated prices, they are still being sold out.
 
Still they sell every GPU they ship. And once stock gets stabilised prices should normalise as well, so more people see it as better value. Now even with somewhat inflated prices, they are still being sold out.

Took them 12 days to sell 8 XFX Fury X's dude. Everyone's **** scared of getting a whiner!!!
 
Took them 12 days to sell 8 XFX Fury X's dude. Everyone's **** scared of getting a whiner!!!

I guess this is what they mean by Nvidia owners jumping into AMD threads to spread **** and take the ****. Then they deny that they do it and it all starts over again.

JediFragger why don't you go read up on some Asynchronous Shading and troll in an Nvidia thread. Tiz getting tedious now.... :rolleyes:
 
Last edited:
I guess this is what they mean by Nvidia owners jumping into AMD threads to spread **** and take the ****. Then they deny that they do it and it all starts over again.

JediFragger why don't you go read up on some Asynchronous Shading and troll in an Nvidia thread. Tiz getting tedious now.... :rolleyes:

+1

Cue reply crying about fanboy accusation, "I buy both", listing gfx card history etc etc... yawn.
 
So in past couple of days was toying around with luxmark 3.1.
And since I had some time, and was brainfarting, I thought hey let's push one of the cards to the limit.
Final GPU clock(through CCC) 1197Mhz with obviously stock volts and +50% power limit (power limit +35% crashed the drivers during one of the runs). End results:couple of best results in luxmark leaderboards, and coming quite close to superclocked 980ti (ran in linux). Though it has been reported, that nvidia's new suggested compiler options and optimisations are causing some artifacting in real world rendering, but they pass the 3 scenes in benchmark, so have no problem with that.
I was experiencing quite a bit of throttling, I am suspecting VRM, since the GPU itself was under 60C all the time. I do remember someone theorize that VRM cooling might not be perfect (bad contact or something). I kinda agree with it, but future will tell how it is. Because clock scaling was not the best. Also, clocking fury x, now is a bit easier, I mean it clocks a bit higher. During 1st weeks of testing I could stably run only 1113Mhz. Now I see myself clocking GPU to 1129-30Mhz range. Not a lot, but still. Maybe it is colder weather, drivers, or something else.

But I guess TLDR, let's get back to usual 10 year old stand up comedy central ;)
 
Lost me? And I was only trying to be helpful by the way.

No, to the luxmark being good opencl benchmark.
And kinda no to valeys/heavens/firestrikes being a good benchmarks. Though you probably meant in regards to GPU clock stability, then yes, firestrike would open a black hole in my system if I ran it with fury x at 1197Mhz ;)
Regarding luxmark being not good opencl benchmark: when nvidia's suggested optimisations start rendering artifacts, and AMD fury x can be clocked very high during its run, it shows that it is far from the best opencl benchmarks around (though support and enthusiasm from devs is amazing over there) ;)
but at least we are not using luxmark 2.0 like some review sites or ratGPU :D
 
same for you, man. If you look at all of your posts here in this thread you will see that 90% of them are: 980ti sold 15k, while fury zero. 980ti is better buy than fury. whining fury...

All true facts I'm afraid ;)

(except I never said Zero, just quoting facts given by OcUK)

Gaming sites confirm that the Ti IS the better buy, and has more ram. It's a no brainer in my eyes, but thankfully people get a choice!!
 
Back
Top Bottom