• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The Fury(X) Fiji Owners Thread

It started because Lokken said that a FuryX was faster than a 980ti at 4K and that they scaled better meaning that FuryX Xfire was consistently faster than 980ti SLI... he didn't say "only using reference cards", or "only using cherry picked AMD supporting games".

OC FuryX are not going to be available, so comparing a non-existent product to ones that actually exist is going to be difficult. That review was "apples for apples" because it compared two real products that you can buy. The 980ti's in that review were not manually overclocked, so saying "its not fair because you can overclock the FuryX" is disingenuous, because you can also overclock the 980ti's that were used.

Yes, I would love to find out how far the FuryX will overclock with voltage, hence why I've ordered one, I would also love to know how a 1535 980ti will compare to whatever the FuryX will manually overclock to. However possible future outcomes aren't directly relevant to an absolute statement like "FuryX is faster than 980ti at 4K", or "980ti SLI cannot beat FuryX crossfire", both of which are objectively not true in absolute terms.

Actually I think I'm the one that said the FuryX was better at 4k and had better scaling. The second statement is true in general, though I'll concede it is game dependent but the overall trend supports it. The first statement is true when comparing stock for stock, and could be true or false comparing max OC to max OC but that comparison is not available to make and I'm not one to simply guess. I don't see OC vs non-OC as a good means for comparison, but like I've said - you can make an argument for it.

So yeah, both statements are not true in the absolute as there are always outliers - yet they are both undeniably true statements.
 
Don't quote me then,

I didn't quote you, i didn't quote anyone.

Where in this post did i quote you or anyone.
http://forums.overclockers.co.uk/showpost.php?p=28428370&postcount=5947

And we can all read the about current whining cards we dont need to keep being reminded about old ones as long as the issue persist there will be new posts by users who have just bought one.
And the HDMI spec is on the box, i dont need you to tell me what's what about it and im done with the worn out topic.
 
Last edited:
Actually I think I'm the one that said the FuryX was better at 4k and had better scaling. The second statement is true in general, though I'll concede it is game dependent but the overall trend supports it. The first statement is true when comparing stock for stock, and could be true or false comparing max OC to max OC but that comparison is not available to make and I'm not one to simply guess. I don't see OC vs non-OC as a good means for comparison, but like I've said - you can make an argument for it.

So yeah, both statements are not true in the absolute as there are always outliers - yet they are both undeniably true statements.

the review I posted includes both stock settings as well as overclocked settings for BOTH 980ti AND FuryX
if the review gets re-done when voltage control is available then it might change things slightly, but as of right now, it is showing apples for apples, both stock settings as well as overclocked

it isn't OC vs non-OC, its stock vs stock vs OC vs OC
the 980ti doesn't really have voltage control either (I get my best OC on stock voltage on mine and I see that quite a lot)

its not a case of outliers, the factory cards are real cards that run at ~1400 out of the box, and are around the same price as current in stock pricing for the FuryX
 
Last edited:
Really does amaze me how everyone and their granny suddenly had this need of hdmi 2.0 when fury didn't support it. Funny how that works. Must have been a mass run of sales on hdmi 2.0 monitors or something i missed.....
 
It was already stated at some point by AMD that they couldn't add HDMI 2.0 because they would have to re-design the board. And considering they were helping create a HDMI 1.4 - DP 1.2 connector, they didn't see a point. At least, that is how I remember it.
 
Really does amaze me how everyone and their granny suddenly had this need of hdmi 2.0 when fury didn't support it. Funny how that works. Must have been a mass run of sales on hdmi 2.0 monitors or something i missed.....

Same thing every time AMD release a new card. The trolls pore over the specs trying to find something they can attack (no matter how insignificant) and then hammer it endlessly.

They go on nvidia forums when they want to complain about 3.5GB, gimped chroma, HDCP, drivers etc etc so it looks like NV is squeaky clean here.

Not to mention the rich ones who buy the enemy's card just so they have a defence against being called a troll.
 
Last edited:
Same thing every time AMD release a new card. The trolls pore over the specs trying to find something they can attack (no matter how insignificant) and then hammer it endlessly.

They go on nvidia forums when they want to complain about 3.5GB, gimped chroma, HDCP, drivers etc etc so it looks like NV is squeaky clean here.

Not to mention the rich ones who buy the enemy's card just so they have a defence against being called a troll.

You hit the nail on the head dude.

I wish it would stop it's toxic to a forum.
 
erm, firefox did buy a FuryX, he also had a 295X2, he does also have a HDMI2.0 TV and kept asking about how to get the 295X2 working... how is he trolling if he's a little bit upset to hear the same thing being bandied about now as when he bought the 295X2
 
Really does amaze me how everyone and their granny suddenly had this need of hdmi 2.0 when fury didn't support it. Funny how that works. Must have been a mass run of sales on hdmi 2.0 monitors or something i missed.....

As mentioned before - 4K TVs used with a PC is roughly as popular uptake as 4K monitors as a good few prefer to have a larger physical screen size to make the most of that resolution and as things stand to get 60Hz at 4K on those TVs 99% of them can only achieve it over HDMI 2.0 - the selection with displayport is extremely limited though starting to grow a little now.

So yeah basically you have missed something - if you look in the monitor section I (and I think andy beat me to a few) made about a dozen replies to threads on the subject before GPUs supporting HDMI 2.0 came out at all - quite a few disappointed people before even the Fury.

For 4K people would be looking at top of the line GPUs so HDMI 2.0 support would be something they'd expect to find.
 
Last edited:
As mentioned before - 4K TVs used with a PC is roughly as popular uptake as 4K monitors as a good few prefer to have a larger physical screen size to make the most of that resolution and as things stand to get 60Hz at 4K on those TVs 99% of them can only achieve it over HDMI 2.0 - the selection with displayport is extremely limited though starting to grow a little now.

For 4K people would be looking at top of the line GPUs so HDMI 2.0 support would be something they'd expect to find.

Then they need to buy the card that supports it natively. AMD are generally slow as hell to get things out to market, anyone remember the mst hubs that were bandied around with the 69 series launch and years later still weren't available?

As this is an enthusiast forum id guess that MOST would be using a monitor regardless.
 
Same thing every time AMD release a new card. The trolls pore over the specs trying to find something they can attack (no matter how insignificant) and then hammer it endlessly.

They go on nvidia forums when they want to complain about 3.5GB, gimped chroma, HDCP, drivers etc etc so it looks like NV is squeaky clean here.

Not to mention the rich ones who buy the enemy's card just so they have a defence against being called a troll.

Is this similar to you having a Nvidia card ;)
 
Back
Top Bottom