• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

"Overclocks Dream" The Fury X

It's not aimed at higher resolutions it's just a weaker GPU that happens to do well at 4K probably due to the extra memory bandwidth HBM offers, if it was aimed at 4K it would have had more than 4GB for a start.

Slow Manufacturing Process! 8GB will come, AMD even list 4K on the Box!! The HMB needs to be pushed harder for it to perform better! 1080p just isn't feeding the GPU enough for it to make any difference.

1080p gaming stay with Last Gen!
 
Although you are quite right, this is not the right card for people playing at 1080p, you do have to wonder at the business decision to aim the card at such a small niche market.
I mean with AMD having a small and slowly dwindling market share, you aim your new flagship product at a tiny proportion of that already small market and then to confound things they miss out being able to capture those using 4k TV's instead of monitors.
It is a good card for the target audience, but I'm not sure AMD really thought things through.

Maybe the 1080 performance is beyond AMD's control and as it has been suggested, it is a trade off from using HBM, we wont know until there is another chip with HBM, which of course will probably be next year with HBM 2.

I feel they hope 300 series GPUs will also fill in that part of the sales.
 
All testing so far shows its pretty decent at 4K, crossfire scaling also very very good at 4K. 4GB memory has shown no negative impact so far(that I can see?).

Still a bit of a "meh" for me at the moment due to no overclocking, far far too many reports of whinegate and no stock.
 
All testing so far shows its pretty decent at 4K, crossfire scaling also very very good at 4K. 4GB memory has shown no negative impact so far(that I can see?).

Still a bit of a "meh" for me at the moment due to no overclocking, far far too many reports of whinegate and no stock.

Indeed.

AMD really needed to release a good alternative to the 980Ti be it in performance or price or both.

At the moment, I don't know why you would buy (if you even could!) a Fury X when 980Ti's, even 3rd party cooled ones, cost the same money.
 
Can we please steer away from 1080p performance discussion :)

Orangey> Perhaps we are reading too much into it. Maybe its the case of an excited engineer giving more information than the business expected?

I think he was nervous and struggling to ad-lib some verbal fluff. Even Lisa Su flubbed her presentation a bit. Bottom line don't put engineers in front of a mic! This is why Jensen does his presentations in that slow, measured manner.

AFAIK "overclockers dream" isn't on any of the marketing materials.
 
Last edited:
I think he was nervous and struggling to ad-lib some verbal fluff. Even Lisa Su flubbed her presentation a bit. Bottom line don't put engineers in front of a mic! This is why Jensen does his presentations in that slow, measured manner.

AFAIK "overclockers dream" isn't on any of the marketing materials.

While I do agree with you, the thing is even when they put the guy who is suppose to be able to talk about these things, MR Huddy sometimes he still gets it wrong. It is a bit of a no win situation for AMD there, I do really like the fact that they are willing to talk and answer questions on the many various subjects that get brought up, but then they really should make sure they know what the answers should be before hand. It is still better than NVidia's don't seem to want to talk about anything policy though. I know Rev Lebaredian has been on a few video pieces, but it normally in a reactive manor rather than a proactive one.
 
Last edited:
This a draw calls test on latest drivers on my system.
http://www.3dmark.com/aot/42647

People need to forget about the furyx and how it runs at 1080p, this GPU wasn't aimed at being a 1080p card.. It's designed for higher resolutions.. If you buy this for 1080p you doing it wrong.

They is a good reason why it runs better the higher the resolution becomes, it's high bandwidth memory performs better the more demanding it's resolution is.

It's a GPU architecture for the future, stop looking back at the old.

Now what I am really disappointed about is the fact they just isn't any stock... What was the point in giving out early stock with manufacturer issue and the getting that reviewed? Amd didn't do themselves any favours here.

Even if the GPU gets completely fixed they will always be people put off from early review results..
Tbh I would liked if they just waited till it was ready.

The overclocking isn't just down to amd either, we always had to wait for other software to enable this.

Yet it comes with HDMI 1.4a *FAIL*

Yes it may well perform at optimum at higher resolutions. Unfortunately, users of UHD displays are still very much in the minority. Even in the enthusiast market, UHD is still in its infancy.

I like the look of Fury X. But it fails on a number of levels. Aiming almost totally at UHD whilst not providing an output dedicated to a large section of users that game at that res (4k TV) being the most criminal.
 
Yet it comes with HDMI 1.4a *FAIL*

Yes it may well perform at optimum at higher resolutions. Unfortunately, users of UHD displays are still very much in the minority. Even in the enthusiast market, UHD is still in its infancy.

I like the look of Fury X. But it fails on a number of levels. Aiming almost totally at UHD whilst not providing an output dedicated to a large section of users that game at that res (4k TV) being the most criminal.

Even people with 4k HDMI 2.0 TVs is a small minority. At the end of the day its a PC gaming Graphics card! Aimed at 4k monitors..

Am happy to see 1080p performance start dropping off that resolution is long over due! We all need to move on from it.. 1440p is the new 1080p and more and more decent gaming monitors now support it.
 
Even people with 4k HDMI 2.0 TVs is a small minority. At the end of the day its a PC gaming Graphics card! Aimed at 4k monitors..

Am happy to see 1080p performance start dropping off that resolution is long over due! We all need to move on from it.. 1440p is the new 1080p and more and more decent gaming monitors now support it.

..Except we nowhere near have the GPU grunt yet to abandon 1080p. Look at the Witcher 3 maxed out with hairworks - even a 980Ti can only just get a constant 60fps+.

Games will only get more demanding, A single 980Ti or Fury X wont be able to max out all the new games coming out in the next few years at 60fps at 1080p, let alone 1440p.

You are also completely discounting those with 120hz monitors....a minority yes, but so are 4k monitor owners...


Am happy to see 1080p performance start dropping off

This is taking justification for AMD's poor 1080p performance to a whole new level :p
 
Last edited:
Because its aimed at people running DisplayPort! 1440p and above gaming.

For 1080p they is other GPUs out there in AMDs lineup that will do the job just fine and not cost has much.. If you want better than 1080p gaming then you want FuryX..
People need to stop looking at this GPU for 1080p!!!

Then AMD have released a new pair of cards aimed at selling next to no actual product... Well done to them again then!
 
..Except we nowhere near have the GPU grunt yet to abandon 1080p. Look at the Witcher 3 maxed out with hairworks - even a 980Ti can only just get a constant 60fps+.

Games will only get more demanding, A single 980Ti or Fury X wont be able to max out all the new games coming out in the next few years at 60fps at 1080p, let alone 1440p.

You are also completely discounting those with 120hz monitors....a minority yes, but so are 4k monitor owners...

I thought the worst when changing to 1440p, but I was very surprised to note that the difference in frame rate isn't that much different am talking here around 10-20fps depending on game, some games not even a single frame GTA5 for example shows zero change going from 1080p to 1440p.
A single 290 is handling all the games fine so far, I also think freesync is helping here a lot smoothing out frame rate.

So I do disagree If I can get by this much on a single 290 then a FuryX or 980ti is going to be some upgrade for me. guess this is a personal thing though.

This is taking justification for AMD's poor 1080p performance to a whole new level :p

No this was me wanting for a long time to get away from 1080p.. I had to hold out quite long time for 1440p 144hz before I made the switch..

Lets make 1080p die! Its long over due! and anyone questioning is the jump worth it to 1440p "JUST DO IT" :D
 
Last edited:
Even people with 4k HDMI 2.0 TVs is a small minority. At the end of the day its a PC gaming Graphics card! Aimed at 4k monitors..

Am happy to see 1080p performance start dropping off that resolution is long over due! We all need to move on from it.. 1440p is the new 1080p and more and more decent gaming monitors now support it.

I'm all for the higher resolution monitors, The lack of Hdmi2 is fine if you just want to connect the fury into a display port monitor and only game at 1440p/4k.
I connect my 7950 to my av amp and the av amp outputs to either my 1080p monitor or my 42" 1080p tv for a dual purpose gaming/htpc rig whilst bitstreaming audio in games/movies at 4.1. I have been playing with vsr this weekend and the fpx hit isn't so bad as I thought it would be.

If i'm going to invest in a tv and new pc screen and av amp again then they'll be 4k, but I need to do my research on whether the av amps will adopt the latest displayport/hdmi2 and have enough throughput. Currently a single gcard isn't quite powerful enough for 4k. So i'll hold out on 1080p until I feel confident where I need to spend my money.

However that still doesn't excuse the lack of hdmi2 across the whole range of the 300 rebrands and the furys.
''you can buy a displayport adapater blah blah'' no that's not good enough, Amd always used to hold an advantage with screen support and uvd encoding against nvidia, But they've been slightly behind since maxwell,
(the gtx960 is the best htpc for h.265 support at the moment).

If i'm going to invest in a screen and av amp again then it'll be 4k.
However a single gcard isn't quite powerful enough for 4k. So for me hdmi2 isn't an issue until I'm ready to get off of 1080p.

The question is how much of the hdmi2 spec does Nvidia actually support fully?
 
NVIDIA fully support hdmi2.0, however hdmi2.0a has now been released as well and I've not seen anything definitive on whether this will be available via a firmware or driver update or not
 
I thought the worst when changing to 1440p, but I was very surprised to note that the difference in frame rate isn't that much different am talking here around 10-20fps depending on game, some games not even a single frame GTA5 for example shows zero change going from 1080p to 1440p.
A single 290 is handling all the games fine so far, I also think freesync is helping here a lot smoothing out frame rate.

So I do disagree If I can get by this much on a single 290 then a FuryX or 980ti is going to be some upgrade for me. guess this is a personal thing though.



No this was me wanting for a long time to get away from 1080p.. I had to hold out quite long time for 1440p 144hz before I made the switch..

Lets make 1080p die! Its long over due! and anyone questioning is the jump worth it to 1440p "JUST DO IT" :D
I was the same, I expected most games to struggle on higher settings when I upgraded to 1440p from 1080p on my single 290X. However I was pleasantly surprised that I didn't notice much difference at all!
 
I was the same, I expected most games to struggle on higher settings when I upgraded to 1440p from 1080p on my single 290X. However I was pleasantly surprised that I didn't notice much difference at all!

To be fair in your case, you would have had a CPU bottleneck in some games, so your card had more to give.

And it always make me laugh how people say something is overkill for 1080p.

A 5870 wouldn't be coping too well at all anymore, and that was "overkill"......

I personally don't think I'd be happy with a single 290X at 2560x1440. I want more out of my GPU performance in 2560x1080 as it is.
 
Back
Top Bottom