• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

**AMD Fiji Thread**

I am reposting this seeing as it to have been missed first time round.

Yes I know that the first chart was discussed and that Whycry made a note of the error on the videocardz article, but the same error is on the main benchmark chart, so something isn't quite right, although I don't know what it is.


unfortunately there are errors in those charts.

Whycry has said there are errors, and he mentions it in the article.



furybench2.jpg


I also notice another in the main bench chart. I've circled both in red.
Obviously something is slightly amiss, as clearly the 980ti does not support mantle and Whycry reckons it not even suppose to say 980ti in the right hand column of the overclock performance chart. Quite what is up remains to be seen.

furybench1.jpg

From what i remember Mantle didn't always give a single Amd card a boost at 4k and sometimes it gave slightly less fps. Maybe in Sniper Elite Mantle was showing Fps gains so they used that Api on the Fury unlike in other games where it was showing no benefit. Some review sites use Mantle as all the game settings are the same bar the api so it's a valid comparison.
 
Last edited:
I have to say I find the price comparrison between the Fury and 980Ti a little silly, they will be very, very close to each other. I think any choice will come down to the actual card itself and the buyers perception of brand value (drivers, support, proprietary tech)

I would say a nice pro's and con's list would be great, hell i'll probably do one myself. But posted on here would result in the usual nonsense. And I have lost enough posts over the last few days due to tidying up :)

If a hybrid 980Ti was available right now, i'd be tempted. Given that they look like they won't be available until next week for probably the same price as the Fury X, ill have a choice to make. I think i'll probably go with a FuryX, i'm excited about the new tech but I must admit I do also see it as a bit of a gamble due to my perception of AMD.
 
I have to say I find the price comparrison between the Fury and 980Ti a little silly, they will be very, very close to each other. I think any choice will come down to the actual card itself and the buyers perception of brand value (drivers, support, proprietary tech)

I would say a nice pro's and con's list would be great, hell i'll probably do one myself. But posted on here would result in the usual nonsense. And I have lost enough posts over the last few days due to tidying up :)

If a hybrid 980Ti was available right now, i'd be tempted. Given that they look like they won't be available until next week for probably the same price as the Fury X, ill have a choice to make. I think i'll probably go with a FuryX, i'm excited about the new tech but I must admit I do also see it as a bit of a gamble due to my perception of AMD.

I think that with the new drivers that AMD have introduced, including the noticable gains for the 390 and 390x in the 15.20 drivers in games with high tesselation, it would be very interesting to see the effect of those drivers on the Fury X and how it will affect its performance vs the 980ti, especially in games where NVIDIA perform strongly.
 
Max settings being max settings. Sorry if I didn't make that clear. As I have a 60hz monitor I cap the frames to 60Hz as tearing is bad over that and 4K monitors are 60Hz so I would do the same again assuming it will tear over that. This is confusing I know but if I can get 60 fps and roughly hold that without too big of a drop I will be happy.

Fury seems made for minfps increase but all games at 4k maxed out is like adding 8xmssa etc..and see fps go down the drain and why even play then?
I think people expect a bit to much from cards vs 4k and such resolutions.
Those I seen play 4k are all going dual or more cards.

I dont as I dont want the dual card issues sli and crossfire has.
With Mantle I could run Bf4 at 115fps at 5040x1050.
I dont kid myself to have one card and then think oh yea I can go maxed out now. and I only use 5mil pixels 4k use 8mil.
thats not playable for me even with a Fury x maxed out.
I adjust settings for the fps I want.
 
I'm concerned the 4GB limit will hurt more than people in here think. Look at these screenshots, they are from current games. They all use over 4GB of memory... at 1440p(!):
http://imgur.com/a/RRjWd#dOBIkQL

And that's a problem that doesn't get solved by adding one or two or three Fury X's in crossfire...

:(

That's because when you have more VRAM games seem to use more of it but don't actually need it. I think it's already been tested.
 
I'm concerned the 4GB limit will hurt more than people in here think. Look at these screenshots, they are from current games. They all use over 4GB of memory... at 1440p(!):
http://imgur.com/a/RRjWd#dOBIkQL

And that's a problem that doesn't get solved by adding one or two or three Fury X's in crossfire...

:(

That means nothing even if AMD were using DDR5.

As far as AMD are concerned it depends on how that HBM memory is accessed and how efficient it is. They explained it in their presentation.
 
A very interesting and detailed article, how hard is to manufacture something like the Fiji GPU the interposer and the HBM memory http://www.hardwareluxx.de/index.php/artikel/hardware/grafikkarten/35776-amd-fiji-wie-der-interposer-hbm-und-gpu-miteinander-verbindet.html

Yup, I think I said in the other thread that people are underestimating what this means for the industry at large let alone AMD and graphics cards.

It's not just TSMC, AMD and Hynix involved here. There is now an established high volume production chain where chips get made in various factories, stuck together(that alone is exceptionally difficult and nothing like sticking chips to a pcb, it's magnitudes more complex), shipped to other facilities to be stick together on interposers. The individual dies on the HBM are produced then stuck together, the interposer and the gpu are made elsewhere then brought together to be stuck together and testing done to ensure the connections are made along the way.

It's a very complex process to go from single gpu being attached easily to a package then easily to a PCB. Now this has been done, multiple companies have proved they can do it, it's ready for more high volume products.

This will absolutely lead to HBM being stuck onto APUs, CPUs, in networking devices, consoles, everywhere really as it's gone from theoretical to an available, working, proven process. After the 'simple' chips which just add some HBM on die(or other more simple chips), we will move into the APU where the CPU, GPU, memory and maybe some DSPs are all made independently and stuck together. Much lower power usage for the same performance, much faster chips for the same power usage or simply enable epically fast chips in the future.

Next gen consoles are going to be some really interesting chips. No more screwing around with ed/esram as MS have been doing, both companies will have probably 16GB of memory with 1TB/s bandwidth or more.
 
That's because when you have more VRAM games seem to use more of it but don't actually need it. I think it's already been tested.

Yes

I really hope you guys are right. We'll know more next week I suppose...

hard to use old tech vs whats different and new that havent been tested out there yet. game developers havent had any reason to ram optimize yet as they just added more Gb to the cards as that was easier to do.
DX12 will help along the way to optimize games better not just for 4gb but later wih the HBM2 which wont have such limitation of size. It wont mean that midrange cards suddenly recive 12gb and developers load more as it add cost.
 
I'm concerned the 4GB limit will hurt more than people in here think. Look at these screenshots, they are from current games. They all use over 4GB of memory... at 1440p(!):
http://imgur.com/a/RRjWd#dOBIkQL

And that's a problem that doesn't get solved by adding one or two or three Fury X's in crossfire...

:(

used memory and needed memory are different. To make a simple example, a main menu system might require certain resources that once you're in-game you don't need. If there is tons of free memory then garbage collection may not clean that up, but if space was tight then it could do so. The same is true of many textures in game etc too.

I'm not saying 4GB won't be a problem, I've no way of knowing yet as it's a new card (all cards handle resources differently so it's not even directly comparable with another 4GB card - we need reviews & users to get their hands dirty)
 

Ha, that's the one I posted earlier.

The tone is his voice from the off and his opening line 'Speculation and guesswork' is all people need to know lol.

An awful video, totally unprofessional from Kitguru imho. The bit where he gets upset at AMD showing their unannounced products to customers. Rather than allowing them to review it before AMD have even been able to announce it at E3. Cringeworthy stuff. Elsewhere today anybody posting this video or defending AMD's stance is being deleted from discuss. So you have these guys whining and slating publicly via social media, followed by then doing the same thing over not received samples after AMD actually had a chance to announce their product. You can see that the websites are trying to steer the narrative. Totally ignorant that they were in the wrong.

AMD are being the bigger man.

 
Last edited:
I'm concerned the 4GB limit will hurt more than people in here think. Look at these screenshots, they are from current games. They all use over 4GB of memory... at 1440p(!):
http://imgur.com/a/RRjWd#dOBIkQL

And that's a problem that doesn't get solved by adding one or two or three Fury X's in crossfire...

:(

There is no need to remove data out of memory until you need that memory. Like cache for the cpu held in memory or in the page file, just because it's there doesn't mean it's required or will ever be used.

None of the games you listed show anything beyond a 1-2% performance difference at 4k/ultra settings on 4GB vs 8GB cards.

You can't compare memory amount being used and assume it's all required. Computes for decades have kept everything in memory as a 'just in case' scenario up to the memory limit(wherever that memory is). 4/6/8/12 GB cards will simply choose to remove not needed memory when they approach the limit, not only when it isn't required.

There are absolutely no games that need 4GB at 1440p. There are only 2 games I can see that suggest they need more than 4GB at 4k/ultra settings and both of those don't show >4GB cards being faster than 4GB cards.
 
Ha, that's the one I posted earlier.

The tone is his voice from the off and his opening line 'Speculation and guesswork' is all people need to know lol.

An awful video, totally unprofessional from Kitguru imho. The bit where he gets upset at AMD showing their unannounced products to customers. Rather than allowing them to review it before AMD have even been able to announce it at E3. Cringworthy stuff.

I watched him say he was about to waffle on with no actual facts and closed the video.

I'll take peoples word for it about the rest.
 
An awful video, totally unprofessional from Kitguru imho. The bit where he gets upset at AMD showing their unannounced products to customers. Rather than allowing them to review it before AMD have even been able to announce it at E3. Cringworthy stuff.

Didn't NVIDIA send out the 980Ti for reviewers before they announced it? The reviews went live half a day before the announcement of the product at the event as well.

I think that's why some reviewers are taken a back by it.

Then again, with the Titan X they revealed it first, and then sent out review samples because it was the top halo product, similar to how Fury X is for AMD.

Would I want reviews already, sure. I can understand why they haven't sent them out yet though, product launch is next week and they probably started sending them out after E3 so reviewers had time before the launch day.
 
Ha, that's the one I posted earlier.

The tone is his voice from the off and his opening line 'Speculation and guesswork' is all people need to know lol.

An awful video, totally unprofessional from Kitguru imho. The bit where he gets upset at AMD showing their unannounced products to customers. Rather than allowing them to review it before AMD have even been able to announce it at E3. Cringeworthy stuff. Anybody posting this video or defending ADM's stance is being deleted from discuss. So you have these guys whining and slating publicly via social media, followed by then doing the same thing over not received samples after AMD actually had a chance to announce their product.

AMD are being the bigger man.


Wow, I've always liked KitGuru. I've never really watched many of their videos but the guy is quite annoying in this one. Talk about an AMD slate fest. He's proper got issues with AMD eh. No suprise of news that they've had their fury review card cancelled by AMD! I actually stopped watching half way through as his whingey voice is annoying as hell!
 
Last edited:
Back
Top Bottom