• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

** AMD BACK IN THE GAME: PRICE DROP EXCLUSIVE TO OcUK!! **

There is a lot of games that require 6GB+ vram on max texture setting even at 1080P.
Rise of the tomb raider, doom 2016, Lords of fallen, shadow of mordor to name few and some games won't let you even enable max texture setting unless you actually have 6GB vram.
Benchmarks don't tell the whole story it's when you stars playing game there are moments where you gonna get terrible stuttering but it's not all the time.
Now maybe some people can live with stuff like that but I can't.

All I can hear here is overkill this overkill that to which I say ********.
For me I need loads of performance buffer so even in those few most demanding moments in games everything stays above 60fps and that's what I call smooth gameplay.

Then the 1060 is not the card for you either.
 
I am really tempted with the fury at £299. Have got a nitro 480 on preorder to replace my ageing 7970 1ghz 3gb card.

I thinking of sticking to my original choice, keep the £50 to spend on games, it's just the delay is sucking a bit, I got an oculus, and my poor 7970 can only just about manage VR. I do wish there was more VR benchmarks out there, been surprised how few they are.
 
i was thinking of the fury nano myself.. potentially as a project to make my first watercooled build, but it's extra money and feels like it would be wasted when i only game at 1080
 
What games?
There's plenty of games unfortunately, The way ram is used up seems to be changing, ROTTR did it with texture file sizes and made the very high setting unplayble even though it could still hit 60 fps plus, Forza will not even allow the very high texture setting to be chosen unless your gpu has more than 4gb's of onboard memory, Another is Xcom2. I'm talking 1080p here not 4k. Personally the way ROTTR did it was, IMO done on purpose through Nvidia 's involvement, Doing it allowed Nvidia to force a settings level difference between there top tier 980ti and AMD's top tier Fury cards making the Fury cards appear inferior. It was a smart move from a business point of view so you can't knock them for it. But it has set the bar higher than it should really be. For example putting the textures in ROTTR to very high with my Fury turns the game into a 70fps stutter fest while turning the setting from very high to high gives me a 70fps smooth experience with a next to zero visual quality difference. That one setting does not change how demanding the game is, or how it looks, It only changes how much video memory the textures need. Same with Forza and others.
 
People do like to say it's not enough, but always go quiet when you ask them for actual examples of games which are causing the Fury range to buckle. Rise of the Tomb Raider is a game which has been held up as an example of one which uses an obscene amount of VRAM. It even comes with its own warning message when you enable Very High textures that you absolutely 100% need a >4GB card for them. So it's perfect to show off the Fury X's crippling minimums as it stutters its way through the game, right?
****! Well, screw all this so-called "evidence" anyway. I know for a fact that 4GB HBM isn't enough and the Fury X is a stutterfest. Take my word for it, guys.

I own a Fury and a 1080p screen and yes the ram has proven to be an issue with several titles,

On release the very high texture setting made the game ROTTR unplayable, I haven't tested it for months but what matters is when you play it.

It appears to be a lot better now dx12 is available but I'd have to redo the bench to see how it does. Actually I'll go and see now and edit in a minute. (I hope it's fixed now for those who've yet to play the game but we'll see).

Forza won't even let me turn very high settings on because my Fury only has 4gb's..

EDIT: https://www.dropbox.com/s/qkkusvtg50od0gq/ROTTR 2016-07-19 10-16-29-69.bmp?dl=0

As you can see, single figure minimums, That was DX11, DX12 gave a mnimum of 0.30 in the Geothermal valley but I could not get a screenshot.
This is today, When the game released and many of us actually played the campaign we were unable to use the very high texture setting with the Fury cards because of how it juddered.
Today I lowered the AA to FXAA to see if it made a difference but no. As you can see perfectly playable averages but dire minimums.
So you can claim people don't back up what they are saying all you want, I own a Fury and it's a great card but I'm not going to sugarcoat the issues.
 
Last edited:
I'm on a 3440x1440 ultrawide and am upgrading from an 8gb 390 to a Fury Nitro. I'm doing this in complete confidence that the 4gb of HBM ram won't be a limiting issue. I've taken the time to research and read up and it's a complete non-issue and likely to remain so for quite some time.

Even if one or two games pop up that do have a problem, just turn texture quality down one notch and things will be fine, you likely won't even notice a difference visually.

So much misinformation, 4GB HBM should not be directly compared to non HBM memory, it performs so much better and can swap into and out of memory at super fast speeds thanks to the huge bandwidth, meaning the memory just doesn't fill up as much as with normal memory.

Even minimal research around the web reveals this, so I don't really understand why some folk are stubbornly saying otherwise.
 
I'm on a 3440x1440 ultrawide and am upgrading from an 8gb 390 to a Fury Nitro. I'm doing this in complete confidence that the 4gb of HBM ram won't be a limiting issue. I've taken the time to research and read up and it's a complete non-issue and likely to remain so for quite some time.

Even if one or two games pop up that do have a problem, just turn texture quality down one notch and things will be fine, you likely won't even notice a difference visually.

So much misinformation, 4GB HBM should not be directly compared to non HBM memory, it performs so much better and can swap into and out of memory at super fast speeds thanks to the huge bandwidth, meaning the memory just doesn't fill up as much as with normal memory.

Even minimal research around the web reveals this, so I don't really understand why some folk are stubbornly saying otherwise.

You won't have a problem as long as you don't have a problem with lowering some settings. I wouldn't of done that upgrade though, I'd of waited for Vega as that's not much more than a side grade, When I got my Fury I did a series of benches comparing it with the 290x it was replacing and there were older games where the 290x matched it (I posted a review of the tests in the Fury pro owners thread last August).
 
Last edited:
You won't have a problem as long as you don't have a problem with lowering some settings. I wouldn't of done that upgrade though, I'd of waited for Vega as that's not much more than a side grade, When I got my Fury I did a series of benches comparing it with the 290x it was replacing and there were older games where the 290x matched it (I posted a review of the tests in the Fury pro owners thread last August).

I'll probably have to lower some settings as want to be as close to 75hz at 3440x1440 anyway.

Everything I've looked at shows the fury about 20% faster than a 390 at 3440 res in most games so I'll be genuinely disappointed if that turns out not to be the case (in general, doesn't have to be in every game) :/
 
Do you genuinely believe that the 1060 will outperform the Fury because it has 2 extra GB of vram?

Not a chance. There will be occasions where settings may need to be different due to memory hogging by the texture files as I'm expecting Nvidia to push more games to do that when they are involved with development to persuade Maxwell owners to upgrade as well as to make there Pascal cards look better than AMD's current flagships. But, as with ROTTR reducing the usage will probably come with next to no visual loss. I certainly wouldn't opt for a 1060 over a Fury.
 
Last edited:
I'll probably have to lower some settings as want to be as close to 75hz at 3440x1440 anyway.

Everything I've looked at shows the fury about 20% faster than a 390 at 3440 res in most games so I'll be genuinely disappointed if that turns out not to be the case (in general, doesn't have to be in every game) :/

I've been planning on getting a 21:9 panel too so I'll be interested in seeing how your card does if you can do a series of tests at some point because I won't be able to afford to upgrade my monitor and gpu at the same time. Plus like you the 75hz seems to be the one to get, There's also some 144hz VA 2560x1080 panels but I think I want to go 1440 to maximise the upgrade.
 
Whilst having more vram is nice, the power of the core gpu is still king over it unless the difference is huge.

If I had a choice I would pick a 4 gig fury x over say a 6 gig 1060.

Usually to save vram one would only need to turn down texture quality and that is sufficient.
 
I've been planning on getting a 21:9 panel too so I'll be interested in seeing how your card does if you can do a series of tests at some point because I won't be able to afford to upgrade my monitor and gpu at the same time. Plus like you the 75hz seems to be the one to get, There's also some 144hz VA 2560x1080 panels but I think I want to go 1440 to maximise the upgrade.

When I get home tonight I'll take a few FPS snaps of a few games using my 390. Won't be benchmark runs but I can easily take a few readings from probably Doom, Evolve and Witcher 3. These are 3 games that I find difficult keeping min frame rates as high as I would like with the 390 in Ultrawide so are good candidates.

Then when I put the Fury in tomorrow I'll take another FPS snapshot from the same spots with the same settings and will report back.
 
I own a Fury and a 1080p screen and yes the ram has proven to be an issue with several titles,

On release the very high texture setting made the game ROTTR unplayable, I haven't tested it for months but what matters is when you play it.

It appears to be a lot better now dx12 is available but I'd have to redo the bench to see how it does. Actually I'll go and see now and edit in a minute. (I hope it's fixed now for those who've yet to play the game but we'll see).

Forza won't even let me turn very high settings on because my Fury only has 4gb's..

EDIT: https://www.dropbox.com/s/qkkusvtg50od0gq/ROTTR 2016-07-19 10-16-29-69.bmp?dl=0

As you can see, single figure minimums, That was DX11, DX12 gave a mnimum of 0.30 in the Geothermal valley but I could not get a screenshot.
This is today, When the game released and many of us actually played the campaign we were unable to use the very high texture setting with the Fury cards because of how it juddered.
Today I lowered the AA to FXAA to see if it made a difference but no. As you can see perfectly playable averages but dire minimums.
So you can claim people don't back up what they are saying all you want, I own a Fury and it's a great card but I'm not going to sugarcoat the issues.

This is all disconcerting to read, especially since I bought the Card. On one hand there are slides showing the 4 gb not to be a problem. On the other hand there are user experiences showing it to be a problem.
I was hoping that this card could easily power a 1440 p freesync monitor :confused:
 
This is all disconcerting to read, especially since I bought the Card. On one hand there are slides showing the 4 gb not to be a problem. On the other hand there are user experiences showing it to be a problem.
I was hoping that this card could easily power a 1440 p freesync monitor :confused:

I'm using 2160P and have not experienced any problems, so I'm sure you'll be fine at 1440P. :)

There are one or two games that impose limitations, but you can work around those and when you do, performance is fine.
 
This is all disconcerting to read, especially since I bought the Card. On one hand there are slides showing the 4 gb not to be a problem. On the other hand there are user experiences showing it to be a problem.
I was hoping that this card could easily power a 1440 p freesync monitor :confused:

IT will easily power a 1440p freesync monitor.

Don't not be concerned at all.
 
Not a chance. There will be occasions where settings may need to be different due to memory hogging by the texture files as I'm expecting Nvidia to push more games to do that when they are involved with development to persuade Maxwell owners to upgrade as well as to make there Pascal cards look better than AMD's current flagships. But, as with ROTTR reducing the usage will probably come with next to no visual loss. I certainly wouldn't opt for a 1060 over a Fury.

That's the point I was making to Dave2150 who claimed he would pick a 1060 all day every day over the Fury nitro because the 1060 has 6GB of vram.

And it won't matter if Nvidia push developers to use huge amounts of ram in games. The settings needed to make running out of Vram a problem for the Fury will be much higher than 1060 will be able to drive anyway.

Come back in two years, the fury will still be the faster card.
 
Back
Top Bottom