• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Possible Radeon 390X / 390 and 380X Spec / Benchmark (do not hotlink images!!!!!!)

Status
Not open for further replies.
It's definitely a weak argument, I know that. But AMD are purported to release 8 GB HBM variants in August. If 4 GB HBM is enough for 4K+, then why do this? I realise this move doesn't mean HBM = GDDR5 in terms of capacity, but then it comes back to what I was initially saying about AMD's marketing.

Final8y could be right, but I wish AMD was much more vocal about the wonders of HBM. They could at least sway a lot of people ordering 980 Tis. The fact that they aren't is why I assume the 'worst'.

Only a few days to go now though. :)

All is fair in the spirit of OVER 9000!!1"!

I really do hope the 8GB cards don't turn out to be dual GPU cards.
 
It's definitely a weak argument, I know that. But AMD are purported to release 8 GB HBM variants in August. If 4 GB HBM is enough for 4K+, then why do this? I realise this move doesn't mean HBM = GDDR5 in terms of capacity, but then it comes back to what I was initially saying about AMD's marketing.

Final8y could be right, but I wish AMD was much more vocal about the wonders of HBM. They could at least sway a lot of people ordering 980 Tis. The fact that they aren't is why I assume the 'worst'.

Only a few days to go now though. :)

There won't be 8GB variants in August, or at all. There will be an 8GB FuryX2, without doubt - I assume on dual linked interposers, rather than a big interposer, as it's likely to be cheaper doing it that way.

It simply wouldn't be worth it to. Improvements would be marginal (and in a tiny minority of titles), if we assumed bandwidth parity, and it'd cost an arm and a leg both to purchase and to produce.

By the time people seem to think titles needing more than 4GB appear, or become anything more than a ridiculously tiny minority, Vulkan and DX12 will be the standard, which makes it largely moot due to tiled resources and numerous other differences to DX11. But you say greater than 4K, or a new wave of titles with huge texture resolution at 4K ... by the time those come around, a single FuryX isn't going to be anywhere near enough to run said games at more than a slideshow.

4GB is 1) more than enough 2) absolutely the right choice 3) HBM2 is far less of a big deal for 4xx (Arctic Islands) than the shrink to 14nmFF, which should bring gigantic gains 4) HBM2's bandwidth increase will offer far more than the capacity increase 5) if you intend to go CrossFire, all DX12 / Vulkan titles should pool the memory to 8/12/16GB etc.
 
Last edited:
There won't be 8GB variants in August, or at all. There will be an 8GB FuryX2, without doubt - I assume on dual linked interposers, rather than a big interposer, as it's likely to be cheaper doing it that way.

It simply wouldn't be worth it to. Improvements would be marginal (and in a tiny minority of titles), if we assumed bandwidth parity, and it'd cost an arm and a leg both to purchase and to produce.

By the time 4K people seem to think titles needing more than 4GB appear, or become anything more than a ridiculously tiny minority, Vulkan and DX12 will be the standard, which makes it largely moot due to tiled resources and numerous other differences to DX11. But you say greater than 4K, or a new wave of titles with huge texture resolution at 4K ... by the time those come around, a FuryX isn't going to be anywhere near enough to run said games at more than a slideshow.

4GB is 1) more than enough 2) absolutely the right choice 3) HBM2 is far less of a big deal for 4xx (Arctic Islands) than the shrink to 14nmFF, which should bring gigantic gains.
All the evidence point that 4GB vram is the recipe for disaster.
 
There won't be 8GB variants in August, or at all. There will be an 8GB FuryX2, without doubt - I assume on dual linked interposers, rather than a big interposer, as it's likely to be cheaper doing it that way.

It simply wouldn't be worth it to. Improvements would be marginal (and in a tiny minority of titles), if we assumed bandwidth parity, and it'd cost an arm and a leg both to purchase and to produce.

By the time people seem to think titles needing more than 4GB appear, or become anything more than a ridiculously tiny minority, Vulkan and DX12 will be the standard, which makes it largely moot due to tiled resources and numerous other differences to DX11. But you say greater than 4K, or a new wave of titles with huge texture resolution at 4K ... by the time those come around, a single FuryX isn't going to be anywhere near enough to run said games at more than a slideshow.

4GB is 1) more than enough 2) absolutely the right choice 3) HBM2 is far less of a big deal for 4xx (Arctic Islands) than the shrink to 14nmFF, which should bring gigantic gains 4) if you intend to go CrossFire, all DX12 / Vulkan titles should pool the memory to 8/12/16GB etc.

Then why increase cost and put 8GB on card far slower than Fury?
 
All the evidence point that 4GB vram is the recipe for disaster.

Only since NV put more on their cards...since then 4gb is not enough even for super mario


TBH it seems that the verdict is done before the card is out...just need to find something to justify (loud, hot, only 4gb, not beating nv enough...etc)

Reminds me the old movie the "witness" playing in the soviet era at hungary, when a poor man cited to a political trial as witness, and before the trial a secret agent asks him if he wants to read the readied statement....he looks at it and say...excuse me comrade but this is the verdict... :)
 
Last edited:
Then why increase cost and put 8GB on card far slower than Fury?

Marketing mainly. Their enthusiast and mid range cards all have more memory than NVIDIA's then. Their halo product has new memory with massively more bandwidth and a smaller form factor - that gives them two areas or marketing points where NVIDIA absolutely cannot compete with them. Secondly, bandwidth.

All the evidence point that 4GB vram is the recipe for disaster.

All evidence points to this 'evidence' being trolling.
 
Last edited:
I'm guessing you don't play any AAA titles, put all games to the lowest possible settings and are also on a <1080p monitor ?

No games require more than 4GB. None. Zero.

Some will use more if it's available, but don't seem to suffer at all above 3GB.

The only time it's "needed" is cranking up MSAA to stupid levels (mainly on NVIDIA cards) at resolutions where it's not needed anyway.

AAA is usually code for crappy console ports which are barely optimised and will either use or report >4GB without needing it at all. Witcher 3 is a perfect example of this ... but even it, a GW title, tends to run at 3-3.5GB at 4K with 4xMSAA, Hairworks on, and on a Titan X.
 
Last edited:
Its the Chrome finish, covered in greasy fingerprints it doesn't look very nice.

I think a Matt Silver or Brushed Alloy finnish would be better.

I once made the point about fingerprints on chrome and got jumped on for saying it!

There won't be 8GB variants in August, or at all. There will be an 8GB FuryX2, without doubt - I assume on dual linked interposers, rather than a big interposer, as it's likely to be cheaper doing it that way.

It simply wouldn't be worth it to. Improvements would be marginal (and in a tiny minority of titles), if we assumed bandwidth parity, and it'd cost an arm and a leg both to purchase and to produce.

By the time people seem to think titles needing more than 4GB appear, or become anything more than a ridiculously tiny minority, Vulkan and DX12 will be the standard, which makes it largely moot due to tiled resources and numerous other differences to DX11. But you say greater than 4K, or a new wave of titles with huge texture resolution at 4K ... by the time those come around, a single FuryX isn't going to be anywhere near enough to run said games at more than a slideshow.

4GB is 1) more than enough 2) absolutely the right choice 3) HBM2 is far less of a big deal for 4xx (Arctic Islands) than the shrink to 14nmFF, which should bring gigantic gains 4) HBM2's bandwidth increase will offer far more than the capacity increase 5) if you intend to go CrossFire, all DX12 / Vulkan titles should pool the memory to 8/12/16GB etc.

I'm actually wondering if DX12 will help. I played BF4 with Mantle and it increased VRAM requirements so much that I ran out of VRAM on 7950CF @ 1080p. DX11 was fine though.
My worry is that with Vulkan and DX12 the developers will have more control over this stuff. The same developers that do lazy console ports...

Only since NV put more on their cards...since then 4gb is not enough even for super mario


TBH it seems that the verdict is done before the card is out...just need to find something to justify (loud, hot, only 4gb, not beating nv enough...etc)

Reminds me the old movie the "witness" playing in the soviet era at hungary, when a poor man cited to a political trial as witness, and before the trial a secret agent asks him if he wants to read the readied statement....he looks at it and say...excuse me comrade but this is the verdict... :)

I seem to recall the first time 4GB was questioned as not being enough was by AMDMatt just after Nvidia release the 900 series cards with 4GB and AMD released some 290Xs with 8GB. At this point AMDMatt produced a list of at least 10 games that hit VRAM limits on 4GB.
Are we saying now that AMD's new cards have 4GB and Nvidia's have more that Matt was in fact completely mistaken?
I just find it so odd that when the 8GB 290Xs were released that 4GB was already an issue and now the new cards look to have 4GB that it's not an issue now and won't be until AMD release cards with more than 4GB.
I've no doubt that once the 8GB AMD cards are out then it'll be discovered that 6GB isn't enough VRAM...
 
They know they won't sell many Fury's so it doesn't matter how much memory it has, 390X has 8GB because AMD have ambitions to sell lots of them.

4GB might not be absolutely necessary now but the computer industry moves fast and you can bet that it will be less than ideal in 12-24mths, I'm guessing some people can afford to blow £600+ on short term performance but most people will look for longevity at that price.
 
No games require more than 4GB. None. Zero.

Some will use more if it's available, but don't seem to suffer at all above 3GB.

The only time it's "needed" is cranking up MSAA to stupid levels (mainly on NVIDIA cards) at resolutions where it's not needed anyway.

AAA is usually code for crappy console ports which are barely optimised and will either use or report >4GB without needing it at all. Witcher 3 is a perfect example of this ... but even it, a GW title, tends to run at 3-3.5GB at 4K with 4xMSAA, Hairworks on, and on a Titan X.

I know how it depends on your total amount of VRAM and the "extra" frame buffer if you have it available (I have used a vast array of GPUs) but let's not forget most people wanting to drive 1440p or 4K with good FPS (60FPS) are looking at this card or (the small number) of 1080p user that are looking for 120FPS+ or/and futureproofing for newer games.

Adding a second card is going to allow you to have that FPS (60+) on the highest settings with MSAA so why should you need to drop something due to a VRAM bottleneck ? (most SLI/CFX rigs are the medium to high end tier not the low end)

When you shell out cash on a new toy ($XXX if not more) on the highest end I doubt you want to turn down settings.

3GB is borderline up until 1440p (for newer titles) you are going to have to lower quite a few details.

4GB is ok for 1440p (for most games) some you are going to have to sacrifice a detail or two.

4GB is ok in some scenarios for 4K but don't be surprised if you hit that limit when going over medium-high settings (even without AA) in AAA titles.

I also want to add something, the difference moving from SLI GTX970 and CFX R9-290s 4GB to 12GB VRAM TX (single and SLI) has a lot to do with the frame buffer.

More = less swaps = smoother gaming experience
 
Last edited:
If you don't want 4gb don't buy a 4gb card simple as that. There is an 8gb version en route we all know. At least the first one will let us know what kind of raw performance the card has and wether or not it's worth waiting for the 8gb or just buying something else.
 
If you don't want 4gb don't buy a 4gb card simple as that. There is an 8gb version en route we all know. At least the first one will let us know what kind of raw performance the card has and wether or not it's worth waiting for the 8gb or just buying something else.

I'm interested (even if 4GB) for that exact reason ^.

Sadly the Crossfire 4K benchmarks are going to be limited (IMO) of using AA.
 
Last edited:
Reviews will show if that's true :cool: You can be sure reviewers will test it's limits.

All the initial reviews will probably only be allowed to run certain test with certain criteria. That is fairly standard practice with launch reviews.
 
Status
Not open for further replies.
Back
Top Bottom