• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Possible Radeon 390X / 390 and 380X Spec / Benchmark (do not hotlink images!!!!!!)

Status
Not open for further replies.
About the 4GB variants:

Everyone moaning about the 4K and multimonitor...and you need more vram for the more powerful GPU...
What about the ppl with 1080p/1440p 120/144hz monitors...they don't really need that much vram, but definitely need the GPU grunt. ( i know 1440p can eat vram as well, but still a lot less demanding than 2160p)
 
It's funny seeing the massive shift in opinion from some folks compared this time last year, or even the year before that.

AMD offer more VRAM than Nvidia (290 vs 780) = VRAM doesn't matter.
Nvidia offer more VRAM than AMD = VRAM is the most important thing ever.

Now the roles are reversed and we have AMD guys saying 4GB is fine at 4K. It would be nice to see some consistency from the fanboys in both camps.

I have a 4GB 980 and the VRAM is plenty for 4K with current games I have played, it's the grunt that's lacking at 4K, not the VRAM. But I have always felt you should get a much VRAM as you can afford to future proof. If the AMD Fury ends up with 4GB I would be reluctant to get one get unless it was quite a bit faster than the 980Ti.

I'm prepared to wait and see how they stack up before I commit. I would hate to pay £530+ on a 980Ti to find it was marginalised by the AMD Fury X in two weeks time.
 
It's funny seeing the massive shift in opinion from some folks compared this time last year, or even the year before that.

AMD offer more VRAM than Nvidia (290 vs 780) = VRAM doesn't matter.
Nvidia offer more VRAM than AMD = VRAM is the most important thing ever.

Now the roles are reversed and we have AMD guys saying 4GB is fine at 4K. It would be nice to see some consistency from the fanboys in both camps.

I have a 4GB 980 and the VRAM is plenty for 4K with current games I have played, it's the grunt that's lacking at 4K, not the VRAM. But I have always felt you should get a much VRAM as you can afford to future proof. If the AMD Fury ends up with 4GB I would be reluctant to get one get unless it was quite a bit faster than the 980Ti.

I'm prepared to wait and see how they stack up before I commit. I would hate to pay £530+ on a 980Ti to find it was marginalised by the AMD Fury X in two weeks time.

Thats a good opinion.
I'm just about thinking of changing my 1080p to 1440p, not even bothering about 4K for years on, so i'm not worried much about vram.
 
8GB would be the sweet spot, but we still don't know how 4GB HBM would actually perform. Those same guys that said the 3.5GB + 0.5GB GTX 970 is fine, don't say the same thing about AMD's 4GB HBM. Funny that :D

Keeping an open mind on these new cards until we see performance / legit benchmarks and reviews.

That chip looks awesome, looks like all of it will be cooled by the custom AIO. Liking the design, now just waiting for results..
 
From what I gather, Shadow of Mordor is one of the only games that really pushes such high VRAM usage, if this is the case then I certainly wont use that as a benchmark for what I need in my gaming life. Because that game is shallow and boring as hell.

Beyond that. I believe the next 5 years of games will pretty much settle around the level of fidelity they are at now. The consoles control the market and Ill be very surprised if anything looks much better than The Witcher 3 within the next few years.

I do hope Im proven wrong.

Im just holding out for some real benchmarks to see how AMDs new cards stand up, then I will buy a card from whichever camp provides the best bang for buck.
 
The Witcher 3 doesn't even look that good, some of the textures are absolutely awful. If nothing looks better than that in the next few years then I'll just buy a console :p

Not sure why people think VRAM requirements are limited by the consoles either, when the 360/PS3 were about games used way more VRAM than they had on PC.

4GB is fine as long as HBM does something to stop it being a limiting factor.
 
An 8gb 390x refresh card for around £300-350 could end up being a real winner for AMD i think.This is the upper price point most are willing to spend,and definatly would have a lot of performance

i agree, especially for lower resolution gamers, and if not noisier, people would be prepared to sacrifice a little green goodness for the piece of mind of card that will last until the end of this consoles generation, knowing full well the loss in value with the next process shrink..

I dont know why amd supposedly dont want to be the value brand, the products that sell millions make more money than the niche ones..
 
8GB would be the sweet spot, but we still don't know how 4GB HBM would actually perform. Those same guys that said the 3.5GB + 0.5GB GTX 970 is fine, don't say the same thing about AMD's 4GB HBM. Funny that :D

Keeping an open mind on these new cards until we see performance / legit benchmarks and reviews.

Same here, I see a few people who flip flop without even a remote sense of irony. Like I said my 4GB 980 is fine for VRAM at 4K, just doesn't have the grunt. Though my preference, all things being similar/equal in performance, is to get whatever card has more VRAM. For me to consider the AMD Fury X at 4GB it would need to be ~10%+ faster. That might not sound like much but it is the difference between 35 and 40 FPS and it is noticeable. To be honest even a single 980Ti is not giving enough grunt to max 4K regardless of having 6GB.

I was largely underwhelmed when I looked at 4K performance for the 980Ti. A definite improvement over the 980 of course but I was hoping for slightly more, hence my wait to see what AMD deliver with the Fury X.
 
Last edited:
The Witcher 3 doesn't even look that good, some of the textures are absolutely awful. If nothing looks better than that in the next few years then I'll just buy a console :p

Not sure why people think VRAM requirements are limited by the consoles either, when the 360/PS3 were about games used way more VRAM than they had on PC.

4GB is fine as long as HBM does something to stop it being a limiting factor.

I do agree the comment on the texture quality in The Witcher.

The thing is though, CDProjekt have pretty much been the only developer that *seemed* to be flying the flag for PC gaming and pushing it to the limits. Since they havent really delivered on what they promised, the chances of any other developers pushing that envelope is very slim.

We will be looking at Batman for the next big good looking game, then I figure down the line it will be Star Wars Battlefront and then some games built on UE4 into 2016.

Obviously 60fps 4k is gonna be the perfect experience and things are exciting in the GFX card market right now. However I think beyond benchmarks, gaming experiences are going to be pretty uninspiring.
 
Advanced Micro Devices has officially demonstrated its first graphics processing unit with high bandwidth memory (HBM) code-named “Fiji”, which is set to power AMD’s next-generation flagship graphics card known unofficially as the Radeon Fury X. But while the company did show its “Fiji” chip, it did not want to show the “Fury” card, reports Leo Waldock, a KitGuru editor, from Taipei.

9cE5ooj.jpg
 
8GB would be the sweet spot, but we still don't know how 4GB HBM would actually perform. Those same guys that said the 3.5GB + 0.5GB GTX 970 is fine, don't say the same thing about AMD's 4GB HBM. Funny that :D

Keeping an open mind on these new cards until we see performance / legit benchmarks and reviews.

That chip looks awesome, looks like all of it will be cooled by the custom AIO. Liking the design, now just waiting for results..

games designed for 2-4gb use.
they design for consoles but witcher 3 shows that 4gb is plenty.
why buy 12gb card when its history in around a year anyhow?
 
I do agree the comment on the texture quality in The Witcher.

The thing is though, CDProjekt have pretty much been the only developer that *seemed* to be flying the flag for PC gaming and pushing it to the limits. Since they havent really delivered on what they promised, the chances of any other developers pushing that envelope is very slim.

We will be looking at Batman for the next big good looking game, then I figure down the line it will be Star Wars Battlefront and then some games built on UE4 into 2016.

Obviously 60fps 4k is gonna be the perfect experience and things are exciting in the GFX card market right now. However I think beyond benchmarks, gaming experiences are going to be pretty uninspiring.

I'd say when both are maxed out GTA V looks better than the Witcher 3 and also requires much more VRAM due to the much better textures. Basically it feels like it's using your GPU properly whereas with Witcher 3 it doesn't.

GTA V reports around 5gb usage for me at 1440p maxed out 8xAA, whether it's using all that I don't know.

they design for consoles but witcher 3 shows that 4gb is plenty.

The reason Witcher 3 makes 4gb seem plenty is because of the awful textures not because it's well optimised. You will see this if they ever release a Ultra/HD texture pack or an enchanced edition like they did with the Witcher 2.
 
Last edited:
I'd say when both are maxed out GTA V looks better than the Witcher 3 and also requires much more VRAM due to the much better textures. Basically it feels like it's using your GPU properly whereas with Witcher 3 it doesn't.

GTA V reports around 5gb usage for me at 1440p maxed out 8xAA, whether it's using all that I don't know.

1440p Ultra on Witcher 3 is awful, 4K medium looks a lot better let alone ultra.

GTA V is meh....
 
Same here, I see a few people who flip flop without even a remote sense of irony. Like I said my 4GB 980 is fine for VRAM at 4K, just doesn't have the grunt. Though my preference, all things being similar/equal in performance, is to get whatever card has more VRAM. For me to consider the AMD Fury X at 4GB it would need to be ~10%+ faster. That might not sound like much but it is the difference between 35 and 40 FPS and it is noticeable. To be honest even a single 980Ti is not giving enough grunt to max 4K regardless of having 6GB.

I was largely underwhelmed when I looked at 4K performance for the 980Ti. A definite improvement over the 980 of course but I was hoping for slightly more, hence my wait to see what AMD deliver with the Fury X.

How do you know if 4 GB is enough VRAM for 4K when your card is massively lacking in grunt at that resolution?
 
games designed for 2-4gb use.
they design for consoles but witcher 3 shows that 4gb is plenty.
why buy 12gb card when its history in around a year anyhow?

I think people buying Titan X is less about the exact 12GB of vram but more about wanting the absolute best single GPU, that's why I bought it for example. I would have been happy with 8GB and a cheaper price tag.

An 8GB Fiji card would be phenomenal, but the 4GB card might surprise us. HBM is new territory and I think we should all wait for legit reviews / benchmarks and actual user experience before people condemn it.

I'm really looking forward to seeing these at E3, and now I know AMD didn't show the Fury I am 100% convinced it beats the Titan X. I will be going back to the red team if this is the case, my relentless pursuit of best single GPU continues through the years :D
 
How do you know if 4 GB is enough VRAM for 4K when your card is massively lacking in grunt at that resolution?

A GTX980 would be enough for most games @ 4K with low-medium settings (no AA) I could run 4K with GTX680s (2GB on low settings ) the older games you can probably max out even with AA.

Lacking grunt would mean anything from 15-35 FPS I suppose ? VRAM bottleneck you are going to be playing "smooth" 25 FPS for example but then get single digit frame drops with GPU usage dropping to 0%.

Parts of it are but textures are streets ahead of Witcher 3 and things like view distance and grass detail are as well. The foliage on Ultra in GTA V is something else.

4udUcG3.jpg

Foilage looks like some noob scribbled in paint for W3, that I agree.
 
Parts of it are but textures are streets ahead of Witcher 3 and things like view distance and grass detail are as well. The foliage on Ultra in GTA V is something else.

4udUcG3.jpg

I've not seen a bush as nice as that in GTA5, if ever. Fnar fnar ;)

Seriously though, not seen anything like that in GTA5, is there only one!?
 
Status
Not open for further replies.
Back
Top Bottom