• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD: Future GPUs will significantly boost performance in 4K resolutions

Did anyone hear about that news where if a game is made to take advantage of DX12/Mantle that vram stacks? So we would finally get 8gb with 2 cards, if true it only leaves micro stutter to be fixed and all would be sweet.

If theory it can be done with Mantle in practice it would be very difficult to do.
 
380X isn't the top end card, that's the 390X. The 380X is AMD's answer to the GTX 980. That's a 4GB card also..

For those that want something higher end, that will be be 390X and GM200..

If the 380x comes out and beats the 290x it will be the current top end card, just like atm the 980 is the current Nvidia top card
 
If the 380x comes out and beats the 290x it will be the current top end card, just like atm the 980 is the current Nvidia top card

Try and forget the release dates and think about the products stacks..

We know these aren't the full parts. GM204 isn't fully enabled Maxwell nor will the 380X be.

I.e GM200 will likely have more than 4GB, Likewise 390X, '290X' replacement is likely to have more than 4GB, or at least various options for VRAM just like 290X.

So those that want something more beefy just have to wait a bit longer for GM200 and 390X..

I suspect that the 380X will be the top card for this year and we won't see the 390X until the middle of next year.

This is only so much you can do with 28nm and I think the 380X will be pushing it to the limit.:)

390X will likely be a fully enabled part on a die shrink. 380X will be interim card to take on GTX 980. Probably re-spun next year on a die shrink.. Hopefully 390X will be released in Q4 in time for Xmas. Nice Xmas present to myself :p
 
Last edited:
Try and forget the release dates and think about the products stacks..

We know these aren't the full parts. GM204 isn't fully enabled Maxwell nor will the 380X be.

I.e GM200 will likely have more than 4GB, Likewise 390X, '290X' replacement is likely to have more than 4GB, or at least various options for VRAM just like 290X.

So those that want something more beefy just have to wait a bit longer for GM200 and 390X..



390X will likely be a fully enabled part on a die shrink. 380X will be interim card to take on GTX 980. Probably re-spun next year on a die shrink.. Hopefully 390X will be released in Q4 in time for Xmas. Nice Xmas present to myself :p

Full part ir not it will most likely as i said the current top card.

There is nothing wrong with users wanting a card less than the full fat premium edition to have more Vram than 4gb, Especially when that card is the current flagship/top model for a vendor.
AMD launch a 4gb 380x 2 weeks later nvidia release a 8gb 980.. kneejerk, What i'd prefer is for AMD to come out of the gate running with a 8gb 380x if possible so its already ahead, AMD make some great cards but they are often trailing/responding to the otherside as you yourself as pointed out 380x to combat the 980


Even gibbo has said hes been asking for a 8gb version of the 980 for months
 
I would be pretty surprised if the full blooded 390X is not released before xmas. As soon as AMD release their 980 slayer (most likely 380X) you can bet your bottom dollar NVIDIA will be ready to release a Maxwell card with some actual high end specs. After that it's the natural mine is bigger than yours wars like when the 290X came out a 780Ti did soon after.

NVIDIA naturally have the upper hand recently because Maxwell is so ridiculously efficient. Just look at the specs of the 980 compared to a GK110 card. There is a lot of headroom for Maxwell. I just hope AMD bring their A game so we can see some real high end cards.
 
Last edited:
Its interesting for sure, I would expect it will mean less chance of bigger memory versions at a latter date as well, as you say it not just a case of sticking higher density or more chips on cards PCB ( like we get with the currant 8GB cards) bigger versions of HBM cards would need to be designed that way from the start.

There really isn't what I was saying. I would suspect that it will be similar to how it is now, memory split into multiple channels and each connects to a memory chip, if you cut the bus you cut the memory because you can't just add chips independently. To go from 4-8GB now you just use double density memory chips, same will be true of HBM, use a 2GB instead of a 1GB stack. I would presume it needs to be designed for a specific number of stacks in the same way a 290x needs a specific number of memory chips, capacity isn't hugely relevant, just how they talk to each other, software wise it doesn't really matter which part of the chip you access, just where you send that message.

It shouldn't make any difference in that sense to current cards.

What will be interesting more than anything is the packaging. Remember Nvidia's fake Pascal mock up. With far fewer chips(16 pretty much gone) from the pcb, power delivery all goes to the same point, thousands of traces are gone so routing of everything left over(which are mostly much simply lower trace count power components) becomes trivial and the PCB can become relatively tiny.

HBM should enable really small and efficient pcb designs leaving more room for cooling and more chance of generation to generation compatibility between coolers. It's really only just occurred to me when writing that how damn small a dual gpu card can become... absolutely tiny in comparison to current dual gpu cards. What I really wonder is, can they stick two gpus, and two set of memory all on one interposer and have one package on card with a significantly reduced core to core latency and huge bandwidth connection... could that lead to the end of microstutter... maybe.
 
Seems like dx12 is going to be a mantle type approach which will also benefit more cores / threads etc, so could finally see more adoption of multithreaded games etc? Hope so as it would also mean gpus are freed up to do more bits n bobs like mantle does.

If I'm correct dx12 is going to be win8 and above? Those with 7 shouldn't worry as you can probably do the free upgrade to win10 anyhow, and I wouldn't put it past Microsoft to not put dx12 on win7 for that exact reason as it's no secret take up of win8 was so poor it's a way to kinda push people to take up win10.

Also if I'm correct dx12 will be compatible with nvidia maxwell cards and amd GCN cards but you may need dx12 cards to get full use of all features as older cards won't access all features available or something?

That's pretty much how I understand it, I'm probably very wrong on a lot of it though.

My hope is that dx12 is like mantle in design, enabling devs to use more threads easily and allow cards to stack ram etc ad the end result would mean better 4k gaming all round and Hopefully better porting of console games
 
4GB
4GB
4GB
4GB
4GB
4GB
4GB

AMD have a great opportunity to exploit nVidia's weakness at 4K and if it is indeed 4GB....Well, what a let down hahahahahaHAHAHAHAH :D

giphy.gif
 
I agree Greg, 4gb would be a massive disappointment, as people have pointed out AMD should push out something truly mind blowing for the 380x, with a minimum of 6gb vram on it, preferably 8gb. Put 4k viable on a single Gpu then it's down to Nvidia to bring out their guns all the while AMD hold the 390x bomb up their sleeves ��

Of course it will never happen, we will get a card that's 20% better than a 290x with 4gb of ram and a lame AIO cooler and power consumption that will make us consider having a wind turbine in our gardens.

Ohh well i will still buy it cry into my pillow at night
 
Last edited:
4GB
4GB
4GB
4GB
4GB
4GB
4GB

AMD have a great opportunity to exploit nVidia's weakness at 4K and if it is indeed 4GB....Well, what a let down hahahahahaHAHAHAHAH :D

This from the same person who cried troll in the other G-Sync thread. Do yourself a favour and grow up. It's tiresome seeing you constantly drag this forum down.
 
Last edited:
I agree Greg, 4gb would be a massive disappointment, as people have pointed out AMD should push out something truly mind blowing for the 380x, with a minimum of 6gb vram on it, preferably 8gb. Put 4k viable on a single Gpu then it's down to Nvidia to bring out their guns all the while AMD hold the 390x bomb up their sleeves ��

Honestly, if this is true, AMD have seriously missed a fantastic opportunity. nVidia "had" the 6GB 780 and nothing else in most people's reach, so therefore popping 4GB (even if it is 512Bit) on a refresh is somewhat lame and 4K needs more VRAM.

I hope and prey the OP is a joke and made for hits, because 4K and a single GPU with all the bells and whistles is a cool notion but it will be trouncing the 4GB in no time.

Edit:

Did I run over someone's cat on the way home from work? I can feel the daggers going in :D
 
I agree Greg, 4gb would be a massive disappointment, as people have pointed out AMD should push out something truly mind blowing for the 380x, with a minimum of 6gb vram on it, preferably 8gb. Put 4k viable on a single Gpu then it's down to Nvidia to bring out their guns all the while AMD hold the 390x bomb up their sleeves ��

Of course it will never happen, we will get a card that's 20% better than a 290x with 4gb of ram and a lame AIO cooler and power consumption that will make us consider having a wind turbine in our gardens.

Ohh well i will still buy it cry into my pillow at night

you will atleast be warm when crying into ur pillow
stop complaining!
:o
 
Honestly, if this is true, AMD have seriously missed a fantastic opportunity. nVidia "had" the 6GB 780 and nothing else in most people's reach, so therefore popping 4GB (even if it is 512Bit) on a refresh is somewhat lame and 4K needs more VRAM.

I hope and prey the OP is a joke and made for hits, because 4K and a single GPU with all the bells and whistles is a cool notion but it will be trouncing the 4GB in no time.

I agree and we have all seen the shoddy console ports hammering our gpus. If I'm designing a card right now im saying look at the uptake on our 8gb models people are picking up for 4k, if we want to sell tons of units we need to adopt that as a minimum. If they are pinning their hopes on dx12 making 4k gaming easier to achieve on 4gb cards that's a bit narrow sited imo
 
you will atleast be warm when crying into ur pillow
stop complaining!
:o

I can't :( been holding out buying a new Gpu and screen as I want a free sync screen and either xfire 380x or 390x. Would love to go 4k but happy to go 1440 @ 144mhz etc. I want something I can slot in and then not worry about it for a couple of years :)
 
I agree and we have all seen the shoddy console ports hammering our gpus. If I'm designing a card right now im saying look at the uptake on our 8gb models people are picking up for 4k, if we want to sell tons of units we need to adopt that as a minimum. If they are pinning their hopes on dx12 making 4k gaming easier to achieve on 4gb cards that's a bit narrow sited imo

Spot on.

I have gamed at 4K and 4GB isn't enough and that's what makes me think the OP is wrong. AMD couldn't be that daft...
 
I agree and we have all seen the shoddy console ports hammering our gpus. If I'm designing a card right now im saying look at the uptake on our 8gb models people are picking up for 4k, if we want to sell tons of units we need to adopt that as a minimum. If they are pinning their hopes on dx12 making 4k gaming easier to achieve on 4gb cards that's a bit narrow sited imo

Problem is even cards with 8gb vram won't have the grunt to run 4k at a solid frame rate anyway. 1440p maybe, even then i'd be surprised if it could hold 60+fps in most games without dipping ridiculously low.
 
Exactly my worry after Thracks latest comments about dx12 and memory pools etc. sounds like the foundations being laid to justify 4gb cards. Of course they might just do 4gb as the reference cards and then push out 6 or 8gb versions later, or hold off and release the 390x as a higher clocked 380x with double the 380x vram
 
Back
Top Bottom