• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD: Future GPUs will significantly boost performance in 4K resolutions

Problem is even cards with 8gb vram won't have the grunt to run 4k at a solid frame rate anyway. 1440p maybe, even then i'd be surprised if it could hold 60+fps in most games without dipping ridiculously low.

Not single cards no, 2 or 3 yes, but you would expect the next gen of cards to bring us closer to 4k gaming on a single Gpu at reasonable fps, especially with the benefits of a low level api in dx12. If coded correctly I mean
 
Not single cards no, 2 or 3 yes, but you would expect the next gen of cards to bring us closer to 4k gaming on a single Gpu at reasonable fps, especially with the benefits of a low level api in dx12. If coded correctly I mean

Yes that is true but i'd rather steer clear as micro stuttering still exists, even though i asked a guy working for AMD about it and he says it was fixed in a newer driver.. D'oh.. simply not true.

Higher frame rates seem useless to me if the play experience has to suffer by not being smooth :(
 
I would imagine dx12 might solve the micro stutter if it does indeed treat all available vram as a single entity to render to. That's my theory anyhow, especially as it sounds like that's what AMD are pinning their hopes on hahaha.

This time round I think I won't buy the Gpu on release and wait for the AIB reviews to get a good picture if I am going to once again buy AMD or swap to the Nvidia flagship cards and monitor
 
How much would 8GB HBM cost anyway?

People have been posting on these forums how the 380/390 not only have to beat the 980 by a sizeable margin, but also have to be £100 cheaper. An 8GB 380 would be prohibitively expensive for most. As an option for enthusiasts it would be great, though.
 
I would imagine dx12 might solve the micro stutter if it does indeed treat all available vram as a single entity to render to. That's my theory anyhow, especially as it sounds like that's what AMD are pinning their hopes on hahaha.

This time round I think I won't buy the Gpu on release and wait for the AIB reviews to get a good picture if I am going to once again buy AMD or swap to the Nvidia flagship cards and monitor

Not sure if it will be fixed, at this rate not for a very long time. Until then though i'd rather stay in the 1080p (possibly 1440P in future) range with a single gpu and just upgrade that every 1-2 years.
 
ideally I'd love the 380x to be a viable option for 1440 gaming on a 120mhz or so monitor, especially when you factor in Freesync and then dx12.

If they can pull that off then not only will a lot of gpus (both AMD and nvidia benefit from dx12 and sync tech) get sold but I would imagine tons of people would make the move from 1080 gaming to 1440 etc. as currently most gamers are 1080p
 
There really isn't what I was saying. I would suspect that it will be similar to how it is now, memory split into multiple channels and each connects to a memory chip, if you cut the bus you cut the memory because you can't just add chips independently. To go from 4-8GB now you just use double density memory chips, same will be true of HBM, use a 2GB instead of a 1GB stack. I would presume it needs to be designed for a specific number of stacks in the same way a 290x needs a specific number of memory chips, capacity isn't hugely relevant, just how they talk to each other, software wise it doesn't really matter which part of the chip you access, just where you send that message.

It shouldn't make any difference in that sense to current cards.

What will be interesting more than anything is the packaging. Remember Nvidia's fake Pascal mock up. With far fewer chips(16 pretty much gone) from the pcb, power delivery all goes to the same point, thousands of traces are gone so routing of everything left over(which are mostly much simply lower trace count power components) becomes trivial and the PCB can become relatively tiny.

HBM should enable really small and efficient pcb designs leaving more room for cooling and more chance of generation to generation compatibility between coolers. It's really only just occurred to me when writing that how damn small a dual gpu card can become... absolutely tiny in comparison to current dual gpu cards. What I really wonder is, can they stick two gpus, and two set of memory all on one interposer and have one package on card with a significantly reduced core to core latency and huge bandwidth connection... could that lead to the end of microstutter... maybe.

Yeah I get what your saying. I was more meaning that at the moment if MSI want to build a card they get the GPU's from AMD and then add in the amount and density memory chips they want 4GB, 8GB etc. Whereas with the HBM type GPU's if MSi want build the same cards they would need to order 4Gb and 8GB GPU's from AMD.

I do like the idea of proper dual core GPU's, just like the old Intel Pentium D's two cores on the same package. :) that could really open up the possibilities of really small dual cards. Even though as you say, even with the existing separate core type cards the lack of needing separate memory chips could make the cards much smaller than currant dual GPU cards.

Imagine a existing dual GPU type card but using two Dual core GPU's, quadfire on a single card.

who knows what they can do in the future.
 
It is nice to see some of usual suspects admitting that AMD currently offer the only truly capable 4K GPU, as they keep pointing out 3.5GB doesn't cut it and neither does 4GB.. 6GB is barely enough then? So the only option then is the 290X 8GB. Good spot guys.
 
It is nice to see some of usual suspects admitting that AMD currently offer the only truly capable 4K GPU, as they keep pointing out 3.5GB doesn't cut it and neither does 4GB.. 6GB is barely enough then? So the only option then is the 290X 8GB. Good spot guys.

4GB certainly doesn't cut it, 6GB is fine for now but 8GB is needed for a touch of future proofing (if there is such a thing). That is assuming you want all the bells and whistles of course.

The Usual Suspects was a top notch film, so I am happy to be placed in that bracket :D
 
you did say you were interested in buying it only about a week ago, after i said, ``you'd only buy the 960 if you we're skint ``;)

Yep and still might. I am not interested in having settings maxed out on a 46" TV and something for light family gaming will do nicely.
 
How much would 8GB HBM cost anyway?

As DM said they are in 1 Gigs stacks so unless they can put 8 on a board we are stuck with 4 Gigs unless they can add GDDR5 aswell so i am not sure, 2Gig stacks are later in the year I thought around Sept, that is what they hoped unless they had a break through.
 
Yep and still might. I am not interested in having settings maxed out on a 46" TV and something for light family gaming will do nicely.

You certainly wouldn't have max settings on a 960 so that would be a great choice :p

For light family gaming a Wii U might be a better bet though, as long as you don't mind AMD Radeon powering the graphics :D

I guess you could get an Xbox One or PS4 instead.. Oh wait..

:p
 
As DM said they are in 1 Gigs stacks so unless they can put 8 on a board we are stuck with 4 Gigs unless they can add GDDR5 aswell so i am not sure, 2Gig stacks are later in the year I thought around Sept, that is what they hoped unless they had a break through.

hehe they could show Nvidia how to do dual speed ram properly...:D
 
the 960 is no good, i'd get a 970 ``B Grade stock``one of the ones that these guys here have RMAd, because for gaming with the kids it'll last you forever
 
I think what we all really want and need to see is something official from AMD as it will decide for most people their next card choice. If it's going to be a let down then myself and many others will just go the Nvidia route, but if it is as good a leap as the 290 was over the previous series then it's AMD and Freesync time

Just need AMD to release some actual real info
 
You certainly wouldn't have max settings on a 960 so that would be a great choice :p

For light family gaming a Wii U might be a better bet though, as long as you don't mind AMD Radeon powering the graphics :D

I guess you could get an Xbox One or PS4 instead.. Oh wait..

:p
There's always the good old PS3...and the original Xbox as an option for Nvidia fans :D
 
Back
Top Bottom