• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Radeon GFX & DirectX 12

The way I'm reading it, with developers being in control of how multi-GPU performs, there won't be any need (or possibly even ability?) to write Crossfire profiles on AMD/Nvidia's side. It's all going to be in the hands of the developers.



As above, sounds like DX12 should bring the end of us moaning to AMD/Nvidia about multi-GPU not working on release day, instead we'll need to moan to Ubisoft/Dice/EA or the development studios themselves. That genuinely worries me and I really hope I'm wrong on this.



Yup. Aside from the graphics of GPUs (it is an AMD presentation, so fair enough) all the actual info is bipartisan and doesn't say it only applies to one side or the other.


Yes it sounds like the era of moaning to Matt for CF profile is going to end...:D
 
meh, split frame rendering has been around since the dawn of multi-gpu gaming, it didn't work well because the workload in different parts of the screen can be vastly different. Nothing has changed, it is just a fundamental limitation. Sure, with DX12 developers probably have better control rather than relying on the drivers to split the screen but that doesn't resolve the situation.

And the shared memory is nice and all except it only has the advantage of preventing duplication of a resource that will only be used by one of the GPUs, which basically never happens. texture and geometry etc are going to be needed on both GPUs because it will be far too slow exchanging resources between GPUs.
This works much better for compute applications.
Sweet sweet common sense.
 
The split frame rendering will be primarily a VR thing where the image is split across 2 screens, but otherwise the content of each image is the same.

I could swear that SFR was first implemented by 3DFX many years ago when they introduced SLI. I remember people saying they had issues whereby the cards would display half the frame at different times!
 
I could swear that SFR was first implemented by 3DFX many years ago when they introduced SLI. I remember people saying they had issues whereby the cards would display half the frame at different times!

It was, the very first SLI implementations all used SFR, including Nvidia. As I said abive, it didn't work then and it wont work well now.
 
Just been on a AMD webinar, when questioned about the lack of new GPU products in the recent past, they suggested that with such strong and regular driver updates they have not needed to keep refreshing products.
 
how tight is the relationship between direct x version and gpu hardware design?

is the ley man to expect that relative comparable DX12 performance on latest gen cards will change significantly from DX11? or be much the same?
 
Yeah, thats true. They seemed to really want to back VR, so it could be interesting to see if/how the new cards will take advantage of VR. They confirmed new cards in the next few weeks, but then we knew that.
 
Just been on a AMD webinar, when questioned about the lack of new GPU products in the recent past, they suggested that with such strong and regular driver updates they have not needed to keep refreshing products.

In it's own sad way, they were right. 290x came to fight against 780ti for starters. After time it has surpassed 780ti performance in most games, and even beating 980 in few games, which is very outstanding for such old card.

Games tend to run very well on AMD, their real problem is in crossfire profiles and games that are more cpu intensive.

But we see things differently, for us 290x isn't product to be desired, even despite it being' very good value for money. It's old, and old ain't sexy and shiny.
 
But we see things differently, for us 290x isn't product to be desired, even despite it being' very good value for money. It's old, and old ain't sexy and shiny.

If that is the way you feel about the 290/x then you wont like the 390/x.
 
Just been on a AMD webinar, when questioned about the lack of new GPU products in the recent past, they suggested that with such strong and regular driver updates they have not needed to keep refreshing products.

At least they have a sense of humour:D
h, they were serious. God help them.:(
 
Back
Top Bottom