• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

PCGH asks Nvidia on Gameworks

You claimed:



Which implies that you think that any problems with the implementation are down to DICE (or any other arbitrary developer for a game using mantle), however given that it is highly likely that AMD must assist with the integration process one cannot exclude blame from AMD in this regard.

I'm not interested speculating in if AMD helped or what parts they were responsible for or if the parts that they helped in are the cause of issues.

There is nothing to go on, which mean it begins and ends purely on ones imagination.
 
Last edited:
Fine, in that case going back to the original point in humbugs post, it's fine for AMD to take the credit for the third parties work right up until there is a problem with it, then it the third parties fault.

Someone wants to have his cake and eat it too.

AMD can take credit that the feature exists for the customers, the 3rd party takes credit for how well they implemented that feature.
 
Last edited:
I'm not interested speculating in if AMD helped or what parts they were responsible for or if the parts that they helped in are the cause of issues.

There is nothing to go on, which mean it begins and ends purely on ones imagination.

Indeed, but that also means that one cannot immediately absolve AMD of any blame and claim that the implementation of Mantle into a game is entirely down to the game developers.
 
I very much doubt that AMD has no hand in helping with implementing Mantle into games, DICE (or, rather, the Frostbite team) would have had help from AMD in implementing the API into their engine.

“Mantle is a new low-level graphics API that we’ve been working very closely with AMD on over the last 2 years

http://www.vg247.com/2014/01/30/battlefield-4-amd-mantle-update-now-live-will-evolve-over-time/

I don't feel it matters but AMD did spend 2 years with Dice/Johan on Mantle's implementation in BF4 and AMD should take credit for it and also criticism where it is warranted.
 
Last edited:
Indeed, but that also means that one cannot immediately absolve AMD of any blame and claim that the implementation of Mantle into a game is entirely down to the game developers.

Implying blame requires something to go on in the first place, absolve come after the fact.
You cant not imply blame because no involvement can not be proved, hence absolve comes after the fact.

The fact is Dice made the Game and AMD made the API and that Dice is responsible for the game and the implementation of the API into the game.

You can what if till the cows come home.
 
http://www.vg247.com/2014/01/30/battlefield-4-amd-mantle-update-now-live-will-evolve-over-time/

I don't feel it matters but AMD did spend 2 years with Dice/Johan on Mantle's implementation in BF4 and AMD should take credit for it and also criticism where it is warranted.

You're getting confused here Greg. AMD/Johan designed Mantle, but AMD had nothing to with how Dice coded the Mantle renderer into Battlefield 4. It just so happens their implementation focuses on the need for a 3gb card or better ideally. Johan confirmed this to me personally on twitter, not that im particularly happy about that. Liken it to if Nvidia have issues with Battlefield 4 microstutter under SLI setups on DirectX, is that microsofts fault? Of course not, it would be Nvidia's fault as they write the driver. Now lets change that up to Mantle, who's fault would it be then? It would be the developer because Mantle uses a thin abstraction layer which means the driver is of minimal importance. This is why when Mantle games come out on frostbite they just work, regardless of driver used. 95% of the work is done by the developer when they code for Mantle, it gives them full control and takes the handbrakes off. Certainly the driver is not responsible for handing the Frostbite memory management implementation.
 
Last edited:
There are only, what? two G-Sync Monitors in a year of that. ^^^



I'm just trying to get away from calling it Free-Sync, that name was dreamed up by Anand and its sort of stuck, thats not what it will be called



If you read what i said you will see that i did not say it was better, i actually agree with you in that i think having Recording and Streaming would have been better in AMD's own app.

However, i do understand that; if they can get a better result by farming it out instead of doing it in house Then i would prefer they farm it out. DVR is still in BETA but having tested it its bloody brilliant, 'like game' recording quality with 0 performance loss and reasonable file size, it could not be any better and it is better than the best pay for stuff out there.

Love the way you fail to answer any of my points head on. Loads of gsync monitors on the way and where is adaptive sync? Still on paper and in people's dreams...
 
Liken it to if Nvidia have issues with Battlefield 4 microstutter under SLI setups on DirectX, is that microsofts fault?

That's not the same situation as here, Microsoft does not assist in implementing DirectX <11 into games/engines anymore (at least not as far as I'm aware) as it is already an established API which has been built into most of the fundamental engines which power the games already. Whereas Mantle is a new API and therefore developers will require the help of AMD in implementing it into their games/engines.
 
Love the way you fail to answer any of my points head on. Loads of gsync monitors on the way and where is adaptive sync? Still on paper and in people's dreams...

And at the start gsync was all talk and "only on paper and in peoples dreams". Takes a while to get things out on the market, not like someone can snap their fingers and boom, shop shelves full of adaptive sync compatible monitors :) it was announced in October of last year, almost 9 months later we're only starting to see monitors trickle onto the market.
 
And at the start gsync was all talk and "only on paper and in peoples dreams". Takes a while to get things out on the market, not like someone can snap their fingers and boom, shop shelves full of adaptive sync compatible monitors :)

G-Sync is here right now however, we don't know how long it will take for adaptive sync compatible monitors to reach the market. Early adopters will always pay a premium, but at least they get to use and enjoy the technology in advance.
 
G-Sync is here right now however, we don't know how long it will take for adaptive sync compatible monitors to reach the market. Early adopters will always pay a premium, but at least they get to use and enjoy the technology in advance.

After almost a 9 Month wait, adaptive sync is supposedly slated in for late this year for reviews and January availability. Whether that happens or not we'll have to wait and see.
 
And at the start gsync was all talk and "only on paper and in peoples dreams". Takes a while to get things out on the market, not like someone can snap their fingers and boom, shop shelves full of adaptive sync compatible monitors :) it was announced in October of last year, almost 9 months later we're only starting to see monitors trickle onto the market.

Gsync has been on the market to buy since December 2013 :confused:

Granted were only just seeing retail none-diy units now, but the technology and displays have been available for public consumption for over 6 months now.
 
Gsync has been on the market to buy since December 2013 :confused:

Granted were only just seeing retail none-diy units now, but the technology and displays have been available for public consumption for over 6 months now.

Maybe for diy units, hardly the same thing as buying a gsync capable monitor which is only now starting to happen. According to the NVidia website the diy kit was only for a specific type of asus monitor and only for a limited time.

Theres a video of the upgrade on this page.

http://www.geforce.com/hardware/technology/g-sync/diy


I'm really surprised that so much inside monitor panels seems to be held in place by tape. :confused:
 
Last edited:
You're getting confused here Greg. AMD/Johan designed Mantle, but AMD had nothing to with how Dice coded the Mantle renderer into Battlefield 4. It just so happens their implementation focuses on the need for a 3gb card or better ideally. Johan confirmed this to me personally on twitter, not that im particularly happy about that. Liken it to if Nvidia have issues with Battlefield 4 microstutter under SLI setups on DirectX, is that microsofts fault? Of course not, it would be Nvidia's fault as they write the driver. Now lets change that up to Mantle, who's fault would it be then? It would be the developer because Mantle uses a thin abstraction layer which means the driver is of minimal importance. This is why when Mantle games come out on frostbite they just work, regardless of driver used. 95% of the work is done by the developer when they code for Mantle, it gives them full control and takes the handbrakes off. Certainly the driver is not responsible for handing the Frostbite memory management implementation.

No, I am not getting confused at all thanks.
 
No, I am not getting confused at all thanks.

You definitely are. It says two years working with AMD to create Mantle. Not two years working between AMD/Dice for Dice to code the Mantle renderer into the frostbite engine. AMD have no control over the frostbite engine, or how the memory management is handled by that engine. They can make suggestions to rewrite it, which they have, but it is up to Dice to do it. If it makes you feel better to blame AMD then go ahead, just pointing out what the score is.
 
Love the way you fail to answer any of my points head on. Loads of gsync monitors on the way and where is adaptive sync? Still on paper and in people's dreams...

So your "of gsync monitors on the way" is on paper and in people's dreams. :rolleyes:

The fact is its had more than enough time to get established and all we have is two.

Free-Sync has every chance to make it, IMO next to G-Sync especially.

G-Sync adds £100 - £150 to the cost of the monitor, that is not £150 thats going to the monitor vendor, no, its going to Nvidia.

So you know what that is, its £150 that the customer does not have to spend on other things, like a new Motherboard while they are Browsing the Asus G-Sync screen.

Free-Sync is of minimal cost to the vendor, that minimal cost maybe reflected in the price, what Free-Sync also does is give vendors a marketing tool to encourage ppl to upgrade their existing screens.
What it does not do is take a huge chunk of a costumers finances going to Nvidia that they could have spent with them.
 
You definitely are. It says two years working with AMD to create Mantle. Not two years working between AMD/Dice for Dice to code the Mantle renderer into the frostbite engine. AMD have no control over the frostbite engine, or how the memory management is handled by that engine. They can make suggestions to rewrite it, which they have, but it is up to Dice to do it. If it makes you feel better to blame AMD then go ahead, just pointing out what the score is.

No, honestly I am not. Did AMD not work with Dice for 2 years then or is that bit I quoted completely wrong?
 
No, honestly I am not. Did AMD not work with Dice for 2 years then or is that bit I quoted completely wrong?

The issue is Mantle’s implementation in the Frostbite engine, which would be Johan/Dice and the team there that wrote the Mantle backend. Johan has admitted this on twitter. Or are you telling me that you know different?
 
Johan's said that they had no help from AMD when implementing mantle?

That'd be stupid if true, given it's the début game for their new API.
 
Back
Top Bottom