• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD's Revolutionary Mantle Graphics API Adopted by Industry Leading Game Developers Cloud Imperium,

Status
Not open for further replies.
Anyone else bored of all the talk and want to see some "Before and After" results?

Nope. Me personally im loving the talk and dreading the day the results come as that will mean no more talk. I only come here for the talk.

EDIT

I suppose you could talk about the results, but thats not quite the same.
 
It'll depend on the CPU and graphics card and game. There won't be any simple figures to look at really, as it's not speeding up DirectX but replacing it, and using the CPU in a different way on top.
 
Nope. Me personally im loving the talk and dreading the day the results come as that will mean no more talk. I only come here for the talk.

EDIT

I suppose you could talk about the results, but thats not quite the same.

There will be plenty more talk when Nvidia panic and start shouting about their competing solution. :D

Going against Carmack's wishes. ;)
 
There will be plenty more talk when Nvidia panic and start shouting about their competing solution. :D

Going against Carmack's wishes. ;)

Joking aside, I can see this happening. Nvidia are not ones to take things laying down and I am sure something will be coming out of Team Green.
 
Does anyone think AMD and Nvidia are risking splitting the pc gaming into two with all there proprietary technologies I.e. we could end with Radeon PC's for mantle and true audio and geforce machines for GPU sync and shadow play/physx?
 
Does anyone think AMD and Nvidia are risking splitting the pc gaming into two with all there proprietary technologies I.e. we could end with Radeon PC's for mantle and true audio and geforce machines for GPU sync and shadow play/physx?

AMD have said that Mantle will be open to all GPUs eventually. We should give them time to fulfil that so it probably won't be proprietary. But if it proves to be overwhelmingly successful who could really blame them for keeping it in house?I wouldn't but it would go against everything they've said.

Their best move IMO is to open it up. There's no way nVidia will adopt it - they're too stubborn. I don't say that as a criticism of them but I can't see in a million years them adopting AMD tech. The more likely outcome is that they push something similar themselves. This probably would be proprietary.

Edit: in case my position is misconstrued - I have nothing against proprietary tech. In truth I have no interest in the ethics of that. I'm more of a selfish buyer - what's the best I can get and offers me the best overall solution... proprietary tech falls into that consideration but not the fact it's proprietary itself. I couldn't care less :).
 
Last edited:
Does anyone think AMD and Nvidia are risking splitting the pc gaming into two with all there proprietary technologies I.e. we could end with Radeon PC's for mantle and true audio and geforce machines for GPU sync and shadow play/physx?

No.

If either AMD or nVidia made something that locked out the other and was a CRITICAL part of being able to play the game developers would simply not design engines to make use of it.

Developers want £££, locking out a large segment of their market does not £££ make.
 
I thought Nvidia did offer PhysX to AMD but they turned it down?
Also, has it been said that G-Sync is exclusive to Nvidia?

nVidia has stated in press interviews that they have locked G-sync down to their hardware, and they STATED openly, that they patented certain elements to block AMD from developing a similar solution. They also stated clearly they had no intention of licensing to AMD.

But I stick by my opinion that either one of the vendors controlling this basically makes it a bad idea for the other one to adopt it.

I'm wondering how many people here would have the same opinion about PhysX, G-Sync and Mantle if things were reversed. Say AMD had PhysX and G-Sync and Nvidia had Mantle. And consider things were as open or locked as they currently are. Would PhysX still be despised? Would G-sync still be pointless and would Mantle still be the answer to all of the worlds problems?

If they situation was exactly reversed, then yes, AMD would be being hated, and nVidia would be being congratulated. However, that would never happen. AMD has a history of making open technologies, and nVidia has a demonstrable history of locking stuff down and then taking underhand measures to limit performance of AMD.
 
^^ Just out of curiosity, what has AMD made 'open' historically? I can only think of TressFX.

AMD have typically always thrown itself behind open source solutions eg support for Havok and Bullet physics engines, tri def & HD3D support, Open GL compute language. I'm sure I'm forgetting something else here but AMD designs its hardware so it can reap the benefits of these technologies where as although Nvidia can support things like Open GL its video cards are setup for its own cuda language which runs best when using a geforce/quadro card.
 
^^ Just out of curiosity, what has AMD made 'open' historically? I can only think of TressFX.

Off the top of my head, gddr 3/4/5 were developed before being handed off to JEDEC, they've recently come up with a standard for 4k and given that up to Vesa again to push the industry forward.

Sure AMD would have had a tough time indeed persuading any memory company to produce gddr3/4/5 en mass without releasing the spec, but they still did this, this was fundamental work that has benefited basically everyone in the entire industry, certainly all gamers.

HSA is about the single biggest industry standard/direction to be come up with, maybe ever. With only two stead fast non joiners(Intel and Nvidia).

There are more, I can't remember all of them, they often push the work of new features in DX and have done for years with dx9/10/11 features before Nvidia but it's about shaping and pushing for those features the 2-3 years leading up to a new DX release. Nvidia are quite the status quo company who are exceedingly rarely first to new tech. This generally leaves AMD/ATi at a disadvantage. Tessellation was an AMD push, they put it in hardware, had they not done this first it would never have become used as much as it can be today. yes the initial tessellation engines were tiny and very underpowered, so what. As Nvidia shows today a hugely overpowered engine makes no IQ difference. Nvidia went out of their way to complain/force MS to drop tessellation from the DX spec meaning AMD had this in hardware taking up die space, and couldn't be used. It was tiny back then but it probably had enough power to smooth out player character edges which would have been better, and is still the most obvious improvement from Tessellation now because it's not surprisingly something always in view. Or in an FPS using it on a gun, anything big and up close looks worse than things far away.

But these features often show up in AMD hardware first, then get adopted, then get pushed forward.

What the big thing is, looking for the places AMD or ATi before DIDN'T screw Nvidia. They'd have paid more but could have locked Nvidia out of gddr 3/4/5 and really hurt Nvidia, they didn't. Every single gaming evolved title could have AMD only features, they could pay to remove AA, or just about anything for Nvidia users but they don't. They could have locked in TressFX but they didn't. They could push to hurt performance on Nvidia cards but they don't.

It's the attitude displayed time and time and time again by AMD, and the reverse attitude is displayed over and over by Nvidia.

But it's not just Nvidia, lightning is NOT better than other standards but Intel went out of their way to patent something, push a tech that doesn't hurt AMD users, it only makes them more money for something that is no better. Thankfully lightning seems to not have made real headway, though plenty of Apple users(another company with the same general attitude) has been charging it's users through the teeth for lightning cables. I don't think any other OEM is willing to do so.

I honestly just don't get the users who will say buy Apple when they are purposefully refusing to use a free, no overheads usb 3.0 or display port and actively make you pay more for a pointless cable. Why nvidia users don't mind that they get charged more to get a 3d vision compatible screen, when all it is, is not blocking the screen in the driver, g-sync, something so completely basic that they are going to charge you $100 more for.

It's not that Intel are anti AMD, Apple are anti Samsung, or Nvidia are anti AMD(again) that pee's me off about these companies. It's that Intel are anti Intel user, Apple are anti Apple user and Nvidia are anti Nvidia user.

The attitude is just horrendous, lock AMD out of 3dvision, but why are you increasing the cost for your own user? They decided to buy a card that costs more that has 3dvision as a feature, but they you increase the cost of the screen so the user has to pay again to use the feature? Back in their mobo days, sli was a feature of the card, but unless you paid an extra £10 for the mobo, the driver(but nothing else) locked you out from sli, a feature you paid for in buying that graphics card to start with.

If there is functional hardware, an actual chip that makes this work(and is actually needed) and it costs more that is one thing, but asking someone who already brought your product to pay more again or be locked out by drivers alone? It's disgusting. Lightning, is purely an attempt to charge 10 times as much for cables and make the cables yourself, there is no benefit to the end user, it's purely a play to milk more cash from the user.
 
You keep saying nVidia's increasing the cost for nVidia users as well but I have failed to see this in reality. Other than the cost of the 3D kit itself (which obviously isn't free to develop and produce) the monitors were priced accordingly for a 120 Hz panel.
 
After reading DM's post i feel quite guilty buying into Nvidia cards, I did not know they where as bad as this :( very sad but then there is a fine line between two company's making money and looking like they are ahead of the tech and in the lead.

But adding software to make another make of GPU perform worse is pretty disgusting.

DM you explain things so very well and i thank you for that :)
 
I don't think it's hate, just a slight disdain for nVidias actions over the years (which imo is highly deserved)

Maybe but after reading the same thing again and again from the same person, it looks more than basic disdain.

I don't like the way Tesco work but I still use them. I don't like the way Nvidia always do business but that doesn't mean that the cards are bad as well.
 
I don't understand why it even matters what their practices are. It's not like they're dumping waste in an African country or pillaging workers. :D

Business is competitive and capitalism ensures that the fight to be competitive isn't always clean but this is replicated in different ways in other markets and there isn't really any issues with it. I think it's largely a way of justifying ones preference which is fair enough but it's important not to extrapolate that along with misleading info. :)

Please don't anyone read this (and me having a nVidia GPU) as me being some kind of nVidia fanboy. I'd hope we could avoid the toilet seat for a debate for once. :D
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom