• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

All Frostbite 3 titles will be optimised exclusively for AMD

None of which is relevant to my posts which are about AMD being hypocritical in the context of ideals they have been very vocal about in the past and which their current actions are very much in breach of.



I'm not complaining about the fact they have AMD optimised games I'm highlighting that its a bit rich in the face of past comments they have made.


Again you are putting many things into one basket and claiming that its all the same which it is not.
 
Its about AMD saying one thing and doing another (or doing nothing at all) something that they have a long and documented history of doing.

Kind of funny also in regard to your comments that you've completely missed or ignored that I was extremely critical of nVidia over many of those things - where it was factually nVidia at fault - i.e. locking out PhysX which I have repeatedly torn them to shreds for over multiple threads.
 
Its about AMD saying one thing and doing another (or doing nothing at all) something that they have a long and documented history of doing.

Kind of funny also in regard to your comments that you've completely missed or ignored that I was extremely critical of nVidia over many of those things - where it was factually nVidia at fault - i.e. locking out PhysX which I have repeatedly torn them to shreds for over multiple threads.

Yes i remember you becoming critical of locking out PhysX eventually when it was clear that NV attitude has not delivered Full Physics as a result of the lockout but not the other lockouts but that's was not my point, you brought that into this thread, you have not been Critical about NV optimized to which optimized is what the article is about.

AMD's statement:

"It makes sense that game developers would focus on AMD hardware with AMD hardware being the backbone of the next console generation. At this time, though, our relationship with DICE and EA is exclusively focused on Battlefield 4 and its performance optimizations for AMD CPUs, GPUs and APUs," AMD representatives said. "Additionally, the AMD Gaming Evolved program undertakes no efforts to prevent our competition from optimizing for games before their release.”

It's not AMD saying that NV can not have the code before release its EA saying it.
Amidst the fray of E3 reveals and gameplay demos, EA announced a new partnership with AMD that could tip the scales for the chip maker's Radeon graphics cards. Starting with the release of Battlefield 4, all current and future titles using the Frostbite 3 engine — Need for Speed Rivals, Mirror's Edge 2, etc. — will ship optimized exclusively for AMD GPUs and CPUs. While Nvidia-based systems will be supported, the company won't be able to develop and distribute updated drivers until after each game is released.

And the updates say there is nothing to worry about.
UPDATE: EA and AMD have issued a statement clarifying that while the two companies are collaborating on day-one support for Radeon products for Battlefield 4, the partnership is non-exclusive and gamers using other components will be supported.

“DICE has a partnership with AMD specifically for Battlefield 4 on PC to showcase and optimize the game for AMD hardware," an EA spokesperson said. "This does not exclude DICE from working with other partners to ensure players have a great experience across a wide set of PCs for all their titles.”
http://uk.ign.com/articles/2013/06/18/all-frostbite-3-titles-to-ship-optimized-exclusively-for-amd
 
Last edited:
Rroff, let's make this simple:

-nVIDIA has specific features lock for use only by themselves;
-they had and still have their own TWIMTP program - and if you want an example of how not to do it, take a look at it's latest addition Metro LL that made an engine that run much better on AMD products to run much worse for no apparent reason;
-they CHOOSE not to implement Direct Compute above a basic value in order to force users to buy more expensive professional parts and create a cooler, smaller chip. The fact that it backfired on them and their users it's a different thing all together and it's NOT AMD fault;
-nVIDIA CHOOSE a different path and consider that consoles were not worth the effort.

-AMD did not force nVIDIA to have a "light" DC series;
-AMD put the pedal to the metal regarding their own partnership with game devs;
-AMD does NOT have any GPU specific features locked - like TressFX, PecksFX or any other Direct Compute feature.

By trying to give it a little spin "straw man style" won't help your cause.
 
Proof is in the pudding.

More interested in the AMD CPU optimisation's than GPU, as this surely implies better scaling on more cores.
 
Pcper have done an article on the news, but the bit that caught my eye and had me thinking, yeah spot on was...

AMD said:
AMD called me to reiterate their stance that competition does not automatically mean cutting out the other guy. In the Tomb Raider story linked above, Neil Robison, AMD's Senior Director of Consumer and Graphics Alliances, states quite plainly: "The thing that angers me the most is when I see a request to debilitate a game. I understand winning, I get that, and I understand aggressive companies, I get that. Why would you ever want to introduce a feature on purpose that would make a game not good for half the gaming audience?

pcper said:
The irony in all of this is that AMD has been accusing NVIDIA of doing this exact thing for years - though without any public statements from developers, publishers or NVIDIA. When Batman: Arkham Asylum was launched AMD basically said that NVIDIA had locked them out of supporting antialiasing. In 2008, Assassin's Creed dropped DX 10.1 support supposedly because NVIDIA asked them too, who didn't have support for it at the time in GeForce cards. Or even that NVIDIA was disabling cores for PhysX CPU support to help prop up GeForce sales. At the time, AMD PR spun this as the worst possible thing for a company to do in the name of gamers, that is was bad for the industry, etc. But times change as opportunity changes.

The cold truth is that this is why AMD decided to take the chance that NVIDIA was allegedly unwilling to and take the console design wins that are often noted as being "bad business." If settling for razor thin margins on the consoles is a risk, the reward that AMD is hoping to get is exactly this: benefits in other markets thanks to better relationships with game developers.

Source
http://www.pcper.com/news/Graphics-Cards/Rumor-AMD-Gets-Exclusive-Optimization-all-Frostbite-3-Games


Discuss! :D

Soooo, not only is Roy a spammer (from E3), he is a liar as well :(

Roy has reached celebrity status on ocuk and is twice the men we are Greg.
 
Last edited:
Pcper have done an article on the news, but the bit that caught my eye and had me thinking, yeah spot on was...



Source
http://www.pcper.com/news/Graphics-Cards/Rumor-AMD-Gets-Exclusive-Optimization-all-Frostbite-3-Games


Discuss! :D



Roy has reached celebrity status on ocuk and is twice the men we are Greg.

Its normal practice, amd get drivers from the bat when the game is out work from the box, Nvidia now needs 3+ months to fix them.
this will reflect on new cards also, so I expect the amd radeon cards to be fast with BF4 at release in reference to Titan for example.
 
Rroff, let's make this simple:

-nVIDIA has specific features lock for use only by themselves;
-they had and still have their own TWIMTP program - and if you want an example of how not to do it, take a look at it's latest addition Metro LL that made an engine that run much better on AMD products to run much worse for no apparent reason;
-they CHOOSE not to implement Direct Compute above a basic value in order to force users to buy more expensive professional parts and create a cooler, smaller chip. The fact that it backfired on them and their users it's a different thing all together and it's NOT AMD fault;
-nVIDIA CHOOSE a different path and consider that consoles were not worth the effort.

-AMD did not force nVIDIA to have a "light" DC series;
-AMD put the pedal to the metal regarding their own partnership with game devs;
-AMD does NOT have any GPU specific features locked - like TressFX, PecksFX or any other Direct Compute feature.

By trying to give it a little spin "straw man style" won't help your cause.

All of which is completely irrelevant to the point I'm making.
 
So all that talk of embracing open standard and how nVidia locking in titles is bad for the health of gaming yada yada stands for nothing when the boot is on the other foot... suprise.

+1

Its the same for TWIMTBP titles Rroff. At least Nvidia users can use OpenCL and TressFX etc as its open standard and benefits everyone. Hopefully we see more of that and less of Physx seeing as only one side can use that effectively without suffering a major performance hit. It really won't encourage too many developers to have Physx in their games when you lock out half the user base imo. Anything that Physx can do OpenCL and your average computer can do without a massive inflicted performance hit which Nvidia apply.

I'm trying to think of that many games that PhysX doesn't cripple. A single 670 cannot process PhysX in Borderlands 2 on anything but low (a joke tbh), but it works well in the Batman games and doesn't hit FPS. Then again the severe FPS drops in some games are way more justifiable with PhysX than TressFX as they give a huge increase in graphical fidelity compared to say, losing 50fps to make hair a little swishy.

Is it the developers of the games who have to optimize the PhysX and TressFX or Nvidia & AMD themselves who code it in the game?
 
All of which is completely irrelevant to the point I'm making.

You are trying to point out that AMD does "an nVIDIA" which is not correct per reasons stated above.

I'm trying to think of that many games that PhysX doesn't cripple. A single 670 cannot process PhysX in Borderlands 2 on anything but low (a joke tbh), but it works well in the Batman games and doesn't hit FPS. Then again the severe FPS drops in some games are way more justifiable with PhysX than TressFX as they give a huge increase in graphical fidelity compared to say, losing 50fps to make hair a little swishy.

Is it the developers of the games who have to optimize the PhysX and TressFX or Nvidia & AMD themselves who code it in the game?

PhysX works well in games where there are only some sort of small debris flying around, but at that point, a multi-core CPU should handle that pretty fine as well (Mafia 2 works great). Complex physics calculations plus 3D graphics are to much for a single GPU and like so, a dedicated Physics Processing Unit is required. It may be another CPU, another GPU or an Ageia thing.

PhysX and stuff like TressFX are technologies from nVIDIA and AMD, the producers take them "as they are" and if possible, they improve, but due to the fact that only a handful of people actually enjoy them, much effort, I reckon, is not put into it.

PS: TressFX on vs. off - 34/44 on a 3x 1050p and 7950OC. It's definitely not a 50FPS hit.
 
Last edited:
You are trying to point out that AMD does "an nVIDIA" which is not correct per reasons stated above.

No I'm not.

My comments are about the fact that AMD have traditionally been very vocal against games being optimised specifically for one vendor or another - regardless of whether thts locking out other vendors or other vendors also having full access.

My point is trying to highlight the often glossed over fact of how often AMD says one thing and then either does nothing or the completely opposite of what they are saying.
 
No I'm not.

My comments are about the fact that AMD have traditionally been very vocal against games being optimised specifically for one vendor or another - regardless of whether thts locking out other vendors or other vendors also having full access.

My point is trying to highlight the often glossed over fact of how often AMD says one thing and then either does nothing or the completely opposite of what they are saying.

You keep saying that but then fail to explain yourself.
 
No I'm not.

My comments are about the fact that AMD have traditionally been very vocal against games being optimised specifically for one vendor or another - regardless of whether thts locking out other vendors or other vendors also having full access.

My point is trying to highlight the often glossed over fact of how often AMD says one thing and then either does nothing or the completely opposite of what they are saying.

There is one thing to optimize Metro style, in which the other company went from way ahead to way behind, to optimize in order to have a product that runs well from day one and uses your hardware as much and as better as possible. Probably it's the case of FX CPUs, that although run very well in Crysis 3 where CPU intense area are, on BF 3 lag waaay behind Intel.

PS: Do you know and Metro style kind of optimization done on AMD part or locking down features?
 
My comments are about the fact that AMD have traditionally been very vocal against games being optimised specifically for one vendor or another - regardless of whether thts locking out other vendors or other vendors also having full access.
.

No they haven't what they've been very vocal about is summed up at pcper.

AMD said:
AMD called me to reiterate their stance that competition does not automatically mean cutting out the other guy. In the Tomb Raider story linked above, Neil Robison, AMD's Senior Director of Consumer and Graphics Alliances, states quite plainly: "The thing that angers me the most is when I see a request to debilitate a game. I understand winning, I get that, and I understand aggressive companies, I get that. Why would you ever want to introduce a feature on purpose that would make a game not good for half the gaming audience?

Source
http://www.pcper.com/news/Graphics-Cards/Rumor-AMD-Gets-Exclusive-Optimization-all-Frostbite-3-Games

When have AMD ever used any proprietary technologies to deliberately hamper performance on the on the other sides gpu's? Are you forgetting Batman: Arkham Asylum which locked AMD out of supporting anti aliasing. Assassin's Creed dropped DX 10.1 support because NVIDIA cards didn't support it. NVIDIA was disabling cores for PhysX CPU support to harm performance. Using extreme tessellation that adds nothing to image quality in crysis 2 other than the fact it hurt AMD cards performance more than Nvidia cards. I don't ever recall AMD doing anything like that other than improving their version of gaming evolved.

The good thing about gaming evolved is Nvidia users get to use it and any of the open standards features it employs without being locked out. Every Nvidia can use TressFX for example, or compute based AA or compute based lighting effects as seen in farcry 3 or dirt showdown.
 
Back
Top Bottom