• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Codemasters builds DX12 support into its EGO Engine 4.0

So entirely possible that some/all games, such as this one, by this developer will also favour AMD.

AMD fans get worked up over Gameworks, same thing I guess, except at least AMD users can choose whether they use GameWorks or not in most cases.

Compared to Gameworks games, How many AMD sponsored games have caused severe performance issues for everyone? I know there was a problem with TressFX in tomb-raider but that was sorted in the end and everyone was on equal footing after.
 
If aimed at Googaly, he isn't a Nvidia fan.



It clearly does as the dev refused at request of Nvidia to disable Async for all, why should they make their partner slower by not using DX12 features?

It won't be a one off with Oxide either, I think we are going to find some partnered DX12 titles are going to be finely tuned for AMD/Nvidia's strong areas giving an even bigger advantage.

DX12 changes the landscape, DX11 dev made title, AMD/Nvidia optimise at driver level, DX12 the onus on the devs who tune titles-AMD/Nvidia have less control.

If we are going off todays tie ins, Nvidia are running riot and AMD titles only release on a full moon.

AMD will be ******* themselves.:p

I guess that smiley at the end shows your happiness if AMD dies. :rolleyes:

As for Oxide refusing to disable Async...why should any dev disable an integral DX12 feature anyway? Would any dev disable Gameworks if AMD asked?

What Oxide did is use a Nvidia supplied codepath for Nvidia cards and AMD code-path for AMD cards. This helps both gpu companies rather than favour one, as you put it. You are intentionally posting misleading information here.
 
Compared to Gameworks games, How many AMD sponsored games have caused severe performance issues for everyone? I know there was a problem with TressFX in tomb-raider but that was sorted in the end and everyone was on equal footing after.

Why compared GameWorks games to AMD sponsored games? Surely Nvidia sponsored are comparable to AMD sponsored and GameWorks are equal to TressFX?
Most AMD sponsored games don't seem to have any AMD tech in them, so it's an unfair question. Of the 2 games that had TressFX in 50% were broken on release and the other 50% didn't allow Nvidia to enable the tech. So I guess 100% of TressFX games didn't work with TressFX effects on Nvidia at launch.
 
I guess that smiley at the end shows your happiness if AMD dies. :rolleyes:

As for Oxide refusing to disable Async...why should any dev disable an integral DX12 feature anyway? Would any dev disable Gameworks if AMD asked?

What Oxide did is use a Nvidia supplied codepath for Nvidia cards and AMD code-path for AMD cards. This helps both gpu companies rather than favour one, as you put it. You are intentionally posting misleading information here.

I agree that they shouldn't disable Async, but picking one game that uses the feature we know AMD does better than Nvidia and then declaring Nvidia's entire DX12 implementation a disaster seems a little bit hasty.

What if a game uses the feature Nvidia support that AMD don't? If the game is designed by a company in collaboration with Nvidia do we still expect AMD to do better in that benchmark?
 
AOTS and Squad

Sorry the others are 5 due to get dx12 I believe are. Ark, DayZ, Arma3, Just Cause 3, Star citizen (not played yet)

Thanks, didn't know about Squad, might have a look, and wasn't aware of DayZ, Arma3(both not my thing).

Just Cause 3 however, is DX12 confirmed?


I guess that smiley at the end shows your happiness if AMD dies. :rolleyes:

Sarcasm,

**** me 8 months in and I'm a fully fledged Nvidia fanboi.:D

(@Andy, what about that one eh?:p)



It was stated because I mentioned DX12 onus is now on dev optimisation, due to GW titles running amok and the harsh reality that AMD have had hardly any exclusive GE to the fact I don't recall any before SWBF and not entirely sure when the next one will be as new TR isn't exclusive.

At the rate Nvidia churn out GW's, as it clearly gives Nv the edge now, you are in for a shock, as who ever gets their name on the box will have the performance advantage.

As for Oxide refusing to disable Async...why should any dev disable an integral DX12 feature anyway?

Would any dev disable Gameworks if AMD asked?

AMD wouldn't even ask, a set of libraries isn't an api, don't know why your asking that, for the record, don't know if you haven't noticed but I'm not the biggest fan of GW's implementation, but most of the market don't give a **** as it doesn't affect them.

posting misleading information here.

If you think so.:cool:
 
I agree that they shouldn't disable Async, but picking one game that uses the feature we know AMD does better than Nvidia and then declaring Nvidia's entire DX12 implementation a disaster seems a little bit hasty.

What if a game uses the feature Nvidia support that AMD don't? If the game is designed by a company in collaboration with Nvidia do we still expect AMD to do better in that benchmark?

Course it's hasty, but personally feeling Nvidia's reluctance to green light DX12 ARK, along with their request to disable api calls don't instill me with confidence.
 
Course it's hasty, but personally feeling Nvidia's reluctance to green light DX12 ARK, along with their request to disable api calls don't instill me with confidence.

I agree.

I think this is where AMD and Nvidia differ with their approach. When something affects Nvidia performance they try to put a stop to it. When something affects AMD performance they do nothing and then a few months later cry victim about how Nvidia did this and developer X did that.
 
I agree.

I think this is where AMD and Nvidia differ with their approach. When something affects Nvidia performance they try to put a stop to it. When something affects AMD performance they do nothing and then a few months later cry victim about how Nvidia did this and developer X did that.

Just about, yet it's ok for Nvidia attempting stopping devs launching anyway on DX12, where as you state AMD cries GW's victim when there fundamentally IS an unfair Nv advantage on GW's due to Nvidia exclusive GW's optimisation considering devs/AMD are forbidden to optimise for AMD?
 
Thanks, didn't know about Squad, might have a look, and wasn't aware of DayZ, Arma3(both not my thing).

Just Cause 3 however, is DX12 confirmed?

ArmA3 was mentioned months back as "looking into" the possibility of DX12 entering the engine around 2017. Latest comment was that it wasn't showing any initial benefits. Not that there aren't any, just that for that game engine, DX11 is by no means the bottleneck for performance. As such i'd doubt DayZ will see it any time either.

Star Citizen have said they'll support DX12 but there is no option to use it yet, and thats a LONG way off release.

JC3 i remember spotting an article about DX12 possibility but i think i just skimmed the headline.

I think we'll see a few games get it this year, but it probably wont be for at least another 12 months before we see games make good use of DX12 because they were designed from the ground up with engines that support the new direction DX12 is taking.
 
I might be wrong (this is beyond my scope of knowledge) but I wouldn't say that Mantle is "dead". Rather DX12 IS mantle. certainly its written the same way. That's why AMD cards are doing so well on it. It's basically the same software architecture. Either AMD was on to something with mantle and the folks at Microsoft thought...hmm this is the way to go here. Or AMD just got extremely lucky and DX12 happened to be similar to mantle. But from all the reading I have done, Mantle is not "dead" it's job is done. It (for all intents and purposes) has now become DX12. Apparently they are amazingly similar designs.

I personally (not a fan boy) don't think that "new drivers" from Nvidia will be enough to reign in the differences being show between AMD and Nvidia on current cards on DX12. But yes, hopefully Pascal will be able to be tuned to be DX12 friendly. I'm pretty sure of that. So I think this year AMD GPU's will dominate on DX12 (a VERY good thing for us, no matter what card you have). And I think Nvidia will pull back level with Pascal. I hope anyway.
 
Good news at least, but still no reason for me to move from Win7 yet.

There's no real reason not to either, There's nothing you can't change to make it more private.

I loved Win 7 and hated Win 8.1 when I moved to it but having now moved to Win 10 it's like being on 7 again. It's a stable OS that with a few settings tweaks does nothing I don't want it too.
 
Eh?? Two of my games are already full fledged DX12 while another 6 of them are set to get DX12 patches in the near future. So that makes 80% my steam games DX12 or nearly DX12 already. You make it sound like it's years off? Not only is it not far away. It's already here. Hell my windows is DX12.

So two of your games are full fledged DX12 (Squad and AoTS) Both are in Alpha or Beta still and the others you mention are getting DX12 support. I don't consider 2 games that are in Alpha or Beta to be "DX12 in full flow" and my point stands. You can actually call it 3 games if you like, as Caffeine is DX12 as well.

Here is a list of DX12 games and coming games.

https://en.wikipedia.org/wiki/List_of_games_with_DirectX_12_support

There will be more and some of them will drop DX12 but for now, DX12 isn't what I would upgrade for.
 
Humbug only had the very old build which is why his results look so bad (which was already pointed out a couple of times in the thread), hence why I went back and looked for review sites that used the same Ashes build and the same CPU on 970 vs 390

In his quote the member Streetlight completely trashes my 970 with a 290p on the same FX9590
 
When I get home next week. I no longer have that CPU tho
In his quote the member Streetlight completely trashes my 970 with a 290p on the same FX9590

I suspect it's the 8 ACE units AS in Hawaii that's making much better use of the FX8 CPU
 
When I get home next week. I no longer have that CPU tho


I suspect it's the 8 ACE units AS in Hawaii that's making much better use of the FX8 CPU

Quote myself again. He's also beaten beeneyboy who has a 4790K.
That should tell you what AS is doing. The FX8 is as fast but only with all its 8 threads used. Which is rare in DX11 and only if it's physics in a good engine like cryengine
 
Compared to Gameworks games, How many AMD sponsored games have caused severe performance issues for everyone? I know there was a problem with TressFX in tomb-raider but that was sorted in the end and everyone was on equal footing after.

There is nothing in an AMD sponsored game to cause any issues.
 
Compared to Gameworks games, How many AMD sponsored games have caused severe performance issues for everyone? I know there was a problem with TressFX in tomb-raider but that was sorted in the end and everyone was on equal footing after.

There is nothing in an AMD sponsored game to cause any issues.

Let's not forget that the majority of these games are terrible performance wise, and buggy even without the NVIDIA tech. Ubisoft have been terrible with performance, stability and even other graphical issues in their games; and Warner Bros just don't seem to care about PC anymore in general.

Both Arkham Knight, and Mortal Kombat X were terrible games on the PC with or without NVIDIA tech running, and that the latter game has now even had PC support dropped entirely.

As much as I like Gameworks' stuff when it works, I'd still rather have a game that runs extremely well, and isn't a buggy mess if it mean no NV or AMD tech whatsoever.
 
I should also add that DX12 on its own is limited to 4 threads when it comes to drawcalls. That's 4x better than DX11 but still needs clever innovation to help when it comes to serious work.
Mantle BTW is 12 at least.
 
Just about, yet it's ok for Nvidia attempting stopping devs launching anyway on DX12, where as you state AMD cries GW's victim when there fundamentally IS an unfair Nv advantage on GW's due to Nvidia exclusive GW's optimisation considering devs/AMD are forbidden to optimise for AMD?

I'm not saying Nvidia's solution is a good one, but they are doing something. Was it the AOTS benchmark that they ask the developers to use a different codepath for their cards? Not the ideal solution, but if hey figured out a solution that works better for them and provide the code (or some sort of break down) to the developer at least they're doing something.

Also not saying AMD are always unjustified with their complaints, but it just seems that rather than go out and get their tech in game they just sit back and complain when Nvidia get their tech in games. Complaining seems like a much easier course of action than pro-actively trying to change something.
 
Back
Top Bottom