• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD AGAIN? no Xfire for Tomb raider

So of course you have definitive proof for all of this, links please.

Good discussion, great points, argument well made.... wait, nope.

Did I say there was proof, why would there be proof of this. Would you expect Nvidia to put up a blog post saying, hey we threw millions at this game because we realised it would make our cards look really bad. Most importantly YOU asked how a comparison was made, I replied and I asked the question why do you think DX12 was removed from the game if Nvidia didn't lose ground with it enabled. So rather than make a realistic point, answer the question or discuss it sensibly you demand proof when there is no realistic expectation of proof.

Once again I'll ask, if this isn't the case why did a DX12 game have it removed on the PC? Is there any viable reason that async compute and other DX12 features like lower driver overhead and generally better performance were removed from the game for the PC release?

These should improve performance on Nvidia, whichever parts of DX12 they support that is, the literally single logical reason for this to have been done is that while Nvidia gain some performance, AMD gain more and thus reduce Nvidia's advantage or push the advantage to AMD in the game.

Is there another reason to remove a faster code path from a game?
 
Yes,like I have said I can understand if CD does not say anything. But what stops AMD from saying its TressFX technology is in Purehair or has some statement?

Plus are AMD that weak sauce they give tech to a dev and can't even say two lines on how it is based on their own tech. What sort of crap agreement have they signed with devs then??

FFS,do people realise Crystal Dynamics and Eidos are OWNED by Square Enix which is publishing the next Deus Ex(made by Eidos) which uses AMD technologies too and is an AMD Gaming Evolved title?

So,does the next Deus Ex mean AMD is not going to be able to talk about the Purehair technology in that too??

Because,if modded/renamed TressFX 2.0/3.0 is being used by one dev which is owned by Square Enix,it wouldn't surprise me if other studios use it too.

I feel your pain. They do seem to be playing the good guy card too much atm. I would like to see some more fight but being the good guy Is never easy. What they are trying to do is the right thing but it will only work with a little guile behind it.
 
I feel your pain. They do seem to be playing the good guy card to much atm. I would like to see some more fight but being the good guy I'd never easy. What they are trying to do is the right thing but it will only work with a little guile behind it.

They need to get their marketing/PR in order. Tombraider is being developed by one of the studios who are part of the same company who is also developing Deus Ex:Mankind Divided.

So,Square Enix accepted Nvidia sponsorship/help for the new TR game and sponsorship/help from AMD for the next Deus Ex game.

So,you could argue,maybe they are hedging their bets a bit. But OTH,considering AMD technology is being used in BOTH Rise of Tombraider and Deus Ex:Mankind Divided,I can't see why AMD would be gagged about actually talking about its own tech which Square Enix has been quite happy to use over it Hairworks!!

I think this is more the case,AMD might see it is a "Nvidia" game so decided they have no reason to bother doing anything with it IMHO. But,what they don't realise if they actual put out a puff piece on Purehair,or at least that it is AMD based,it would mean some good PR for AMD,especially in a time when they are not releasing much,especially how reviewers are waxing lyrical on how small a performance hit the hair effects are having cross platform,and it also adds as excellent PR against Hairworks with its issues.

This would also drive some hype for Deus Ex:Mankind Divided too,as it is meant to have the latest iteration of TressFX which is meant to have a greater scope in animation effects. So if AMD can get a similar scope of global effects to Hairworks at a lower performance hit,while making the code freely available to devs,it actually is a positive piece of PR which is backed up instead of just rhetoric.
 
Last edited:
Breaking news! Latest blockbuster game creates AMD/nVidia divide... Again!

Do we have to keep having the same threads with the same tired arguments in all the time?
If it keeps happening only when Nvidia gets involved then doesn't that suggest there could be a good reason the performance is always relatively poor / missing features / tuned in a less than optimal way?

I'm not wading into the argument but just wondering how many times it would take for someone to get slapped to realise who's slapping them lol. It's always been a waste of time to try and convince some of the most hard headed and anti logical crew that no matter how many times the event repeats itself there is some evidence it seems to be rather frequent only when Nvidia gets involved. Say what you will, I'm not going to argue as I said but it's only a repeating argument because there's so many people stapling there eyes shut. I don't expect anyone to provide definitive proof so don't mind if some choose to be cautious in there suspicions but there's been enough observations and examples that it seems like some people just don't want to turn the gears in there heads and engage with the idea. Short of Nvidia outright breaking into tears and confessing I'd expect some to at least have the smarts to acknowledge it's a possibility.
 
If it keeps happening only when Nvidia gets involved then doesn't that suggest there could be a good reason the performance is always relatively poor / missing features / tuned in a less than optimal way?

I'm not wading into the argument but just wondering how many times it would take for someone to get slapped to realise who's slapping them lol. It's always been a waste of time to try and convince some of the most hard headed and anti logical crew that no matter how many times the event repeats itself there is some evidence it seems to be rather frequent only when Nvidia gets involved. Say what you will, I'm not going to argue as I said but it's only a repeating argument because there's so many people stapling there eyes shut. I don't expect anyone to provide definitive proof so don't mind if some choose to be cautious in there suspicions but there's been enough observations and examples that it seems like some people just don't want to turn the gears in there heads and engage with the idea. Short of Nvidia outright breaking into tears and confessing I'd expect some to at least have the smarts to acknowledge it's a possibility.


Refreshingly rational. Kudos
 
If it keeps happening only when Nvidia gets involved then doesn't that suggest there could be a good reason the performance is always relatively poor / missing features / tuned in a less than optimal way?

I'm not wading into the argument but just wondering how many times it would take for someone to get slapped to realise who's slapping them lol. It's always been a waste of time to try and convince some of the most hard headed and anti logical crew that no matter how many times the event repeats itself there is some evidence it seems to be rather frequent only when Nvidia gets involved. Say what you will, I'm not going to argue as I said but it's only a repeating argument because there's so many people stapling there eyes shut. I don't expect anyone to provide definitive proof so don't mind if some choose to be cautious in there suspicions but there's been enough observations and examples that it seems like some people just don't want to turn the gears in there heads and engage with the idea. Short of Nvidia outright breaking into tears and confessing I'd expect some to at least have the smarts to acknowledge it's a possibility.

Thumbs up.
 
If it keeps happening only when Nvidia gets involved then doesn't that suggest there could be a good reason the performance is always relatively poor / missing features / tuned in a less than optimal way?

I'm not wading into the argument but just wondering how many times it would take for someone to get slapped to realise who's slapping them lol. It's always been a waste of time to try and convince some of the most hard headed and anti logical crew that no matter how many times the event repeats itself there is some evidence it seems to be rather frequent only when Nvidia gets involved. Say what you will, I'm not going to argue as I said but it's only a repeating argument because there's so many people stapling there eyes shut. I don't expect anyone to provide definitive proof so don't mind if some choose to be cautious in there suspicions but there's been enough observations and examples that it seems like some people just don't want to turn the gears in there heads and engage with the idea. Short of Nvidia outright breaking into tears and confessing I'd expect some to at least have the smarts to acknowledge it's a possibility.

+1

You pretty much sum up my views in a better way than i could express them.
 
I've had plenty of Nvidia cards in the past, gf3ti, gf4 ti 200(went back for free and replaced with a 9700 pro not long after), couple of 6800gt's, I think a 5900 at some point, then I got a killer deal on a 8800gtx.

At some point I got fed up with nonsense like Nvidia paying to have DX10.1 patch removed from one game(and I think quite a few games rumoured to have it coming that only came out with DX10 in the end) which reduced performance to make Nvidia look good.

This is my bone with Nvidia, I used to give them money and rather than spend that money making as good a card and supporting the latest and best features, they started sabotaging games, screwed up their DX10 support so screwed everyone by working with(paying) MS to remove multiple features. When MS had the temerity to add those features back in with DX10.1 that wasn't enough so Nvidia started paying to remove DX10.1 support from games.

That is when I stopped supporting them. This is what I can't understand, we all know this game has DX12 features, Nvidia say they support DX12(though are hazy on their idea of async shaders, one of the single most important features of DX12) but somehow pay to nick this DX12 game off AMD and suddenly the game has features missing from the console version that would improve performance.

Screw that, screw over tessellating and removing performance from your own users to hurt AMD more. Hurting AMD alone is one thing, bad, but a very different thing to hurting your own users performance to get at AMD. I have no idea how people think it's okay.

Lets say with DX11 this game gives a Fury X 50fps and 980ti 60fps, in DX12 Fury X gets 75fps and 980ti gets 70fps, or maybe it's 80fps. They are paying for their own users to miss out on that 10 or 20fps just because it shows a bigger gap in DX12. That should be completely and utterly unacceptable to every Nvidia user.
 
I've had plenty of Nvidia cards in the past, gf3ti, gf4 ti 200(went back for free and replaced with a 9700 pro not long after), couple of 6800gt's, I think a 5900 at some point, then I got a killer deal on a 8800gtx.

At some point I got fed up with nonsense like Nvidia paying to have DX10.1 patch removed from one game(and I think quite a few games rumoured to have it coming that only came out with DX10 in the end) which reduced performance to make Nvidia look good.

This is my bone with Nvidia, I used to give them money and rather than spend that money making as good a card and supporting the latest and best features, they started sabotaging games, screwed up their DX10 support so screwed everyone by working with(paying) MS to remove multiple features. When MS had the temerity to add those features back in with DX10.1 that wasn't enough so Nvidia started paying to remove DX10.1 support from games.

That is when I stopped supporting them. This is what I can't understand, we all know this game has DX12 features, Nvidia say they support DX12(though are hazy on their idea of async shaders, one of the single most important features of DX12) but somehow pay to nick this DX12 game off AMD and suddenly the game has features missing from the console version that would improve performance.

Screw that, screw over tessellating and removing performance from your own users to hurt AMD more. Hurting AMD alone is one thing, bad, but a very different thing to hurting your own users performance to get at AMD. I have no idea how people think it's okay.

Lets say with DX11 this game gives a Fury X 50fps and 980ti 60fps, in DX12 Fury X gets 75fps and 980ti gets 70fps, or maybe it's 80fps. They are paying for their own users to miss out on that 10 or 20fps just because it shows a bigger gap in DX12. That should be completely and utterly unacceptable to every Nvidia user.

The point your missing is while this is indeed quite possible,, its just as possible Nvidia didnt pay to get DX12 removed from this game either aswell yet you go on as if its fact .
Its also possible the console maker dont want the pc version to appear much better and faster and they had a hand in it but the thing is none of us really know for sure
 
I've had plenty of Nvidia cards in the past, gf3ti, gf4 ti 200(went back for free and replaced with a 9700 pro not long after), couple of 6800gt's, I think a 5900 at some point, then I got a killer deal on a 8800gtx.

At some point I got fed up with nonsense like Nvidia paying to have DX10.1 patch removed from one game(and I think quite a few games rumoured to have it coming that only came out with DX10 in the end) which reduced performance to make Nvidia look good.

This is my bone with Nvidia, I used to give them money and rather than spend that money making as good a card and supporting the latest and best features, they started sabotaging games, screwed up their DX10 support so screwed everyone by working with(paying) MS to remove multiple features. When MS had the temerity to add those features back in with DX10.1 that wasn't enough so Nvidia started paying to remove DX10.1 support from games.

That is when I stopped supporting them. This is what I can't understand, we all know this game has DX12 features, Nvidia say they support DX12(though are hazy on their idea of async shaders, one of the single most important features of DX12) but somehow pay to nick this DX12 game off AMD and suddenly the game has features missing from the console version that would improve performance.

Screw that, screw over tessellating and removing performance from your own users to hurt AMD more. Hurting AMD alone is one thing, bad, but a very different thing to hurting your own users performance to get at AMD. I have no idea how people think it's okay.

Lets say with DX11 this game gives a Fury X 50fps and 980ti 60fps, in DX12 Fury X gets 75fps and 980ti gets 70fps, or maybe it's 80fps. They are paying for their own users to miss out on that 10 or 20fps just because it shows a bigger gap in DX12. That should be completely and utterly unacceptable to every Nvidia user.

Indeed it just coincidence that dx10.1 gets removed from AC, dx12 from Rise of Tomb raider and ARK: Survival Evolved.
 
I think it's funny how people expect games to work the week they release. Don't they ever learn. Cf has some of the best scaling but they take longer to bring out drivers. This has been the case for well over a year so why do people have such high expectations. :P if you want to game to work on day 1 buy a ps4, although you might have to wait till day two to play it because you'll spend the first day downloading patches.
 
I think it's funny how people expect games to work the week they release. Don't they ever learn. Cf has some of the best scaling but they take longer to bring out drivers. This has been the case for well over a year so why do people have such high expectations. :P if you want to game to work on day 1 buy a ps4, although you might have to wait till day two to play it because you'll spend the first day downloading patches.

problem is its not that way at all.

Just cause 3 has been out for ages, and still no crossfire support <- best example
 
Definitely need CF support for this to be playable on my system and decent settings.

How does one force the old TR profile in this new crimson driver GUI? It's awful and I can't find anything on it. (it also broke my hotkeys arrrrg)
 
Back
Top Bottom