Soldato
Keep on their case Gregster, you biblical beast.
lol
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Keep on their case Gregster, you biblical beast.
In a normal game but we're talking a TWIMTBP game so against a 7950 isn't beyond the realms of possibility.
If AMD for whatever reason are denied access to optimise then yes.
The only way a GPU like that would ever keep up is if its not running efficiently on the 7950.
Isn't that they point here?
No I was just meaning the normal difference between TWIMTBP and GE titles. As in they're optimised better from the off on their respective cards. That couple with a scenario where the memory bandwidth on the 660 doesn't hamper it would bring them close. I don't see the difference being down purely down to GW although not ruling it out entirely.
Joel Hruska - Gregster • 3 hours ago −
AMD did not "set me up" for this article. They made me aware of their concerns. I investigated the issue and believe the concerns warranted. I also spoke to Nvidia and attempted to speak to WBM.
The benchmarks, data collection, and research for this story are entirely my own. Vendors regularly communicate with journalists regarding product performance of both their own hardware and that of their competition.
Careful Greg, people might think you're a biased CT nutjob or that you have an agenda.
Sorry, forgot the ''
It's just that you're taking what this guy says and making it mean what you want it to mean in order to support your point of view while the general Nvidia crowd claim the usual conspiracy theory and agenda accusations whenever something negative is said about Nvidia practices and demand absolute proof.
Take the Origin PC thing for example. How blatant was that? Yet the green defenders wanted nothing short of an official statement from Nvidia saying 'Yeah, we did it' or it must all be conspiracy theory BS from AMD fanboys...
All I'm saying is bias is on both sides of the argument here and people are ignoring anything that doesn't support their pre-defined POV, while trying their hardest to put down anyone that thinks otherwise.
Can you explain how a 770 is faster than a 290X with FXAA in BAO? Whatever the NV advantage is with FXAA, we're still talking about something that only costs a couple of fps with AMD cards.
My mistake im getting confused with the titan launch, prior to the Tomb Raider launch. Regardless Nvidia were able to optimize for tressfx, the way it should be. Something amd are unable to do for GameWorks, so it changes nothing. IF AMD wanted to hurt Nvidia performance they would not use a DX11 API standard like Direct Compute which is open and can be optimized for.
I will say it again. There is no way a 660 should be faster than a 7950 at any setting. See the post below.
So it is totally ok for AMD to develop extra iq features knowing full well that it would hurt nvidia users because "its open", even when it makes nvidias top card look like an AMD mid range card
but it isnt ok for nvidia to develop extra iq features even when multiple benches across a range of games and settings show that the 290x still beats a titan
Sorry, forgot the ''
It's just that you're taking what this guy says and making it mean what you want it to mean in order to support your point of view while the general Nvidia crowd claim the usual conspiracy theory and agenda accusations whenever something negative is said about Nvidia practices and demand absolute proof.
Take the Origin PC thing for example. How blatant was that? Yet the green defenders wanted nothing short of an official statement from Nvidia saying 'Yeah, we did it' or it must all be conspiracy theory BS from AMD fanboys...
All I'm saying is bias is on both sides of the argument here and people are ignoring anything that doesn't support their pre-defined POV, while trying their hardest to put down anyone that thinks otherwise.
Can you explain how a 770 is faster than a 290X with FXAA in BAO? Whatever the NV advantage is with FXAA, we're still talking about something that only costs a couple of fps with AMD cards.
Even if they did put him onto it, why is that such a bad thing? He reckons the concerns are valid and if they are...?.
Have Nvidia never put anyone onto anything? How about the FCAT frametimes thing for example?
With BAO, the only way that AMD are running level or even slightly ahead is due to them being better at MSAA (according to them).
Sure, these are just benchmark scores and most decent cards can run the game well enough because it really isn't that hugely demanding tbh.
What happens when this is being applied to a game that does require more GPU beef?
Again, a 770 shouldn't be beating a 290X.
As it stands, yes, I agree that it's a little overblown right now. However, that doesn't mean it definitely isn't an ominous sign of things to come.
I know you wouldn't want a situation like that just as much as me or anyone that doesn't sport a GPU brand on their forehead
So it is totally ok for AMD to develop extra iq features knowing full well that it would hurt nvidia users because "its open", even when it makes nvidias top card look like an AMD mid range card
but it isnt ok for nvidia to develop extra iq features even when multiple benches across a range of games and settings show that the 290x still beats a titan
And now that Nvidias top cards can run tressfx without a massive performance hit it hasnt been seen in any other game, funny that
If you really cant see the bias in your own statements then there really is no point in having any type of discussion
The hypocrisy is off the scale, I dont mind people having a brand preference, I prefer nvidia, but to dress it up as "concern for gamers" is laughable
Both sides work with developers and a range of games end up with a slight bias towards one range of cards or the other, it is not surprising, the claim that GW totally prevents AMD from any optimisation and that as a result all AMD cards are at a disadvantage really isnt backed up by benches, you end up having to run very specific settings and only compare specific cards for the data to support the accusation, with a wider data set then no pattern occurs
But no, you are totally right, nvidia should be banned from doing any work that benefits their customers where as AMD should be given awards for developing features and tools that block out or hurt performance for nvidia customers, that sounds completely fair
So it is totally ok for AMD to develop extra iq features knowing full well that it would hurt nvidia users because "its open", even when it makes nvidias top card look like an AMD mid range card
but it isnt ok for nvidia to develop extra iq features even when multiple benches across a range of games and settings show that the 290x still beats a titan
And now that Nvidias top cards can run tressfx without a massive performance hit it hasnt been seen in any other game, funny that
If you really cant see the bias in your own statements then there really is no point in having any type of discussion
The hypocrisy is off the scale, I dont mind people having a brand preference, I prefer nvidia, but to dress it up as "concern for gamers" is laughable
Both sides work with developers and a range of games end up with a slight bias towards one range of cards or the other, it is not surprising, the claim that GW totally prevents AMD from any optimisation and that as a result all AMD cards are at a disadvantage really isnt backed up by benches, you end up having to run very specific settings and only compare specific cards for the data to support the accusation, with a wider data set then no pattern occurs
But no, you are totally right, nvidia should be banned from doing any work that benefits their customers where as AMD should be given awards for developing features and tools that block out or hurt performance for nvidia customers, that sounds completely fair
Very well said Petey but the brand defenders are here in force and don't listen to reason nor common sense. They're far to dug in so its a case of Ignore, Deflect, Ignore, Deflect etc.
Andy andy andy. The extra IQ features as you call it are part of DirectX. I don't see how you can compare that to GameWorks which is a close library which AMD are unable to optimise for. Its pretty clear you don't understand, ive tried to explain it for you several times but you just ignore what im saying then go off on a tangent about AMD only using Direct Compute because mid range Kepler cards decided to gimp that feature at Nvidias request. Its madness.
Even funnier that you now speculate that because Nvidia have optimised their drivers for TressFX AMD have decided to stop using it. Actually AMD are working on a new and improved version of TressFX which is less demanding. It will be appearing in Star Citizen. If AMD had used a closed library like GameWorks and not an Open API like DirectCompute Nvidia would have never been able to optimise their drivers for it.
As for the claims about GameWorks its backed up perfectly via the article, via techspot and via user benches. Its only when AA is applied that AMD is able to overpower the GameWorks advantage in this particular title and pull ahead through brute force. Its not normal for a 660 to beat a 7950 boost.
tressfx is not part of DX11, it is a specific feature that was added to a game by AMD... the article you linked to even goes in to detail about how AMD hurt their own users performance and that the upshot of that was that it hurt Nvidia users more (sound familiar?)
NVidia's performance on 780 vs. 680 has nothing to do with driver optimisation, the 680 lacked hardware (at developers and gamers request) that could run that code efficiently, tressfx release was specifically targeted at making NVidia's cards look poor value and AMD cards look better value to potential customers - a marketing tool
being able to optimise your drivers for a feature is irrelevant if your hardware lacks the capability to run it - same with tressfx at the time and the same with mantle for god knows how long
mantle being "open" is only a valid point once AMD add support for Intel and Nvidia GPU's
as others have pointed out there are lots of "closed librarys" in use by devlopers other than NVidia's, and yet both sides have no problems optimising drivers for those
this entire thread is purely a case of your rabid support for anything AMD and your inherent need to have a pop at NVidia, but you can't have it both ways
I bought NVidia because I see the value in their extra features, you bought AMD because you didn't, you can't then complain when game devs leverage NVidia's tools to make their games and then on specific settings in specific circumstances the cards that you own don't perform quite as well as they do in some other games
AMD do the exact same thing and it is only your rabid support of all things rose tinted that prevents you from accepting that
We’ve been working closely with NVIDIA to address the issues experienced by some Tomb Raider players. In conjunction with this patch, NVIDIA will be releasing updated drivers that help to improve stability and performance of Tomb Raider on NVIDIA GeForce GPUs. We are continuing to work together to resolve any remaining outstanding issues. We recommend that GeForce users update to the latest GeForce 314.21 drivers (posting today) for the best experience in Tomb Raider.
People are STILL comparing GameWorks to Mantle purely because of performances? How about actually comparing the approach of the features instead?This concerns me tbh, but no one makes a thread stating that mantle could make nvidia 780ti cards look like a 6950 in a lot of future games.
Quote "Of course heres the thing we are not sure about. Mantle was clearly designed with GCN in mind, so when AMD talks about other vendors being able to utilize Mantle does that mean that Mantle will work on their current Architecture? Or will the actual architecture of rival vendors (Nvidia) be need to be modified to support Mantle?"
Read more: http://wccftech.com/amd-mantle-api-require-gcn-work-nvidia-graphic-cards/#ixzz2pERWrXZ8
IF nvidia achitecture has to be 'modified' i dont think this will help the gaming community at all on nvidia cards as at what cost to the consumer will this happen, IF it is the case.