Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
I'm allowed to get annoyed at things man. I'm not losing sleep over it, dont worry. I get annoyed at somebody pushing past me in a crowd, too. Whether it's on the internet or not is irrelevant.
Well at least you admit it's not evidence of them doing what was accused of them here. That's a start.
Has nothing to do with what we were talking about. Here I was thinking you might finally start understanding the point, but nope, you're still going to sit here and try and say there's some merit to the argument because they did something else.
I'm not saying Nvidia is clean or anything. Never have. I am merely dismissing the notion that they are paying developers to get rid of DX12 like was claimed. Nothing more. Yet here you are, desperately trying to cling to the notion with thin wires, even though you even admit there's absolutely no evidence to support it. You tell me to 'keep an open mind' even though there's absolutely no reason to support it. You continually bring up this one thing they did in the past and present it as some idea that maybe this accusation is true, even though there is no actual evidence to support it.
It's a lousy argument man. That's all there is to it. Something that doesn't have any evidence whatsoever is a bad argument. Yet here you are, continually trying to defend it anyways, however loosely. There is NO REASON to entertain the idea whatsoever. And it wasn't phrased as speculation. For one, it was completely *baseless* speculation, but moreso, it was stated as fact. As if it had happened. Though they of course were wise to never respond to this as they knew damn well they had nothing to back it up with. So dont take over the charge that somebody else was smart enough to back down from man. It's a losing battle.
You haven't at all explained why the analogy is 'stupid'. It's clearly just your instinct to state it is so as it's inconvenient to your argument.
We are ALSO talking about two different things. Putting pressure on a developer to turn off certain options in benchmarking test. And paying money to a developer to completely get rid of DX12 from their game. I think my analogy is absolutely, freakin A+ spot-on here man. Doing one does not mean it is ok to accuse them of the other based *solely* on the fact that they were guilty of the former.
It's nonsense.
you continually bring up this one thing they did in the past and present it as some idea that maybe this accusation is true, even though there is no actual evidence to support it
Nexus18 If I was to post up that AMD (who bought out ATI) were paying developers to add extra nonsense effects to games just to hurt NVidia and here is my proof.
Proof, but not really
That is pretty much what you said earlier, I accuse Nvidia of doing X and here is the proof because they did Y (and in your case it wasn't even proof just an accusation.)
Yes as you can see the thing they were guilty of has no relevance to what they are now accused of.
Disclaimer: Before anyone starts saying AMD wouldn't do such a thing, bru you talking billyhooks and the like, what I said above is just an example.
I have no idea, haven't read into anything much about ATI/AMD back then etc.
I am guessing that you haven't properly read my posts either though... as my point isn't just about nvidia but for all companies and just how dirty they can be and not to reject opinions because they don't sit well with certain people, I am just using nvidia in this instance considering the topic at hand.
Chances are, AMD are just as dirty too with regards to getting dx12/async implemented in games given nvidias predicament, for all we know, they too could be "pressuring" (take that as whatever way you want) developers into doing this, just as you have shown with that link, they also have a history of this behaviour.
Also, didn't one or two of the developers come forward and say something about it being very time consuming to work on dx 12/async to get any noticeable difference? If this is true then why bother given how strict their deadlines are as well as their budget constraints especially since it will only really benefit how much of the GPU market?
I disagree that they "stopped optimising for the 780Ti".
My personal opinion is that AMD drivers are generally badly optimised. Not to say AMD cards are not competitive - they are, but this is through brute force. Hence the 7970 has much more compute power than the 680, and the 390x has more compute power than the 980, etc.
This is shown by nVidia's poor showing in DX12 - it's not a poor showing, but their incredibly efficient DX11 driver left less fat to trim, less room for improvement.
This is also why AMD need async compute - to use resources in the GPU that are sitting idle otherwise - and nVidia haven't bothered with it - as there are no idle resources for them.
780Ti got as far as it could go, nVidia then started optimising drivers for the new maxwell GPU. AMD meanwhile carried on optimising for GCN, with much further to go, so over time the 290x/390x/whatever is it named today overtook the 780Ti.
Im expecting 25% from the 980 to Ti and another 25% from Ti to 1080 and thus you get 50% from a 980 GTX to a 1080 GTX.
This is blasphemy!!!11! Fun fact: async compute is present in Maxwell arch it just needs to be enabled by driver. Funnier fact is driver team has been working on this issue since last years summer
This is blasphemy!!!11! Fun fact: async compute is present in Maxwell arch it just needs to be enabled by driver. Funnier fact is driver team has been working on this issue since last years summer
I really do reckon that NVidia will be pushing the FP16 capabilities of the new Pascal architecture and for games that are coded to take advantage of it could give a very significant boost to performance.
This could be the next battle ground between AMD and NVidia, who persuades developers to use the favourable tech ASync or FP16.
http://videocardz.com/59558/nvidia-geforce-gtx-1080-3dmark-benchmarks
hm..
If true, stock against stock makes it a decent bit faster than the 980Ti (20 - 30%)
If that is correct 20-30% faster then a 980Ti is not great. Good upgrade for anyone 970 or lower. This could be false?
Negative: http://forums.overclockers.co.uk/showpost.php?p=27387417&postcount=1232http://videocardz.com/59558/nvidia-geforce-gtx-1080-3dmark-benchmarks
3DMark score leaked for GTX1080 - it is faster than an overclocked GTX980TI and appears to boost to 1.86GHZ!!
The GTX980TI is faster clock for clock but you need LN2 to hit similar clockspeeds.
If that is correct 20-30% faster then a 980Ti is not great. Good upgrade for anyone 970 or lower. This could be false?