• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

** The Official Nvidia GeForce 'Pascal' Thread - for general gossip and discussions **

Caporegime
Joined
4 Jun 2009
Posts
31,576
I'm allowed to get annoyed at things man. I'm not losing sleep over it, dont worry. I get annoyed at somebody pushing past me in a crowd, too. Whether it's on the internet or not is irrelevant.

Well at least you admit it's not evidence of them doing what was accused of them here. That's a start.


Has nothing to do with what we were talking about. Here I was thinking you might finally start understanding the point, but nope, you're still going to sit here and try and say there's some merit to the argument because they did something else.

I'm not saying Nvidia is clean or anything. Never have. I am merely dismissing the notion that they are paying developers to get rid of DX12 like was claimed. Nothing more. Yet here you are, desperately trying to cling to the notion with thin wires, even though you even admit there's absolutely no evidence to support it. You tell me to 'keep an open mind' even though there's absolutely no reason to support it. You continually bring up this one thing they did in the past and present it as some idea that maybe this accusation is true, even though there is no actual evidence to support it.

It's a lousy argument man. That's all there is to it. Something that doesn't have any evidence whatsoever is a bad argument. Yet here you are, continually trying to defend it anyways, however loosely. There is NO REASON to entertain the idea whatsoever. And it wasn't phrased as speculation. For one, it was completely *baseless* speculation, but moreso, it was stated as fact. As if it had happened. Though they of course were wise to never respond to this as they knew damn well they had nothing to back it up with. So dont take over the charge that somebody else was smart enough to back down from man. It's a losing battle.


You haven't at all explained why the analogy is 'stupid'. It's clearly just your instinct to state it is so as it's inconvenient to your argument.

We are ALSO talking about two different things. Putting pressure on a developer to turn off certain options in benchmarking test. And paying money to a developer to completely get rid of DX12 from their game. I think my analogy is absolutely, freakin A+ spot-on here man. Doing one does not mean it is ok to accuse them of the other based *solely* on the fact that they were guilty of the former.

It's nonsense.

Why is it so hard to grasp these 2 points...

The possibility (note - I never made any definitive statements as facts - this is just an internet forum after all, not a court of law where I need to source and substantiate everything to the requisite standard of proof) of pressure being applied.

Pressure can take various forums, we have no idea what exactly those oxide developers mean when they say that and I doubt it was just a simple email/phone call from nvidia saying "oh hey oxide developers, can you disable this feature please, kthxbye".

Again, I am not saying that they did or didn't do anything, nor am I trying to get people to take what I or others think as fact, it is you who is saying that they "definitely" didn't do whatever and so desperately trying to shove your believes down others people throats, it really is almost like your are in damage control mode right now....

As for the oxide thing;

you continually bring up this one thing they did in the past and present it as some idea that maybe this accusation is true, even though there is no actual evidence to support it

Are you now suggesting to me that oxide are lying? If they were, do you not think nvidia would have had something to say about that?

And if so, as you have been lambasting me with, where is your evidence to show that they are lying? A developer working on the issue has raised credible evidence to make it a prima facie case. So the onus is on you to rebut it with something more credible.

Like I said, my point all along has been to keep an open mind and don't be so gullible and blind as to what companies can and will do as shown time and time again and that you nor me or anyone else truly know what goes on behind closed doors and until the whistle is blown, it is all just speculation from all parties.

There has been plenty of rational discussion on nvidia, AMD and dx 12, async etc. for people to come to their assumptions, you chose to ignore what doesn't sit well with you and your believes but hey that is cool, if you want to see companies with rose tinted glasses on, be my guest, we all got to keep ourselves happy one way or another :)
 
Last edited:
Caporegime
Joined
4 Jun 2009
Posts
31,576
Nexus18 If I was to post up that AMD (who bought out ATI) were paying developers to add extra nonsense effects to games just to hurt NVidia and here is my proof.


Proof, but not really


That is pretty much what you said earlier, I accuse Nvidia of doing X and here is the proof because they did Y (and in your case it wasn't even proof just an accusation.)

Yes as you can see the thing they were guilty of has no relevance to what they are now accused of.



Disclaimer: Before anyone starts saying AMD wouldn't do such a thing, bru you talking billyhooks and the like, what I said above is just an example.:)

I have no idea, haven't read into anything much about ATI/AMD back then etc.

I am guessing that you haven't properly read my posts either though... as my point isn't just about nvidia but for all companies and just how dirty they can be and not to reject opinions because they don't sit well with certain people, I am just using nvidia in this instance considering the topic at hand.

Chances are, AMD are just as dirty too with regards to getting dx12/async implemented in games given nvidias predicament, for all we know, they too could be "pressuring" (take that as whatever way you want) developers into doing this, just as you have shown with that link, they also have a history of this behaviour.

Also, didn't one or two of the developers come forward and say something about it being very time consuming to work on dx 12/async to get any noticeable difference? If this is true then why bother given how strict their deadlines are as well as their budget constraints especially since it will only really benefit how much of the GPU market?
 
Associate
Joined
13 Oct 2011
Posts
1,419
Location
Suffolk
I disagree that they "stopped optimising for the 780Ti".

My personal opinion is that AMD drivers are generally badly optimised. Not to say AMD cards are not competitive - they are, but this is through brute force. Hence the 7970 has much more compute power than the 680, and the 390x has more compute power than the 980, etc.

This is shown by nVidia's poor showing in DX12 - it's not a poor showing, but their incredibly efficient DX11 driver left less fat to trim, less room for improvement.

This is also why AMD need async compute - to use resources in the GPU that are sitting idle otherwise - and nVidia haven't bothered with it - as there are no idle resources for them.

780Ti got as far as it could go, nVidia then started optimising drivers for the new maxwell GPU. AMD meanwhile carried on optimising for GCN, with much further to go, so over time the 290x/390x/whatever is it named today overtook the 780Ti.
 

bru

bru

Soldato
Joined
21 Oct 2002
Posts
7,359
Location
kent
I have no idea, haven't read into anything much about ATI/AMD back then etc.

I am guessing that you haven't properly read my posts either though... as my point isn't just about nvidia but for all companies and just how dirty they can be and not to reject opinions because they don't sit well with certain people, I am just using nvidia in this instance considering the topic at hand.

Chances are, AMD are just as dirty too with regards to getting dx12/async implemented in games given nvidias predicament, for all we know, they too could be "pressuring" (take that as whatever way you want) developers into doing this, just as you have shown with that link, they also have a history of this behaviour.

Also, didn't one or two of the developers come forward and say something about it being very time consuming to work on dx 12/async to get any noticeable difference? If this is true then why bother given how strict their deadlines are as well as their budget constraints especially since it will only really benefit how much of the GPU market?

Oh well it was worth a try, I can see by your response that you have completely missed the point of my post and therefore what Sean has been saying all along.

One last try.

There was a dead dog outside on the path the other day and because my cat likes to bring me birds every now and again it must have been him that killed the dog.
 
Associate
Joined
10 Jul 2009
Posts
1,559
Location
London
I disagree that they "stopped optimising for the 780Ti".

My personal opinion is that AMD drivers are generally badly optimised. Not to say AMD cards are not competitive - they are, but this is through brute force. Hence the 7970 has much more compute power than the 680, and the 390x has more compute power than the 980, etc.

This is shown by nVidia's poor showing in DX12 - it's not a poor showing, but their incredibly efficient DX11 driver left less fat to trim, less room for improvement.

This is also why AMD need async compute - to use resources in the GPU that are sitting idle otherwise - and nVidia haven't bothered with it - as there are no idle resources for them.

780Ti got as far as it could go, nVidia then started optimising drivers for the new maxwell GPU. AMD meanwhile carried on optimising for GCN, with much further to go, so over time the 290x/390x/whatever is it named today overtook the 780Ti.

This is blasphemy!!!11! Fun fact: async compute is present in Maxwell arch it just needs to be enabled by driver. Funnier fact is driver team has been working on this issue since last years summer :D
 
Permabanned
Joined
5 Apr 2006
Posts
7,699
zmavjq.gif
 
Associate
Joined
13 Oct 2011
Posts
1,419
Location
Suffolk
This is blasphemy!!!11! Fun fact: async compute is present in Maxwell arch it just needs to be enabled by driver. Funnier fact is driver team has been working on this issue since last years summer :D

I bet when we see it, it won't do much to benefit them!!

In a way it makes AMD cards a better investment, as they will get better with time.
 
Associate
Joined
8 May 2014
Posts
2,288
Location
france
This is blasphemy!!!11! Fun fact: async compute is present in Maxwell arch it just needs to be enabled by driver. Funnier fact is driver team has been working on this issue since last years summer :D

funnier fact : async compute doesnt exist on maxwell and never will , async compute cannot be added through driver, it needs hardware, and nvidia already commented on it, async brings mainly 2 improvements, core efficiency and latency.
core efficiency, cannot be improved by much, since nvidia GPU are already maxed out on this regard, they utilize all the core the best they can, tweeking some more doesn't add that much more performance.
latency , well just nothing they can do, it needs to be hardware
 
Caporegime
Joined
18 Oct 2002
Posts
32,624
Even funnier fact: Async compute is not part of DX12, only compute and graphics queues are, which all Nvidia cards back to the Fermi support.

Best performance form Nvidia's compute queues comes from large work batches. GCN can handle smaller batch sizes better, but at the cost of increased overhead.

Don't buy into the AMD Pr machine. async compute is a an AMD marketing spin to solve a problem that Nvidia doesn't suffer from to the same extent. GCN has a terrible time exploiting all the CUs, Nvidia's architecture is much better at reaching closer to theoretical performance.
 

bru

bru

Soldato
Joined
21 Oct 2002
Posts
7,359
Location
kent
I really do reckon that NVidia will be pushing the FP16 capabilities of the new Pascal architecture and for games that are coded to take advantage of it could give a very significant boost to performance.
This could be the next battle ground between AMD and NVidia, who persuades developers to use the favourable tech ASync or FP16.
 
Caporegime
Joined
18 Oct 2002
Posts
32,624
I really do reckon that NVidia will be pushing the FP16 capabilities of the new Pascal architecture and for games that are coded to take advantage of it could give a very significant boost to performance.
This could be the next battle ground between AMD and NVidia, who persuades developers to use the favourable tech ASync or FP16.

GCN 1.2 already supports FP16.
 
Associate
Joined
2 Nov 2012
Posts
959
Location
Sussex
http://videocardz.com/59558/nvidia-geforce-gtx-1080-3dmark-benchmarks

3DMark score leaked for GTX1080 - it is faster than an overclocked GTX980TI and appears to boost to 1.86GHZ!!:eek:

The GTX980TI is faster clock for clock but you need LN2 to hit similar clockspeeds.
Negative: http://forums.overclockers.co.uk/showpost.php?p=27387417&postcount=1232

http://forums.overclockers.co.uk/showthread.php?t=18428457&highlight=Official

GPU scores don't look that impressive to me, with the caveat that we don't know what the stock clocks are on the 1080 and how far it can be pushed.
 
Caporegime
Joined
18 Oct 2002
Posts
32,624
If that is correct 20-30% faster then a 980Ti is not great. Good upgrade for anyone 970 or lower. This could be false?

Those benchmarks wont be a good sign of how well the 1080 can do: "3DMark11 Performance preset is rendered at 1280×720 resolution".

Pascal will draw a bigger lead in more modern games at higher resolutions in more demanding situations. Then there is the usual driver maturity differences.

Its looking like the 1080 is decently faster than a 980ti, as expected.

By certain peoples reckoning AMD should have launched Polaris back in November if Nvidia is 6 months behind AMD!
 
Back
Top Bottom