• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

** The Official Nvidia GeForce 'Pascal' Thread - for general gossip and discussions **

Soldato
Joined
7 Dec 2010
Posts
8,318
Location
Leeds
I wouldn't consider a GTX 780TI for £300 bargain basement, significantly cheaper than it was but next to a GTX 970 for £250+ not even close. 290X were going for £250 and less at the time too.

Like the GTX 980 and up now the 780TI was one of those cards that had a lot of E-Peen Tax.

The only real problem with the 780Ti was Nvidia decided to stop optimizing it to its full capability, where as AMD with the 290x kept optimizing it, the 780Ti was neck and neck with a 980 and they slowly overtime gimped (stop optimizing & "accidentally" crippled it in some drivers but later stating they found the issue and fixed it), reality is Nvidia is trying to save its rear and make consumers buy the next generation which in most cases was cheaper to manufacture and that equaled more profit for them. A 780Ti was not a cheap piece of silicon on the 28nm process, 980/970 was cheaper to manufacture and of course they tried to push sales on that.


The 780Ti was a fantastic card and had nice compute capability for serious work as well as gaming. Maxwell is a gaming card and they tried to string it along as a single precision compute card too ... reality was it was terrible in compute for the market that required compute.

Why do you think the Titan Z was sold at a silly price at first because they knew how good a compute card it was and it was nothing more than two 780Ti's on a board. Yes they got laughed at because of the price but nothing still touches it for DP Compute that is on the market and requiring CUDA.


Seems Nvidia realized this and decided to slowly gimp Kepler so people that required compute would upgrade with the next generation of real compute cards. I just think Nvidia is playing a little dirty sometimes and we will see it again soon with Maxwell and how Pascal will come out and not be that great a jump in performance but they will spin it again with gimping Maxwell (not optimizing) and optimizing Pascal to show a lead. One thing I like about AMD last few generations is they keep their optimizations for all the generations without gimping their previous cards, probably just how the technology on their GPUS works and can carry on to all the generations, where Nvidia seems to make sure any optimizations they make for a generation doesn't seem to make a difference to their previous ones or worse makes them fall behind even more than previous drivers.


Well we will see soon enough how they behave, wasn't very impressed with what they did to Fermi and Kepler so we will see if they do the same to Maxwell, if they do I may consider AMD next time as I don't like companies that behave in this manner considering people are spending their hard earned money on expensive bits of technology in most cases that last in some cases 2 years and reality is they should last longer, if it was a TV or a fridge or other consumer goods that slowly over time disintegrated in this manner, some one would be kicking up a fuss in some court. But Nvidia are smart they make sure to show improvements over time and then for some reason or another they suddenly stop getting better or some new feature like gameworks will cripple them and they will say well sorry this feature is really aimed at our newer range and tough luck you own outdated technology that is only one generation old, want the new feature update to a new card or turn off the feature, which they know will make people also update to see what they are missing.


780Ti sadly was a good example of how they took a fantastic bit of technology and they found ways to cripple it.
 
Last edited:
Soldato
Joined
18 Oct 2002
Posts
19,389
Location
Somewhere in the middle.
Of course they will... market dictates they MUST. They can't and won't release a 8GB GDDR5X GPU that is SLOWER than a 980Ti, they wouldn't sell any LOL! I don't even know how they'd manage that, but it won't happen and they'd NEVER get away with it if they tried (which they won't). The million dollar question is how much more powerful it will be AND at what price. 10% more for the same price as a Ti, no thanks (especially with inevitable Ti price drop). 20%+ more for less money... now we're talking. We shall find out soon enough anyway...

I think they will want longevity from their new product line and just trickle feed gimped crap with marginal speed increases for the next 3 years :p
 
Soldato
Joined
31 Dec 2006
Posts
7,224
I think they will want longevity from their new product line and just trickle feed gimped crap with marginal speed increases for the next 3 years :p

Why would they do that when they have an HBM2 card coming next year? They NEED to made the 1070/1080 appealing, otherwise people won't buy and will hold out for that, just keeping their 980/980Ti... or simply buy one of those cheap (when the prices inevitably drop in June). They simply cannot afford to gimp these new cards. That said, I do believe they will do JUST enough to make them appeal, no more.
 
Soldato
Joined
9 Nov 2009
Posts
24,985
Location
Planet Earth
I don't think anything is gonna turn up that makes the 980ti look crap.

It should beat it,otherwise it will be the first time since the 9800GTX that a Nvidia high end card of a new generation has not beaten the fastest single GPU card of the previous one. Even in that case,they priced the card relatively cheaply compared to the 8800GTX.
 
Soldato
Joined
18 Oct 2002
Posts
19,389
Location
Somewhere in the middle.
It should beat it,otherwise it will be the first time since the 9800GTX that a Nvidia high end card of a new generation has not beaten the fastest single GPU card of the previous one.

I hope it's true but I can't forsee anything particularly mind blowing. Hopefully it does though. I would like something to turn up and make 4k look as easy to drive as 1080p.
 
Soldato
Joined
9 Dec 2006
Posts
9,287
Location
@ManCave
I'm also reconsidering getting pascal

Because overwatch my new staple game is 100% in sli scaling

So in order to get sane perf I need 1*1080 to beat two 980s. Possible but doubt it

Getting 120Fps at 4K max settings
Or 65fps at Almost 6K? 5760*3840
 
Soldato
Joined
14 Jul 2004
Posts
2,549
Kind of hoping Pascal is less driver optimisation dependent than Kepler and Maxwell, but I gather not much has changed.

Also hoping Polaris is either faster than expected or cheap as chips. Nothing gets Nvidia into line like a bit of competition. Folks may remember the prices on the GTX 260 and 280 dropped through the floor when AMD caught them on the hop with the 47xx series. Was one of the best years ever for GPUs.

I'm kind of under the impression Nvidia are in a bit of a rush to get to market. Always telling.
 
Last edited:
Soldato
Joined
19 Oct 2008
Posts
5,958
Fermi is evidence to the contrary. It took a whole new series of cards to polish that particular lump of... coal. ;)

What was wrong with it? Despite some negativety at the time (heat if memory serves me well) I recall Fermi were actually very good cards and are seen so by many. How did the Fermi cards stack up to the competition at the time? :)
, but having owned early release cards from both sides I stand by my claim of being well developed products, both hardware and of course software (drivers).
 
Last edited:
Soldato
Joined
7 Aug 2013
Posts
3,510
Chill bro, it is only an internet forum, no need to get annoyed over the "internet"
I'm allowed to get annoyed at things man. I'm not losing sleep over it, dont worry. I get annoyed at somebody pushing past me in a crowd, too. Whether it's on the internet or not is irrelevant.

And no it is not evidence of them paying developers off (your charge against me...) but it shows that nvidia have tried to apply pressure so who knows what they might do or offer to other developers especially games that they "sponsor". You said that there was no rational discussion or merit on claims to suggest that nvidia pay companies to do said things but that shows that they already put "pressure" on developers, which could take various forms PS. I never specifically said that nvidia 'paying off developers', which seems to be your central charge at me ;)
Well at least you admit it's not evidence of them doing what was accused of them here. That's a start.

So question back at you, do you deny that nvidia have not applied pressure to game developers to exclude features that benefit their competition in the past?
Has nothing to do with what we were talking about. Here I was thinking you might finally start understanding the point, but nope, you're still going to sit here and try and say there's some merit to the argument because they did something else.

I'm not saying Nvidia is clean or anything. Never have. I am merely dismissing the notion that they are paying developers to get rid of DX12 like was claimed. Nothing more. Yet here you are, desperately trying to cling to the notion with thin wires, even though you even admit there's absolutely no evidence to support it. You tell me to 'keep an open mind' even though there's absolutely no reason to support it. You continually bring up this one thing they did in the past and present it as some idea that maybe this accusation is true, even though there is no actual evidence to support it.

It's a lousy argument man. That's all there is to it. Something that doesn't have any evidence whatsoever is a bad argument. Yet here you are, continually trying to defend it anyways, however loosely. There is NO REASON to entertain the idea whatsoever. And it wasn't phrased as speculation. For one, it was completely *baseless* speculation, but moreso, it was stated as fact. As if it had happened. Though they of course were wise to never respond to this as they knew damn well they had nothing to back it up with. So dont take over the charge that somebody else was smart enough to back down from man. It's a losing battle.

And I'm afraid, stupid analogy is still stupid because you are comparing 2 different things, murder and theft. I am only being sceptical and high lighting the "possibility" that nvida has put "pressure" on developers to do said stuff as they did with oxide.
You haven't at all explained why the analogy is 'stupid'. It's clearly just your instinct to state it is so as it's inconvenient to your argument.

We are ALSO talking about two different things. Putting pressure on a developer to turn off certain options in benchmarking test. And paying money to a developer to completely get rid of DX12 from their game. I think my analogy is absolutely, freakin A+ spot-on here man. Doing one does not mean it is ok to accuse them of the other based *solely* on the fact that they were guilty of the former.

It's nonsense.
 
Soldato
Joined
7 Aug 2013
Posts
3,510
Kind of hoping Pascal is less driver optimisation dependent than Kepler and Maxwell, but I gather not much has changed.
This has little to do with the hardware, and more to do with the software stack. Under DX11, drivers are always going play a significant role. Under DX12, they play a much smaller role, though it then places a lot more emphasis on the developers to implement and optimize all the features that the drivers use to take care of for them automatically(more or less).

Nvidia and AMD have really been a huge crutch for PC development for a long time now. DX12 is taking away that crutch, and there's a whole lot of positives and negatives that come with that aren't necessarily going to always balance out in our(the customer's) favor.
 
Permabanned
Joined
28 Nov 2006
Posts
5,750
Location
N Ireland
It should beat it,otherwise it will be the first time since the 9800GTX that a Nvidia high end card of a new generation has not beaten the fastest single GPU card of the previous one. Even in that case,they priced the card relatively cheaply compared to the 8800GTX.

Im expecting 25% from the 980 to Ti and another 25% from Ti to 1080 and thus you get 50% from a 980 GTX to a 1080 GTX. Anything less will be really really dangerous to the PC segment i think it would be a first as you said.


Performance and Vram normally doubles. Shave off a pci-e cable and brand it with this FP_64 and DX12_1 PR. Well i would still have to buy that to be honest im sure other 980 owners will nod in agreement too. Were really knocking off these cables too with NVMe drives there will be barely any left which is sweet i guess.


Still mad about HBM though, HBM = Dragons = Soon = Maximum milkage
 

bru

bru

Soldato
Joined
21 Oct 2002
Posts
7,359
Location
kent
Nexus18 If I was to post up that AMD (who bought out ATI) were paying developers to add extra nonsense effects to games just to hurt NVidia and here is my proof.


Proof, but not really


That is pretty much what you said earlier, I accuse Nvidia of doing X and here is the proof because they did Y (and in your case it wasn't even proof just an accusation.)

Yes as you can see the thing they were guilty of has no relevance to what they are now accused of.



Disclaimer: Before anyone starts saying AMD wouldn't do such a thing, bru you talking billyhooks and the like, what I said above is just an example.:)
 
Back
Top Bottom