• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA reports financial results for Q2 fiscal 2015.

Compared to the massive 4GB on the 290? Not trying to start an argument here, but you've not got one anyway :D

You can't have 4gb on a 384bit bus.

Would 4.5GB on 384bit not be doable? or 5GB, what about 3072MB? I though memory just had to be a multiple of the bus?

Was always funny when the 9600 GSO had a 192bit bus so they came in 384MB, 768MB and 1536MB (flagship 9800GTX+ only came in 512MB and 1024MB lol).
 
Last edited:
How can Nvidia get away with some of those flat out lies. I know they are trying to make themselves sound like major players in the mobile world when they completely failed so far. But still marketing lies don’t belong in financial reports. It makes me wonder what else they are lying about.


“the first to bring GPU computing to mobile”
WTF is all I can say they were the last, flat out last as everyone else beat them by over 1 generation or more in cases case beat them by years.

Massive growth areas mobile! How does massive market share loss last time I looked at under 1% and running a department running at 100’s of millions loss count as growth?

EDIT: I can see they lost 90% of there own market share then gained some of it back so are technically up 200% over last year.
 
Last edited:
How can Nvidia get away with some of those flat out lies. I know they are trying to make themselves sound like major players in the mobile world when they completely failed so far. But still marketing lies don’t belong in financial reports. It makes me wonder what else they are lying about.


“the first to bring GPU computing to mobile”
WTF is all I can say they were the last, flat out last as everyone else beat them by over 1 generation or more in cases case beat them by years.

Massive growth areas mobile! How does massive market share loss last time I looked at under 1% and running a department running at 100’s of millions loss count as growth?

Is this rant based on your previous out of date information that you showed us?

We need Nvidia/Amd all making money to keep pushing each other. It would be nice to see another player in the market for desktop Graphics.
 
Last edited:
Would 4.5GB on 384bit not be doable? or 5GB, what about 3072MB? I though memory just had to be a multiple of the bus?

Was always funny when the 9600 GSO had a 192bit bus so they came in 384MB, 768MB and 1536MB (flagship 9800GTX+ only came in 512MB and 1024MB lol).


It is multiples of the bus. But 4.5 would be a bit obscure. I'm sure there are reasons why it cannot be done on GK110.


There were a few 192bit cards with 1GB though. This was done by using three separate memory controller assigned to six separate chips of varying sizes instead of the same size.
 
Last edited:
Is this rant based on your previous out of date information that you showed us?

We need Nvidia/Amd all making money to keep pushing each other. It would be nice to see another player in the market for desktop Graphics.
What previous out of date information? The comments are taken from the link above and is not a rant. I just found it funny how they can be late to the GPU mobile compute market by 2 years and say they are first. NVidia only joined in half way though 2014 with TK1 while the other major players go as far back as 2012. Yet somehow take makes NVidia first?

EDIT: It would be nice to see another player in the gaming desktop market but I don't see that happening. The market is to small to get into for a new player against the cost and risk it would take to develop. We do have new player(s) in the none desktop gaming market but that's a very different market.


Pottsey, do us all a favour and just vanish with your drivel
I don’t understand what was stupid or careless about what I said?
 
Last edited:
Would like to know the ratio of profits in enthusiast GPU : tesla : mobile etc.
Not sure on profits as there doesn’t seem to a breakdown of running costs for each department this time around. As a estimated guide line I guess we could look at last years department running costs and compare against this year's totals. That's not ideal mine you.

The mobile market was 140million, PC 850 million, Data and cloud 50million.
 
Compared to the massive 4GB on the 290? Not trying to start an argument here, but you've not got one anyway :D
Had the GTX780 came with 6GB as default instead of 3GB, it would give far more incentive for people (myself included) to consider buying it.

It's difficult to consider buying GTX780 from financial perspective when a 290 is at £260-£300, and the GTX780 3GB is on £360+ over average and the recently launched 6GB version at £420+.

If it was the GTX780 6GB at £360 ish (still nearly £100 dearer than the 290) rather than the 3GB version, it would at let people feel like they are actually getting something extra for the premium they are paying.

Think there's at least one or two occassion that I've seen members claiming they are running out of vram and performance suffering at 2560 res. Disagree as you may, but the corner that Nvidia cut on vram DOES make the GTX780 age worse than needed to be, comparing to the Titan with 6GB of vram. One of the reason why the Titan aged so well is not because of its compute capability, but the extra vram it has over the newer GTX780 3GB...
 
Last edited:
PC gaming is dead they say, PC's are obsolete with iPads and what not they say. Don't get me wrong I know Nvidia are involved with more than just Desktop GPU's but those are some pretty healthy looking figures!
 
Had the GTX780 came with 6GB as default instead of 3GB, it would give far more incentive for people (myself included) to consider buying it.

It's difficult to consider buying GTX780 from financial perspective when a 290 is at £260-£300, and the GTX780 3GB is on £360+ over average and the recently launched 6GB version at £420+.

Your forgetting of course that the 290 wasn't even in the pipe when the GTX780 hit the market, at the time it was bridging the gap between the GTX680/Titan and matching VRAM with AMD's flagship the HD7970.
 
PC gaming is dead they say, PC's are obsolete with iPads and what not they say. Don't get me wrong I know Nvidia are involved with more than just Desktop GPU's but those are some pretty healthy looking figures!

Agreed and I was worried when the new consoles were released that the PC would suffer a big hit but even AMD are slowly pulling themselves into the black, so the PC is looking good.
 
Your forgetting of course that the 290 wasn't even in the pipe when the GTX780 hit the market, at the time it was bridging the gap between the GTX680/Titan and matching VRAM with AMD's flagship the HD7970.
Nope I'm not forgetting anything, and your point has doesn't apply for the GTX770 being launched nearly a year after 7970 with 3GB was launched, and still only offered 2GB of vram, despite being £30 more expensive at the time. It's not even about Nvidia vs AMD, I was just pointing out that Nvidia has always been TOO conservative when it comes the amount of vram they are offering, like they want to ensure their product will not age well so people will have to upgrade sooner (though from business point of view I can totally understand...had I been in their shoe, I'd do the same).

People has always argued that "GPU will run out of grunt before run out of vram", but look back at GTX580 1.5GB, or people with SLI GTX670/GTX680 2GB, I'm not sure if I can really agree with that.
 
Last edited:
People has always argued that "GPU will run out of grunt before run out of vram", but look back at GTX580 1.5GB, or people with SLI GTX670/GTX680 2GB, I'm not sure if I can really agree with that.

It was argued because it is true. Unless you're going multi-GPU the VRAM on nVidia cards is sufficient for the processing power the card holds. Unless games go the route of super high res rextures (which there hasn't been a movement to yet) then that pattern doesn't look like breaking. Exceptions exist but they are the minority.

Again worth re-iterating that for multi-GPU the line blurs somewhat in that more often than not the amount of VRAM is still sufficient for the processing power present but it can then become an issue in certain games at certain resolutions.
 
Nope I'm not forgetting anything. It's not even about Nvidia vs AMD, I was just pointing out that Nvidia has always been TOO conservative when it comes the amount of vram they are offering

I never said it was about Nvidia or AMD, just that Nvidia were offering as much VRAM as their competitors best product, it's not like they were being stingy/conservative with the VRAM, they just weren't being generous with it and keeping the normal amount instead of increasing it.


and your point has doesn't apply for the GTX770 being launched nearly a year after 7970 with 3GB was launched, and still only offered 2GB of vram, despite being £30 more expensive at the time

Nice ninja edit, but you and I both know that the 770 was not a new launch just a re-release of a 14 month old card with a clock bump and a new name, if your going to complain about that then you also have to complain about the R280X and R280 both launching with the same ram as the GTX780, 5 months and 10 months later respectively.
 
Last edited:
Apple is successful and make hellalots of money, but that doesn't meant their product is value for money for the consumers though :p

Value for money is relative, people pay high prices for Apple products because they are of a high quality and they enjoy using them. I would rather pay £400 for a positive experience than £300 for a not so positive one.

AMD GPU buyers get too caught up in benchmarks and specs I feel, who buys a car purely based on lowest price/top speed?
 
Last edited:
Had the GTX780 came with 6GB as default instead of 3GB, it would give far more incentive for people (myself included) to consider buying it.

It's difficult to consider buying GTX780 from financial perspective when a 290 is at £260-£300, and the GTX780 3GB is on £360+ over average and the recently launched 6GB version at £420+.

If it was the GTX780 6GB at £360 ish (still nearly £100 dearer than the 290) rather than the 3GB version, it would at let people feel like they are actually getting something extra for the premium they are paying.

Think there's at least one or two occassion that I've seen members claiming they are running out of vram and performance suffering at 2560 res. Disagree as you may, but the corner that Nvidia cut on vram DOES make the GTX780 age worse than needed to be, comparing to the Titan with 6GB of vram. One of the reason why the Titan aged so well is not because of its compute capability, but the extra vram it has over the newer GTX780 3GB...

Firstly, the 780GTX isn't in the same tier as Titan, so highlighting the fact it's newer is just confusing. Secondly, there aren't any corners cut. Why would you offer the same amount of VRAM as your top tier product? If you need the 6GB frame buffer, you buy the Titan. I will disagree with you, as I ran 1440P with three 780 for 6 months, and had no issues with the frame buffer. Going beyond that, you run into trouble on both 780 and the 290x on occasion. As it stands, the only alternative AMD offer is produced by one of it's AIBs in form of an 8GB version. Nvidia have offered consumers 6GB since February 2013. A somewhat substantial time difference, so price isn't really comparative.

4GB isn't massively more than 3GB, so once again I don't think you've an argument there. It's been done to death.
 
Last edited:
Apple is successful and make hellalots of money, but that doesn't meant their product is value for money for the consumers though :p

Not quite sure what that has to do with anything I posted, or even this thread for that matter but ho hum.

In complete honesty, as a gamer I care more about the product and value for money of the products more than how well Nvidia do for themselves (as I am no shareholder of theirs :D).

Very understandable and to be honest I feel the same way.


If they are going to price their card higher than their competitor, the least I would expect is that they won't cheap out of the amount of vram. I'd rather see more vram on our graphic cards than more profits on Nvidia's account :p

Nvidia cheaping out on Vram, that's a good one, could you just list how many gaming single GPU AMD cards have ever released with more than 4 GB of ram?

er that would be none then, as opposed to Nvidia, oh that would be two then, Titan and Titan black.
 
Back
Top Bottom