• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD teasing new product, is it Radeon or FirePro?

Except Kepler was refined Fermi and Maxwell is refined Kepler.

The last big change for NV actually was the GT200 to Fermi jump. After that it was refinements.

Interestingly TH did the most precise power measurements of Maxwell to date(they have the most advanced system of all the English language websites and measure dynamic power consumption),and its worth reading their take on it all:

http://www.tomshardware.co.uk/nvidia-geforce-gtx-980-970-maxwell,review-33038-14.html

Yep, it's more of an evolution rather then a new architecture built from the ground up. I thought Maxwell was supposed to have an arm co-processor at some stage? Probably just a rumor.

With GCN and mantle though, it took time to add support for GCN 1.0 when it launched and then the 285, even though its a small refinement, doesn't work so well with current mantle games. I just think it might be holding them back.
 
What did I say, eh? Here is the "leak" I told you all to expect ;)

http://wccftech.com/amd-embedded-roadmap-2014-2016-leaked-insight-gen-apus-gpus/

We can extrapolate from that the approximate timeframe for AMD’s upcoming graphics products for the client segment based on the E10xxx series. This puts the timeframe for AMD’s new cards between lat Q4 of this year and early Q1 of 2015. Before then AMD will have to make do with game bundles and price cuts to compete with Nvidia’s GTX 900 series. AMD is also poised to make an announcement on September 25th so stay tuned for more info.

Epn0QCZ.jpg
 
I believe when the GTX680 was released some well known review sites said AMD was doomed. Yet obviously they cannot be that doomed if Sony and MS just spent billions on new consoles using AMD tech.

Just like when the internets said Nv was screwed with the FX launch. People have conviniently short memories.

Intel was buggered with the P4 and Itanium and might go bankrupt eventually.

I believe someone at Merill Lynch said AMD would be soon be bankrupt in 1989. Still not happened although especially on US forums it seems much worse regarding all the sky is falling.

Apple will go bankrupt.

Pentax will go bankrupt.

And so on.

As usual the internet is full of doomsayers.

wake_up_sheeple.png


Its more likely the doomsayers will cause the doom through some kind of Osbourne Effect(company is going bankrupt so don't buy their products).

...or like when AMD released Bulldozer and everybody said they were doomed and they have pretty much given up on the CPU front ever since? people aren't saying that they're going to go out business but losing lots of market share is a very real possibility. AMD are as good as dead atm in the CPU market if you're after high end performance.

I don't think their GPU division will do as badly as their CPU division but AMD's response to "being doomed" before has always been to significantly undercut their competitor... NVidia haven't left much room for that this time and if AMD only undercut them slightly the saving is unlikely to be worth it as they're likely to be selling an all-round worse product (heat/noise/power usage/software support etc).

Perhaps this is NVidia responding to Mantle, they've probably been happy to accomodate AMD as a 'cheap alternative' before but now they see AMD trying to corner the market with Mantle and they've decided to go for the throat and take away AMD's primary customer base (the bang for buck brigade).
 
Last edited:
The GM200/GM210 like all Nvidia top end GPUs won't scale as well in performance as they are not true single use designs for gaming. They are designed for mixed gaming/GPGPU workloads with emphasis on DP just like Hawaii is so there is added die area and power consumption towards non-gaming functionality.

Just like with the GK104 and GK110,GF100 and GF104,GF110 and GF114,GT200 and G92 and so on. The top end GPUs on the same node have lower performance/mm2 and generally worse performance/watt. Even look at Tahiti against the GK104. All doom and gloom when the GTX680 was launched,yet the former was not designed only for gaming. However compare the GK106 and Pitcairn,and in terms of performance/watt and performance/mm2,they were similar.

I have been an enthusiast since 2003,and every year its either AMD or Nvidia is doomed. It really is that old.

Edit!!

That is assuming the GM200/GM210 is 28NM.

If Nvidia makes a small run at 20NM,priced at £1000 and 99% of their range is still 28NM for another year,further cries of AMD is doomed since they are one node behind.
x
There is only one problem here, scaling from GK104 up to GK110 is near perfect. I have tested this myself as I own 4 of each. You can predict the likely performance of any Kepler card quite easy if you know how many SMX modules it has and what clockspeeds are used, this is how well they scale.

What is sometimes misleading when high end cards are reviewed is the test setup used, often the cards are bottlenecked which makes the faster cards look bad.

You can do the same with Hawaii and Tahiti too for scaling as these cards are very similar. I even predicted to humbug before I owned a 290X what it would score on his Valley bench and what clockspeeds would be needed, I was within a couple of points when I benched it for real.
 
Last edited:
Looking at the last few years of GPU releases.

2002-2003 ATI is doomed due to the cancelation of the 8500XT and Nvidia having the TI4200 and TI4600

2003-2004 Nvidia is doomed since the R300 is first to DX9 and faster than the FX series.

2004-2005 ATI is doomed since the 6000 series Nvidia cards support DX9C and they don't. Nvidia is lower power consumption.

2005-2006 Shifts between ATI is doomed and Nvidia is doomed.

2006-2007 ATI is doomed as the G80 and G92 have better performance and lower power consumption than the HD2000 and HD3000 series

2008-2009 ATI is first doomed since they having nothing to compete with the GT200. Then AMD launches the R700,and now Nvidia is doomed as Nvidia has to price cut huge chips against AMD ones which are smaller

2009 - 2010 Nvidia is doomed since ATI/AMD is first to DX11,has lower power consumption and smaller dies than the GTX400 series. Although soon with the GTX460 series,AMD is doomed.

2010 -2011 AMD is partially doomed since they don't have the fastest card anymore,and Nvidia has better tessellation

Late 2011 to early 2012. AMD is doomed due to the HD7970 and even more doomed with the GTX680.

Late 2012 to middle 2013. AMD is doomed as they have nothing to compete with once the Geforce Titan and GTX780 are released

Late 2013. AMD is still doomed with the R9 290 and R9 290X releases,since Nvidia quickly launched the GTX780TI and the AMD cards have black screens,throttle and run too hot and explode all the time.

Late 2012 to 2013. Nvidia is doomed due to AMD winning console contracts.

Early 2014. AMD is doomed due to the GTX750TI

Late 2014. AMD is doomed due to the GM204.

Potential next doom point - Nvidia releases 20NM GM200/GM210 in small quantities at £1000 and even if AMD has the fastest card in the R9 390X at £500 before then is still doomed.

...or like when AMD released Bulldozer and everybody said they were doomed and they have pretty much given up on the CPU front ever since? people aren't saying that they're going to go out business but losing lots of market share is a very real possibility. AMD are as good as dead atm in the CPU market if you're after high end performance.

I don't think their GPU division will do as badly as their CPU division but AMD's response to "being doomed" before has always been to significantly undercut their competitor... NVidia haven't left much room for that this time and if AMD only undercut them slightly the saving is unlikely to be worth it as they're likely to be selling an all-round worse product (heat/noise/power usage/software support etc).

Perhaps this is NVidia responding to Mantle, they've probably been happy to accomodate AMD as a 'cheap alternative' before but now they see AMD trying to corner the market with Mantle and they've decided to go for the throat and take away AMD's primary customer base (the bang for buck brigade).

Its ACTUALLY more the case people on purpose ignore the last decade and think the doom will happen.

It does make me wonder at times,since all these doom predictions have been endlessly paraded before.
 
Last edited:
x
There is only one problem here, scaling from GK104 up to GK110 is near perfect. I have tested this myself as I own 4 of each. You can predict the likely performance of any Kepler card quite easy if you know how many SMX modules it has and what clockspeeds are used, this is how well they scale.

What is sometimes misleading when high end cards are reviewed is the test setup used, often the cards are bottlenecked which makes the faster cards look bad.

You can do the same with Hawaii and Tahiti too for scaling as these cards are very similar. I even predicted to humbug before I owned a 290X what it would score on his Valley bench and what clockspeeds would be needed, I was within a couple of points when I benched it for real.

It isn't though when you consdier the increase in transistors for said performance and decrease in performance/watt too. You will get reductions in performance/mm2 and performance/watt for a 28NM GM200 against a GM204 based on the same uarch.

The following is just a general statement BTW.

This is why the sky is failing is hilarious when people ignore on purpose the GT200 vs G92,GF100 vs GF104,GF100 vs GF114 and GK110 vs GK104.

Hawaii and Tahiti are both compute cards since they have far higher DP performance. Try comparing something like Pitcairn and Cape Verde with Tahiti for example. Improved performance/watt for one,but far less DP performance,and much smaller chips.

GK104 vs GK110 - improved performance/watt and performance/mm2. The GK110 based cards have to be clocked comparatively lower to maintain performance/watt.

Like I said the same old arguments about AMD is doomed and Nvidia is doomed have happened for years.

Move back 10 years and you can see all this doom and gloom said for different GPUs.

Some of the people here forget the awful FX series,and how there were multiple Nvidia is screwed predictions too. Even Fermi with its massive chips and large power consumption,was meant to doom Nvidia. ATI/AMD was doomed because of the G80 and G92 and their HD2000 and HD3000 series,which could not compete. All here still. I for one never predicted any doom since I thought it was silly and more importantly I want both companies pushing each other! I also want a choice.

Apple is doomed,Android is doomed,and so on.

Plus broken records can always be right when it comes to doom predictions.

Just like the person on the side of the road saying the end is nigh for 30 years and it actually happens.

Is he an oracle or just a nutjob?? Their word against yours.

Edit!!

But if you people really want to feel depressed,I can start my own doom and gloom scenario.

Discrete cards are decreasing in sales each year and BOTH AMD and Nvidia are fighting over less and less sales,and depending more and more on compute and pro cards.

Except,Intel is now improving its graphics massively each generation eating away at the low end and now with MIC is poised to enter the compute market in a BIG way.

Unlike Nvidia and AMD they have billions of dollars to buy marketshare(look at new Atom) and could probably eff up AMD and Nvidia in a big way if they wanted to.

Those big monolithic GPUs you all love on this forum are primarily developed for the markets Intel is entering and the runts offloaded to gamers.

Intel is maintaining margins from increased investment into services and commercial computing and this is why they want to get a foothold in the compute market.

So people should enjoy the fact they have a choice now as Intel is only making baby steps.

Might not be the way in 5 to 10 years,especially with process nodes being drawn out. Intel spends more than TSMC and GF combined ATM just on process node development.
 
Last edited:
Yeah i don't think anyone who has been in this game longer than 5 minutes thinks "AMD are Doomed"

GPU technologies develop closer between rivals than some realise.

Most recently Tonga with its Texture compression technology. "oh thats clever" one month later the GTX 970/80 do the same thing....

Its more a case of who is first out of the starting blocks rather than who does it and who doesn't.

For more than a decade GPU's have been pretty much the same between Nvidia and AMD. they are only different between architectural upgrades. Like now. each time some nOOb predicts one or the other is doomed because of it. ;)


Completely and totally agree.

These things take years to develop, there is no way of oh we will just counter the opposition with this unless it has been in the making for a good long while.


The same can be said for FreeSync and GSync, Mantle and DirectX12 etc.


The only one that I would even suspect might be a bit of a knee jerk reaction would be FreeSync, but even then I doubt it, they have probably been mulling around the idea for a good while.
 
Yeah i don't think anyone who has been in this game longer than 5 minutes thinks "AMD are Doomed"

I don't think anybody in this game be it consumer or otherwise wants anyone to be doomed, the gpu scene as a whole is boring right now, even more so with less competition.

One things for sure, 2014 will go down as one of the most boring years for gpus, a midrange card, that, erm, matches what's already available and a top tier card that marginally beats what was already on the market. Pricing and power aside it's been dull on the outright performance front. We're in urgent need of a die shrink and stacked memory.
 
It isn't though when you consdier the increase in transistors for said performance and decrease in performance/watt too. You will get reductions in performance/mm2 and performance/watt for a 28NM GM200 against a GM204 based on the same uarch.

The following is just a general statement BTW.

This is why the sky is failing is hilarious when people ignore on purpose the GT200 vs G92,GF100 vs GF104,GF100 vs GF114 and GK110 vs GK104.

Hawaii and Tahiti are both compute cards since they have far higher DP performance. Try comparing something like Pitcairn and Cape Verde with Tahiti for example. Improved performance/watt for one,but far less DP performance,and much smaller chips.

GK104 vs GK110 - improved performance/watt and performance/mm2. The GK110 based cards have to be clocked comparatively lower to maintain performance/watt.

Like I said the same old arguments about AMD is doomed and Nvidia is doomed have happened for years.

Move back 10 years and you can see all this doom and gloom said for different GPUs.

Some of the people here forget the awful FX series,and how there were multiple Nvidia is screwed predictions too. Even Fermi with its massive chips and large power consumption,was meant to doom Nvidia. ATI/AMD was doomed because of the G80 and G92 and their HD2000 and HD3000 series,which could not compete. All here still. I for one never predicted any doom since I thought it was silly and more importantly I want both companies pushing each other! I also want a choice.

Apple is doomed,Android is doomed,and so on.

Plus broken records can always be right when it comes to doom predictions.

Just like the person on the side of the road saying the end is nigh for 30 years and it actually happens.

Is he an oracle or just a nutjob?? Their word against yours.

Edit!!

But if you people really want to feel depressed,I can start my own doom and gloom scenario.

Discrete cards are decreasing in sales each year and BOTH AMD and Nvidia are fighting over less and less sales,and depending more and more on compute and pro cards.

Except,Intel is now improving its graphics massively each generation eating away at the low end and now with MIC is poised to enter the compute market in a BIG way.

Unlike Nvidia and AMD they have billions of dollars to buy marketshare(look at new Atom) and could probably eff up AMD and Nvidia in a big way if they wanted to.

Those big monolithic GPUs you all love on this forum are primarily developed for the markets Intel is entering and the runts offloaded to gamers.

Intel is maintaining margins from increased investment into services and commercial computing and this is why they want to get a foothold in the compute market.

So people should enjoy the fact they have a choice now as Intel is only making baby steps.

Might not be the way in 5 to 10 years,especially with process nodes being drawn out. Intel spends more than TSMC and GF combined ATM just on process node development.

Which games benefit from the improved DP performance?
 
Looking through the 970 owners thread I've not seen a single 970 beat my 290 by a decent amount of FPS and most of them actually lose to my 290 despite them being clocked 1.5Ghz or higher.

From a performance point of view these new Nvidia cards are no threat to AMD, the only thing they do have is low power consumption.

From what I've seen the the 970 often punches around the 290x's weight so even though the 290x does win more often than not after taking everything into account I'd be happier with a 970 rather than a 290x.
 
...that doesn't make sense :confused:

He's saying the GTX 970 has the ball park performance of a 290X while using a fraction of the power and not running as hot as a thousand suns. So overall if the 290X is a lil faster in some things, the advantages of not needing to open all the Windows in the house while gaming, plus saving on the power bills along with some nice Nvidia features make the GTX 970 the better choice for him.

You're welcome.
 
Back
Top Bottom