• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Confirms 20nm Products Moved to FinFET, Warns on Q2’15 Earnings

  • Thread starter Thread starter bru
  • Start date Start date
What really happened 3dfx was mismanagement, too many stop gap cards just to get a card out while rampage was constantly delayed and pushed back. Also a lot of spending on adverts, other companies, stb in juarez mexico and producing their own cards and basically giving the finger to aib partners.

shame really, good cards at the time.
 
I'm not talking about stock i'm taking about coil whine.

Of course it does what its designed for, my 290 also renders games, it can't make me a cupa :D

Hehe, I reckon a 290 would have a better chance of making a decent cupa than a 970 would, it would certainly be faster at boiling the water. ;)
 
if a 970 made a cuppa I reckon you wouldn't quite get the amount of sugars you asked for. ;)
 
In 5 or 6 years this space will look very different, what you are getting for your money is diminishing, midrange cards costing £400, high end cards now costing £600 / £700+, mid range Intel CPU's costing £200 and £300.

Want a half decent gaming rig these days it will cost you £1000 and then some.

On the plus side, there is a higher cost but components will be relevant for considerably longer. CPUs especially, in the late 90s and early 2000s to play current games I needed a CPU upgrade every 2 years minimum, nowadays a £250 CPU will last 5+ years before the need for an upgrade (ok a lot of people will upgrade before that but there's no real need). GPU lifecycles are slightly less but they still last longer than in the past.... so it's swings and roundabouts really... more expensive each time you upgrade, but upgrade frequency is lower.
 
The gains in perf will plateau soon enough. I think everyone will be stuck on 10nm for a while. Intel's GPU performance comes from (at the very high end) exotic (read: expensive) on die VRAM and packing in more compute units. Both of these aren't the answer if you want to see cheap CPUs.

The 128MB eDRAM on the Iris Pro parts has been calculated to cost roughly $3.

For Skylake's Iris Pro part (due out in 2016) Intel is going to have 2x128MB eDRAM, since it's so effective.
 
Microsoft would probably have a hard time buying AMD as it would mean every console sold by their direct competitors (Sony / Nintendo) would be putting money into Microsofts pockets as they would own the hardware going into them.

While on paper and in a dream world it would probably be brilliant, its a dream none the less, it wouldnt happen unfortunately.

I worked for MS for a few years, they are not the devil everyone makes them out to be, ive seen plenty of bedroom developers that made something that MS bought and employed the people to make other bits n bobs.

Also yeah they havent really cared much about PC gaming, they never really had to as they had a thriving console market, i think since the introduction of Pay monthly mmo type games and now the cash shop type games on PC's, its a revenue stream MS hasnt really tapped into, so would make some sense for them to push into that direction as well.
 
On the plus side, there is a higher cost but components will be relevant for considerably longer. CPUs especially, in the late 90s and early 2000s to play current games I needed a CPU upgrade every 2 years minimum, nowadays a £250 CPU will last 5+ years before the need for an upgrade (ok a lot of people will upgrade before that but there's no real need). GPU lifecycles are slightly less but they still last longer than in the past.... so it's swings and roundabouts really... more expensive each time you upgrade, but upgrade frequency is lower.

I suspect the longer upgrade cycle is pushing the prices up as they still need to increase their revenue but consumers aren't buying as often. It could get worse if the pace of change slows even further. I can't see development between nodes speed up as we get to 10nm and below. I expect the PC tech company landscape to change radically in the next 5-10 years.
 
They already said they don't expect to see a turn-around until 2016, this isn't a big deal. And in any case, the graphics division will always survive so it's all rather inconsequential for the consumer (the CPU division might as well be dead at this point).
 
They already said they don't expect to see a turn-around until 2016, this isn't a big deal. And in any case, the graphics division will always survive so it's all rather inconsequential for the consumer (the CPU division might as well be dead at this point).

The point is that their new revenue forecast is now even lower. 2016 has nothing to do with this, which is why the share prices plummeted.
 
Intel will kill NVIDIA eventually, in the GPU space at least.

IGPU performance from Intel is growing at an exponential rate - add to that their extremely advanced process technology, in 5-6 years Intel's IGPU performance will be similar to high end GPU's IMO :)

Intel need more than performance to dominate the desktop GPU market they just don't have the right "synergy" with their product approach (it does work at the lower end which tends to favour simple no frills) - they could dominate by lending their fabs and resources to another entity that "gets" it though.
 
The problem NVidia face is mobile is such a big market now, more and more R&D is being pushed into mobile. On top of that mobile players are buying up all the advanced fab space leaving the tiny in comparison desktop market to run on outdated processor nodes. If the desktop market keeps shrinking like it is, over time that will impact NVidia R&D. We are already getting technology advances appearing on mobile first and it’s only going to get worse over time. The long term trend isn’t looking good for NVidia.

Good post and so true..! we are not the future.
 
I just think at times AMD needs to hire some more competent marketing people and I mean in terms of strategy. The Fury X launch could just have been done better IMHO. If they had delayed it until stock was actually available,did another month or so driver optimisations and made sure that examples with the pump whine did not enter retail the overall impression of the company would have been better.

It's not the first time they have launched decent products but have bunged up on details all for the sake of maybe a month or so.

Let's hope they do a better job of the Fury launch!:p
 
Last edited:
Back
Top Bottom