• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Bulldozer Finally!

Indeed, I think there are many gamers out there wanting these bad boys for Battlefield 3, wish them all the luck that AMD deliver!

My Phenom II X4 does everything I need it to, whether that be gaming or audio encoding at a decent clip. I honestly couldn't care less about the Bulldozer complaints\speculation\discussion, as I don't upgrade more often than a 3 year cycle at most.

I do want good competition of course like most people, so I would caution against slagging off AMD given they're the only reason Intel don't have complete market control. Like someone said in here earlier, spinning off their fabs was a mistake, they also paid too much for ATI instead of say investing in production capacity.
 
Sure, I think the reason people on here are behaving like this with Bulldozer, is because official or not, it seems every month someone says "It should be coming out next month"... next month comes... "It should be coming out next month" lol It's been like that since this thread has started.

Of course most of this is speculation so it's their own fault. I think it's mostly due to the fact that "AMD People" haven't had a CPU update in a while, so a LOT of people are awaiting Bulldozer.

The very fact Intel have made the choice to delay their entire CPU line-up as far as they have makes me think they already know what AMD has in store and they aren't worried.

What, almost everything you say is a contradiction, AMD gets all this hype and its irksome, but it isn't amd making the hype and its Intel who constistantly make leaks of info and early benchmarks, but you like that?

Fusion and Bulldozer were mentioned years ago, thats how long real architecture change takes. Core 2 duo hit the desktop market, the core architecture via the mobile market had been known about for years and years, thats life. Big change takes time, end of, as for Bulldozer, its a name, fusion was delayed for many reasons, the industry not being ready for it, and everyone knew fusion from day one was an APU, everyone.

AMD don't release benchmarks because they'd suck, they are happy to be 10% slower but 50% cheaper, yet you hope AMD make something that blows Intel's socks off so the consumer gets cheaper products?

Here's a clue, the 2500k is £145 PURELY because a hexcore CAN compete with it and is also £145 , there is no other reason that the 2500k is so cheap other than competition by AMD. New chips will cost more, they won't change how much a wafer costs, how much each die costs, how much final chips cost nor drop Intel's prices. AMD have no interest in making something as fast as a 2600k and pricing it at £140, its bad for them, they'd lose money, not on each chip but R&D + chip manufacturing price won't be covered if they sell chips at cost. The 10% slower and much cheaper thing is exactly why almost every one of Intel's prices is where it is today, so new chips that are more competitive will cost more from AMD, they won't drop Intel pricing.

AMD were quiet on 4870 performance, on 5870, and 6970, thats the way they do business.

As for tweaking Sandybridge-e to be faster, when will people realise, a cpu tapes out anything between 12 and 18 months before launch ,clock speeds will be validated and tested likely anything up to 6 months before launch, 99% of the time any clock speed changes will be downward due to process issues, not upwards. They can't just bung an extra core, or eek a little more performance out of it if bulldozer blew it out the water. Likewise its not late just because Intel don't care about releasing it.

Heres several reasons why, firstly Sandy-e has problems, known problems and Intel's latest official info shows a very distinct lack of features they've previously said would be on it. Roadmap wise they've said Ivy would be out this year, it won't be, a 22nm chip thats all but identical to a 32nm chip, will be almost half the size, sell that for the same cost and make hugely higher profits. 22nm Ivy is NOT about more performance for the consumer, its about higher profits for Intel..... hence theres literally no reason you'd ever delay it because AMD can't compete......

In terms of going from the 980x, to a Sandybridge-e varient, if it extends the performance lead from AMD from say, 50% to 80%, they can simply charge more for it. Considering every 980x wafer takes up space that could be a Sandy-e wafer start, again the only person this is hurting is Intel's wallet.

Intel aren't out to compete with AMD, they won, years ago, if AMD had a chip 5 times as fast as Intel's, they can still only produce 20% of what the market needs, the other 80% would buy Intel because there wasn't any other choice. Intel are out to improve their margins, new faster chips increase their margins, its that simple.

Their delays are because of problems, Intel have said there are problems, let alone everyone else who knows anything about it saying there are problems.

Again the fundamental reason I posted was someone saying they wouldn't buy from AMD because of delays, when EVERY company has delays in this industry and Intel have exactly the same level of delays as AMD right now, possibly even worse.
 
Since when was the 2500k 145 quid?
Intel charge what they like practically, you can get a hex cheaper than you can a 2500k expanding to AMD's flagship 1100T cheaper than a 2500k.... Intel aren't swayed by Thuban, hence why they charge so much for the 2600k.
OCUK's 155 quid pricing of the 2500k isn't the norm. On average a 2500k is 165 quid.
 
Last edited:
What, almost everything you say is a contradiction, AMD gets all this hype and its irksome, but it isn't amd making the hype and its Intel who constistantly make leaks of info and early benchmarks, but you like that?

Fusion and Bulldozer were mentioned years ago, thats how long real architecture change takes. Core 2 duo hit the desktop market, the core architecture via the mobile market had been known about for years and years, thats life. Big change takes time, end of, as for Bulldozer, its a name, fusion was delayed for many reasons, the industry not being ready for it, and everyone knew fusion from day one was an APU, everyone.

AMD don't release benchmarks because they'd suck, they are happy to be 10% slower but 50% cheaper, yet you hope AMD make something that blows Intel's socks off so the consumer gets cheaper products?

Here's a clue, the 2500k is £145 PURELY because a hexcore CAN compete with it and is also £145 , there is no other reason that the 2500k is so cheap other than competition by AMD. New chips will cost more, they won't change how much a wafer costs, how much each die costs, how much final chips cost nor drop Intel's prices. AMD have no interest in making something as fast as a 2600k and pricing it at £140, its bad for them, they'd lose money, not on each chip but R&D + chip manufacturing price won't be covered if they sell chips at cost. The 10% slower and much cheaper thing is exactly why almost every one of Intel's prices is where it is today, so new chips that are more competitive will cost more from AMD, they won't drop Intel pricing.

AMD were quiet on 4870 performance, on 5870, and 6970, thats the way they do business.

As for tweaking Sandybridge-e to be faster, when will people realise, a cpu tapes out anything between 12 and 18 months before launch ,clock speeds will be validated and tested likely anything up to 6 months before launch, 99% of the time any clock speed changes will be downward due to process issues, not upwards. They can't just bung an extra core, or eek a little more performance out of it if bulldozer blew it out the water. Likewise its not late just because Intel don't care about releasing it.

Heres several reasons why, firstly Sandy-e has problems, known problems and Intel's latest official info shows a very distinct lack of features they've previously said would be on it. Roadmap wise they've said Ivy would be out this year, it won't be, a 22nm chip thats all but identical to a 32nm chip, will be almost half the size, sell that for the same cost and make hugely higher profits. 22nm Ivy is NOT about more performance for the consumer, its about higher profits for Intel..... hence theres literally no reason you'd ever delay it because AMD can't compete......

In terms of going from the 980x, to a Sandybridge-e varient, if it extends the performance lead from AMD from say, 50% to 80%, they can simply charge more for it. Considering every 980x wafer takes up space that could be a Sandy-e wafer start, again the only person this is hurting is Intel's wallet.

Intel aren't out to compete with AMD, they won, years ago, if AMD had a chip 5 times as fast as Intel's, they can still only produce 20% of what the market needs, the other 80% would buy Intel because there wasn't any other choice. Intel are out to improve their margins, new faster chips increase their margins, its that simple.

Their delays are because of problems, Intel have said there are problems, let alone everyone else who knows anything about it saying there are problems.

Again the fundamental reason I posted was someone saying they wouldn't buy from AMD because of delays, when EVERY company has delays in this industry and Intel have exactly the same level of delays as AMD right now, possibly even worse.

Ok, ok, maybe I didn't word myself very well, here it goes again ;)

AMD gain hype by releasing very little info "We have a new architecture called Bulldozer", then years later they talk about the architecture then another year later they demo Dirt 3 (I'm generalising here).

Intel builds hype by first announcing their architecture and details the changes. Then a year later they release info about the actual specs of the chip and what it is capable of. Also, a lot of info (inadvertently by Intel most likely) gets leaked in the meantime.

I prefer Intel's method of business as they introduce their architecture and their line-up early on. Intel plan ahead and their roadmaps are generally quite thorough. The difference; is that Intel talk about something to come in 1-2 years, AMD talk about something to come in 5~ years. This is mostly due to the fact that, as you say, Intel is a lot bigger, they can afford to plan ahead and have everything worked out.

AMD are the opposite, they talk a bit about their architecture and then very late down the line actually publish details on that architecture. Look at Bulldozer for specs, they have only just been confirmed by AMD. A perfect example of what I am talking about can be compared with i7 2nd generation and Bulldozer. - The specs for the i7 were known almost a year before product launch via official Intel slides. AMD Bulldozer specs have only been confirmed recently (the previous specs were rumours/speculation).

As for AMD Fusion, this is precisely the waffle I am talking about from AMD:

http://www.youtube.com/watch?v=RBwiCaPuU4w&feature=results_video&playnext=1&list=PL26090B026BAB7DD0

What?...

"Fusion, it's the energy of innovation"... "Until this moment, you thought Fusion was the coming together of the CPU and graphics technology on a single chip... but it's MORE than that... MUCH MORE".

See what I mean? Corporate BS. Don't tell me AMD don't overhype things ;). The sad thing is, people fall for it, as I think what has happened with Bulldozer (Don't flame me! I'm not saying Bulldozer will be a rubbish at all, I am simply pointing out AMD's hype machine often builds upon little substance.

"AMD were quiet on 4870 performance, on 5870, and 6970, thats the way they do business." Yes... but that's exactly what I mean. On every occasion, NVIDIA always release a GPU which is faster. That isn't me being a NVIDIA fanboy or anything; it's a simple fact, look at benchmarks. It is exactly what I mean when I basically said AMD don't have much of a point to make UNTIL they release the product and people compare bang for buck. AMD has always been about "value.performance" ratio. Intel/NVIDIA has always been about "performance.sod-the-cost ratio". Their products are always that little bit faster, and as such, they always charge a lot more for them.

When I said I want AMD to beat Intel for a change, yes I do... what's wrong with that? If AMD produced some competitive products, not on a value basis, but on a true performance basis, I honestly believe Intel would start to lower their prices.

“AMD don't release benchmarks because they'd suck, they are happy to be 10% slower but 50% cheaper, yet you hope AMD make something that blows Intel's socks off so the consumer gets cheaper products?

Here's a clue, the 2500k is £145 PURELY because a hexcore CAN compete with it and is also £145 , there is no other reason that the 2500k is so cheap other than competition by AMD.”

I never said AMDs benchmarks “suck” I said they would disappoint people, as has been proven on this very thread, the second people see that Bulldozer is a bit slower than SandyBridge on a leaked benchmark it’s “Oh, that’s disappointing”, yet when it comes out and it’s 50 quid cheaper people then go “AH! No brainer!”

What I meant is that if AMD DID release a CPU faster than the 2600K and it was say… 50 quid cheaper, Intel would have to lower their prices because most people wouldn’t buy a 2600K.

AMD competes with the 2500K as you rightly point out, however, the 2500K is still a generally better CPU than the Phenom. The problem is that AMD never compete with Intel’s top end CPU. If AMD release a Bulldozer which is then a lot faster than even the SandyBridge-E series, do you think Intel would charge so much? Of course not, their high end products would become cheaper.

However, if that were the case, I think AMD would have demonstrated some benchmarks, real benchmarks, showing the Bulldozer being faster than a 2600K, but they haven’t. All they have shown is Dirt 3 type benchmarks. That gives the message "Well, our CPU can play games, but we are too scared to show you how good it is at number crunching". They are following suit, as they always do, they leave it until release day until consumers get their hands on it, realise it isn’t “as” fast as the competitor but is actually a fair bit cheaper, therefore the consumer isn’t unhappy. For someone like me though that likes to have the best components, I wish AMD would release a CPU which wasn’t just good value, but a CPU which forces Intel to lower prices on their high end products.

EDIT: Also, a CPU from AMD which competes with Intel's flagship, means both AMD and Intel would work much harder to make their chips faster than each other which is good for us consumers.

As for your main reason for posting, yes I understand what you mean and I agree with you. It is completely daft to say you wouldn't purchase from a company ever again because they delay something. As you say, delays happen. I think most people on here are upset because they have been under the misconception that bulldozer launches "next month" as quoted, every month, the problem is that officially, AMD have said very little about a release date so people just speculate all the time.
 
Last edited:
"On every occasion, NVIDIA always release a GPU which is faster."

6990 disagrees.

6990 is 2 GPUs, not 1. As for the 6990 vs GTX 590, neither is "faster", they are both faster at different games/applications with the majority going to the GTX 590. The 6990 does compete well with the GTX 590, but I'm talking about pure GPU performance, i.e. 6970 vs. 580.
 
6990 is 2 GPUs, not 1. As for the 6990 vs GTX 590, neither is "faster", they are both faster at different games/applications with the majority going to the GTX 590. The 6990 does compete well with the GTX 590, but I'm talking about pure GPU performance, i.e. 6970 vs. 580.

Many would say the 6990 is faster.
Given the 6990 is two lower single GPU's, and it can even surpass the GTX590 which is two 580's, and is cheaper, which is the better card? ;)

But yeah, Nvidia tend to have the fastest single GPU.
But AMD versus Intel isn't as clear cut as AMD versus Nvidia.
 
6990 is 2 GPUs, not 1. As for the 6990 vs GTX 590, neither is "faster", they are both faster at different games/applications with the majority going to the GTX 590. The 6990 does compete well with the GTX 590, but I'm talking about pure GPU performance, i.e. 6970 vs. 580.

That completely depends on what sites you read. Bit-tech were the only site claiming the GTX590 to be faster than the 6990, nearly everywhere else stated it was the 6990, and that changed more so when nVidia had to drop the performance on 590 to stop them blowing up. The amount of GPUs is only irrelevant when nVidia aren't at the top with their dual GPU card when it comes to a lot of people's opinions on it.

That wasn't really my point though, I was just saying it in response to your absolute claim, when it's not really that simple. There's a lot of back and fourth with AMD and nVidia in performance terms, but it's not really worth comparing nVidia to Intel because Intel seem to have some sort of basis for common sense, nVidia don't seem to understand that the way they're operating is suicidal, they're desperate to appear as the best. That's not something you can say about intel, intel are clearly ahead of AMD but they know what they're doing and coping with it very well.

Many would say the 6990 is faster.
Given the 6990 is two lower single GPU's, and it can even surpass the GTX590 which is two 580's, and is cheaper, which is the better card? ;)

But yeah, Nvidia tend to have the fastest single GPU.
But AMD versus Intel isn't as clear cut as AMD versus Nvidia.

I think people have a hard time understand how little top end performance actually matters, it doesn't matter how fast something is if no one will buy it at the price it's been set.
 
Aside from "NVIDIA release a faster card every time" being an assinine statement, it's totally untrue.

In terms of actual shader and compute performance, the AMD cards absolutely annihalate the NVIDIA ones. It's not even remotely close. If anything, it's close because AMD / ATI have failed to optimise drivers as much or because of the evening of the playing field that the very high level (as opposed to low level) API that DirectX is. OpenGL is a bit less bloated, but AMD's drivers appear to be worse for it than NVIDIA. If games were actually programmed direct to metal (or somewhere close to), I'm pretty sure you'd see gigantic performance leads for Radeons. There's little chance of any of the next-gen consoles having an NVIDIA GPU, and this is one of the chief reasons why; only the first couple of generations of games tend to use the high level API initially provided by Sony / MS / Nintendo, then they go direct to metal or to a very low level API .. after that, it's all about absolute performance.

Aside from this, it's all about performance per watt. Here again, NVIDIA have lost terribly for the last 2-3 generations.
 
Last edited:
6990 is 2 GPUs, not 1. As for the 6990 vs GTX 590, neither is "faster", they are both faster at different games/applications with the majority going to the GTX 590. The 6990 does compete well with the GTX 590, but I'm talking about pure GPU performance, i.e. 6970 vs. 580.

The 6990 is the faster card for gaming, i had the 590 now i have the 6990, the 6990 is the better card even though its a bit noiser,
 
Aside from "NVIDIA release a faster card every time" being an assinine statement, it's totally untrue.

In terms of actual shader and compute performance, the AMD cards absolutely annihalate the NVIDIA ones. It's not even remotely close. If anything, it's close because AMD / ATI have failed to optimise drivers as much or because of the evening of the playing field that the very high level (as opposed to low level) API that DirectX is. OpenGL is a bit less bloated, but AMD's drivers appear to be worse for it than NVIDIA. If games were actually programmed direct to metal (or somewhere close to), I'm pretty sure you'd see gigantic performance leads for Radeons. There's little chance of any of the next-gen consoles having an NVIDIA GPU, and this is one of the chief reasons why; only the first couple of generations of games tend to use the high level API initially provided by Sony / MS / Nintendo, then they go direct to metal or to a very low level API .. after that, it's all about absolute performance.

Aside from this, it's all about performance per watt. Here again, NVIDIA have lost terribly for the last 2-3 generations.

It isn't an insane statement, we could sit here all day arguing the technical/scientific reasons why an ATI GPU SHOULD be faster, but the fact is, they aren't.

They might have more bells and whistles, they might be restricted by APIs etc. The whole argument is fairly irrelevant when it's been the same story since as far back as I can remember. Even if it's apparently AMD drivers to blame, well... Does it really matter? If nothing has changed in 10 years then who cares?

The GPUs don't perform faster in the REAL WORLD. Plus, making a statement like 'it has more shaders' is also irrelevant. It's like saying one CPU has 'More GHz', 'More Cores' - the architectures are completely different on NVIDIA/ATI GPU and that is generally what makes the difference.

Even if it were the drivers that made a difference, or it was the way AMD cards interact with the API don't you think if that were the case AMD might possibly have changed there design after all this time? If an AMD GPU is better at number crunching... Again... Who cares?

If someone needs a number crunching GPU for professional purposes then they would buy a professional GPU not one designed for gaming. If a gaming graphics card cant perform as fast as another gaming graphics card in pretty much every game then in relation to gaming (in real world scenarios) then, it is slower. There is no point arguing over the fact. If a card produces lower FPS at the same IQ then it is slower.
 
Last edited:
The 6990 is the faster card for gaming, i had the 590 now i have the 6990, the 6990 is the better card even though its a bit noiser,

It entirely depends on the game you play/application you use. If you play COD then a GTX 590 would win, if you play Far Cry 2 then the 6990 is faster :).
 
Last edited:
It entirely depends on the game you play/application you use. If you play COD then a GTX 590 would win, if you play Far Cry 2 then the 6990 is faster :).

I'd argue that a 590 is not any better playing CoD than a 4870. Not much difference between 90 and 160 FPS
 
Back
Top Bottom