• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Official Bulldozer Reviews

I was hoping the Bulldozer would bulldoze all the bullsh*t speculation prior to these reviews. What a shame.. I can imagine quite a few are sitting with a smug look while reading this.
 
What Cliff Maier said back in 2010.

I don't know. It happened before I left, and there was very little cross-engineering going on. What did happen is that management decided there SHOULD BE such cross-engineering ,which meant we had to stop hand-crafting our CPU designs and switch to an SoC design style. This results in giving up a lot of performance, chip area, and efficiency. The reason DEC Alphas were always much faster than anything else is they designed each transistor by hand. Intel and AMD had always done so at least for the critical parts of the chip. That changed before I left - they started to rely on synthesis tools, automatic place and route tools, etc. I had been in charge of our design flow in the years before I left, and I had tested these tools by asking the companies who sold them to design blocks (adders, multipliers, etc.) using their tools. I let them take as long as they wanted. They always came back to me with designs that were 20% bigger, and 20% slower than our hand-crafted designs, and which suffered from electro migration and other problems.

Management rarely listen to good advice unless your lucky to be working for good ones.
 
Is it possible that motherboard has a tweaked BIOS, I didn't see anyone else using it?

Does seem odd their results are better than most other sites, in fact based on their review I'd have no issue upgrading to a BD. :confused:

If you actually look at the synthetics such as Cinebench 11.5, they're actually the same/worse.

The whole "Locking a core on a module thing", while it makes the CPU good, it's a 200 quid quad core that the 2500k will rape and pillage..
 
Does today’s FX invoke the Athlon 64 FX-51 that compelled Intel to rebadge a Xeon and come up with the Extreme Edition moniker back in 2003 just to compete? Not really, no. In fact, the chip giant didn’t have to do anything at all. Its nearly year-old 95 W parts fend for themselves without even a price adjustment.

http://www.tomshardware.com/reviews/fx-8150-zambezi-bulldozer-990fx,3043-24.html
 
I registered to make this post, I would like to ask opinions on this review: http://www.hardwareheaven.com/revie...sor-vs-core-i7-2600k-review-introduction.html

Are people perhaps just overeating to bad testing method and misleading test data elsewhere, since the gaming tests they ran shows that the FX 8150 compares quite favorably to an I7 chip in real world gaming scenarios in some of the newest games?

Bad testing and misleading data from every other review site? unlikely.
 
I registered to make this post, I would like to ask opinions on this review: http://www.hardwareheaven.com/revie...sor-vs-core-i7-2600k-review-introduction.html

Are people perhaps just overeating to bad testing method and misleading test data elsewhere, since the gaming tests they ran shows that the FX 8150 compares quite favorably to an I7 chip in real world gaming scenarios in some of the newest games?

Using a SINGLE HD6950 at a high resolution with AA on top the games are going to be GPU bound, so there is very little difference seen, should have at least crossfired 2 HD6950 or use a lower resolution so the games would be CPU bound. Thats my guess anyway.
 
Using a SINGLE HD6950 at a high resolution with AA on top the games are going to be GPU bound, so there is very little difference seen, should have at least crossfired 2 HD6950 or use a lower resolution so the games would be CPU bound. Thats my guess anyway.

Yes, that point has been made on their forum discussion of the review, but doesn't that represent a real world test? a direct comparison between 2 rigs the only thing essentially differing being the chip and chipset, who plays games at lower resolutions and how many of us actually have 2, £200+ graphics cards, what the test shows that under normal gaming conditions, what most of us would experience is similar if not better performance with the AMD chip. I don't see the point in running tests at resolutions that no one would ever use because it's not a realistic scenario.
 
Yes, that point has been made on their forum discussion of the review, but doesn't that represent a real world test? a direct comparison between 2 rigs the only thing essentially differing being the chip and chipset, who plays games at lower resolutions and how many of us actually have 2, £200+ graphics cards, what the test shows that under normal gaming conditions, what most of us would experience is similar if not better performance with the AMD chip. I don't see the point in running tests at resolutions that no one would ever use because it's not a realistic scenario.

No, this is illogical. When a test is GPU bound, you can't use it to say anything about the CPU.
 
I think BD does exactly what AMD wanted it to do. It's a platform for 16 core CPU's and they are principally aiming for the Server / Workstation market.

Outside of enthusiasts, the desktop Market is dwindling. Smartphones, tablets, consoles are eating it's lunch. Windows 8 is going to run on Arm.

AMD saw the writing on the wall IMO
 
I think BD does exactly what AMD wanted it to do. It's a platform for 16 core CPU's and they are principally aiming for the Server / Workstation market.

Outside of enthusiasts, the desktop Market is dwindling. Smartphones, tablets, consoles are eating it's lunch. Windows 8 is going to run on Arm.

AMD saw the writing on the wall IMO

Yeah, I agree with all that. However, BD power consumption just seems too high for it to be a massive success in the server market. If it had mediocre absolute performance but excellent performance/watt then they'd have a great chip, but with consumption like that, they don't.
 
Yeah, I agree with all that. However, BD power consumption just seems too high for it to be a massive success in the server market. If it had mediocre absolute performance but excellent performance/watt then they'd have a great chip, but with consumption like that, they don't.

The server version are rated as little as 65W and the problem of power when overclocking will not come into it.
 
It seems there are ones rated at around 35W to 40W:

http://www.cpu-world.com//news_2011/2011100401_AMD_Opteron_4200_lineup_revealed.html

If Titan is using 38400 16 core Interlagos 6200 series 16 core CPUs,then I suspect it must have at least better performance/watt than the previous 12 core Magny Cours CPUs.

http://www.cpu-world.com/news_2011/2011092701_New_AMD_Opteron_6200_models_sighted.html

As long as they are better performance/watt than the old would be at least something.
 
Last edited:
Back
Top Bottom