One that caught my eye was the Adobe suite, it seems to perform very well on BD so if you work with graphics and design it's not a bad choice at all.
Which review are you referring to?
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
One that caught my eye was the Adobe suite, it seems to perform very well on BD so if you work with graphics and design it's not a bad choice at all.
I don't know. It happened before I left, and there was very little cross-engineering going on. What did happen is that management decided there SHOULD BE such cross-engineering ,which meant we had to stop hand-crafting our CPU designs and switch to an SoC design style. This results in giving up a lot of performance, chip area, and efficiency. The reason DEC Alphas were always much faster than anything else is they designed each transistor by hand. Intel and AMD had always done so at least for the critical parts of the chip. That changed before I left - they started to rely on synthesis tools, automatic place and route tools, etc. I had been in charge of our design flow in the years before I left, and I had tested these tools by asking the companies who sold them to design blocks (adders, multipliers, etc.) using their tools. I let them take as long as they wanted. They always came back to me with designs that were 20% bigger, and 20% slower than our hand-crafted designs, and which suffered from electro migration and other problems.
Is it possible that motherboard has a tweaked BIOS, I didn't see anyone else using it?
Does seem odd their results are better than most other sites, in fact based on their review I'd have no issue upgrading to a BD.![]()
I registered to make this post, I would like to ask opinions on this review: http://www.hardwareheaven.com/revie...sor-vs-core-i7-2600k-review-introduction.html
Are people perhaps just overeating to bad testing method and misleading test data elsewhere, since the gaming tests they ran shows that the FX 8150 compares quite favorably to an I7 chip in real world gaming scenarios in some of the newest games?
I registered to make this post, I would like to ask opinions on this review: http://www.hardwareheaven.com/revie...sor-vs-core-i7-2600k-review-introduction.html
Are people perhaps just overeating to bad testing method and misleading test data elsewhere, since the gaming tests they ran shows that the FX 8150 compares quite favorably to an I7 chip in real world gaming scenarios in some of the newest games?
Using a SINGLE HD6950 at a high resolution with AA on top the games are going to be GPU bound, so there is very little difference seen, should have at least crossfired 2 HD6950 or use a lower resolution so the games would be CPU bound. Thats my guess anyway.
It's a rubbish review. They work out the AMD CPU system uses 40% more power when under CPU load than the Intel system but then don't comment on that fact at all giving it 9/10 for design, value and performance. To have that ridiculously high power consumption it should lose points.I registered to make this post, I would like to ask opinions on this review: http://www.hardwareheaven.com/revie...sor-vs-core-i7-2600k-review-introduction.html
Yes, that point has been made on their forum discussion of the review, but doesn't that represent a real world test? a direct comparison between 2 rigs the only thing essentially differing being the chip and chipset, who plays games at lower resolutions and how many of us actually have 2, £200+ graphics cards, what the test shows that under normal gaming conditions, what most of us would experience is similar if not better performance with the AMD chip. I don't see the point in running tests at resolutions that no one would ever use because it's not a realistic scenario.
I think BD does exactly what AMD wanted it to do. It's a platform for 16 core CPU's and they are principally aiming for the Server / Workstation market.
Outside of enthusiasts, the desktop Market is dwindling. Smartphones, tablets, consoles are eating it's lunch. Windows 8 is going to run on Arm.
AMD saw the writing on the wall IMO
Yeah, I agree with all that. However, BD power consumption just seems too high for it to be a massive success in the server market. If it had mediocre absolute performance but excellent performance/watt then they'd have a great chip, but with consumption like that, they don't.
The server version are rated as little as 65W and the problem of power when overclocking will not come into it.
So does BD have better Mflops/watt than Intel chips or not?
The server version are rated as little as 65W and the problem of power when overclocking will not come into it.
It seems there are ones rated at around 35W to 40W:
http://www.cpu-world.com//news_2011/2011100401_AMD_Opteron_4200_lineup_revealed.html
If Titan is using 38400 16 core Interlagos 6200 series 16 core CPUs,then I suspect it must have at least better performance/watt than the previous 12 core Magny Cours CPUs.