Caporegime
And I got 112W - http://forums.overclockers.co.uk/showpost.php?p=25743641&postcount=816
So I call BS on that power consumption article
So I call BS on that power consumption article
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Probably M-ATX as I'm very limited on space atm. Using something like the corsair obsidion 350D as the case.
Not really a lot of options for decent FM2+ m-ATX boards atm.
I will be running it with an R9 290, I'm sure some of you will say that's a waste, but due to a lack of space and limited funds this seems to be a viable way for me to make a second pc that I can game on.
Indeed, I'd class all the current mATX boards as budget boards including the Asus A88XM-PRO I have in my HTPC.
I'm sure you know about the MSI Gaming boards due to be released soon, and I have my eye on the mATX board (A88XM Gaming)
That's what I'd be considering as the best candidate for your proposed build.
I just hope the reviews are good and they clock as good as they look.
Yeah, I'm keeping an eye on the MSI A88XM Gaming board, though there doesn't appear to be any details about actually being available to buy anywhere, either in the UK or US.
This one looks OK:
http://www.asrock.com/mb/AMD/FM2A88X-ITX+/?cat=Specifications
Power consumption is meant to be decent.
Just wait till the horror stories of the burning hot VRM's and throttling on the Asrock in tiny cases start appearing.
Its not just the cheap build quality of asrock that puts me off. The warranty is worse too.
So here's something for people who've been "misinformed" with the power consumption of the Gigabyte GA-F2A88XN-WIFI mITX motherboard.
I brought my power meter home over the weekend to test this, as I was calling BS on the article that was showing abnormally high power consumption figures.
They used an A10-6800K Processor in that review and while that is a little less power efficient than Kaveri, there's definitely something up with their numbers.
I'm testing in an Overclocked environment and will do the same tests as shown in the graph further up the page. (I'll do some stock tests at some point)
I'll also include some real world results of my own.
Test setup
AMD A10-7850K Overclocked to 4.2ghz CPU, 1000mhz iGPU
Thermalright AXP-200 cooler
Gigabyte GA-F2A88XN-WIFI motherboard (F4a Bios)
2x4gb Team Xtreem LV DDR3 2400 Cas 10, 12, 12, 31, 2T
256gb Samsung 830 SSD
500gb HGST 2.5" Storage Drive 7200rpm
160w Pico Psu /w 192w power brick
Silverstone Milo ML06B case
USB mouse and keyboard
Windows 8.1 Pro 64 bit
iGPU on AMD catalyst 14.1 beta
AMD Chipset driver 13.20
All C states enabled and APM enabled
Onboard WIFI enabled (using even more power)
Figures given are total system power consumption measured at the socket with my trusty Kill A Watt meter.
Here's the tests to Compare with the graph, although it's not clear on their methodology for their 3DMark Firestrike testing, I'd concur that scaled up, my overclocked testing would be similar to their results clock for clock if they'd posted proper numbers and not some skewed kind of average.
Idle power consumption (after 5 mins idle) = 28w
3DMark Firestrike
Graphics Test 1 = 103.8w Peak
Graphics Test 2 = 102.4w Peak
Physics Test = 101.7w Peak
Combined Test = 114.3w Peak
Cinebench 11.5 = 104.7W Peak
Somehow their CPU intensive test (Cinebench 11.5) vastly differs from mine, like I've said previously Kaveri is a little more power efficient than Trinity, but there is something badly wrong with their result.
This particular test result is the one responsible for the "Misinformation", they didn't even test the board with a Kaveri APU.
Real world results and observations of my own.
Borderlands 2 1080p, Medium Settings, No AA
Co-op play with 3 other friends
Buttery smooth, no jitters
Complete system power draw ~125w with the odd peaks of ~130w.
Starcraft 2 1080p Medium Settings No AA
Single player game, jitter free, scrolls nicely
Complete system draw 117w
1080p 10gb Blu-ray rips, played back through XBMC (DXVA)
Complete system power draw = 62w
Got Guiminer working, 112k hashrate (at 1000mhz iGPU)
Complete system power draw = 96w (shows it's working the iGPU nicely)
Stock clocks testing and Underclocking / Undervolting / cTDP function to be tested some time soon
Interestingly I use Gigabyte in my current main build(plus an old Zotac in another),so I honestly I don't see why people have an issue with believing another company might make a better motherboard for certain uses.
If the roles were reversed in the hardware.info review I bet there would have been no issue with saying the Gigabyte was better.
Those are from Hardware.fr with an Asus A88X Plus. They are the biggest computer hardware site in France.
Edit!!
Another thing. Major sites tend to use normal PSUs,not pico-PSUs which hit 90% efficiency at low loads,especially if you choose the correct power brick. The power brick alone makes a big difference to power consumption.
Nobody is saying any board is better.
I'm saying that after buying, owning and using previous incarnations of a "certain" board, I wouldn't put my money on that brand again.
I have no issue with whatever someone else chooses to spend their money on, it's their money right?
I'm also saying that the Ukharwardwareinfo "comparison" cinebench 11.5 test "result", using the 6800k in the Gigabyte ITX board is flatout wrong.
There's something way off the mark, with their result.
It can't pull that much power in the "CPU" intensive part of the test (at stock settings)
The Hardware france Luxmark result showing the 142w draw, is a combined test hammering the CPU and iGPU both at the same time,
Ukhardwareinfo's Cinebench test result is CPU only.
Fire up Cinebench 11.5 Krooton and show him.
/snip
Originally Posted by MjFrosty View Post
http://www.pcper.com/reviews/General...medium=twitter
Some interesting questions put forward in this piece, including some info on shader translations
Q: How will Mantle tie in with HSA features? For instance, can both halves of an APU collaborate while discrete GPUs are busy drawing? For example, during AI update code which blends serial (logic) and parallel (pathfinding and visibility) tasks?
[Guennadi] This is something we'll need to evaluate in the future. Right now HSA and Mantle try to solve different sets of problems, but there is certainly room for overlap at some point.
I could be wrong but the answer to one of the questions sounds like we won't be seeing an APU with dedicated GPU card gaming setup using mantle were the physics functions etc are being handed off to the APU's iGPU while the dGPU handles the rendering of the game itself
The original article here:
http://www.pcper.com/reviews/Genera...uer?utm_source=twitterfeed&utm_medium=twitter
This was always going to be the case.
It's going to take a long time for HSA to start getting coded into mainstream games and programs.
Mantle also needs HSA coding into it.
[snip]
Until the software catches up, I'd be looking at other options.
I thought mantle did have HSA stuff like this built into it already.
I mean AMD are very much focusing on APU's rather than raw CPU's, so it makes sense for them to have their mantle protocol be able to leverage the full ability of an APU even when it's being used with one of their dedicated GPU cards.