• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** AMD "Zen" thread (inc AM4/APU discussion) ***

Soldato
Joined
9 Nov 2009
Posts
24,844
Location
Planet Earth
I don't agree with you that Ryzen 5 2500U is a niche CPU.
Actually, it should turn out to be the best selling mainstream CPU for all notebooks.

Of course, if Intel removes its corruption practices to twist their partners hands to avoid the all-round superior AMD products.

:o :o
It'll sell well if Intel allows :p

https://www.pcgamesn.com/intel-inside-partner-program-cut

Looks like Intel is cutting back on the "Intel Inside" program!

A report from CRN suggests that Intel are planning on undergoing huge changes to this OEM/Intel co-venture program, with Intel supposedly cutting funding across the board. Intel’s proposed cuts are supposedly up to a whopping 60%, which may leave OEMs and enthusiast builders to largely fend for themselves, or possibly start striking deals with alternative primary colour branded competitors.
 
Associate
Joined
12 Mar 2017
Posts
1,115
Location
Ireland
@CAT-THE-FIFTH The 2/4 CPU is still a 15W part, it has the same power envelope as the new Kaby Lake R parts hence it doesn't use less power, you should know that. And I mentioned load power because in notebookcheck's load test the dGPU is used too.
And soldering a DIMM to the mobo doesn't really save power, it's usually done for different reasons. The two envies they compared were pretty obviously different configurations, not just RAM wise, but most likely also display wise given they previously had a 2017 x360 i7 8550U Envy with a 186 nit display while the RR one is 128nit: https://www.laptopmag.com/reviews/laptops/hp-envy-x360-15t
The difference in heat output between the two supposedly "15W" parts is pretty telling for RR using more than 15W. That's besides the fact that different outlets measured the RR laptop using a lot more power than the 15W envelope should allow it to.

It's not a bad product, just not as power efficient as Intel counterparts, but on the other hand it has a much better iGPU while on the CPU side it's close enough. So if you're more interested in GPU performance, it might be more power efficient? But then again it also almost on par with a 15W CPU + 25W dGPU according to notebookcheck's testing.

@4K8KW10
All vendors have similar programs, examples:
https://www.amd.com/en-us/press-releases/Pages/amd-partner-program-2014mar25.aspx
https://www.amd.com/en/products/embedded-partner-program

The court rulings against Intel were for other types of programs (deals limiting AMD sales in exchange for market development funds in the early 2000s).
 
Last edited:
Permabanned
Joined
2 Sep 2017
Posts
10,490
The court rulings against Intel were for other types of programs (deals limiting AMD sales in exchange for market development funds in the early 2000s).

The MOA (mother of all programs) and famous quotes about the best friend money can buy, etc.... very ugly. I don't understand how a serious corporation might be happy with itself with so dirty conscience.
 
Associate
Joined
12 Mar 2017
Posts
1,115
Location
Ireland
Corporations don't have a conscience though? They've all have their fair share of dirt.
It's going to be interesting to see what happens with Intel's €1 billion EU fine, since they won an appeal and the case is under review again. They got the fine because they offered exclusivity incentives with their old OEM deals (which is against EU single market regulation). Marketing deals apparently are fine, but not if they are tied to exclusivity.

This should be a good opportunity for AMD since they can use their own OEM marketing deals to make their products more attractive while Intel seems to redirect most of them to its data center initiatives.
 
Soldato
Joined
9 Nov 2009
Posts
24,844
Location
Planet Earth

The 2C/4T parts should still consume less power under any load than a 4C/4T part,and on purpose you are ignoring Nvidia Optimus which switches off the card when not under load - look at different Intel or AMD CPUs on the desktop with the same TDP,they are not all the same power consumption. Did you honestly think some of us have forgotten that??

Both the Intel and AMD parts have cTDP upto 25W anyway. Both tested at 100 nits,so the AMD display was close to maximum brightness,and in the end despite all this the difference was between 6 to 38 minutes which in the end,plus it actually shows slightly better CPU performance in the case of the 2017 systems tested by Laptopmag instead of you trawling the internet to find entirely different systems.

a_HR0c_Dov_L3d3dy5s_YXB0b3_Bt_YWcu_Y29t_L2lt_YWdlcy93c_C9wd_XJja_C1hc_Gkva_W5j.png


So lets put this into consideration,one is 317 minutes and one is 311 minutes. The other test which they did you quoted was 311 minutes and 349 minutes.

So AMD system is what 2% to 10% worse off with Intel having a screen which was half its max brightness as opposed to one at 80% brightness.


It is also quicker in the CPU tests too.

So let me get this straight:
1.)Better CPU performance,by around 10% to 20% using the same 2017 chassis,and that includes Excel.
2.)Better IGP performance by three times.
3.)Comparable battery life with the same battery model.

All in the same chassis,so similar cooling and the systems have the same dimensions and weight. The AMD laptop is also cheaper.

I love how people on forums were saying RR would be fail since:
1.)CPU performance wouldn't bare out in reality,despite Ryzen showing otherwise at lower TDPs.
2.)IGP performance would be fail because it used Vega,despite videos showing even BD based APUs thrashing a modern CFL IGP.
3.)Battery life would be utterly fail.

In the end when similar systems are compared,it seems very competitive.

I blame AMD for this - they need to step up their marketing,and be more proactive in making sure reviews try to be more equal to their products.

Now I understand why people were still buying Pentium 4 CPUs in the era of Athlon 64 CPUs.

Especially with Intel cutting subsidies,prices of laptops will rise,so with something like Raven Ridge is more competition which will hopefully as time progresses and the full range is released keep prices in check. You would think hardware geeks would be happy at that,but I sense not.
 
Last edited:
Associate
Joined
12 Mar 2017
Posts
1,115
Location
Ireland
I don't think you understand how TDP works for Intel parts, regardless of core counts 15W TDP is a hard limit and a 2C/4T part will use just as much as a theoretical 18C/36T part if they're both set to a maximum 15W TDP. And you're also ignoring the part where I said that in notebookcheck's load scenario the dGPU is used.
For Load Average they used 3DMark06 and for Load Maximum they use Prime95 + Furmark: https://www.notebookcheck.net/Our-Test-Criteria.15394.0.html
https://www.notebookcheck.net/HP-En...ega-8-Laptop-Review.266614.0.html#toc-verdict
pzkwQWk.png
That right there shows the 2500U blowing way past 15W TDP, possibly over 25W TDP in load. It's not a 1:1 comparison since of course there are other hardware differences, but if RR was limited to 15W or even 25W, it wouldn't use that much power.
The two companies get their TDP ratings slightly differently, and on desktop, same as with notebooks, you have lots of variation depending on OEM/motherboard. MCE will blow past TDP for Intel parts for example, same with "Performance Bias" on some AM4 motherboards.
You have examples of Intel U parts being limited to bellow 15W by some OEMs, it just varies by design, but the key numbers are always going to be the power consumption ones.

It's quicker in some CPU tests because it uses a lot more power than the Intel counterpart as per Notebookcheck testing, which is quite a bit more in depth than laptopmag's.

The main advantages of RR, as I've said several times already, are the much faster iGPU and seemingly price. CPU performance on the same TDP is going to be lower than Intel counterparts, or if alternatively is configured at 25W TDP like it is on the X360 Envy, CPU performance is going to be close in single threaded and better in multi, but at the cost of battery life.

The reason why TDP is very important is because it's essentially comparing apples to oranges unless both parts are configured to the same power envelope, if you're interested in perf/W. The biggest fail here in my opinion is AMD launching Raven Ridge with such a lackluster laptop in the HP Envy X360, the models have a LOT of variation when it comes to display brightness (laptop mag had 128nit to 186nit, TechReport & Notebookcheck had theirs at ~220nit) and other hardware specifics, which leads us to all this ambiguity when it comes to Raven Ridge performance.

The big win is that they are competitive on an inferior process, with a much larger die.
 
Soldato
Joined
9 Nov 2009
Posts
24,844
Location
Planet Earth
20 newish games tested on the HP.

https://www.youtube.com/watch?v=MdgMv_9A0z8


00:01 - Destiny 2 03:12 - Titanfall 2 05:32 - Rise of the Tomb Raider 08:09 - Microsoft Flight Simulator X Steam Edition (1080p) 10:51 - Project CARS 12:39 - The Elder Scrolls V: Skyrim 14:56 - Micro Machines World Series 16:56 - Redout 18:53 - The Crew 20:53 - Rainbow Six Siege 22:33 - Crysis 3 25:21 - Total War: Warhammer II 26:21 - Overwatch 29:56 - Battlefield 1 Multiplayer 64 32:40 - Fortnite 35:28 - Middle-earth: Shadow Of War 38:30 - Rocket League 40:55 - Dota 2 44:13 - Counter-strike: Global Offensive (CS:GO) 46:10 - Far Cry Primal

Another 9 tested.

https://www.youtube.com/watch?v=IqGnYR_bNIg


00:01 - Wolfenstein II 02:46 - Pro Evolution Soccer 2018 05:36 - Mad Max 08:43 - Metal Gear Solid V 14:39 - TrackMania Turbo 15:49 - CoD: Infinite Warfare 18:23 - Overwatch 20:23 - Assasins Creed: Origins 22:33 - Far Cry Primal

Yet another 12 games tested.

https://www.youtube.com/watch?v=l9MYwQbR5bc


00:01 - Watch Dogs 2 02:12 - Deus Ex: Mankind Divided (DirectX 11) 04:13 - X-Plane 11 08:23 - The Surge 10:28 - Team Fortress 2 12:09 - Sniper: Ghost Warrior 3 13:40 - Prey 16:09 - Mortal Kombat XL 19:18 - Killer Instinct 21:04 - Fallout 4 24:25 - The Witcher 3 32:29 - Ashes of the Singularity: Escalation (DirectX 12)

So 41 games tested in total. I am surprised the IGP can even run some of those titles. I am definitely interested to see how the Ryzen 7 Mobile and the Ryzen Mobile Gaming SKUs will performance when they are released.


With similar systems,with similar cooling,etc the AMD system was not much lower than the Intel system and the fact of the matter is Nvidia Optimus switches off the GPU if not under heavy GPU load. I should know as I have mates who have systems with Nvidia dGPUs where the thing switches off so battery life and power consumption is improved. This is not 2011,its 2017 dude!

Even look at the Intel system you post,how does it compare to other Intel systems with the same CPU,this is why you need to equate cooling for both. If the Intel system is throttling more it will post lower power consumption readings,etc.

The fact is the 2017 models of HP X360 are not massively different in battery life.

Plus also you seem to forget one thing - if the AMD CPU was consuming FAR more power,in the same chassis and with the same cooler as the Intel system,it would probably post lower overall CPU scores,but all indications say the system does not run massively hotter or throttles any more than the Intel system.

Likewise,the Intel system if producing less heat should boost much higher. There is no indication of any of this from any review I had a quick look at.

The fact of the matter,is that battery life is fine when metrics are similar - remember you are argueing over 2% to 10% when it comes to battery life,but the AMD system is faster with regards to CPU and GPU.

Both X360 systems have under 6 hours battery life and the screens are not bright enough so ultimately yes its a meh system.

In the end its a solid chip,and in the end I would argue a better rounded chip than the Core i5 8250U.

The thing is cost is on AMD's side - Intel can only really compete by bundling a lower end Nvidia dGPU which is less cost effective for OEMs.
 
Last edited:
Associate
Joined
12 Mar 2017
Posts
1,115
Location
Ireland
Optimus won't switch the dGPU off if you're running 3Mark2006, Furmark or The Witcher 3.

Battery life is indeed fine, but the HP Envy X360 model is just an awful laptop overall and hopefully we'll see better designs in RR in them, might also give us better insight into RR performance and perf/W.
 
Soldato
Joined
9 Nov 2009
Posts
24,844
Location
Planet Earth
Optimus won't switch the dGPU off if you're running 3Mark2006, Furmark or The Witcher 3.

Battery life is indeed fine, but the HP Envy X360 model is just an awful laptop overall and hopefully we'll see better designs in RR in them, might also give us better insight into RR performance and perf/W.

Look at the normal CPU load readings,they are roughly the same ballpark. Its not surprising Ryzen performs well in those non-gaming benchmarks as its gaming at 1080p with higher end cards where Ryzen tends to be a bit off the pace relative to Intel.

The fact is if that model throttles down the CPU,due to too much heat being produced it will skew the power consumption figures,and that has happened with laptops. I would even argue if the AMD laptop had a mobile RX460 or something like that,the power consumption increase might not be as noticeable,since its most likely the CPU would sort of throttle too.

OTH,even looking at the Core i5 7200U and 940MX results,to get better CPU performance,and similar GPU performance,in an all in one solution does impress me. Remember,Ryzen is a true SOC,especially in the case of RR,so the reduction in complexity should help with costs. Basically three chips down to one. RR is a bigger chip but I don't expect the laptops to have a separate chipset,as the APU should have enough PCI-E lanes and SATA ports for a laptop.

Regarding the Envy its not ideal,but this is AMD we are talking about - they could have easily sent out pairs of AMD and Intel Envy X360 laptops for reviewers to test,so at least the ranges can be equalised.

Apparently their own marketing is asleep. But then this is the same company which asked Anandtech a few years ago to go to their HQ to test some laptops,and used prototype ones which had worse performance than production ones,since some could not run dual channel RAM,yet NBC had tested a production version of one of the laptops they had, and AMD had given them more RAM to run it in dual channel.

I mentioned a few months ago the Acer Swift looks OK. Its a 25W TDP chassis with equivalent Intel ones,so hopefully when that comes out we can see the Ryzen 5 2500U and Core i5 8250U in a chassis where they can run at full pelt.

Dell is also making their own Ryzen laptops too from a thread I saw over on Reddit.

But unless AMD sends two similar models out,I expect only the odd site will bother to buy similarly configured models.

I am interested to see how this model performs though:

https://images.anandtech.com/doci/1...ics_press_deck-legal_final-page-043_575px.jpg

Its much smaller than the other Ryzen Mobile models,weighs just over 1KG and runs RAM in single channel. There is an equivalent Intel model too.

I expect IGP peformance to not be great due to the lack of cooling and memory bandwidth but I would be interested to see how CPU performance fares in that situation.
 
Associate
Joined
12 Mar 2017
Posts
1,115
Location
Ireland
@humbug It's expected, but the power usage gap between RR and a 15W TDP CPU + 25W TDP GPU is only ~5W/~10% in load which denotes a higher TDP for the RR part.

Either way, hopefully the Lenovo and Acer laptops are out soon because those should be much better than this HP x360.
 
Caporegime
Joined
17 Mar 2012
Posts
47,650
Location
ARC-L1, Stanton System
@humbug It's expected, but the power usage gap between RR and a 15W TDP CPU + 25W TDP GPU is only ~5W/~10% in load which denotes a higher TDP for the RR part.

Either way, hopefully the Lenovo and Acer laptops are out soon because those should be much better than this HP x360.

TDP is not a measure of power consumption (Thermal Design Power) donates what level of cooling is recommended.

For one Prime95 and Furmark are designed to push the power consumption way over design, in real life use you will not get to half the levels Prime95 and Furmark push the Hardware to, Intel put 140 to 180 Watt TDP's on their SkyLake-X CPU's and yet with Prime95 the chips will pull 250 Watts, so are Intel under rating their TDP? No, Intel didn't envisage you playing Prime95 and nor should they.

Having said all of that Furmark is to the GPU what Prime95 is to the CPU, so the GPU on the Ryzen APU is being pushed over its TDP, a GPU 3 time more powerful than Intel's GPU which in turn is not being used at all so we do not know what its power consumption is under Furmark load, on the Intel system its the MX150, an entirely different GPU not from Intel or AMD. Edit its similar to the GT 1030.

CAT is right, you can only get an accurate comparison measure of power-consumption by using 'otherwise' identical systems. Which is what he posted
 
Last edited:
Associate
Joined
12 Mar 2017
Posts
1,115
Location
Ireland
Yes, TDP is indeed thermal power, but thermal power is also electrical power when accounting for thermal resistance, they're linked. You can't really push power consumption way over TDP because it's just not physically possible unless the TDP is higher than initially thought.

Skylake-X is a different matter entirely because the CPU TDP limit is determined by the motherboard firmware. Unlike mobile parts keeping to TDP isn't really critical for HEDT parts, so most motherboard makers just removed the TDP limits entirely.
Source: http://www.pcgameshardware.de/Core-...ts/Skylake-X-Basin-Falls-Performance-1230913/
Translated: "With the aforementioned UEFI 0501, the MSI motherboard lets the ten-core processor run free when it comes to power consumption, and also pushes the all-core turbo through to 4.0 GHz at all times." "The Asus ROG Strix X299-E gaming, on the other hand, takes the other, more specification-compliant route. After the obligatory "Load optimized defaults", the TDP limit of 140 watts is enforced - if necessary at the expense of the clock rates, which also fall below 4.0 GHz."
A lot of the outlets that measured pretty obscene power consumption numbers in Prime95 w/ AVX for Skylake-X, like Tomshardware, were using MSI motherboards: http://www.tomshardware.com/reviews/intel-core-i9-7900x-skylake-x,5092-4.html
That difference in motherboard set TDP is also the reason why we saw so much variation in Skylake-X benchmarks from different outlets.

Also, I do want to point out that 2500U also had high power consumption numbers in 3DMark2006 and The Witcher 3, as per notebookcheck testing.
 
Caporegime
Joined
17 Mar 2012
Posts
47,650
Location
ARC-L1, Stanton System
Yes, TDP is indeed thermal power, but thermal power is also electrical power when accounting for thermal resistance, they're linked. You can't really push power consumption way over TDP because it's just not physically possible unless the TDP is higher than initially thought.

Skylake-X is a different matter entirely because the CPU TDP limit is determined by the motherboard firmware. Unlike mobile parts keeping to TDP isn't really critical for HEDT parts, so most motherboard makers just removed the TDP limits entirely.
Source: http://www.pcgameshardware.de/Core-...ts/Skylake-X-Basin-Falls-Performance-1230913/
Translated: "With the aforementioned UEFI 0501, the MSI motherboard lets the ten-core processor run free when it comes to power consumption, and also pushes the all-core turbo through to 4.0 GHz at all times." "The Asus ROG Strix X299-E gaming, on the other hand, takes the other, more specification-compliant route. After the obligatory "Load optimized defaults", the TDP limit of 140 watts is enforced - if necessary at the expense of the clock rates, which also fall below 4.0 GHz."
A lot of the outlets that measured pretty obscene power consumption numbers in Prime95 w/ AVX for Skylake-X, like Tomshardware, were using MSI motherboards: http://www.tomshardware.com/reviews/intel-core-i9-7900x-skylake-x,5092-4.html
That difference in motherboard set TDP is also the reason why we saw so much variation in Skylake-X benchmarks from different outlets.

Also, I do want to point out that 2500U also had high power consumption numbers in 3DMark2006 and The Witcher 3, as per notebookcheck testing.

Both the Intel system and the AMD system measured power way above rated TDP so all that ^^^ is completely pointless just as it is an aimless and meaningless argument.

Your Comparing two entirely different systems. Your point is completely muddled, just like your reasoning and your need to compare two completely different system and have us try and imagine that the nVidia Discrete GPU is somehow a replacement of Intel's iGPU and less than inline with that of AMD's iGPU.... in order to disprove the findings of two identical systems, you want us to paint the orange green and imagine its in that way an apples to apples comparison, it isn't, what CAT posted is an actual apples to apples comparison.

Its this simple: measuring the power consumption of the AMD CPU + iGPU vs Intel CPU + nVidia discrete GPU to get to the power consumption of the AMD CPU vs Intel CPU doesn't work.
 
Caporegime
Joined
17 Mar 2012
Posts
47,650
Location
ARC-L1, Stanton System
Both the Intel system and the AMD system measured power way above rated TDP so all that ^^^ is completely pointless just as it is an aimless and meaningless argument.

Your Comparing two entirely different systems. Your point is completely muddled, just like your reasoning and your need to compare two completely different system and have us try and imagine that the nVidia Discrete GPU is somehow a replacement of Intel's iGPU and less than inline with that of AMD's iGPU.... in order to disprove the findings of two identical systems, you want us to paint the orange green and imagine its in that way an apples to apples comparison, it isn't, what CAT posted is an actual apples to apples comparison.

Its this simple: measuring the power consumption of the AMD CPU + iGPU vs Intel CPU + nVidia discrete GPU to get to the power consumption of the AMD CPU vs Intel CPU doesn't work.

I get the felling i have to explain why:

The iGPU of the AMD APU is an unknown quantity of that total 45 watts power, just as the nVidia GPU is an unknown quantity of that total 50 watts power on the Intel system, if you do not know the measure of those components that make up the total system then the AMD and Intel CPU potions of those systems are also unknown.

Its not rocket science.
 
Back
Top Bottom