• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Zen 2 (Ryzen 3000) - *** NO COMPETITOR HINTING ***

Soldato
Joined
4 Jul 2012
Posts
16,911
my 3900x box has a seal lable on it and only place it shows boost clock speed And it says 4.6ghz MAX Boost So its the most that people sould expect And its not a excuse other factors do come into it.

A mac book pro with a 6 core intel cpu will never hit max boost speed! and will throttle a lot due to the tiny tiny cooler they use.
now are intel to blame for apple noit hitting the max rated boost speed on there cpu or are apple?

yes a cpu sould get close to there max boost speeds if right conditions are meet? Does intel hit
boost speeds on suplied cooler? (no as they dont supply one)
does dell pc's all hit max boost speed on intel?
does apple hit boost speed on all procducts?

there are a lot of reasons why a cpu or gpu or even momory can't hit its rated speeds and lets hope on the 10th of september that amd do give us users the extra speed on our sihinny new cpu's
Apple are Intel's customer with regards to the CPU and the specs. Apple buys a tray of CPUs from Intel that has no marketing material printed on it. It's then down to Apple to determine how those specs might change dependent on how Apple uses Intel's parts in their own machines.

When we buy a processor in a box, we are the customer, so what AMD tells us and puts on their box is reasonable for us to expect the product to be able to actually achieve. A 3900X 4.6Ghz MAX boost should mean that AMD has certified that the particular SKU will boost up to 4.6Ghz maximum with the included cooler.
 
Soldato
Joined
14 Apr 2009
Posts
4,816
Location
Cheshire
Mate. What on earth? I can only assume you're young.

was giving reasons why max boost speeds not allways guaranteed

there are a lot of cpu's out there that dont hit there max boost speeds. and gave reasons why

and im 39 so not young![/QUOTE]
Wow.

It's just that you sighted mac books and think that it is acceptable that mac books advertise a speed that is achievable only momentarily before it down clocks and you seem to think these justifications are acceptable as you (as many of us do) understand the engineering limits around thermal throttling.

Me... If I buy something that says up to x. I expect to get x easily if I observe some simple and achievable requirements.

Macs are a joke and apple and Intel should be hauled over the fire for misleading marketing.

If I can't buy an x570 approved mobos, which all essentially are, a decent fan, some compatible ram, and get advertised speed then to me I've been lied to.

I don't think it is acceptable, but whether I'm arsed enough to go anything that is a different matter. 100mhz here or there I don't care for tbh.

"Upto" should be in most cases for a long duration.. I. E most of the time.
 
Associate
Joined
24 Feb 2010
Posts
213
I am meant to use 2 and 4 for dual channel, the thermal paste fell on slot 3. I will try cleaning it up when I get home since I have 90% alcohol.

So last night I tried a fixed CPU voltage and all core frequency and my latency dropped to 63.1ns, changing back to auto CPU voltage latency back up to 64+ns
 
Soldato
Joined
13 Jun 2009
Posts
6,847
Because intel's CPUs always hit their advertised speeds.
No, they always hit their base clocks and will hit their boost clocks as long as various conditions are met. Sound familiar? There's a reason base clocks and boost clocks are advertised separately.

AMD Ryzen 3000 lineup is complete failure for me.
Yeah we know, you mention it in every post.

Two rebrands with Vega graphics
What? The "G" chips are always one generation behind, they aren't rebrands because no Zen+ 12nm APUs existed before this.
two 3600X and 3800X chips that are not needed at all
True, kinda the same with a lot of product line-ups sadly. If you're concerned about SKU bloat though, check out how many Coffee Lake-S chips there are (which are of course still using essentially Skylake cores). There's rather a lot of nVidia Turing SKUs too considering half of them are in the "why are you spending this much you shmuck" category.
one 12-core offer that is below the expected silicon quality.
What?
Which means out of seven offers, only two make any sense - 3600 and 3700X.
I don't see why the R9 3900X doesn't "make any sense" but yes, the R5 3600 and R7 3700X are the two "mainstream" chips that make sense for most people.

So, instead of properly binning all the chiplets and segmenting them in different products, they chose to put bins of different quality under the same moniker - the flagship 3900X.
What does "properly" binning mean? The R9 3900X supposedly has one higher quality chiplet compared to the other, which is simply efficient when you consider how boost clocks work (only a few cores ever need to clock to a certain point). yes it means overclocking potential is reduced but overclocking has been dead for years. Ryzen has never been worth overclocking (except maybe the R7 1700) and with Intel you can barely squeeze anything beyond simply enabling MCE. Also it's not like the R9 3900X is the flagship now and might be superseded in the future, we have always known the R9 3950X is the flagship, since it was announced alongside all of the others.

You are simply ignorant of a lot of details on this topic.
 
Joined
2 Jan 2019
Posts
617
Has anyone tried manually testing each individual core on a 3600 to see how high it can be clocked and successfully pass a 3-minute Ryzen Master stress test?
I've done this for a 3600X and each core could hit and hold 4.5GHz+ for the entire run.
Unless and until I see a 3600 hitting those clocks, then the 3600X is still a justifiable purchase at only £20 more than a 3600 and with a better cooler.

Edit: the rationale being that the 3600X contains the 3900X good chiplets, and the 3600 contains the 3900X crap chiplets, of which we know there to be one of each.

Edit2: pre-release it may have been correct to assume that the 3600 and 3700X were just downclocked little brothers, based primarily on how we knew earlier generations of Ryzen behaved. This latest generation does not exhibit the same traits, so the assumptions no longer hold true.
 
Associate
Joined
21 Sep 2018
Posts
895
Has anyone tried manually testing each individual core on a 3600 to see how high it can be clocked and successfully pass a 3-minute Ryzen Master stress test?
I've done this for a 3600X and each core could hit and hold 4.5GHz+ for the entire run.
Unless and until I see a 3600 hitting those clocks, then the 3600X is still a justifiable purchase at only £20 more than a 3600 and with a better cooler.

Edit: the rationale being that the 3600X contains the 3900X good chiplets, and the 3600 contains the 3900X crap chiplets, of which we know there to be one of each.

Edit2: pre-release it may have been correct to assume that the 3600 and 3700X were just downclocked little brothers, based primarily on how we knew earlier generations of Ryzen behaved. This latest generation does not exhibit the same traits, so the assumptions no longer hold true.

If you are into benching, the X version is what you need. If just gaming, RAM tuning on both is about same and that what matters most/ Tuned RAM. If both have tuned RAM, fps difference between the two versions is not noticeable.
 
Soldato
Joined
14 Sep 2009
Posts
9,203
Location
Northumberland
First time I've given my 3700 a workout last night. Could convert about 50 FLAC files in under 10 seconds. Did about two hundred in total. Took me twenty times longer to go through and check all the details and locations were right. Big step up over the 4690K. :)
 
Soldato
Joined
17 Jan 2005
Posts
8,555
Location
Liverpool
Well I was going to sit down tonight and finally play some games on my 3700x, but I go to turn my PC on and I just get a clicking noise from the PSU and no signs of life. Turn it off on the PSU and back on again and it boots so I jump into a game and it powers off after 30 seconds and just clicks. Some testing later and it looks as though my PSU has given up the ghost, typical! :(
 
Caporegime
Joined
17 Mar 2012
Posts
47,662
Location
ARC-L1, Stanton System
Not really the right thread for this but found this on GAF

https://medium.com/performance-at-intel/real-world-performance-ifa-without-compromise-9f2dff21c277

Seems AMD is really getting to intel :)

Ryan Shrout :rolleyes:


Posted in the comments section.

“Real World Performance” Ryzen 3000 is faster in rendering, encoding, compiling where Intel’s complier doesn’t deliberately gimp anything other than Intel, those are real world applications and because you’ve lost the performance crown you want to redefine what real world performance is to Web Browsers and MS Word, if that’s all you do with your computer why would you need a 9900K? while in measurable performance those things might be faster the actual performance difference you see or feel is no different between that of a 9900K and a $50 G4560 or a Ryzen 1200.

This is desperate stuff Intel, my $200 Ryzen 3600 blows a $250 9600K out of the water in any “Real world Application” i use, its as fast as a $400 8700K, FireFox and MS Word is instant and while maybe technically faster on an 8700K its like measuring the speed of light over a distance of 1m vs 2m, you would never see that difference in real life, what i do, the real world where it matters take minutes and hours and having Ryzen vs anything at a similar cost to Intel shaves minutes and hours off the time it take to do any given real world task.

How about you fight AMD, a company one tenth the size of you by improving your products instead of refreshing the same old crap over and over again.

This level of marketing is desperate twaddle and quite obviously so, Ryan, its not a good look.
 
Caporegime
Joined
17 Mar 2012
Posts
47,662
Location
ARC-L1, Stanton System
*Right thread this time, too many Ryzen threads*

Installed the update, no change...

Just purely out of interest i ran the FarCry 5 benchmark to compare it with Online reviews of my 3600, just to see if my memory overclocking actually makes a difference.

At 720P to make sure the GPU is not the bottleneck, other that that everything is the same as Toms Hardware, and well yes, the difference is huge.... even tho my RAM while better than stock is still not good.

Toms: 82/109
Mine: 94/120 (+15%) on the minimums.

8EOMeUc.png

Toms Hardware

gRIkf4Q.jpg.png

https://www.tomshardware.com/reviews/amd-ryzen-5-3600-review,6287-6.html

More evidence of this.

Gamers Nexus did some Manual DRam Calc tunes RAM testing.

As usual Gamers Nexus; like data dumping someone filled their slides with masses of unnecessary data. While having the performance difference between about 10 different memory stick across 2 different motherboards is nice, i guess... The only thing that matters is the difference in performance between XMP and manual DRam Calc tuning.
Luckily they are all at the top so we don't need to dig them out from in amongst the mass of useless data.
This slide is one of the better ones in demonstrating that difference but they are almost all pretty good.

So, ignoring the 4200Mhz and 2:1 FClock ratio, again why are they even there?

3800Mhz XMP: 113 FPS
3800Mhz Manual: 127 FPS (+13%)

So it seems again, as with Ryzen 1000/2000 'whatever RAM you have' if you spend some time with DRma Calc and tune your timings you can get a significant uplift in gaming performance, In CPU speed terms 13% is like the difference between 4.4Ghz and 5Ghz.

MQHzya4.jpg.png

 
Caporegime
Joined
17 Mar 2012
Posts
47,662
Location
ARC-L1, Stanton System
Ryan Shrout :rolleyes:


Posted in the comments section.

Seems i earned a response from him....

Hi, thanks for reading this story, though clearly we don’t see eye to eye much, yet. Here are some thoughts for you though.

My definition of “Real World Performance” is one that centers around looking at the RIGHT workloads for each segment of computing. You mention rendering (I assume offline ray tracing rendering like Cinema 4D) for example. I stated in our event at IFA that this post references that I do not believe running Cinebench is a good way to measure the user experience of a thin and light notebook as that isn’t a workload that people *actually do* on those machines. I did openly state that if you wanted to use that benchmark for testing X-series and W-series products, that seems appropriate. Cinebench is not a bad test, and Maxon’s Cinema 4D is a leading production quality tool, but using the right workloads to test each platform is critical to making the best buying decision for each user.

I would also posit that you would in fact see a difference in real world workloads like productivity and browsing when comparing a Core i9–9900K and a Ryzen 3 1200 CPU. It’s something I’ll try to find a way to quantify later perhaps.

There are a lot of factors that went into YOUR buying decision is sounds like, and if you are happy with the purchase, I’m happy for you. Some people put more emphasis on having the absolute best and highest level of performance, others are content to balance budget and capability. I believe that Intel offers those options too — consider the 9700K as a great alternative to the 9900K to save some money, while keeping the gaming performance leadership over the competiton.

You mention (in different words) that we should get to action and improve on the products and technology we offer to consumers, and I couldn’t agree more. Since I joined Intel I have been a big proponent of roadmap adjustments, shifts in product strategy to better address enthusiasts and gamers, marketing that makes sense and doesn’t insult the community, and being MORE DIRECT with this crowd (hence this blog). I can tell you that we absolutely have changes in the product coming, and new products coming out that will offer consumers in the desktop, enthusiast, and mobile spaces performance and value. (I left data center out as I haven’t done much work in that group yet.)

I’m sorry you view this blog and the attempt to set the stage for communicating with this audience as foolish, that’s not what I want. And I said in a previous comment, though we won’t be able to convince everyone to shift their mindset, please know that I take these comments seriously, and it is one of the primary goals of the company to do so as well.

Not sure i have a response to that, its saying a whole lot of nothing. I'll think about it.
 
Back
Top Bottom