• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RYZEN 5000 SERIES NOW ONLINE - 5950X, 5900X, 5800X & 5600X COMING NOV 5TH AT 5PM **NO COMPETITORS**

@MrPils don't you think AMD will release Ryzen 5600 and 5700x processors in future then?

If they have leftover dies that aren't selling yes. If they can collect enough cores over time that don't make the 5800x bin or can't have two cores lopped off and make the 5900x bad CCD bin yes. Out of the goodness of their hearts? No.

I think they'd rather position the leftover Ryzen 3000 chips to fill that performance and value gap. A 3900x(t) slots in nicely to the potential 5700x cost and performance bracket and a 3700x or 3800x(t) slots in just as nicely below the 5600x and they have plenty of stock of those. The 3600x and 3600 can round out the low end. The youtube info posted earlier by the guy who has the 5600x proved that, when manually overclocked his 5600x was just slightly faster than the 3700x in multicore and was knocking on the door of the 3800x. Single core (and therefore gaming) will come down heavily on the side of the 5000 generation cpus.

The only pressure to make a value 5000 series will come from OEMs who want to sell their budget prebuilts with a Ryzen 5000 cpu to ride the coattails of success. For that reason alone I think we will see a 5600 at least appear at some point but it could take a while to make it into retail.
 
I can answer with what I have first hand info of, but I can't help with the decision as I'm in a similar position as you are myself. In order to get my 3900XT (which was advertised as an improvement on the 3900X) to beat my manually clocked 4.4ghz all core 3800X I had to disable multithreading in order to get the core clocks up over 4.5ghz on my "good" CCX's on the better binned CCD while my "bad" CCX's languish at 4.3ghz. It took a clock speed advantage against my 3800x of around 150mhz on the good half to overcome a clock speed deficit against my 3800x of 100mhz on the bad half of the 3900XT. Disabling multithreading gave around a 100mhz boost when overclocking, without that 100mhz boost on my good CCD I lost frames in games compared to a flat 8 cores at 4.4ghz from the 3800x. I never tested my 3800x with multithreading disabled, I probably should have but was already miffed enough at my £520 3900XT purchase at the time. With a near 20% IPC increase and improved clock speeds the performance gap should in theory now be wider as each mhz is "worth more".

I also found the extra power requirement added 20c or more to VRM temperatures and the cpu draws around 40 watts extra at the wall in a direct swap situation. The performance gain was hard work and minimal.

The decision of which to get for gaming will boil down to how much worse the 5800x CCD is compared to the good CCD on the 5900x. Out of the box the 5900XT will be slightly faster due to the large clock speed advantage. In games that benefit from more than 8 physical cores it will be a lot faster, however not many do (total war comes immediately to mind as an example of that). If we find the 5800x can overclock to 4.7 to 4.8 all core then I can see it beating the 5900x even if the 5900x is 100mhz faster on its best core. Ultimately you're comparing 6 slightly faster and 6 slower cores to 8 fast cores. When you add in the cross CCD transfers then it could get messy.

If I had to make a statement on it now, going just from what we know so far it would be that the 5900x should always beat the 5600x in gaming and the 5950x should always beat the 5800x when all the cpus are manually overclocked. I suspect some crossover between the 5800x and 5900x where manual overclocking is concerned as there is only a 100mhz "out of box" difference in binning between them. We really wont know for sure though until they get out in the wild. Putting my neck on the line here a bit lol.

These have been very carefully specced and marketed that's for certain.

I never really take OC into account when comparing stuff. Maybe thats a bad thing, I just always go stock for stock. There is always variance in silicon quality. My 6700k takes more voltage for 4.5 than some do for 4.7.
I wasnt aware the 5900 was going to be 6 cores better than 5800, and 6 worse. Just assumed they were 12 of the same.
What about single core? If thats as important for gaming as I read, how do you think the stronger single core of the 5900 vs the 5800 will compare. Or is that again, just down to the 100mhz.
Im definitely not stepping up to the 5950!

There certainly seems to be more buzz around the 5900 than the 5800, with a lot ruling it out due to being too closely priced to the 5900. So if that is AMDs marketing at work, they have done well.

Would be nice if some of the bigger reviewers have the videos already made, ready to upload when the NDA ends. Otherwise Im going to wing it and take a punt on a 5900 I think, and just hope I dont regret it.
 
Literally right in the manual, it's well known that the more sticks you have the slower they run. I was actually wrong on this - with 4 sticks fitted the max speed is "only" 3600MHz:

image.png





Yes, that's why I went 2x 16GB Corsair Vengeance RGB Pro 3600MHz + 2x Dummy Corsair RGB Vengeance modules. So I have the speed + the looks of 4 modules at once :D

Damn, I missed that. Am planning to fit 4 x 16GB in to my Tomahawk. Out of interest, I tried looking up the X570 Unify manual to see if it has the same limitation, doesn't seem to list it.

The dummy idea is a good one!
 
I have a quad kit 4x8gb in my x570 tomahawk and I've managed to get 3800mhz with 1900mhz IF also with CL 14 15 14 28 288 1t tight 2nd and 3rd and stable too.
 
Damn, I missed that. Am planning to fit 4 x 16GB in to my Tomahawk. Out of interest, I tried looking up the X570 Unify manual to see if it has the same limitation, doesn't seem to list it.

The dummy idea is a good one!
I have the MSI MEG Unify x570

New BIOS out today - https://www.msi.com/Motherboard/support/MEG-X570-UNIFY#down-bios

Version
7C35vA75(Beta version)

Release Date
2020-11-03
File Size
18.4 MB

Description
- Updated AMD AGESA ComboAm4v2PI 1.1.0.0 Patch C
 
I never really take OC into account when comparing stuff. Maybe thats a bad thing, I just always go stock for stock. There is always variance in silicon quality. My 6700k takes more voltage for 4.5 than some do for 4.7.
I wasnt aware the 5900 was going to be 6 cores better than 5800, and 6 worse. Just assumed they were 12 of the same.
What about single core? If thats as important for gaming as I read, how do you think the stronger single core of the 5900 vs the 5800 will compare. Or is that again, just down to the 100mhz.
Im definitely not stepping up to the 5950!

There certainly seems to be more buzz around the 5900 than the 5800, with a lot ruling it out due to being too closely priced to the 5900. So if that is AMDs marketing at work, they have done well.

Would be nice if some of the bigger reviewers have the videos already made, ready to upload when the NDA ends. Otherwise Im going to wing it and take a punt on a 5900 I think, and just hope I dont regret it.

I honestly don't think you'll regret the 5900x, its going to be an excellent CPU regardless. AMD have just taken maximum advantage of the fact that they don't advertise an all core clock speed beyond the CPU base clock. The good CCD bad CCD thing obviously isn't something that AMD wants to advertise, but if you look at the many hwinfo screenshots around for Ryzen 3900(x/xt) and 3950x CPUs you'll see that its the case. Sometimes it was quite mixed up with the 3900x and you got 3 good and 3 bad on each CCX - that was about the worst result possible as it lead to constant swapping between CCD's as the windows scheduler tried to keep workloads on the fastest cores with power budget available to boost. This is why its so important to get the AMD chipset driver on, it overrides this behaviour and takes into account the latency between CCD's and will only allow workloads to move where there is a considerable advantage gained to do so. Its written/programmed to be aware of the latency penalty of moving between CCDs whereas windows scheduler has no idea at all. The 3900xt I have only ever seen 3+3 good and 3+3 bad and the 3950x 4+4 good and 4+4 bad. That's die binning at work.

The real giveaway is when you see the 5950x results compared to the 5900x in AMD's presentation. Why would it be slower than the 5900x when it has higher clock speeds available? All those cores mean it only has the power budget to boost higher than the 5900x when the workload is requesting the same amount of threads. When you look at the game slides next to each other the games that lean on very low thread counts or very high thread counts the 5950x wins out, everything in between the 5900x wins out. If that 4.9ghz CCD was on the 5800x instead it would destroy everything at up to 16 thread workloads. We all know its a rare game that benefits from more than 16 threads and its likely to continue to be a rare thing into the future. So how do you stop the 5800x from taking sales of your 5900x? You put the best CCDs onto the good side of the 5950x instead, put low ball CCDs onto the bad side, and everything in the middle goes onto the 5800x. Meanwhile you do the same with your CCDs with failed/bad cores by disabling two of them and applying the same construction techniques to the 5900x and 5600x. This "optimises" profits further by ensuring you put the worst and therefore lowest value CCDs that you cant put on the 6 or 8 core CPUs because they're too slow and prevents you needing to create a lower priced CPU to get rid of them. You then adjust prices so you make the same amount of profit based on the value of the die you put on each CPU. This will position the 5800x close enough to the 5900x to tempt in a lot of potential 5800x sales, and anyone who still buys the 5800x is still giving you the same profit as you would get from a 5900x sale. Meanwhile the 5950x becomes a halo product for anyone chasing maximum single core performance as well as anyone looking for true multicore performance.

If the 5600x and 5800x got the best binned single CCD's AMD would not be able to shift all their lower quality CCDs on the 12 and 16 core parts as in a lot of desktop tasks they would be slower. More price for less performance is hard to market to the average consumer and that's where the buying power is. Desktop parts in the upper price range are mostly for gaming, therefore the most expensive desktop parts have to be the best for gaming out of the box. Its very clever but it's not good for us, its good for AMD and their profits. Its also why you don't release reviews until the product is available to buy. You also make sure you're not sitting on enough stocks to fulfil demand and let FOMO take care of the rest. And so the value manufacturer perceived as consumer friendly becomes premium, while everyone cheers that the underdog has won and laps it up. If you think about it AMD need not have released the 12 and 16 core parts at all to beat Intel, they could have just released 6 and 8 core parts with the dual CCD CPUs following when demand has died down. They need the perfect storm to maximise profits, that's the only motivation behind releasing 12 and 16 core parts when they already have the performance crown over intel in the multicore arena. Lisa Su is somebody I would never like to cross.
 
I went for the Ballistix 3600 RGB, different chips, but shall see how well it fairs.

Good to hear your results on the Tomahawk!

Pretty certain that's Micron E-die. It'll do 3600C14 without too much hassle. Rumour has it the FCLK will break 1900mhz this time around, so expect to see some super quick benchmarks pop up over the next few days.
 
I think yours is the newer ripped kit from team group and 8pack. Not 100% sure of differences though, mines called xtreem 8pack edition, mine is last years model? Someone may be able to correct me though as I'm not sure lol.

I picked up the same ram about a week ago in preparation for the new AMD release
 
Back
Top Bottom