• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD THREADRIPPER VS INTEL SKYLAKE X

Now there's actual competition they're slowly realising there's actually a lot more involved here. How does Ryzen's infinity fabric work? How does RAM speed and latency work with it?

and it seems a lot of them still dont as some of the x299 cpu scores seem to have ryzens on old bios's or at least slower memory speeds than you can hit now. so guessing they dont have access to ryzens anymore or just cant be bothered to update their tests as why both eh.
 
It's strange because it seems like tech reviewers have collectively forgotten what it's like to have a competitive industry. They are used to rolling out their test Windows installs, updating chipset drivers, and running the same bunch of programs as last year. They show the 5% bigger bar in the graph and praise be to Intel, job done.

Now there's actual competition they're slowly realising there's actually a lot more involved here. How does Ryzen's infinity fabric work? How does RAM speed and latency work with it? What happens on Linux? What programs are badly optimised for non-Intel CPUs? What compilers should we use for each system? What GPUs work best with different CPUs? What's more important, average or minimum frames? etc.

In all honesty it's impossible to come to a matter-of-fact solution with so many variables. All you can do is choose the few that are most important to you and test (or find other people's tests for) those variables to make a decision.

Spot on.

Now a person actually looks to specifically look for their use cases at multiple different reviewers to draw any conclusions as to how the new tech with perform for them.

They can't simply use the same test methodology anymore at all, and being thorough under Inte's time constraints doesn't help either.

I look forward to the next few months though, and one can see which reviewers actually put in the most effort for once; and try to be thorough.
 
The 1800x is just a slightly overclocked 1700 with marginally higher turbo and the 1700 is £320. The 7820x is £540. Looking at it like that then the cost of the two chips is a huge deciding factor too. You're not really "dropping down" to anything given the 1800x and 1700 are the same silicon.
No IS £282 ;) if you shop around you will find many brand new at this price.....

OCUK was selling them sub £300 up to last week also. Same applies to the 1800X it was going for less then £400. Now miraculously is £440.
Imho is "Intel inflation" pushing the prices up for AMD products by retailers hand....
 
Depending on how Skylake-X looks it'll be between the £280 1700 and the 6c/12t Intel if they release it at under £300. Higher IPC and clockspeed with decent chipset and RAM speeds might make Skylake-X be worthwhile.
 
Depending on how Skylake-X looks it'll be between the £280 1700 and the 6c/12t Intel if they release it at under £300. Higher IPC and clockspeed with decent chipset and RAM speeds might make Skylake-X be worthwhile.

the 7800x is £340 by looks of things (may vary slightly between vendors)
 
Depending on how Skylake-X looks it'll be between the £280 1700 and the 6c/12t Intel if they release it at under £300. Higher IPC and clockspeed with decent chipset and RAM speeds might make Skylake-X be worthwhile.

There is no IPC increase from Skylake to Skylake-X and the 4 core X299 is already well over £300 so don't expect the 6 core to be under £300.
 
As do I!

I also want to know how well those new Noctua coolers will perform on it. I'm eyeing up Threadripper now for my next workstation. I want to see how it competes with Skylake X and Broadwell-E.

Heck, if neither are a big jump up, I might just snag a second hand 5960X/6900K or 6950X and plop that into my motherboard; then wait another year or two.

As it stands though, the 7900X is not doing too well in Tom's ( I Love Intel )Hardware review for my friend's and my needs.
I actually expected the 7900X to do far batter in these types of tests, especially the handbrake ones.

http://www.tomshardware.com/reviews/intel-core-i9-7900x-skylake-x,5092.html
b8qmNLFERhOdY5-jSPBFNg.png

bRMN6d2_Sp_JBenypqtVQg.png

8obg0Y0SSjq1KsNfY8BL6g.png

TsLvYGWGS6Ce9nnDF1Vubg.png

Doesn't this actually show poor performance in effect?

That's 10 4.3 GHz cores from Intel vs 8 3.7 GHz cores from AMD.

With 2 more cores, 16% more clock, and (supposedly) more IPC, shouldn't it be doing better than that?

Also with cooling being an issue even at 10 cores, surely the 16 and 18 cores from Intel are going to HAVE to run significantly lower clockspeeds?

Methinks Threadripper is going to end up the king of multi-thread performance. Not including price even, just absolute performance.
 
That's 10 4.3 GHz cores from Intel vs 8 3.7 GHz cores from AMD.

The problem is the scaling as you add those extra cores necessitating longer pipelines/queue depth, etc. 2 more cores and extra clock speed will likely lead to a significant amount of extra heat, higher inter-chip latency, etc. and less performance scaling than on paper.
 
Doesn't this actually show poor performance in effect?

That's 10 4.3 GHz cores from Intel vs 8 3.7 GHz cores from AMD.

With 2 more cores, 16% more clock, and (supposedly) more IPC, shouldn't it be doing better than that?

Also with cooling being an issue even at 10 cores, surely the 16 and 18 cores from Intel are going to HAVE to run significantly lower clockspeeds?

Methinks Threadripper is going to end up the king of multi-thread performance. Not including price even, just absolute performance.

Supposedly, these chips are hitting the wattage limit of the socket. Some review had an overclocked 7900x hitting over 300W. If the 16 and 18 core stuff can some how clock really well, I would expect them to consume around if not greater than 400W.
 
The problem is the scaling as you add those extra cores necessitating longer pipelines/queue depth, etc. 2 more cores and extra clock speed will likely lead to a significant amount of extra heat, higher inter-chip latency, etc. and less performance scaling than on paper.

Quite possibly yes.

Although it's basically performing how you'd expect the 8-core to at those clockspeeds vs Zen.

So it's as if the extra 2-cores are dead weight.

Supposedly, these chips are hitting the wattage limit of the socket. Some review had an overclocked 7900x hitting over 300W. If the 16 and 18 core stuff can some how clock really well, I would expect them to consume around if not greater than 400W.

Isn't there speculation the motherboard socket needs a revision to cope with that much power?

And also if 10-cores are causing 90-100 degrees C with a 240mm AIO, how are 16-18 cores going to be cooled if they can run the same clockspeed?!
 
Q

Isn't there speculation the motherboard socket needs a revision to cope with that much power?

And also if 10-cores are causing 90-100 degrees C with a 240mm AIO, how are 16-18 cores going to be cooled if they can run the same clockspeed?!

Yes, i think they will need to do that.
A sub-zero phase change PC.
 
Supposedly, these chips are hitting the wattage limit of the socket. Some review had an overclocked 7900x hitting over 300W. If the 16 and 18 core stuff can some how clock really well, I would expect them to consume around if not greater than 400W.

300w is really in Phase type territory, and not somewhere that most peeps should be going under anything other than a very custom water system. 400w is absolute phase territory.....................in fact mine is only tuned for 350w. I say "only", because it was all i ever needed up until now. As i mentioned in another thread...........................a thermal wall is very very close for Intel.
 
OK. Because enough of the IPC and "high speed rules ugh" malarkey, tonight I am running the 6800K @ stock (ram running 2133C15 instead 3600C16). Not even Intel Boost activated.
The difference even on single core games is almost close to 0. Yes some of the cores are having bigger load. While playing WOT/WOWS from the usual 60% went to 72% (the thread WOT was running), at the same time using TS, streaming internet radio, having Steam downloading on the background, and 4 tabs on firefox open the workload was spread over the other 5 cores and 11 threads.

Even run BF1 64man to compare. pffff I lost 5fps..... (while all the above were ON also). The Division played DZ, saw 0 difference on FPS.

Yes I am gaming at 2560x1440 and I am not running circles of 1080p benchmarks to prove something. I don't have time top waste for unrealistic performance.
(and I borrowed a GTX1080Ti from a colleague and playing with the curve ;) so no Furyx 1190/600 bottlenecks)
 
OK. Because enough of the IPC and "high speed rules ugh" malarkey, tonight I am running the 6800K @ stock (ram running 2133C15 instead 3600C16). Not even Intel Boost activated.
The difference even on single core games is almost close to 0. Yes some of the cores are having bigger load. While playing WOT/WOWS from the usual 60% went to 72% (the thread WOT was running), at the same time using TS, streaming internet radio, having Steam downloading on the background, and 4 tabs on firefox open the workload was spread over the other 5 cores and 11 threads.

Even run BF1 64man to compare. pffff I lost 5fps..... (while all the above were ON also). The Division played DZ, saw 0 difference on FPS.

Yes I am gaming at 2560x1440 and I am not running circles of 1080p benchmarks to prove something. I don't have time top waste for unrealistic performance.
(and I borrowed a GTX1080Ti from a colleague and playing with the curve ;) so no Furyx 1190/600 bottlenecks)
I never bothered overclocking my 5930k or my 1800X or any GPU and performance has always been good. Reviews make it sound like overclocking is needed to play Tetris at 640*480.
 
I never bothered overclocking my 5930k or my 1800X or any GPU and performance has always been good. Reviews make it sound like overclocking is needed to play Tetris at 640*480.

Exactly this! I posted over in the 7700k binned thread a while back. Even when overclocked to 5.2 ghz the increase over stock was minimal. Like within margin of error stuff. Money would be better spent on the GPU. I'm not saying there isn't a place for it, but once you get over a certain limit you hit very diminishing returns. And it seems that's what intel have done here. That extra speed has equates to almost nothing over the previous gen but has brought a load of heat and power consumption with it.

Back on topic, thread ripper has got this in the bag.
 
My goodness! Alienware also mentioned their Threadrippers will be overclocked, and on all cores. I wonder what max all core frequency they can get.

If it can hit 3.9-4.0Ghz like a Ryzen R7......

It's plausible.

They don't consume that much power with 8 cores at 3.9 on most chips. So double that power I think is manageable with watercooling.

Although even then it may not be needed because the dies are far away from each other (in chip terms) and the heatspreader has huge surface area.

I'm going to make a bold prediction (based on hope): the 16-core will do 3.9 GHz on the majority of chips (not a golden sample) and able to be cooled fine, at reasonably quiet fanspeed, by the largest Noctua cooler for X399.
 
Back
Top Bottom