• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD THREADRIPPER VS INTEL SKYLAKE X

Well, Bits and Chips have been pretty good with Ryzen leaks so far and they think an entry-level 16C/32T will be $849:
ntdYgBT.png
https://twitter.com/BitsAndChipsEng/status/870373386391891968
So a bit more than 2 x R7-1700X. X399 motherboards are likely to cost a bit more than AM4 ones though.

Good Lord. The Asus board if more expensive :p

If the 16c/32T goes for $849, then the 10c/20t goes for what? $500?
Does that means that according to AMD new mainstream is 10core CPUs?

(Not surprised about the the Intel 7700K + Z170 firesale atm, explains a lot)

Huh see bellow on the twitter

Bits And Chips - Eng‏ @BitsAndChipsEng


Replying to @davideneco1

According to my sources, a ThreadRipper 16C/32T (Dies+Package+Testing) cost AMD about 110-120$. 849$-110/120$ = 739/729$ Not bad. :)

I bet AMD planted a money tree, and time to start harvesting. Cannot be explained otherwise.
And I have no issue for them making such profit. Considering the alternative offerings and their pricing!!

(lets not forget the price of the 6950X, or the 6900K)
 
Last edited:
I love these sort of arguments... It's like a re run of the whole 980/ 1080 etc are 'mid range' Gpu's when they were respectively released as the fastest, at the time, consumer gpu's by most metrics, because some forum members had decided that the dies on these gpu's were not physically large enough to qualify them as 'high end' personally I have been on hex cores for years now since my 980 westmere....
That wasn't really the issue. It was disdain at nVidia for majorly cutting down their chips and selling what they usually sold as mid range, as the high end.
 
I love these sort of arguments... It's like a re run of the whole 980/ 1080 etc are 'mid range' Gpu's when they were respectively released as the fastest, at the time, consumer gpu's by most metrics, because some forum members had decided that the dies on these gpu's were not physically large enough to qualify them as 'high end' personally I have been on hex cores for years now since my 980 westmere....

What you are saying is basically validation for if nVidia had put out say the GTX460 as a GTX480 and called it high end coz after all it would have been the fastest GPU out while charging £400+ for it - yet when they do that with the 980 that is OK? You are basically justifying nVidia treating customers with contempt.
 
yet when they do that with the 980 that is OK? You are basically justifying nVidia treating customers with contempt.

yes I was fine with it as they were after all (for both the 980 and 1080) the fastest consumer gpu's on release across a wide range of metrics upon release. you can't 'play' die size after all....it's like arguing that when intel went from a 4c8t nehalem lineup on x58 to a 6c12t with a smaller die at 32nm vs 45nm for their top end parts on the platform that they were some how selling a lesser part....

for anyone obsessed by the 'size' of their equipment as opposed to its actual utility (how very Freudian) there was the respective titans and the 980/1080ti available later on.

it's like arguing over what's 'mainstream' and what's 'enthusiast'.... It's all very arbitary especially at points in the past when intels HEDT entry level (5820k) was cheaper than their top end consumer CPU at the time (the 6700k) at uk retail prices..... I remember arguing at the time that people would be better of therefore investing in a 5820k/x99 setup rather then 6700k/z170... Strangely a lot of the people who disagreed with me with at the time (not so long ago) are now on ryzen hex core+ setups! So I guess time has at least partially vindicated me as I feel no need to upgrade my [email protected] just yet (or my 'midrange' 1080) until there is a more compelling upgrade on the market
 
Last edited:
$849 for the 16c/32t model could mean quite a tasty price for the 12c/24t and 14c/28t models... :)

Hopefully the performance scales as well as normal Ryzen and causes Intel to drop prices across the board to compete. Good news for everyone if that does happen (as unlikely as it would be).
 
why would any "normal" person (90% of people on this forum?) need 32 threads?
Why would someone buy a 200+mph car when the speed limit is 70 ? :p.
You can never have enough power :).
Heck, you can run other resource hungry apps while gaming too.
Many of us could have course do without but if you have the money and WANT, why not
 
why would any "normal" person (90% of people on this forum?) need 32 threads?


I would say my computer usage at home is less than average
I tend to play a game now and then and sometimes dabble with 3d rendering, cad/Cam , photo and video editing
when I do any of those I would like the results as fast as possible as I'm impatient when it comes to computing so I'm willing to pay for more cores if the price is reasonable.

On my 1700 I can now render decent quality live previews whereas years ago I'd have been waiting quite a few minutes for the final render

I can remember using cinema 4d on an Amiga and taking all night to render a basic scene at low resolution heh and my 1700 is twice as fast as the x5650 I upgraded from. If 16 cores is twice as fast again then it's certainly tempting just because I want faster
 
I love these sort of arguments... It's like a re run of the whole 980/ 1080 etc are 'mid range' Gpu's when they were respectively released as the fastest, at the time, consumer gpu's by most metrics
Well technically speaking that was correct, as proven when Nvidia finally released the high end parts (980ti and 1080ti), releasing only your entry level and mid range GPUs then charging high end prices for the mid range doesn't make them high end, it just makes them overpriced and your high end delayed (NB they did the same thing with the GTX680).

You can't just make up/change terminology to suit, even if your a manufacturer. Gulftown was not mainstream and the GTX980 was a cut down high end GPU (AKA mid range).
 
Does ift finaly run on multicore CPUs ?? When i USED to play ESO was frying 2 cores lol

It always did. The issue is to make sure the config file is set up correctly, and you either use AMD GPU or prey for bit luck if you use NV card.

Also Windows 10 game mode, does help also to spread the usage if you don't want to fiddle with the config files.
 
I love the 64 PCI-E lanes on offer by AMD. If all goes well I will be upgrading to x399 and ideally drop in a new CPU if AMD sticks with this socket. I wonder how this will impact the used market - Ryzen/x99 after the release - Good times ahead for everyone I guess. AMD have moved us on from the curse of the Quads.


any news on pricing?
 
This is why one Vendor (Intel) becoming too dominant is bad for consumers. AMD have blind-sided Intel with Threadripper, they know its MORE than just competitive with SkyLake-X, so Intel to protect their massive margins on their ridiculously priced CPU's are stripping out and locking down features on the lower core count SkyLake-X to force you to buy the higher core count ones to get all of the enthusiast features.

This is calculated corporate BS that they know they can get away with because of their massive mind-share, its high time Intel dominance was quashed.

 
Last edited:
Back
Top Bottom