Home Lab Threadripper Build Thread.

I'm going to be trying a similar setup with a Asus Zenith Extreme. I believe NVMe Raid has been available on most Asus boards since they applied an update from AMD released on Oct 2nd. A few reports (including the one mentioned here) seem to suggest Threadripper has better support in ESXI 6.5 U1, also recently released.
 
It seems like a lot of folks here are using this thread to find out about the issues of ESXi with Threadripper, so I shall put down my own notes with my ASUS PRIME X399-A:

1) USB device passthrough is a hit and miss on the USB 2.0 ports (Which are only available as internal headers), some USB devices work fine, while others get detected by the VMs but do not function correctly (Unknown if a future ESXi/BIOS update may fix things).
As for the USB 3.0 ports, I have had no success at all (USB devices get detected by the VMs, but do not function correctly).

2) For those who find the lack of network ports to be an issue, I have a HP NC364T network card plugged into the PCI-E x4 slot and it gets detected and used by both the BIOS and ESXi without any issues thus far (Does not require any driver injection on ESXi).

3) I have a AMD R7 260X as my passthrough GPU for one of my VMs, and ESXi refuses start the VM (Giving a passthrough error) if the GPU is plugged into the first (nearest to the CPU) PCI-E x16 slot. It however works fine with the second PCI-E x16 slot.
I have also tested a ATI Radeon HD4550 in both PCI-E x16 slots and had no success in either slot.
 
It seems like a lot of folks here are using this thread to find out about the issues of ESXi with Threadripper, so I shall put down my own notes with my ASUS PRIME X399-A:

1) USB device passthrough is a hit and miss on the USB 2.0 ports (Which are only available as internal headers), some USB devices work fine, while others get detected by the VMs but do not function correctly (Unknown if a future ESXi/BIOS update may fix things).
As for the USB 3.0 ports, I have had no success at all (USB devices get detected by the VMs, but do not function correctly).

2) For those who find the lack of network ports to be an issue, I have a HP NC364T network card plugged into the PCI-E x4 slot and it gets detected and used by both the BIOS and ESXi without any issues thus far (Does not require any driver injection on ESXi).

3) I have a AMD R7 260X as my passthrough GPU for one of my VMs, and ESXi refuses start the VM (Giving a passthrough error) if the GPU is plugged into the first (nearest to the CPU) PCI-E x16 slot. It however works fine with the second PCI-E x16 slot.
I have also tested a ATI Radeon HD4550 in both PCI-E x16 slots and had no success in either slot.

These are interesting results. I haven't yet had any issues with usb or gpu passthrough. I tried both an RX480 and Vega and managed to boot VM's with the gpu passed through in both cases. As for usb devices I was mainly passing through USB storage devices which were mainly on case USB's via headers. But I also passed through my Thrustmaster t500rs as a random device but don't know if this was plugged into USB 2 or USB 3.

Keep us updated though as its interesting and helpful. What usb devices are you having issues with? I wonder if I have any similar that I could test?

Gents I was going to rebuild my rig in the Tiachi today, I went out bought TIM and all that good stuff but am feeling very fragile today after getting smashed last night with the wife (wedding anniversary) and really don't feel up to ripping it all apart today. Perhaps ill give it a go in the week.
 
Gents I was going to rebuild my rig in the Tiachi today, I went out bought TIM and all that good stuff but am feeling very fragile today after getting smashed last night with the wife (wedding anniversary) and really don't feel up to ripping it all apart today. Perhaps ill give it a go in the week.

Congrats on the anniversary!
 
I can confirm that the asrock Fatal1ty X399 Professional Gaming bios 1.80 works with VMware vSphere Hypervisor ESXi 6.0.0.U3a-5572656.x86_64. I have not done any real testing as yet though. Both Intel nics are present and esxi detected my 1tb Nvme ssd as well. I took a gamble on this motherboard and I am glad how its working out so far.
 
Last edited:
I can confirm that the asrock Fatal1ty X399 Professional Gaming bios 1.80 works with VMware vSphere Hypervisor ESXi 6.0.0.U3a-5572656.x86_64. I have not done any real testing as yet though. Both Intel nics are present and esxi detected my 1tb Nvme ssd as well. I took a gamble on this motherboard and I am glad how its working out so far.

Don't suppose you have any sata drives in there? I managed to get all but sata working, my nvme etc worked without issue. The real problem was getting sata exposed to esxi.
 
No Problem and I wish you luck with your build. :)

It looks like AMD had a couple of offers running to shift some threadripper stock so it was really nicely priced last week. I paid full release price in excess of £950 and haven't at all been disappointed with what it has to offer in terms of performance even at that price. It rips through everything I throw at it with relative ease and is working an absolute charm in server environments. Storage wise it can be whatever you want it to be really, it has enough pci-e that you can make an nvme monster if you need the I/O. I would love to be able to make a massive raid 10 array of nvme and see if you could expose that as storage to esxi because that kind of I/O performance was pretty much out of question in anything other than some very expensive storage arrays not that long ago, I am thinking along the lines of pure and 3par.

To have that in a single box in a mini form is great and opens up performance in the lab that was out of reach below 10k a year back. Granted if you are learning how to manage and maintain or provision storage out of something like a 3par or EVA then it's about as useful as a chocolate teapot but if you need I/O performance in the lab then it opens up a sensible option even if it takes some fettling to get it working.

300 CAD$ drop on the 1950x on Monday, purchase done with the Asus Prime. Just saw the posts on the BIOS 1.8, too bad. At least the Asus one was a bit cheaper. I'm just bothered by the eATX form.

I will not test any RAID, this is because this machine will possibly be used for users, with multiple virtual machines as desktop, connected from clients still running on old hardware. This was a way for my boss to skip the hardware upgrade of each computer here. At 2500$, it's much cheaper than replacing 10 computers. However the current Esxi build we have is running on Intel at low frequency and our applications are single core, so we are wasting time waiting for the applications to start or process each request. The Threadripper is a perfect replacement, high frequency and cheap price.
 
300 CAD$ drop on the 1950x on Monday, purchase done with the Asus Prime. Just saw the posts on the BIOS 1.8, too bad. At least the Asus one was a bit cheaper. I'm just bothered by the eATX form.

I will not test any RAID, this is because this machine will possibly be used for users, with multiple virtual machines as desktop, connected from clients still running on old hardware. This was a way for my boss to skip the hardware upgrade of each computer here. At 2500$, it's much cheaper than replacing 10 computers. However the current Esxi build we have is running on Intel at low frequency and our applications are single core, so we are wasting time waiting for the applications to start or process each request. The Threadripper is a perfect replacement, high frequency and cheap price.

I wouldn't worry about the eATX it is literally 2cm wider than the ASRock, not really worth the eATX tag to be honest.
 
Don't suppose you have any sata drives in there? I managed to get all but sata working, my nvme etc worked without issue. The real problem was getting sata exposed to esxi.

I did some further testing and it appears I can’t pass through my video cards and sata controller. I was able to select the devices for pass through, rebooted the host and the devices were available and ready to be assigned to a VM. But once assigned to a VM, the VM would not boot. If I removed the video card I was passing through, the VM boot up fine. I also loaded up the latest ESXi 6.5.0.U1-5969303.x86_64 and same issue.

On a more interesting thought. My old setup is a Xeon e5-2696 V3 with 128 GB RDIMM. I was planning on migration over to the thread ripper 1950x due to its higher base clock vs 2696 v3. What was interesting was that I did some synthetic bench marks with one of my VMs that I assigned 4 vcores. I then copied and migrated that same VM over to the thread ripper host. Used Cinebench to conduct the benchmarks. I have to say, I was surprised by the results. In all the tests, the Xeon e5-2696 kept coming out on top. The thread ripper scored 456 and the Xeon e5-2696 v3 scored 495.

Given the pass through issues with the thread ripper build I have decided to stay with my current setup for now.
 
Last edited:
I did some further testing and it appears I can pass through video cards and sata control. I was able to select the devices for pass through, rebooted the host and the devices were ready to be assigned to a VM. But once assigned to a VM, the VM would not boot. If I removed the pass through video card, the VM boot up fine. I also loaded up the lastest Esxi 6.5 and same issue.

I did some further testing and it appears I can’t pass through my video cards and sata controller. I was able to select the devices for pass through, rebooted the host and the devices were available and ready to be assigned to a VM. But once assigned to a VM, the VM would not boot. If I removed the video card I was passing through, the VM boot up fine. I also loaded up the latest ESXi 6.5.0.U1-5969303.x86_64 and same issue.

On a more interesting thought. My old setup is a Xeon e5-2696 V3 with 128 GB RDIMM. I was planning on migration over to the thread ripper 1950x due to its higher base clock vs 2696 v3. What was interesting was that I did some synthetic bench marks with one of my VMs that I assigned 4 vcores. I then copied and migrated that same VM over to the thread ripper host. Used Cinebench to conduct the benchmarks. I have to say, I was surprised by the results. In all the tests, the Xeon e5-2696 kept coming out on top. The thread ripper scored 456 and the Xeon e5-2696 v3 scored 495.

Given the pass through issues with the thread ripper build I have decided to stay with my current setup for now.

I can help you get the gpu pass through working as there are some modifications you will need to make to some esxi files to achieve this without faffing around.

Sounds like the ASRock might still have some ahci issues but can't confirm as I'm yet to put my board back in. Probably a job for the weekend now.

Performance wise I am surprised, out of interest what ram and how fast are you running it on the 1950? Also if you could throw me the files I could try and help confirm your results and compare against the Asus board before I rebiluild.
 
I'm also hitting a wall with x399 Taichi and ESXi ... I can confirm that even with 1.80 BIOS ESXi still won't boot on it without disabling vmw_ahci module. One it boots up with old ahci driver we still don't see drives. Don't think ASRock did anything to fix this and it is a real shame .. this board is a perfect fit for home lab ... I don't really care about having SATA drives exposed to ESXi, but not being able to pass SATA controller to VM ... I would like at least one of those to work ... ATM this is useless to me since this build was always meant to run ESXi ...

Is there a support case open with ASRock Support ? If they only could do what everyone is doing ... FFS
 
Can't you just add another SATA RAID card ?
If I have to do that I'll just buy ASUS ...
Think is that all other boards work fine, but ASRock managed to screw this one up ... We need to force them to fix it ... not just add HBA ...

I have pulled vmkernel log and it shows vmw_ahci probing ports and it actually finds SATA HDDs ... 3 in my case ...

after detection I get this:

Code:
2017-11-27T19:15:43.227Z cpu1:65971)PCI: 88: Device 0x384e4302e1e5ddc2 is not an PCI vmkDevice
2017-11-27T19:15:43.227Z cpu1:65971)Device: 2482: Module 0 did not claim device 0x384e4302e1e5ddc2.
2017-11-27T19:15:43.228Z cpu1:65971)PCI: 88: Device 0x124f4302e1e5de6f is not an PCI vmkDevice
2017-11-27T19:15:43.228Z cpu1:65971)Device: 2482: Module 0 did not claim device 0x124f4302e1e5de6f.
2017-11-27T19:15:43.228Z cpu1:65971)PCI: 88: Device 0x3234302e1e5e2fe is not an PCI vmkDevice
2017-11-27T19:15:43.228Z cpu1:65971)Device: 2482: Module 0 did not claim device 0x3234302e1e5e2fe.
 
If I have to do that I'll just buy ASUS ...
Think is that all other boards work fine, but ASRock managed to screw this one up ... We need to force them to fix it ... not just add HBA ...

I have pulled vmkernel log and it shows vmw_ahci probing ports and it actually finds SATA HDDs ... 3 in my case ...

after detection I get this:

Code:
2017-11-27T19:15:43.227Z cpu1:65971)PCI: 88: Device 0x384e4302e1e5ddc2 is not an PCI vmkDevice
2017-11-27T19:15:43.227Z cpu1:65971)Device: 2482: Module 0 did not claim device 0x384e4302e1e5ddc2.
2017-11-27T19:15:43.228Z cpu1:65971)PCI: 88: Device 0x124f4302e1e5de6f is not an PCI vmkDevice
2017-11-27T19:15:43.228Z cpu1:65971)Device: 2482: Module 0 did not claim device 0x124f4302e1e5de6f.
2017-11-27T19:15:43.228Z cpu1:65971)PCI: 88: Device 0x3234302e1e5e2fe is not an PCI vmkDevice
2017-11-27T19:15:43.228Z cpu1:65971)Device: 2482: Module 0 did not claim device 0x3234302e1e5e2fe.

Looks to me like there is still something going on with the taich, I am not even sure that given this evidence I will bother putting it back in until we know it is fixed. I have spent hours messing around with it and have conceded that I still have literally no idea why it doesn't work. I still have the Prime in at the moment but I do keep looking at other boards as I want to settle on one so that I can water cool it all. I keep deliberating over the designare but still haven't pulled the trigger as lack of u.2 and untested esxi credentials are kinda putting me off as I could just end up with yet another x399 board that doesn't work with esxi.

Truth be told I keep thinking that I might just bite the bullet and buy the rog after Christmas. At least then I wont be tempted to buy another board as it has probably the best mix of features even if asus customer support sucks, it's still the only fully featured board (apart from the single lan port) that we know just works. In my opinion that lack of 2 sata ports is easily made up for with the dim.2 and 3x m.2 slots. If I sell the Tiachi and the Prime I might get some of the money back and then I could grab me a rog.

If only somebody could confirm that the designare works then I would buy one straight away but given xmas is just around the corner I think the mrs will kill me if i buy yet another board.

For what it is worth I posted a link to this thread and requested help in the ASRock motherboard thread which the vendors apparently visit but as of yet nothing.
 
I might be making some progress with passtrough ... Issue here is I don't have spare m.2 NVME drive and don't want to wipe my 960 EVO for testing purposes ... I will format USB drive with VMFS and make it a datastore tomorrow and see if I can pass SATA controller to OpenMediaVault VM. This was always my end goal since I would prefer that NAS OS has direct access to controller and HDDs for NFS, CIFS and iSCSI ... M.2 NVME will be used high performance datastore once I'm happy I can get bare minimum working

Also I will be opening a support case with ASRock ... Might also see if I can get anything from Vmware (I do have access to support, but not sure if they will entertain a ticket for unsupported hw ...)
 
Back
Top Bottom