Anybody bought any HP EPYC Servers recently?

Man of Honour
Joined
30 Oct 2003
Posts
13,255
Location
Essex
Wondering on current lead times as am trying to plan out a bit of a project and just realised I need a bunch more servers. Or a bunch more chips to go in the current servers. Whatever happens I don't have enough servers and more will be required but I shall need them quickly as it looks like this project will need to move.

Also anybody had a play with 7H12 yet as i'm seriously looking at it as an option when looking at VDI. It is either some sort of VDI setup or 100 or so HP 705 G5 mini's (ryzen 3400ge) racked up in a colo which although an option doesn't sound that smart. I never thought I would see the day where I am looking to colo an entire estate but hey looks like the day is here!
 
Soldato
Joined
15 Sep 2009
Posts
2,895
Location
Manchester
I would ask 3-4 VARs what their lead time is as it's going to vary I would imagine and as Crimson says whether it's Pre-Built/CTO - The like of CDW/Ingram Micro should be able to immediately give you a lead time on them, or at least their best guesstimates.

Also speaking on the 2nd 'setup' of 100 or so HP 705s, please immediately take that option off the table, the people who assist/come after you will thank you, or they will chase you to the depths of hell with fiery pitchforks should that configuration ever see light of day.

We're still slightly old school on a lot of our VDI deployments and still do Cisco UCS Chassis with Blades and NetApp Storage for none HCI and UCS Servers with plenty of fast storage and vSAN for HCI. I don't have anything to add on the EPYCs I'm afraid as I haven't done anything with AMD for a number of years.
 
Man of Honour
OP
Joined
30 Oct 2003
Posts
13,255
Location
Essex
Hmm cheers lads some bits to consider. I am actually considering just buying CPU's. I have 3 spare EPYC sockets and 1.5 spare chassis so that's an option. The problem I have is a two week timeline to migrate everything so its tight. Right now I am going to lift and shift 100 prodesk 400 g5/g6 SFF machines that were peoples desktops and put them in racks as I have managed to negotiate space with my DC and have no time to work it out. That sorts me in the short term but I need a more permanent solution moving forward, although my DC have hooked me up I have made a commitment to do something about the racks and racks full of desktops and fairly quickly.

I have been looking closely at VMWare Horizons so I am thinking that might be the direction to go in.
 
Man of Honour
OP
Joined
30 Oct 2003
Posts
13,255
Location
Essex
HP had a $1.5 billion dollars worth of back orders just a few weeks back so I don't see anything moving quickly short term. Adding some CPUs may be a quicker solution for you.

HPE will bend over backwards for us (they are a client and we are their client) so I would expect them to be able to turn around kit. I think for now I am just getting it all in and deciding after. The current thinking is take a 7452 out of EPYC 3 and shove it in EPYC 1, then buy a couple of 64core chips and VDI the world on EPYC 3! or something like that.
 
Man of Honour
OP
Joined
30 Oct 2003
Posts
13,255
Location
Essex
Good position to be in then. Happy days :)

Yea generally HPE treat us very well, well enough that they gave me 3 x pre production epyc servers around a month before you could buy them. So I had lots of time to play and yes im one of those guys that needed to upgrade naples bios to run rome. To be honest the plan has now somewhat accelerated and I am making fairly large infrastructure changes every day atm. I have been working like a mad man provisioning new phone systems, working out how we serve our userbase better than rds to their machine in the office etc. I imagine the landscape will be pretty different for me in 3 to 6 months. I will keep you all updated with where I end up anyway.
 
Don
Joined
19 May 2012
Posts
17,179
Location
Spalding, Lincolnshire
Also speaking on the 2nd 'setup' of 100 or so HP 705s, please immediately take that option off the table, the people who assist/come after you will thank you, or they will chase you to the depths of hell with fiery pitchforks should that configuration ever see light of day.

But they do rackmount shelves for them and everything! :)



(I may or may not have 5x 800 mini's in production for remote access) :D
 
Man of Honour
OP
Joined
30 Oct 2003
Posts
13,255
Location
Essex
But they do rackmount shelves for them and everything! :)



(I may or may not have 5x 800 mini's in production for remote access) :D

They do do shelves for them which is why they are on the cards in the first place :) fwiw we are going to be putting 100 minis in a rack while we don't have office space and they will eventually go back to an office. We are also looking at the VDI solutions and it is looking like we will implement horizons.
 
Man of Honour
OP
Joined
30 Oct 2003
Posts
13,255
Location
Essex
I know, although I fabricated my own out of the desk stands they come with, an old rack shelf and some cable ties :D

Proper ghetto. I like it! The DC won't put them in without some sort of shelves though as at first I was like can't you just stack 100 prodesk 400 SFF g5/g6's in a couple of racks? They wouldn't because of reasons so pulling the disks and throwing them into the minis is the next best option while we build something more future proof an robust. To be honest we have had a very good WFH experience so far running RDS to the desktops in the office so for now replicating that as closely as I can while thinking forward to the future and what might be the new working standards seems sensible. If I went full VDI and we have an office in a few months then I would be in some hot water with no desktops or machines to deploy back to the office. I need to remain flexible while offering a robust solution and small time frame. It's happening on the 27th if I am ready or not.
 
Man of Honour
OP
Joined
30 Oct 2003
Posts
13,255
Location
Essex
It's not even that ghetto, I am going to need a bit more density than that though. 100 across 1 or two racks is hopefully doable. :D After that I can then properly think about what comes next. I have been so close to pulling the trigger on new epyc CPU's a couple of times this week but I know the right thing to do is move, consolidate and then progress. One thing at a time!
 
Soldato
Joined
15 Sep 2009
Posts
2,895
Location
Manchester
But they do rackmount shelves for them and everything! :)



(I may or may not have 5x 800 mini's in production for remote access) :D

I shuddered :D when you take it as it is, it's quite interesting. I just know that I would hate to manage it as a 'VDI'/RDS Solution because I am so ingrained to having something like Horizon/Citrix to manage my estate rather than a subset of potentially hundreds of mini PCs with no one single pane of glass, but to each their own, we don't all have the budget/capabilities/buy in for other solutions. 5 of them isn't too bad, I wouldn't take up alcoholism as a career with 5.
 
Man of Honour
OP
Joined
30 Oct 2003
Posts
13,255
Location
Essex
I shuddered :D when you take it as it is, it's quite interesting. I just know that I would hate to manage it as a 'VDI'/RDS Solution because I am so ingrained to having something like Horizon/Citrix to manage my estate rather than a subset of potentially hundreds of mini PCs with no one single pane of glass, but to each their own, we don't all have the budget/capabilities/buy in for other solutions. 5 of them isn't too bad, I wouldn't take up alcoholism as a career with 5.

I guess that all boils down to how you were set up and where you are headed. The ideal solution is of course VDI for everything but with all the unknowns in what I am doing at the moment I need to keep flexibility and need to have an option to put it all back fairly quickly which means I need to keep the workstations in the estate, at least in the short term. As things progress that may change relatively quickly.

Effectively I am going with the multi pronged approach. Try loads of stuff and find something middling that works :)
 
Man of Honour
OP
Joined
30 Oct 2003
Posts
13,255
Location
Essex
Well its been intense, interesting and slightly mental these last few weeks, but it is done. Sadly I had to ride solo which meant me hiring a van and spending 48 hours solid at the DC with about 2 hours sleep but It got done. In the process because of BT epic failures I am now running HA pairs of netgate xg 7100's rather than the 200e's I had before and have also had to remove the inept BT engineering team from all infrastructure, the firewalls were a managed BT service but because they were so shocking I had to take some drastic last minute action. I also reworked the network design and a load of other stuff which I can go into in full if anybody wants to hear. Just two weeks is all I had in planning (I mean I knew it was a possibility earlier but I never thought it would actually happen) and it was delivered in a single weekend. I'm pretty sure it cost me some hair as you will see, some of you may be horrified by what you are about to see:

This row bar a couple of racks is mine, the apc rack in the middle I literally wheeled into the dc :D I also have some rackspace elswear in this DC a couple of rows back anyway I was sending updates to the continuity team at work so some of the pics have me in and ill include a pic I took at the end so you can see the toll the weekend took on a man:













Next mission is what to do about the racks and racks of desktops. I also need to clean up the mess of cables at some point.
 
Last edited:
Soldato
Joined
15 Sep 2009
Posts
2,895
Location
Manchester
Nice work Vince, you'll have to let us know how they go in terms of performance. As for the racks and racks of desktop, that's a hell of a homelab for you now haha.
 
Soldato
Joined
27 Feb 2003
Posts
7,173
Location
Shropshire
Well its been intense, interesting and slightly mental these last few weeks, but it is done. Sadly I had to ride solo which meant me hiring a van and spending 48 hours solid at the DC with about 2 hours sleep but It got done. In the process because of BT epic failures I am now running HA pairs of netgate xg 7100's rather than the 200e's I had before and have also had to remove the inept BT engineering team from all infrastructure, the firewalls were a managed BT service but because they were so shocking I had to take some drastic last minute action. I also reworked the network design and a load of other stuff which I can go into in full if anybody wants to hear. Just two weeks is all I had in planning (I mean I knew it was a possibility earlier but I never thought it would actually happen) and it was delivered in a single weekend. I'm pretty sure it cost me some hair as you will see, some of you may be horrified by what you are about to see:

Doesn't surprise me with BT. I'm involved with a pilot installation for a project which if moved into production the value is few million quid. The network is outsourced to BT and they are clueless at times.
 
Back
Top Bottom