Anybody bought any HP EPYC Servers recently?

@Vince Sorry bit of a late reply. My Veeam VM didn't have FC connectivity, so i just spun up another VM in my lab that i've added a FC passthrough card to. From their i can configure the backup repository on Veeam to use the FC link to the StoreOnce, and has just this second completed a test backup.

To add the FC card, i just edited the VM settings, and added a new PCI device:

YmxWHRS.png


We're running "Hypervisor: VMware ESXi, 6.5.0, 9298722"

I shall give it another bash. Not sure why I was having so many issues. I am not entirely sure I have tried again since I put the epyc servers in. Im going to the DC on tuesday so will give it another look. Thanks for giving it a bash dude. Ill let you know how i get on :)
 
Which one out of interest if you don’t mind saying? (trust me a message if you don’t want to put it public)

Reason for my interest is I work for a vendor with some stuff that has some overlap on features, so compete with them now and again.

Expense is one thing I’ve heard from customers, high false positives was another bug bear of some, but didn’t seem to be everyone.

I think we need to talk in a bit more depth. Currently looking at options in this space. Can probably help or give opinions on some of the other questions you have asked also :)
 
@Armageus - Thought you might like to see the magnificence of this: So nearly finished! Ill pop back to the DC at the weekend to move the last few and swap those last 3 over... They said I was mad... :D





Yes every last one will be a mini pc and all in a single rack like an absolute animal!

I still think I might be a bit mad but the maths works out. They will pay for themselves in just 8 months after the buyback on the old 400 g5's. All the while im saving on rack space condensing 3 racks into just 1.
 
Last edited:
I hope that you make a poor underpaid rookie to come into this being full production just for the sheer hilarity of it! I personally think you're absolutely off your trolley for doing something like this, but I hope it also works out! I appreciate that it's not just rammed in there.

They are all labaled what could possibly go wrong :D I even provisioned 4 spares just so we can swing people across should we have any issues. Ive been running everything out of the DC now since around july is it? Still it's been no trouble at all, not really. They are set to turn themselves on if they get turned off, they will also schedule power on requests every day for any that get turned off. I think we have had about 4 power down issues and one machine that refused to restart one day. Other than that it's silky smooth and we have a big enough pipe to cope.

During these crazy times we do crazy things to adapt and keep the cash flow where it needs to be, for all the crazyness, trust me, this is actually also smart, I'll save almost 3k a month on just the rack space. If I told you how much they cost me a unit I think you would agree :)
 
Oh I can definitely see some of the advantages whilst it's smooth running and cost-savings. I'm a bit of a traditionalist when it comes to my DC stuff I guess haha.

Honestly from a DC side yes I agree, it's errr not your norm. :D Thats what makes it kinda cool and very random! If it wasn't properly left field stuff it wouldn't have deserved a post :P
 
@Vince What you using all them machines for? So each user can VPN into a remote desktop? Please tell me this isn't the case? :D

They are in fact users desktops. The reason they are in here is we had to give up our office for the moment. The problem... well when we go back to an office we need machines and we need to be able to move pretty quickly. But what we also want is some flexibility, with a full vdi solution we could have done this but at what cost? We needed to keep the costs and running cheap while we ride this out and work out what we want to do. At current estimations when would you send your staff back to a central london office? This is a short term solution to a big problem. Also it's not all vpn we have rds so they have options. RDS, VPN, Run everything remote because we have our dms and email exposed to the web. It's all about offering working flexibility.

The remote desktops allow for people to run multiple screens their side without putting any strain on the core. I dunno there are many ways to do this but how do you quickly transition in the space of 3 weeks to having an office to not having one? This is the question.
 
Last edited:
A solution is a solution. Plenty of desktops to go around once they return. What's been used for the VPN? How many concurrent connections?

I have 4x netgate xg 7100 1u and a 1gb pipe. I also have routing out for windows rds,owa, exchange anywhere, dms anywhere etc, you can hit almost all services ria a rds session or direct. so you can work in multiple ways without the need for a vpn. 100 concurrent ish.
 
Last edited:
Great solution. No VPN? Even better! Not used anything like that, always done remote over VPN. Never used windows RDS, may have to have a look into it. Did you set this up or did someone else set this up for you? :)

You using single licences per workstation or a server licence? Does it cost much per user?

I have 3x datacenter licenses at the core with cals for core cal, exchange, sql etc. Licensing is expensive 50k p/a ish. And yes I set it all up. We donhave a vpn as the pfsense firewalls can and do run openvpn but it isnt essential with our infrastructure.
 
Ouch! nice bill there.

It's a bit pricey - That's without widows 10 licensing as they are all OEM! - It also is just what we pay MS - So doesn't include things like vsphere, zscaler, continuity, dms, etc etc... total licensing p/a is somewhere in the region of 100k. Just got home from the DC actually, left at a nice early 5am and moved the rest over, as soon as the image host website is up ill show you what she looks like now :D
 
You got an image host website running from that dc? :confused: why not just use imgur

Hell no I normally use postimg.io but that appears to be broken so I found a different one :D 70 minis all rocking and rolling. Today I cable tied every psu to shelves around the back. Moved the remaining machines over and emptied another rack!

[/url

I do have a problem with one of my epyc servers that I discovered, one nic seems to be running at 10mb rather than 1000 so I removed it from the virtual switch and vmotioned a few machines around to remove important services off of that host but ill have a closer look at that next week when I go there.

Perhaps next week ill move the top shelf down a bit and put another shelf in there so that the 4 machines sitting on top can have their own shelf. Still plenty of work to do and i'm currently a switch down as well so ill need to wire that back in at some point. Really I should put that into the core as my current spare switch is actually a lot better than my current core switch. Problem is my core switch has a load of config so ill have to spend a bit of time and work out what is next.
 
Last edited:

It's never going to be that cheap something like this but it's still less than it would have cost even just to buy the servers for a vdi solution let alone licensing etc. Plus then what do I do when we get an office back? A balancing act!
 
What's the intention, as soon as an office opens up, you whip them out of the rack and hand them to the respective users to set up on a desk?

Do the staff not have laptops? You're probably about on par cost wise with buying them versus buying a laptop for each member of staff - although would mean less play time for you (aka less faff cabling/racking them up :p).

Every staff member also has a laptop although have to remote in as certain legacy stuff wont work over wan/vpn - Namely our case management tools. Everybody has a new ryzen 4500u probook x360 - Nice machines!

The IT team are all running HP envy's. Weirdly we were on Spectres but we all went with the ryzen based 4700u machines this year and the spectre line is just not compelling with Intel Inside.

I try to keep people off of local machines where possible as we work with a lot of sensitive data so working by logging into a remote desktop and locking the ability to pull stuff local for certain user groups allows us to keep a level of control.
 
Last edited:
Back
Top Bottom