We a stand alone hyper V (server 2008 r2 with hyper v role installed) how would one limit the IO of a particular vm and its network for simulations?
I've secured a fairly powerful Dell R710 kitted out, but I want to experiment with how an application operates with under powered kit. I know in vmware you can limit IOPS and in workstation you can introduce things like packet loss but I'm not sure about hyper v.
Any thoughts or recommendations?
I've secured a fairly powerful Dell R710 kitted out, but I want to experiment with how an application operates with under powered kit. I know in vmware you can limit IOPS and in workstation you can introduce things like packet loss but I'm not sure about hyper v.
Any thoughts or recommendations?