Basically totally stuck on a piece of Java work..
Its basically a silly 'simulation' of a queue/buffer system..
You tell the program you have X servers, requests come in at a rate of Y, requests take Z 'time units' to process and you have a buffer queue of size V.
Simulation lasts 1M time units (iterations of the loop)
Example output of a working program is...
My code is here...
http://pastebin.com/FarLtS80
(Requests, Servers, and Queue are modelled as objects stored in an ArrayList)
My problem is that i never get rejected requests, so i dont think requests ever get added to the server, either that or that the Queue keeps going after the limit (hence the override of add in Queue.java to try limit the size to the inputted value)
Its basically a silly 'simulation' of a queue/buffer system..
You tell the program you have X servers, requests come in at a rate of Y, requests take Z 'time units' to process and you have a buffer queue of size V.
Simulation lasts 1M time units (iterations of the loop)
Example output of a working program is...
Code:
./lab3 10000 3 2 25
Total Requests: 333035
Rejected Requests: 243035
Percentage of requests rejected: 72.975815
Average Queue Size: 9795.746917
Average Response Time: 114658.564625
Requests Completed at the end of the simulation: 79998
Requests Running at the end of the simulation: 2
Queue length at the end of simulation: 10000
My code is here...
http://pastebin.com/FarLtS80
(Requests, Servers, and Queue are modelled as objects stored in an ArrayList)
My problem is that i never get rejected requests, so i dont think requests ever get added to the server, either that or that the Queue keeps going after the limit (hence the override of add in Queue.java to try limit the size to the inputted value)