I need an expert to work on the following project:
In a data-centre requests for various services arrive, randomly, at a set of virtual servers. Within the data centre there are computational agents that are able to service various specific requests. The service agents connect to the virtual servers, randomly, to complete any requests in the queue; reducing the length of queue at the server by [login to view URL] should fully specify a number of service agents in
the scenario and design a solution for the data centre, which you then go on to implement in a NetLogo simulation. Your simulation should allow for many agents to be able to service a request, while minimizing the idleness of the agents. Your simulation should show the length of the queues at a number of virtual servers and the number of requests each service agent has completed.
Additionally, your simulation should show plots of the average and maximum queue length. You should then report on an experiment to determine a roughly optimal number of service agents in your scenario.
If you have reached this point - please ask me to give you the rest of the details of the project. If you do not I will not respond to you. I have had enough time wasters. Thank you