In this paper, we investigate the problem of assignment of K identical servers to a set of N parallel queues in a time slotted queueing system. The connectivity of each queue to each server is randomly changing with time; each server can serve at most one queue and each queue can be served by at most one server per time slot. Such queueing systems were widely applied in modeling the scheduling (or resource allocation) problem in wireless networks. It has been previously proven that Maximum Weighted Matching (MWM) is a throughput optimal server assignment policy for such queueing systems [1], [2]. In this paper, we prove that for a symmetric system with i.i.d. Bernoulli packet arrivals and connectivities, MWM minimizes, in stochastic ordering sense, a broad range of cost functions of the queue lengths including total queue occupancy (or equivalently average queueing delay).

Additional Metadata
Persistent URL dx.doi.org/10.1109/CDC.2011.6160652
Conference 2011 50th IEEE Conference on Decision and Control and European Control Conference, CDC-ECC 2011
Citation
Halabian, H. (Hassan), Lambadaris, I, & Lung, C.H. (2011). Delay optimal server assignment to symmetric parallel queues with random connectivities. Presented at the 2011 50th IEEE Conference on Decision and Control and European Control Conference, CDC-ECC 2011. doi:10.1109/CDC.2011.6160652