Going Fast and Fair: Latency Optimization for Cloud-Based Service Chains

Published in IEEE Network, 2017

Recommended citation: Yuchao Zhang, Ke Xu, Haiyang Wang, Qi Li, Tong Li, and Xuan Cao. "Going Fast and Fair: Latency Optimization for Cloud-Based Service Chains". IEEE Network, pp. 138-143, 2017.

State-of-the-art microservices have been attracting more attention in recent years. A broad spectrum of online interactive applications are now programmed to service chains on the cloud, seeking better system scalability and lower operating costs. Different from the conventional batch jobs, most of these applications consist of multiple stand-alone services that communicate with each other. These step-by-step operations unavoidably introduce higher latency to the delay-sensitive chained services. In this article, we aim at designing an optimization approach for reducing the latency of chained services. Specifically, presenting the measurement and analysis of chained services on Baidu's cloud platform, our real-world trace indicates that these chained services are suffering from significantly high latency because they are mostly handled by different queues on cloud servers for multiple times. However, such a unique feature introduces significant challenges to optimize a microservice's overall queueing delay. To address this problem, we propose a delay-guaranteed approach to accelerate the overall queueing of chained services while obtaining fairness across all the workloads. Our evaluations on Baidu servers shows that the proposed design can successfully reduce the latency of chained services by 35 percent with minimal impact on other workloads.

Download paper here