Sign In

Communications of the ACM

ACM TechNews

AI Could Help Data Centers Run Far More Efficiently

View as: Print Mobile App Share:
A new technique for the data center.

A system by researchers at the Massachusetts Institute of Technology learns how to allocate data processing operations across thousands of servers most efficiently.

Credit: MIT News

Massachusetts Institute of Technology (MIT) researchers have created a system that automatically learns how to optimally allocate data processing workloads across thousands of servers as a means of boosting data center efficiency.

The Decima scheduler leverages reinforcement learning to make scheduling decisions for specific workloads in specific server clusters.

Decima tests multiple incoming workload allocation strategies across the servers to find the best trade-off between the use of computational resources and fast processing speeds.

Decima's completion speed is about 20% to 30% faster than the best handwritten scheduling algorithms, the researchers say.

MIT’s Hongzi Mao observed that “any slight improvement in utilization, even 1%, can save millions of dollars and a lot of energy in data centers.”

From MIT News
View Full Article


Abstracts Copyright © 2019 SmithBucklin, Washington, DC, USA


No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account