AI system optimally allocates workloads across thousands of servers to cut costs, save energy

A novel system developed by MIT researchers automatically "learns" how to schedule data-processing operations across thousands of servers—a task traditionally reserved for imprecise, human-designed algorithms. Doing so could help today's power-hungry data centers run far more efficiently.

from News on Artificial Intelligence and Machine Learning https://ift.tt/2Zg0FA9
SHARE
    Blogger Comment
    Facebook Comment

0 comments:

Post a Comment