A novel system developed by MIT researchers automatically "learns" how to schedule data-processing operations across thousands of servers—a task traditionally reserved for imprecise, human-designed algorithms. Doing so could help today's power-hungry data centers run far more efficiently.
from News on Artificial Intelligence and Machine Learning https://ift.tt/2Zg0FA9
Home
machine-learning-ai-news
News on Artificial Intelligence and Machine Learning
AI system optimally allocates workloads across thousands of servers to cut costs, save energy
- Blogger Comment
- Facebook Comment
Subscribe to:
Post Comments
(
Atom
)
0 comments:
Post a Comment