Abstract
This paper examines the use of an Uninterruptible Power Supply (UPS) to enhance the operational efficiency of data centers. It focuses on developing an optimal energy scheduling strategy for a data center equipped with UPS, using the Markov decision process (MDP) framework. The MDP framework simulates the decision-making process involved in minimizing energy costs. Each unit’s available power output in the data center is treated as a Markov state, taking into account the uncertainty associated with renewable distributed generation. This uncertainty drives the system to transition to other Markov states in subsequent decision times. A recursive optimization model is established for each Markov state at each decision time to guide state-based operations, which includes determining the unit output while considering both current and future costs. The challenge of dealing with high dimensionality, arising from a substantial number of states and actions in the model, is effectively addressed by adopting an approximate dynamic programming (ADP) method. This approach incorporates decision-state and forwards dynamic algorithms to tackle the complexity of the MDP-based model. By employing ADP, the computational burden is reduced, enabling efficient and practical solutions to be obtained.
Get full access to this article
View all access options for this article.
