Date of Award
David Wenzhong Gao
Mohammad A. Matin
Building energy, Deep reinforcement learning, DLMP, Game theory, Power system, Reinforcement learning
Most of the current game-theoretic demand-side management methods focus primarily on the scheduling of home appliances, and the related numerical experiments are analyzed under various scenarios to achieve the corresponding Nash-equilibrium (NE) and optimal results. However, not much work is conducted for academic or commercial buildings. The methods for optimizing academic-buildings are distinct from the optimal methods for home appliances. In my study, we address a novel methodology to control the operation of heating, ventilation, and air conditioning system (HVAC).
We assume that each building in our campus is equipped with smart meter and communication system which is envisioned in the future smart grid. For academic and commercial buildings, HVAC systems consume considerable electrical energy and impact the personnels in the buildings which is interpreted as monetary value in this article. Therefore, we define social cost as the combination of energy expense and cost of human working productivity reduction. We implement game theory and formulate a controlling and scheduling game for HVAC system, where the players are the building managers and their strategies are the indoor temperature settings for the corresponding building. We use the University of Denver campus power system as the demonstration smart grid and it is assumed that the utility company can adopt the real-time pricing mechanism, which is demonstrated in this paper, to reflect the energy usage and power system condition in real time. For general scenarios, the global optimal results in terms of minimizing social costs can be reached at the Nash equilibrium of the formulated objective function. The proposed distributed HVAC controlling system requires each manager set the indoor temperature to the best response strategy to optimize their overall management. The building managers will be willing to participate in the proposed game to save energy cost while maintaining the indoor in comfortable zone.
With the development of Artificial Intelligence and computer technologies, reinforcement learning (RL) can be implemented in multiple realistic scenarios and help people to solve thousands of real-world problems. Reinforcement Learning, which is considered as the art of future AI, builds the bridge between agents and environments through Markov Decision Chain or Neural Network and has seldom been used in power system. The art of RL is that once the simulator for a specific environment is built, the algorithm can keep learning from the environment. Therefore, RL is capable of dealing with constantly changing simulator inputs such as power demand, the condition of power system and outdoor temperature, etc. Compared with the existing distribution power system planning mechanisms and the related game theoretical methodologies, our proposed algorithm can plan and optimize the hourly energy usage, and have the ability to corporate with even shorter time window if needed. The combination of deep neural network and reinforcement learning rockets up the research of deep reinforcement learning, and this manuscript contributes to the research of power energy management by developing and implementing the deep reinforcement learning to control the HVAC systems in distribution power system.
Simulation results prove that the proposed methodology can set the indoor temperature with respect to real-time pricing and the number of inside occupants, maintain indoor comfort, reduce individual building energy cost and the overall campus electricity charges. Compared with the traditional game theoretical methodology, the RL based gaming methodology can achieve the optiaml resutls much more quicker.
Copyright is held by the author. User is responsible for all copyright compliance.
Hao, Jun, "Deep Reinforcement Learning for the Optimization of Building Energy Control and Management" (2020). Electronic Theses and Dissertations. 1775.
Received from ProQuest
Electrical engineering, Energy, Engineering