Files
Abstract
Transportation electrification is one of the core policies actively driven by countries around the world. The strategic integration of electric vehicles (EVs) holds the key to a reliable and resilient future for the power grid. To address the challenges of integrating EVs into distribution grid operation, this dissertation presents a comprehensive framework with advanced optimization and machine learning techniques for the control and deployment of fleet EV (FEV) charging. This dissertation addresses the challenges of integrating FEVs into distribution grid operations through cloud-based and edge-based approaches. Cloud-based solutions provide centralized optimization methods deployed on platforms operated by utilities or distribution system operators, while edge-based solutions enable decentralized control at the grid edge, where advanced methods such as federated reinforcement learning can be applied.In Chapter 2, this dissertation develops a two-stage stochastic optimization model designed for the strategic placement of FEV charging stations (FEVCSs) to enhance the resilience of distribution networks. By focusing on high-impact, low-probability (HILP) events (such as hurricanes), this centralized optimization model that could be deployed on a cloud platform of a utility system accounts for uncertainties in both the power grid and transportation networks, ensuring operational efficiency and grid reliability during outages.In Chapter 3, the focus shifts to a cloud-based approach for aggregating EV charging and other distributed energy resources for market participation. A bilevel stochastic optimization framework is developed to model the FEVCS, as part of a Distributed Energy Resource Aggregator (DERA) of providing energy and ancillary services in the ISO day-ahead market. This model demonstrates the potential of FEVCSs to provide ancillary services and capacity reserves, accounting for their impact on locational marginal prices (LMPs).Chapter 4 introduces a novel cloud-edge collaboration framework based on federated reinforcement learning to enable decentralized control of FEVCSs in the distribution system. The proposed Federated Learning-Enhanced Conflict-Aware Multi-Agent Reinforcement Learning (FLE-CA-MARL) framework allows individual FEVs, modeled as agents, to coordinate their actions locally at the edge of the grid, contributing to voltage regulation and grid stability. This hybrid approach, which combines cloud-level coordination with edge-based decision-making, effectively addresses the computational complexity, latency, and data privacy challenges inherent in large-scale EV charging management.The work in this dissertation enables grid-optimized intelligent deployment and management of FEV charging by developing a suite of cloud-based optimization models and edge-based distributed control using federated reinforcement learning. The simulation results on different IEEE test cases demonstrate the effectiveness of this work in enabling FEVs to improve grid resilience, voltage stability, and market participation. This work offers a scalable and adaptive solution for the future deployment of FEVCS in modern smart grids.