Wang, Pengyue2021-01-132021-01-132020-08https://hdl.handle.net/11299/217799University of Minnesota Ph.D. dissertation. August 2020. Major: Mechanical Engineering. Advisor: William Northrop. 1 computer file (PDF); xi, 163 pages.Plug-in Hybrid Electric Vehicles (PHEVs) have potential to achieve high fuel efficiency and reduce on-road emissions compared to engine-powered vehicles when using well-designed Energy Management Strategies (EMSs). The EMS of PHEVs has been a research focus for many years and optimal or near optimal performance has been achieved using control-oriented approaches like Dynamic Programming (DP) and Model Predictive Control (MPC). These approaches either require accurate predictive models for the trip information during driving cycles or detailed velocity profiles in advance. However, such detailed information is not feasible to obtain in some real-world applications like the delivery vehicle application studied in this work. Here, data-driven approaches were developed and tested over real-world trips with the help of two-way Vehicle-to-Cloud (V2C) connectivity. First, the EMS problem was formulated as a probability density estimation problem and solved by Bayesian inference. The Bayesian algorithm deals with the condition where only small amounts of data are available and sequential parameter estimation problem elegantly, which matches the characteristics of the data generated by delivery vehicles. The predicted value of the parameter for the next trip is determined by the carefully designed prior information and all the available data of the vehicle so far. The parameter is updated before the delivery tasks using the latest trip information and stays static during the trip. This method was demonstrated on 13 vehicles with 155 real-world delivery trips in total and achieved an average of 8.9% energy efficiency improvement with respect to MPGe (miles per gallon equivalent). For vehicles with sufficient data that can represent the characteristics of future delivery trips, the EMS problem was formulated as a sequential decision-making problem under uncertainty and solved by deep reinforcement learning (DRL) algorithms. An intelligent agent was trained by interacting with the simulated environment built based on the vehicle model and historical trips. After training and validation, optimized parameter in the EMS was updated by the trained intelligent agent during the trip. This method was demonstrated on 3 vehicles with 36 real-world delivery trips in total and achieved an average of 20.8% energy efficiency improvement in MPGe. Finally, I investigated three problems that could be encountered when the developed DRL algorithms are deployed in real-world applications: model uncertainty, environment uncertainty and adversarial attacks. For model uncertainty, an uncertainty-aware DRL agent was developed, enabled by the technique of Bayesian ensemble. Given a state, the agent quantifies the uncertainty about the output action, which means although actions will be calculated for all input states, the high uncertainty associated with unfamiliar or novel states is captured. For environment uncertainty, a risk-aware DRL agent was built based on distributional RL algorithms. Instead of making decisions based on expected returns as standard RL algorithms, actions were chosen with respect to conditional value at risk, which gives more flexibility to the user and can be adapted according to different application scenarios. Lastly, the influence of adversarial attacks on the developed neural network based DRL agents was quantified. My work shows that to apply DRL agents on real-world transportation systems, adversarial examples in the form of cyber-attack should be considered carefully.enenergy managementmachine learningmodelingplug-in hybrid electric vehiclesreinforcement learningData-Driven Framework for Energy Management in Extended Range Electric Vehicles Used in Package Delivery ApplicationsThesis or Dissertation