Follow the leader: a deep reinforcement learning framework for safe and efficient autonomous car-following

2026-02-07

Andres L. Marin, Fernando Martínez-Plumed, Michail A. Makridis, Alessandro Tansini, Luca Pulvirenti, Jaime Suarez Corujo, Dimitrios Komnos, María José Ramírez-Quintana, Carlos Monserrat, Georgios Fontaras,
Follow the leader: a deep reinforcement learning framework for safe and efficient autonomous car-following,
Journal of Intelligent Transportation Systems,
2025,
,
ISSN 1547-2450,
https://doi.org/10.1080/15472450.2025.2576907.
(https://www.sciencedirect.com/science/article/pii/S154724502500074X)
Abstract: Autonomous and semi-autonomous driving technologies present a promising path to achieving safer and more efficient transportation, potentially reducing energy use and emissions. However, these advantages might not be realized without carefully targeted optimization and ad-hoc designed dataset to evaluate the efficacy of the optimization performance. Through the analysis of both driver actions and vehicle dynamics using a real-world dataset, we present an innovative reward function based on the Deep Deterministic Policy Gradient (DDPG) method. The reward function aims to minimize energy consumption and the increased travel time while keeping acceleration within physical limits. The approach results in an average of 7% energy savings in various vehicle powertrains, including Internal Combustion Engine Vehicles (ICEV), Plug-in Hybrid Electric Vehicles (PHEV), and Battery Electric Vehicles (BEV), while keeping acceleration within realistic ranges, maintaining safe following distances, and keeping trip time increases below 1%. The dataset can be accessed at https://data.jrc.ec.europa.eu/collection/id-00437.
Keywords: Car-following; deep reinforcement learning; driver behavior modeling; energy efficiency