Details

HUDs Up! Investigating the Impact of Heads-Up Displays and Cognitive Workload on Driver Takeover Performance in Automated Driving Scenarios

Project Image

Year: 2025

Term: Winter

Student Name: Jacob Zekorn

Supervisor: Nadine Moacdieh

Abstract: Autonomous vehicles have seen a significant increase in their prevalence in the market, and ensuring that the interaction between these autonomous systems and their human operators is safe remains a critical focus of research. This project investigated how varying levels of cognitive workload induced by non-driving-related tasks (NDRTs) interact with the presence of helpful heads-up display elements to influence a driver’s response to a takeover request (TOR). A virtual reality (VR) driving environment was developed using Unity3D, and used the integration of hardware such as a steering system and a VR headset to create an immersive simulation. Forty participants completed different driving scenarios featuring three levels of NDRT difficulty (control, 1-Back, 2-Back), under either a HUD or NO-HUD condition. Metrics such as steering and pedal response times, average speed and deviation, eye gaze measurements, NDRT performance, and subjective workload questionnaire scores were collected. A descriptive analysis was completed on the collected data, and there were some promising results. Participants who were under the HUD condition demonstrated a trend of faster response times and higher average speeds. Subjective workload ratings also showed an increase with NDRT difficulty across both conditions, with more cognitive load being perceived by participants in the HUD condition. These findings open the conversation for the design and use of HUD systems in autonomous vehicles, however, further analyses on the data must be completed to establish statistical significance.