Date of Award

4-2025

Degree Name

Doctor of Philosophy

Department

Mechanical and Aerospace Engineering

First Advisor

Zachary D. Asher, Ph.D.

Second Advisor

Richard T. Meyer, Ph.D.

Third Advisor

Guan Yue Hong, Ph.D.

Fourth Advisor

Ali Riza Ekti, Ph.D.

Keywords

ADAS, AI, computer vision, GAN, inclement weather, snow automation

Abstract

Modern vehicles have undergone a transformation with the widespread integration of Advanced Driver Assistance Systems (ADAS) technology becoming the new standard and are set to be mandated by the by the National Highway Traffic Safety Administration (NHTSA) for all passenger vehicles and light trucks. ADAS features have proven to prevent or mitigate crashes by either alerting or assisting the driver. ADAS typically utilizes a forward-facing camera, which comes standard in modern vehicles to provide limited automation features such as Lane Keeping Assist (LKA), and Lane Centering Assist (LCA) to improve driver safety. These systems rely on the assumption that vehicle surroundings, and lane markings are clear and visible, but when a vehicle operates in adverse weather conditions like heavy snow, these systems fail. Additionally, to maintain safe autonomous vehicle (AV) operation, having an accurate perception of the driving environment is necessary. In this study, we address this research gap using novel methods of obtaining perception data using both on-vehicle and infrastructure-based sensors. Firstly, we outlined a novel way to determine the safe driving region in lanes covered with snow, using unique features like tire tracks. It is anticipated that these research findings can inform new ways to improve the drivable region detection in regions of snow-occluded lanes and expand the operational design domain (ODD) of ADAS. Study 1 was expanded to the second and third study, purpose-built deep learning computer vision models were developed to investigate robust drivable region detection techniques. The fourth study details a method that estimates the amount of road snow coverage using an in-vehicle camera and infrastructure weather sensor data inputs. The fifth study aims to explore the advantages of gathering perception data through Vehicle-to-Infrastructure (V2I) technology, which involves integrating infrastructure information sources (IISs) for real-world on-road vehicle perception and control tasks. The study examines the strengths and limitations of an off-the-shelf vision-based Advanced Driver Assistance System (ADAS) with the newly proposed V2I technology for robust and efficient perception and control tasks. Finally, the last study focuses on creating synthetic snow datasets using an open-source AV simulator and leveraging modern Deep Learning (DL) methods to translate clear road scenes into snow-covered road scenes. This approach directly addresses the data scarcity challenges associated with adverse weather datasets. These studies together take a step towards broadening the ODD of ADAS in challenging weather and unforeseen road conditions and edge cases.

Access Setting

Dissertation-Open Access

Share

COinS