cff-version: 1.2.0 abstract: "

The growing adoption of electric vehicles, known for their quieter operation compared to internal combustion engine vehicles, raises concerns about their detectability, particularly for vulnerable road users. To address this, regulations mandate the inclusion of exterior sound signals for electric vehicles, specifying minimum sound pressure levels at low speeds. These synthetic exterior sounds are often used in noisy urban environments, creating the challenge of enhancing detectability without introducing excessive noise annoyance. This study investigates the design of synthetic exterior sound signals that balance high noticeability with low annoyance. An audiovisual experiment with 14 participants was conducted using 15 virtual reality scenarios featuring a passing car. The scenarios included various sound signals, such as pure, intermittent, and complex tones at different frequencies. Two baseline cases, a diesel engine and only tyre noise, were also tested. Participants rated sounds for annoyance, noticeability, and informativeness using 11-point ICBEN scales. The findings highlight how psychoacoustic sound quality metrics predict annoyance ratings better than conventional sound metrics, providing insight into optimising sound design for electric vehicles. By improving pedestrian safety while minimising noise pollution, this research supports the development of effective and user-friendly exterior sound standards for electric vehicles.


The supplementary material contains:

* /data: Contains data collected during the experiment.

* /data/Participant_response: This folder includes output files generated by the Unity environment. Each folder follows the format: Participant_{participant_number}_{YYYYMMDD}_{HHMMSS} and includes:

* - The participant's movement data, such as head movement and hand movements.

* - Responses to in-experiment questions.

* - A mapping file indicating the sequence of trials per participant.

* /data/Response_form/: Contains intake and post-trial questionnaire data and forms:

* - intake-questionnaire.csv: Responses to the intake questionnaire.

* - post-questionnaire.csv: Responses to the post-trial questionnaire.

* - intake-questionnaire.pdf: PDF of the intake questionnaire form.

* - post-questionnaire.pdf: PDF of the post-trial questionnaire form.

* mapping.csv: Mapping of stimuli.

* master_datasheet.mat: Responses of the listening experiment.

* measurements.mat: List of the audio files to be analysed.

* metrics.mat: Sound quality metrics that the code produces.

* /sounds: Audio files used as stimuli in the Unity experimental setup (as emitted by the EV).

* /sounds_raw: Raw audio files used for SQM analysis (as perceived at the observer position).

* /SQAT-feature-EPNL: SQAT framework as downloaded from https://github.com/ggrecow/SQAT

* sqm_analysis.m: Analysis to produce figures.

* /unity_environment: Unity project containing the virtual environment used in the experiment.


NOTE:

A public repository with the maintained Unity code is available at: https://github.com/Shaadalam9/sound-ev

" authors: - family-names: Bazilinskyy given-names: Pavlo orcid: "https://orcid.org/0000-0001-9565-8240" - family-names: Alam given-names: Md Shadab orcid: "https://orcid.org/0000-0001-9184-9963" - family-names: Merino-Martínez given-names: Roberto orcid: "https://orcid.org/0000-0003-2261-9595" title: "Supplementary material for "Psychoacoustic assessment of synthetic sounds for electric vehicles in a virtual reality experiment"" keywords: version: 1 identifiers: - type: doi value: 10.4121/1f8ae9be-950b-430e-9b75-e2b420dcaa26.v1 license: CC0 date-released: 2025-05-13