Hybrid auditory based interaction framework for driver assistance system

Research output: Contribution to journalArticle

3 Citations (Scopus)

Abstract

Problem statement: The rapid development of Driver Assistance System (DAS) provides drivers with radically enhanced information and functionality. The nature of the current DAS requires a complex human-machine interaction which is distracting and may increase the risk of road accidents. The interaction between the driver and DAS should aid the driving process without interfering with safety and ease of vehicle operation. Speech based interaction mechanisms employed are not sufficiently robust to deal with the distraction and noise present in the interior environment of the vehicle. Approach: Thus, suitable hybrid earcon/auditory icon design principles for DAS are developed. These interfaces are investigated in driving simulators in-order to test their durability and robustness. Several evaluation parameters will be applied. This will ensure the driving-related information from the DAS was delivered to the driver without affecting the overall driving process. Results: This study produces auditory design principles for information mapping (visual into non-speech based interaction) and presentation framework was produced. It outlines representation architecture that enables concurrent auditory driving related information to be transmitted from four different sources in the vehicle's interior environment. It outlines a set of hybrid design principles that integrates auditory icons with earcons to map and present real-time driving related data from a visual to a non-speech auditory interface. Conclusion/Recommendations: The major contribution of this research project takes a genuine approach by considering the entire DAS (safety, navigation and entertainment subsystem). It proposes hybrid representation strategy based on the cognitive availability of the driver and cognitive workload. It was a significant discovery that aid future DAS auditory interaction design.

Original languageEnglish
Pages (from-to)1499-1504
Number of pages6
JournalJournal of Computer Science
Volume6
Issue number12
DOIs
Publication statusPublished - 2010

Fingerprint

Highway accidents
Security systems
Navigation
Durability
Simulators
Availability

Keywords

  • Auditory icon
  • Driver assistance system (DAS)
  • Earcon
  • Earcon recognition
  • Hierarchical architecture
  • Hybrid
  • Psychoacoustics basis
  • Synchronous
  • Synthetic speech

ASJC Scopus subject areas

  • Software
  • Computer Networks and Communications
  • Artificial Intelligence

Cite this

Hybrid auditory based interaction framework for driver assistance system. / Valbir Singh, Dalbir Singh.

In: Journal of Computer Science, Vol. 6, No. 12, 2010, p. 1499-1504.

Research output: Contribution to journalArticle

@article{96b902d3388f45d593b582e7ed1a3a8f,
title = "Hybrid auditory based interaction framework for driver assistance system",
abstract = "Problem statement: The rapid development of Driver Assistance System (DAS) provides drivers with radically enhanced information and functionality. The nature of the current DAS requires a complex human-machine interaction which is distracting and may increase the risk of road accidents. The interaction between the driver and DAS should aid the driving process without interfering with safety and ease of vehicle operation. Speech based interaction mechanisms employed are not sufficiently robust to deal with the distraction and noise present in the interior environment of the vehicle. Approach: Thus, suitable hybrid earcon/auditory icon design principles for DAS are developed. These interfaces are investigated in driving simulators in-order to test their durability and robustness. Several evaluation parameters will be applied. This will ensure the driving-related information from the DAS was delivered to the driver without affecting the overall driving process. Results: This study produces auditory design principles for information mapping (visual into non-speech based interaction) and presentation framework was produced. It outlines representation architecture that enables concurrent auditory driving related information to be transmitted from four different sources in the vehicle's interior environment. It outlines a set of hybrid design principles that integrates auditory icons with earcons to map and present real-time driving related data from a visual to a non-speech auditory interface. Conclusion/Recommendations: The major contribution of this research project takes a genuine approach by considering the entire DAS (safety, navigation and entertainment subsystem). It proposes hybrid representation strategy based on the cognitive availability of the driver and cognitive workload. It was a significant discovery that aid future DAS auditory interaction design.",
keywords = "Auditory icon, Driver assistance system (DAS), Earcon, Earcon recognition, Hierarchical architecture, Hybrid, Psychoacoustics basis, Synchronous, Synthetic speech",
author = "{Valbir Singh}, {Dalbir Singh}",
year = "2010",
doi = "10.3844/jcssp.2010.1493.1498",
language = "English",
volume = "6",
pages = "1499--1504",
journal = "Journal of Computer Science",
issn = "1549-3636",
publisher = "Science Publications",
number = "12",

}

TY - JOUR

T1 - Hybrid auditory based interaction framework for driver assistance system

AU - Valbir Singh, Dalbir Singh

PY - 2010

Y1 - 2010

N2 - Problem statement: The rapid development of Driver Assistance System (DAS) provides drivers with radically enhanced information and functionality. The nature of the current DAS requires a complex human-machine interaction which is distracting and may increase the risk of road accidents. The interaction between the driver and DAS should aid the driving process without interfering with safety and ease of vehicle operation. Speech based interaction mechanisms employed are not sufficiently robust to deal with the distraction and noise present in the interior environment of the vehicle. Approach: Thus, suitable hybrid earcon/auditory icon design principles for DAS are developed. These interfaces are investigated in driving simulators in-order to test their durability and robustness. Several evaluation parameters will be applied. This will ensure the driving-related information from the DAS was delivered to the driver without affecting the overall driving process. Results: This study produces auditory design principles for information mapping (visual into non-speech based interaction) and presentation framework was produced. It outlines representation architecture that enables concurrent auditory driving related information to be transmitted from four different sources in the vehicle's interior environment. It outlines a set of hybrid design principles that integrates auditory icons with earcons to map and present real-time driving related data from a visual to a non-speech auditory interface. Conclusion/Recommendations: The major contribution of this research project takes a genuine approach by considering the entire DAS (safety, navigation and entertainment subsystem). It proposes hybrid representation strategy based on the cognitive availability of the driver and cognitive workload. It was a significant discovery that aid future DAS auditory interaction design.

AB - Problem statement: The rapid development of Driver Assistance System (DAS) provides drivers with radically enhanced information and functionality. The nature of the current DAS requires a complex human-machine interaction which is distracting and may increase the risk of road accidents. The interaction between the driver and DAS should aid the driving process without interfering with safety and ease of vehicle operation. Speech based interaction mechanisms employed are not sufficiently robust to deal with the distraction and noise present in the interior environment of the vehicle. Approach: Thus, suitable hybrid earcon/auditory icon design principles for DAS are developed. These interfaces are investigated in driving simulators in-order to test their durability and robustness. Several evaluation parameters will be applied. This will ensure the driving-related information from the DAS was delivered to the driver without affecting the overall driving process. Results: This study produces auditory design principles for information mapping (visual into non-speech based interaction) and presentation framework was produced. It outlines representation architecture that enables concurrent auditory driving related information to be transmitted from four different sources in the vehicle's interior environment. It outlines a set of hybrid design principles that integrates auditory icons with earcons to map and present real-time driving related data from a visual to a non-speech auditory interface. Conclusion/Recommendations: The major contribution of this research project takes a genuine approach by considering the entire DAS (safety, navigation and entertainment subsystem). It proposes hybrid representation strategy based on the cognitive availability of the driver and cognitive workload. It was a significant discovery that aid future DAS auditory interaction design.

KW - Auditory icon

KW - Driver assistance system (DAS)

KW - Earcon

KW - Earcon recognition

KW - Hierarchical architecture

KW - Hybrid

KW - Psychoacoustics basis

KW - Synchronous

KW - Synthetic speech

UR - http://www.scopus.com/inward/record.url?scp=79251477554&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=79251477554&partnerID=8YFLogxK

U2 - 10.3844/jcssp.2010.1493.1498

DO - 10.3844/jcssp.2010.1493.1498

M3 - Article

VL - 6

SP - 1499

EP - 1504

JO - Journal of Computer Science

JF - Journal of Computer Science

SN - 1549-3636

IS - 12

ER -