Adaptive multimodal interaction in mobile augmented reality

A conceptual framework

Rimaniza Zainal Abidin, Haslina Arshad, Saidatul A.Isyah Ahmad Shukri

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Recently, Augmented Reality (AR) is an emerging technology in many mobile applications. Mobile AR was defined as a medium for displaying information merged with the real world environment mapped with augmented reality surrounding in a single view. There are four main types of mobile augmented reality interfaces and one of them are multimodal interfaces. Multimodal interface processes two or more combined user input modes (such as speech, pen, touch, manual gesture, gaze, and head and body movements) in a coordinated manner with multimedia system output. In multimodal interface, many frameworks have been proposed to guide the designer to develop a multimodal applications including in augmented reality environment but there has been little work reviewing the framework of adaptive multimodal interface in mobile augmented reality. The main goal of this study is to propose a conceptual framework to illustrate the adaptive multimodal interface in mobile augmented reality. We reviewed several frameworks that have been proposed in the field of multimodal interfaces, adaptive interface and augmented reality. We analyzed the components in the previous frameworks and measure which can be applied in mobile devices. Our framework can be used as a guide for designers and developer to develop a mobile AR application with an adaptive multimodal interfaces.

Original languageEnglish
Title of host publication2nd International Conference on Applied Science and Technology 2017, ICAST 2017
PublisherAmerican Institute of Physics Inc.
Volume1891
ISBN (Electronic)9780735415737
DOIs
Publication statusPublished - 3 Oct 2017
Event2nd International Conference on Applied Science and Technology 2017, ICAST 2017 - Langkawi, Kedah, Malaysia
Duration: 3 Apr 20175 Apr 2017

Other

Other2nd International Conference on Applied Science and Technology 2017, ICAST 2017
CountryMalaysia
CityLangkawi, Kedah
Period3/4/175/4/17

Fingerprint

interactions
head movement
pens
touch
reviewing
photographic developers
multimedia
emerging
output

ASJC Scopus subject areas

  • Physics and Astronomy(all)

Cite this

Abidin, R. Z., Arshad, H., & Shukri, S. A. I. A. (2017). Adaptive multimodal interaction in mobile augmented reality: A conceptual framework. In 2nd International Conference on Applied Science and Technology 2017, ICAST 2017 (Vol. 1891). [020150] American Institute of Physics Inc.. https://doi.org/10.1063/1.5005483

Adaptive multimodal interaction in mobile augmented reality : A conceptual framework. / Abidin, Rimaniza Zainal; Arshad, Haslina; Shukri, Saidatul A.Isyah Ahmad.

2nd International Conference on Applied Science and Technology 2017, ICAST 2017. Vol. 1891 American Institute of Physics Inc., 2017. 020150.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abidin, RZ, Arshad, H & Shukri, SAIA 2017, Adaptive multimodal interaction in mobile augmented reality: A conceptual framework. in 2nd International Conference on Applied Science and Technology 2017, ICAST 2017. vol. 1891, 020150, American Institute of Physics Inc., 2nd International Conference on Applied Science and Technology 2017, ICAST 2017, Langkawi, Kedah, Malaysia, 3/4/17. https://doi.org/10.1063/1.5005483
Abidin RZ, Arshad H, Shukri SAIA. Adaptive multimodal interaction in mobile augmented reality: A conceptual framework. In 2nd International Conference on Applied Science and Technology 2017, ICAST 2017. Vol. 1891. American Institute of Physics Inc. 2017. 020150 https://doi.org/10.1063/1.5005483
Abidin, Rimaniza Zainal ; Arshad, Haslina ; Shukri, Saidatul A.Isyah Ahmad. / Adaptive multimodal interaction in mobile augmented reality : A conceptual framework. 2nd International Conference on Applied Science and Technology 2017, ICAST 2017. Vol. 1891 American Institute of Physics Inc., 2017.
@inproceedings{195d8b0d60f14c82b0ad7d2f05e06e38,
title = "Adaptive multimodal interaction in mobile augmented reality: A conceptual framework",
abstract = "Recently, Augmented Reality (AR) is an emerging technology in many mobile applications. Mobile AR was defined as a medium for displaying information merged with the real world environment mapped with augmented reality surrounding in a single view. There are four main types of mobile augmented reality interfaces and one of them are multimodal interfaces. Multimodal interface processes two or more combined user input modes (such as speech, pen, touch, manual gesture, gaze, and head and body movements) in a coordinated manner with multimedia system output. In multimodal interface, many frameworks have been proposed to guide the designer to develop a multimodal applications including in augmented reality environment but there has been little work reviewing the framework of adaptive multimodal interface in mobile augmented reality. The main goal of this study is to propose a conceptual framework to illustrate the adaptive multimodal interface in mobile augmented reality. We reviewed several frameworks that have been proposed in the field of multimodal interfaces, adaptive interface and augmented reality. We analyzed the components in the previous frameworks and measure which can be applied in mobile devices. Our framework can be used as a guide for designers and developer to develop a mobile AR application with an adaptive multimodal interfaces.",
author = "Abidin, {Rimaniza Zainal} and Haslina Arshad and Shukri, {Saidatul A.Isyah Ahmad}",
year = "2017",
month = "10",
day = "3",
doi = "10.1063/1.5005483",
language = "English",
volume = "1891",
booktitle = "2nd International Conference on Applied Science and Technology 2017, ICAST 2017",
publisher = "American Institute of Physics Inc.",

}

TY - GEN

T1 - Adaptive multimodal interaction in mobile augmented reality

T2 - A conceptual framework

AU - Abidin, Rimaniza Zainal

AU - Arshad, Haslina

AU - Shukri, Saidatul A.Isyah Ahmad

PY - 2017/10/3

Y1 - 2017/10/3

N2 - Recently, Augmented Reality (AR) is an emerging technology in many mobile applications. Mobile AR was defined as a medium for displaying information merged with the real world environment mapped with augmented reality surrounding in a single view. There are four main types of mobile augmented reality interfaces and one of them are multimodal interfaces. Multimodal interface processes two or more combined user input modes (such as speech, pen, touch, manual gesture, gaze, and head and body movements) in a coordinated manner with multimedia system output. In multimodal interface, many frameworks have been proposed to guide the designer to develop a multimodal applications including in augmented reality environment but there has been little work reviewing the framework of adaptive multimodal interface in mobile augmented reality. The main goal of this study is to propose a conceptual framework to illustrate the adaptive multimodal interface in mobile augmented reality. We reviewed several frameworks that have been proposed in the field of multimodal interfaces, adaptive interface and augmented reality. We analyzed the components in the previous frameworks and measure which can be applied in mobile devices. Our framework can be used as a guide for designers and developer to develop a mobile AR application with an adaptive multimodal interfaces.

AB - Recently, Augmented Reality (AR) is an emerging technology in many mobile applications. Mobile AR was defined as a medium for displaying information merged with the real world environment mapped with augmented reality surrounding in a single view. There are four main types of mobile augmented reality interfaces and one of them are multimodal interfaces. Multimodal interface processes two or more combined user input modes (such as speech, pen, touch, manual gesture, gaze, and head and body movements) in a coordinated manner with multimedia system output. In multimodal interface, many frameworks have been proposed to guide the designer to develop a multimodal applications including in augmented reality environment but there has been little work reviewing the framework of adaptive multimodal interface in mobile augmented reality. The main goal of this study is to propose a conceptual framework to illustrate the adaptive multimodal interface in mobile augmented reality. We reviewed several frameworks that have been proposed in the field of multimodal interfaces, adaptive interface and augmented reality. We analyzed the components in the previous frameworks and measure which can be applied in mobile devices. Our framework can be used as a guide for designers and developer to develop a mobile AR application with an adaptive multimodal interfaces.

UR - http://www.scopus.com/inward/record.url?scp=85031308186&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85031308186&partnerID=8YFLogxK

U2 - 10.1063/1.5005483

DO - 10.1063/1.5005483

M3 - Conference contribution

VL - 1891

BT - 2nd International Conference on Applied Science and Technology 2017, ICAST 2017

PB - American Institute of Physics Inc.

ER -