Abstract
Recently, Augmented Reality (AR) is an emerging technology in many mobile applications. Mobile AR was defined as a medium for displaying information merged with the real world environment mapped with augmented reality surrounding in a single view. There are four main types of mobile augmented reality interfaces and one of them are multimodal interfaces. Multimodal interface processes two or more combined user input modes (such as speech, pen, touch, manual gesture, gaze, and head and body movements) in a coordinated manner with multimedia system output. In multimodal interface, many frameworks have been proposed to guide the designer to develop a multimodal applications including in augmented reality environment but there has been little work reviewing the framework of adaptive multimodal interface in mobile augmented reality. The main goal of this study is to propose a conceptual framework to illustrate the adaptive multimodal interface in mobile augmented reality. We reviewed several frameworks that have been proposed in the field of multimodal interfaces, adaptive interface and augmented reality. We analyzed the components in the previous frameworks and measure which can be applied in mobile devices. Our framework can be used as a guide for designers and developer to develop a mobile AR application with an adaptive multimodal interfaces.
Original language | English |
---|---|
Title of host publication | 2nd International Conference on Applied Science and Technology 2017, ICAST 2017 |
Publisher | American Institute of Physics Inc. |
Volume | 1891 |
ISBN (Electronic) | 9780735415737 |
DOIs | |
Publication status | Published - 3 Oct 2017 |
Event | 2nd International Conference on Applied Science and Technology 2017, ICAST 2017 - Langkawi, Kedah, Malaysia Duration: 3 Apr 2017 → 5 Apr 2017 |
Other
Other | 2nd International Conference on Applied Science and Technology 2017, ICAST 2017 |
---|---|
Country | Malaysia |
City | Langkawi, Kedah |
Period | 3/4/17 → 5/4/17 |
Fingerprint
ASJC Scopus subject areas
- Physics and Astronomy(all)
Cite this
Adaptive multimodal interaction in mobile augmented reality : A conceptual framework. / Abidin, Rimaniza Zainal; Arshad, Haslina; Shukri, Saidatul A.Isyah Ahmad.
2nd International Conference on Applied Science and Technology 2017, ICAST 2017. Vol. 1891 American Institute of Physics Inc., 2017. 020150.Research output: Chapter in Book/Report/Conference proceeding › Conference contribution
}
TY - GEN
T1 - Adaptive multimodal interaction in mobile augmented reality
T2 - A conceptual framework
AU - Abidin, Rimaniza Zainal
AU - Arshad, Haslina
AU - Shukri, Saidatul A.Isyah Ahmad
PY - 2017/10/3
Y1 - 2017/10/3
N2 - Recently, Augmented Reality (AR) is an emerging technology in many mobile applications. Mobile AR was defined as a medium for displaying information merged with the real world environment mapped with augmented reality surrounding in a single view. There are four main types of mobile augmented reality interfaces and one of them are multimodal interfaces. Multimodal interface processes two or more combined user input modes (such as speech, pen, touch, manual gesture, gaze, and head and body movements) in a coordinated manner with multimedia system output. In multimodal interface, many frameworks have been proposed to guide the designer to develop a multimodal applications including in augmented reality environment but there has been little work reviewing the framework of adaptive multimodal interface in mobile augmented reality. The main goal of this study is to propose a conceptual framework to illustrate the adaptive multimodal interface in mobile augmented reality. We reviewed several frameworks that have been proposed in the field of multimodal interfaces, adaptive interface and augmented reality. We analyzed the components in the previous frameworks and measure which can be applied in mobile devices. Our framework can be used as a guide for designers and developer to develop a mobile AR application with an adaptive multimodal interfaces.
AB - Recently, Augmented Reality (AR) is an emerging technology in many mobile applications. Mobile AR was defined as a medium for displaying information merged with the real world environment mapped with augmented reality surrounding in a single view. There are four main types of mobile augmented reality interfaces and one of them are multimodal interfaces. Multimodal interface processes two or more combined user input modes (such as speech, pen, touch, manual gesture, gaze, and head and body movements) in a coordinated manner with multimedia system output. In multimodal interface, many frameworks have been proposed to guide the designer to develop a multimodal applications including in augmented reality environment but there has been little work reviewing the framework of adaptive multimodal interface in mobile augmented reality. The main goal of this study is to propose a conceptual framework to illustrate the adaptive multimodal interface in mobile augmented reality. We reviewed several frameworks that have been proposed in the field of multimodal interfaces, adaptive interface and augmented reality. We analyzed the components in the previous frameworks and measure which can be applied in mobile devices. Our framework can be used as a guide for designers and developer to develop a mobile AR application with an adaptive multimodal interfaces.
UR - http://www.scopus.com/inward/record.url?scp=85031308186&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85031308186&partnerID=8YFLogxK
U2 - 10.1063/1.5005483
DO - 10.1063/1.5005483
M3 - Conference contribution
AN - SCOPUS:85031308186
VL - 1891
BT - 2nd International Conference on Applied Science and Technology 2017, ICAST 2017
PB - American Institute of Physics Inc.
ER -