Multimodal interaction design of HMI for electric vehicles in China: A study to enhance user experience

Authors

  • Tian Zenghui Department of Multimedia Creative, Faculty of Art, Sustainability & Creative Industry, Universiti Pendidikan Sultan Idris, 35900 Tanjong Malim, Perak, Malaysia
  • Nur Safinas Albakry Department of Multimedia Creative, Faculty of Art, Sustainability & Creative Industry, Universiti Pendidikan Sultan Idris, 35900 Tanjong Malim, Perak, Malaysia

DOI:

https://doi.org/10.37134/kupasseni.vol13.1.9.2025

Keywords:

Electric Vehicles, HMI Design, Multimodal interaction Design, User Satisfaction, User Experience

Abstract

Under the current severe challenges of global climate change and environmental pollution, the rapid development of electric vehicles (EVs) is seen as a key way to achieve sustainable development in the transport sector. EVs not only reduce dependence on fossil fuels and lower greenhouse gas emissions but also bring a new driving experience to users through their advanced technological features. In this transformation process, human-machine interaction (HMI) design plays a crucial role, which directly affects user acceptance and satisfaction with EVs. This study thoroughly analyses the application of multimodal interaction technology in HMI design for EVs, which greatly enriches the user interaction experience by integrating multi-sensory information such as visual, auditory, and tactile senses. We paid special attention to how the multimodal interaction design can enhance user convenience and driving safety, and how it can satisfy the needs of different users through personalized interaction. Through quantitative research methods, we used SPSS software to analyze the experimental data in detail to assess the effectiveness of multimodal interaction design in practical applications. The experimental results reveal the significant advantages of multimodal interaction design in enhancing user experience. Compared with traditional interaction methods, multimodal interaction design not only shortens the time for users to complete tasks and reduces the operation error rate, but also achieves a significant improvement in user satisfaction. These results suggest that multimodal interaction design can provide users with a more intuitive, natural, and enjoyable interaction experience, which is crucial for promoting the popularity of electric vehicles. In addition, the findings of this study provide valuable insights for the future development of HMI design for EVs. With the continuous advancement of technology and the increasing diversity of user needs, future HMI design needs to pay more attention to user-centered design principles and take full advantage of multimodal interaction technologies in order to create a smarter and more personalized driving experience.

Downloads

Download data is not yet available.

Author Biography

Tian Zenghui, Department of Multimedia Creative, Faculty of Art, Sustainability & Creative Industry, Universiti Pendidikan Sultan Idris, 35900 Tanjong Malim, Perak, Malaysia

School of Art and Creativity, Guangzhou College of Applied Science and Technology, Guangzhou, China

References

Hu, C., Gu, S., Yang, M., Han, G., Lai, C. S., Gao, M., Yang, Z., & Ma, G. (2024). MDEmoNet: A multimodal driver emotion recognition network for smart cockpit. 2024 IEEE International Conference on Consumer Electronics (ICCE), 1–6. https://doi.org/10.1109/ICCE59016.2024.10444365

Jiang, Z., & Xu, C. (2023). Policy incentives, government subsidies, and technological innovation in new energy vehicle enterprises: Evidence from China. Energy Policy, 177, 113527. https://doi.org/10.1016/j.enpol.2023.113527

Liu, W., Zhu, Y., Huang, R., Ohashi, T., Auernhammer, J., Zhang, X., Shi, C., & Wang, L. (2023). Designing interactive glazing through an engineering psychology approach: Six augmented reality scenarios that envision future car human-machine interface. Virtual Reality and Intelligent Hardware, 5(2), 157–170. https://doi.org/10.1016/j.vrih.2022.07.004

Mohammadi, M., Thornburg, J., & Mohammadi, J. (2023). Towards an energy future with ubiquitous electric vehicles: Barriers and opportunities. Energies, 16(17), 6379. https://doi.org/10.3390/en16176379

Mou, L., Zhao, Y., Zhou, C., Nakisa, B., Rastgoo, M. N., Ma, L., Huang, T., Yin, B., Jain, R., & Gao, W. (2023). Driver emotion recognition with a hybrid attentional multimodal fusion framework. IEEE Transactions on Affective Computing, 14(4), 2970–2981. https://doi.org/10.1109/TAFFC.2023.3250460

Pramuanjaroenkij, A., & Kakaç, S. (2023). The fuel cell electric vehicles: The highlight review. International Journal of Hydrogen Energy, 48(25), 9401–9425. https://doi.org/10.1016/j.ijhydene.2022.11.103

Qin, S., & Xiong, Y. (2024). Differences in the innovation effectiveness of China’s new energy vehicle industry policies: A comparison of subsidized and non-subsidized policies. Energy, 304, 132151. https://doi.org/10.1016/j.energy.2024.132151

Wang, Y., Wijenayake, S., Hoggenmüller, M., Hespanhol, L., Worrall, S., & Tomitsch, M. (2023). My eyes speak: Improving perceived sociability of autonomous vehicles in shared spaces through emotional robotic eyes. Proceedings of the ACM on Human-Computer Interaction, 7(MHCI), 1–30. https://doi.org/10.1145/3604261

Wei, F., Walls, W. D., Zheng, X., & Li, G. (2023). Evaluating environmental benefits from driving electric vehicles: The case of Shanghai, China. Transportation Research Part D: Transport and Environment, 119, 103749. https://doi.org/10.1016/j.trd.2023.103749

Xin, F., Zhang, G., & Huang, Y. (2024). Research on intelligent vehicle cockpit design based on multimodal human-computer interaction technology. Proceedings of the 2024 International Conference on Digital Society and Artificial Intelligence, 130–134. https://doi.org/10.1145/3677892.3677914

Yuan, X., Liu, X., & Zuo, J. (2015). The development of new energy vehicles for a sustainable future: A review. Renewable and Sustainable Energy Reviews, 42, 298–305. https://doi.org/10.1016/j.rser.2014.10.016

Zhou, X., Williams, A. S., & Ortega, F. R. (2022). Eliciting multimodal gesture+speech interactions in a multi-object augmented reality environment. Proceedings of the 28th ACM Symposium on Virtual Reality Software and Technology, 1–10. https://doi.org/10.1145/3562939.3565637

Downloads

Published

2025-01-14

How to Cite

Zenghui, T., & Albakry, N. S. (2025). Multimodal interaction design of HMI for electric vehicles in China: A study to enhance user experience. KUPAS SENI: Jurnal Seni Dan Pendidikan Seni, 13(1), 83–96. https://doi.org/10.37134/kupasseni.vol13.1.9.2025