AffectAI: An Emotion-Aware conversational AI for Human-Computer Interaction | IJET – Volume 12 Issue 2 | IJET-V12I2P49

International Journal of Engineering and Techniques (IJET) Logo

International Journal of Engineering and Techniques (IJET)

Open Access • Peer Reviewed • High Citation & Impact Factor • ISSN: 2395-1303

Volume 12, Issue 2  |  Published: April 2026

Author: Ms. Rajvee Sakariya, Mr. Om Sakariya, Ms. Rauki Yadav

DOI: https://doi.org/{{doi}}  •  PDF: Download

Abstract

In the present digital environment, conversational agents are becoming more and more common but they usually do not know how to interpret and react to human emotions, contributing to robot-like communication. The current study introduces AffectAI, an effective conversational system that will close this emotional intelligence gap. The holistic AffectAI system envisions a multimodal system, with the text sentiment analysis, speech emotion recognition, and facial expression detection to identify the affective state of a user. The conversational engine uses a dialogue manager which is a reinforcement learning agent that is capable of dynamically adjusting its tone, empathy level, and response content depending on the emotion that has been detected, and the context of the dialogue. The paper describes the design, development, and testing of the underlying visual element of this system, a real-time facial emotion recognition module. Based on a Convolutional Neural Network (CNN) trained using TensorFlow and optimized to execute on the Android platform through TensorFlow Lite, this module is implemented on the Android platform. Experimental analyses prove that this element can show a higher accuracy in emotion detection and user interaction, creating a strong paradigm of building more empathetic, context-aware, and human-like conversational agents to work in the mental health, education, and customer service areas.

Keywords

{{keywords}}

Conclusion

The aim of this paper was to design and test a practical, real-time facial emotion-recognition system that could be operated successfully inside a mobile application. The development of AFFECTAI shows that a well-balanced convolution neural network can prove to be successful in identifying various expressions of emotion while maintaining the same level of confidence over time. The outcome of the approach seems to show good results in detecting well expressed emotions, and at the same time, comparatively subtle expressions show areas for further refinement. This system goes beyond point prediction by combining classification, temporal tracking and visual analytics and tracks emotional trends over time. The contribution of this work is to reduce the gap between the theoretical models of emotion recognition and the real-world implementation. Unlike many of the existing systems that focus on accuracy metrics, this framework has put more priority on usability, interpretability, and ongoing monitoring. Multimodal integration and the increase of datasets and minimum model optimization are some of the areas that future studies could enhance performance. Human emotions will continue to be crucial to understanding and recognizing, as intelligent systems continue to evolve. This is to create digital systems that are more human-centric and sensitive to human feelings.

References

[1] V. S. Manjula and L. D. S. S. Baboo, “Face detection identification and tracking by PRDIT algorithm using image database for crime investigation,” Int. J. Comput. Appl., vol. 38, no. 10, pp. 40–46, Jan. 2012. [2] W. Wang, Y. Jie, J. Xiao, L. Sheng, and D. Zhou, “Face recognition based on deep learning,” in Proc. Int. Conf. Hum. Centered Comput., 2014 [3] Mollahosseini, Ali, Behzad Hasani, and Mohammad H. Mahoor. “Affectnet: A database for facial expression, valence, and arousal computing in the wild.” IEEE transactions on affective computing 10.1 (2017). [4] Li, Y., Zeng, J., Shan, S., & Chen, X. (2018). Occlusion aware facial expression recognition using CNN with attention mechanism. IEEE transactions on image processing, 28(5). [5] Barsoum, E., Zhang, C., Ferrer, C. C., & Zhang, Z. (2016, October). Training deep networks for facial expression recognition with crowd-sourced label distribution. In Proceedings of the 18th ACM international conference on multimodal interaction. [6] Tian, Chunwei, Jingyuan Xie, Lingjun Li, Wangmeng Zuo, Yanning Zhang, and David Zhang. “A Perception CNN for Facial Expression Recognition.” IEEE Transactions on Image Processing 34 (2025). [7] Xue, F., Wang, Q., & Guo, G. (2021). Transfer: Learning relation-aware facial expression representations with transformers. In Proceedings of the IEEE/CVF International conference on computer vision. [8] Poria, S., Hazarika, D., Majumder, N., Naik, G., Cambria, E., & Mihalcea, R. (2019, July). Meld: A multimodal multi-party dataset for emotion recognition in conversations. In Proceedings of the 57th annual meeting of the association for computational linguistics. [9] Barrett, Lisa Feldman, et al. “Emotional expressions reconsidered: Challenges to inferring emotion from human facial movements.” Psychological science in the public interest 20.1 (2019). [10] Li, Lixiang, et al. “A review of face recognition technology.” IEEE access 8 (2020): 139110-139120. [11] Chowdhary, Chiranji Lal, ed. Multidisciplinary Applications of Deep Learning-Based Artificial Emotional Intelligence. IGI Global, 2022. [12] Følstad, Asbjørn, et al. “Future directions for chatbot research: an interdisciplinary research agenda.” Computing 103.12 (2021). [13] Zhang, Kaipeng, et al. “Joint face detection and alignment using multitask cascaded convolutional networks.” IEEE signal processing letters 23.10 (2016). [14] Uniyal, S., & Agarwal, R. (2024, December). Analyzing facial emotion patterns in AffectNet with deep neural networks. In 2024 1st International Conference on Advances in Computing, Communication and Networking (ICAC2N). [15] Jiang, H., & Learned-Miller, E. (2017, May). Face detection with the faster R-CNN. In 2017 12th IEEE international conference on automatic face & gesture recognition (FG 2017). [16] Jaiswal, A., Raju, A. K., & Deb, S. (2020, June). Facial emotion detection using deep learning. In 2020 international conference for emerging technology (INCET). [17] Verma, Amit, et al. “Face recognition: a review and analysis.” Computational Intelligence in Data Mining: Proceedings of ICCIDM 2021 (2022).

Cite this article

APA
Ms. Rajvee Sakariya, Mr. Om Sakariya, Ms. Rauki Yadav (April 2026). AffectAI: An Emotion-Aware conversational AI for Human-Computer Interaction. International Journal of Engineering and Techniques (IJET), 12(2). https://doi.org/{{doi}}
Ms. Rajvee Sakariya, Mr. Om Sakariya, Ms. Rauki Yadav, “AffectAI: An Emotion-Aware conversational AI for Human-Computer Interaction,” International Journal of Engineering and Techniques (IJET), vol. 12, no. 2, April 2026, doi: {{doi}}.
Submit Your Paper