An Extended Reality (XR)-Based Tactical Support System Integrating Edge-AI Threat Detection and a Distributed Sensor Network for High-Risk Operations | IJET – Volume 12 Issue 2 | IJET-V12I2P113

International Journal of Engineering and Techniques (IJET) Logo

International Journal of Engineering and Techniques (IJET)

Open Access • Peer Reviewed • High Citation & Impact Factor • ISSN: 2395-1303

Volume 12, Issue 2  |  Published: April 2026

Author: Dr. P. Arivazhagi, B. Farhana Parveen, T.A. Haritha, R. Logeshwari, S. Rakshana Shree

DOI: https://doi.org/{{doi}}  •  PDF: Download

Abstract

This paper presents the design and implementation of an Extended Reality (XR)–based tactical support system intended for use in high-risk defence operations. High-risk operational environments, such as defense battlefields, counter-terrorism missions, and disaster response zones, require instantaneous decision-making and seamless situational awareness. Until now, most defense- oriented immersive systems relied primarily on Virtual Reality (VR) and Augmented Reality (AR) interfaces simply to visualize preloaded mission data, lacking the capability to perform real-time threat identification. Building on earlier technologies, this paper proposes a next-generation Extended Reality (XR)-based tactical platform that performs intelligent on-ground analysis using Edge- Artificial Intelligence (Edge-AI) and a Distributed Sensor Network (DSN). The XR headset is equipped with a compact camera module capturing real-time visual feeds processed through an AI-driven facial recognition model to instantly match high-risk suspects against an encrypted database. Upon detection, the operator receives an immediate XR alert, while the base station receives captured imagery via a two-way wireless transceiver link. By offloading processing to the edge, the system dramatically reduces latency and cloud dependency, overcoming the limitations of traditional centralized surveillance. The integration of advanced hardware—including ESP32 microcontrollers, PIR sensors, vibration modules, and OLED displays—creates a resilient, lowpower, and highly scalable cooperative defense network.

Keywords

Extended Reality (XR), Edge-AI, Distributed Sensor Network, Threat Detection, Situational Awareness, Internet of Things (IoT), Tactical Support.

Conclusion

The proposed Extended Reality (XR)-Based Tactical Support System effectively integrates real-time image processing, Edge-AI, distributed sensor monitoring, and wireless communication to deliver a smart, highly responsive solution. By offloading facial recognition and threat analysis to the edge, the system escapes the latency and reliability pitfalls of traditional cloud-based architectures. The seamless fusion of OLED-driven XR visualization with multi-sensor environmental awareness (motion, vibration) ensures that field operators are continuously protected and informed. Its low-power design, decentralized resilience, and cost- effectiveness make it a cornerstone architecture for modern IoT-based defense, safety, and immersive tactical applications.

References

[1] S. Gurusubramani, M. Suresh Anand, J. Jegan Amarnath, D. Sathishkumar, and A. Sheela, “Augmented Reality in Military Applications,” Int. J. Eng. & Adv. Tech., vol. 9, Issue 1S, Oct. 2019. [2] X. You, W. Zhang, M. Ma, C. Deng, and J. Yang, “Survey on Urban Warfare Augmented Reality (UWAR),” ISPRS Int. J. Geo-Inf., 2018. [3] DEVCOM Army Research Laboratory, “Researchers help Soldiers find targets with augmented reality,” U.S. Army, Jan. 2021. [4] “Application of Augmented Reality, Mobile Devices, and Sensors for Combat Entity Quantitative Assessment Supporting Situational Aware ness,” Appl. Sci., 2019. [5] “Testing and Evaluation of ULTRA Vis Wearable AR System for Soldier Situational Awareness,” 2013. [6] S. Yuan, W. Guo, T. Hu, Y. Yang, J. Chen, R. Qian, Z. Liu, and L. Xie, “STARC: See-Through-Wall Augmented Reality for Human- Robot Collaboration in Emergency Response,” preprint, 2025. [7] O. Sautenkov, S. Asfaw, Y. Yaqoot, M. A. Mustafa, A. Fedoseev, D. Trinitatova, and D. Tsetserukou, “FlightAR: AR Flight Assistance In terface with Multiple Video Streams and Object Detection for Immersive Drone Control,” 2024. [8] A. Mallik, J. Xie, and Z. Han, “A Performance Analysis Modeling Framework for Extended Reality Applications in Edge-Assisted Wireless Networks,” arXiv / Wireless Networks Research, 2024. [9] M. Łysakowski, K. ˙ Zywanowski, M. Nowicki, et al., “Real-Time On board Object Detection for Augmented Reality Using YOLOv8,” arXiv / Computer Vision Research, 2023. [10] R. Xu, D. Nagothu, and Y. Chen, “AR- Edge: Autonomous and Resilient Edge Computing Architecture for Smart Cities,” Intech Open– Edge Computing Research, 2024. [11] M. Walsh, D. Schulker, D. Ross, and D. Cortese, “Cyber- Physical Sensing to Extend the National Intelligence, Surveillance, and Recon naissance Mesh,” Carnegie Mellon Software Engineering Institute, 2024. [12] A. Ojha and B. Gupta, “Evolving Landscape of Wireless Sensor Net works: Trends and Future Perspectives,” Discover Applied Sciences (Springer), 2025. [13] S. Singh and M. Bajpai, “AI-Assisted Multi-Sensor Fusion Architecture for Situational Awareness in Tactical Environments,” International Journal of Scientific Research in Engineering and Technology, 2025. [14] C. Chacc our, W. Saad, M. Debbah, and H. V. Poor, “Joint Sensing, Communication, and AI for Extended Reality Systems,” IEEE Research / Wireless Networks, 2023. [15] A. Ray, “Edge AgentX-DT: Integrating Digital Twins and Generative AI for Resilient Edge Intelligence in Tactical Networks,” arXiv Research Publication, 2025. [16] Y. Li, et al., “Edge Vision Network-Enabled Augmented Reality Monitoring for Structural Systems,” Journal of Manufacturing Systems, 2026

Cite this article

APA
{{author}} (April 2026). {{title}}. International Journal of Engineering and Techniques (IJET), 12(2). https://doi.org/{{doi}}
{{author}}, “{{title}},” International Journal of Engineering and Techniques (IJET), vol. 12, no. 2, April 2026, doi: {{doi}}.
Submit Your Paper