AndroidXrApp is an experimental Android application developed as part of a study exploring the integration of computer vision techniques within Extended Reality (XR) environments. The application serves as an interactive educational tool, demonstrating the Android Activity Lifecycle through immersive XR presentations.
Features
- Interactive XR Presentation: Visualizes the Android Activity Lifecycle stages with corresponding slides.
- Audio Narration: Each slide is accompanied by a brief audio explanation to enhance understanding.
- Spatial Panel Integration: Utilizes a virtual interface panel within the 3D environment to present content.
- Orbiter Element: Implements a dynamic visual element that orbits the panel, providing visual cues during transitions.
- ARCore APIs: Leverages ARCore for spatial tracking, environmental understanding, and light estimation to ensure realistic interaction between virtual objects and the physical environment.
Technologies Used
- Android Studio (Canary version): For development and leveraging the experimental Android XR Emulator.
- ARCore: Provides the foundational XR capabilities, including motion tracking and environmental understanding.
- Jetpack Compose for XR: Facilitates the creation of dynamic and engaging user interfaces tailored for immersive environments.
- Spatial Panel & Orbiter - Custom components designed to enhance user interaction within the XR space
Usage
- Upon launching the application:
- Navigate through the XR presentation to explore different stages of the Android Activity Lifecycle.
- Each slide provides visual and auditory information to facilitate learning.
Computer Vision in Extended Reality presented at the 2025 International Conference on Software, Telecommunications and Computer Networks (π¦πΌπ³πππ’π ) and published by ππππ https://ieeexplore.ieee.org/abstract/document/11197395