Thanks to visit codestin.com
Credit goes to github.com

Skip to content
/ Ojas Public

Ojas is a production-grade Android application that measures heart rate from live camera feed using remote photoplethysmography (rPPG) technology. The app leverages Arm Neon SIMD for signal processing and NNAPI for AI-powered signal refinement, achieving real-time performance on mobile devices.

Notifications You must be signed in to change notification settings

namdpran8/Ojas

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

26 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Ojas πŸ«€

Real-Time Contactless Heart Rate Monitoring on Android

Version Platform API License

Ojas is a production-grade Android application that transforms any smartphone into a medical-grade health monitor. Using remote photoplethysmography (rPPG), it detects heart rate and stress levels purely from a live camera feed.

Built for the Arm AI Developer Challenge, Ojas demonstrates how Arm Neon SIMD intrinsics, KleidiAI-optimized MediaPipe, and NNAPI can deliver real-time, privacy-first health AI on the edge.


🎯 Features

  • βœ… Contactless Measurement: No wearables required - uses front camera only
  • ⚑ Real-Time Processing: 30 FPS camera pipeline with <50ms latency
  • πŸš€ Hardware Accelerated: Arm Neon SIMD + NPU/GPU acceleration via NNAPI
  • 🎨 Medical-Grade UI: Futuristic dark theme with live waveform visualization
  • πŸ”¬ Scientific Accuracy: FFT-based frequency analysis + AI refinement
  • πŸ—οΈ Production Ready: MVVM architecture, Kotlin + C++, fully documented

🧬 Technical Architecture

ojas2 ojas1

πŸ› οΈ Tech Stack

Frontend

  • UI Framework: Jetpack Compose
  • Architecture: MVVM + StateFlow
  • Camera: CameraX (ImageAnalysis)

Vision Pipeline

  • Face Detection: MediaPipe Face Landmarker (468 landmarks)
  • Running Mode: LIVE_STREAM (30fps)
  • ROI: Forehead + Left/Right Cheeks

Signal Processing (Native)

  • Language: C++17 with Arm Neon intrinsics
  • FFT: KissFFT (optimized for mobile)
  • SIMD: arm_neon.h - 4x float32 vectors
  • Operations: Mean, StdDev, Normalization, Windowing

AI Refinement

  • Framework: TensorFlow Lite 2.14.0
  • Model: 1D CNN (Conv1D + Dense layers)
  • Acceleration: NNAPI Delegate (Arm NPU/GPU)
  • Precision: FP16 for mobile efficiency

πŸ“¦ Installation

Prerequisites

  • Android Studio Hedgehog (2023.1.1) or newer
  • Android SDK 26+ (Oreo)
  • NDK 25.1.8937393 or newer
  • CMake 3.22.1+
  • Python 3.8+ (for model generation)

Step 1: Clone Repository

git clone https://github.com/namdpran8/Ojas
cd ojas

Step 2: Generate TFLite Model

# Install Python dependencies
pip install tensorflow numpy

# Generate rPPG model
python generate_rppg_model.py

# Copy to assets
cp rppg_model.tflite app/src/main/assets/

Step 3: Download MediaPipe Model

Download face_landmarker.task from MediaPipe Solutions and place in:

app/src/main/assets/face_landmarker.task

Step 4: Build & Run

# Sync dependencies
./gradlew clean build

# Install on connected device
./gradlew installDebug

πŸš€ Usage

  1. Launch App: Grant camera permission when prompted
  2. Position Face: Center your face in the camera view
  3. Wait for Detection: Green landmarks appear on face
  4. Signal Acquisition: Status shows "Acquiring Signal..." (3 seconds)
  5. Measurement: Heart rate displays after 5 seconds

Tips for Best Results

  • βœ… Good lighting conditions (avoid shadows)
  • βœ… Keep face steady and centered
  • βœ… Wait for "Measuring" status
  • ❌ Avoid excessive movement
  • ❌ Don't cover forehead or cheeks

πŸ”¬ How It Works

1. Light Absorption Principle

Blood volume changes with each heartbeat β†’ affects light absorption β†’ detectable in skin's green channel.

2. ROI Extraction

MediaPipe identifies forehead and cheek landmarks β†’ samples 3x3 pixel regions β†’ averages green channel intensity.

3. Signal Processing Pipeline

Raw Signal β†’ Normalization β†’ Hamming Window β†’ FFT β†’ Peak Detection

4. Frequency Analysis

FFT finds dominant frequency in 0.75-3.0 Hz range (45-180 BPM) β†’ converts to heart rate.

5. AI Refinement

1D CNN removes motion artifacts and noise β†’ outputs cleaned heart rate estimate.


⚑ Performance Optimization

πŸ› οΈ Arm Optimization Deep Dive

Ojas isn't just a wrapper around an API; it features custom low-level optimizations for Arm processors:

1. Neon-Accelerated Pixel Extraction

Instead of a standard scalar loop, Ojas uses arm_neon.h intrinsics to process image data.

  • Technique: SIMD (Single Instruction, Multiple Data)
  • Implementation: Loads 16 pixels (128 bits) into NEON registers (uint8x16x4_t) and computes channel averages in parallel.
  • Benefit: Reduces frame processing time by ~4x compared to scalar C++ code.

2. KleidiAI Integration

We utilize MediaPipe 0.10.14, which integrates Arm KleidiAI micro-kernels.

  • Impact: drastically improves matrix multiplication performance for the Face Mesh model on Arm v9 CPUs.

3. NPU/GPU Offloading

  • Face Tracking: Runs on the NPU/GPU via XNNPACK.
  • Signal Cleaning: The 1D CNN model uses the Android NNAPI delegate to leverage specific hardware accelerators (Hexagon DSP, Mali GPU, or Ethos NPU).

5. NNAPI (Neural Networks API)

  • Target: Arm Cortex-M NPU, Ethos-N NPU, Mali GPU
  • Precision: FP16 (half-precision)
  • Inference Time: ~10ms (vs. 50ms CPU-only)

5. Optimization Flags

-O3                    # Maximum optimization
-ffast-math           # Aggressive FP math
-march=armv8-a        # Target ARMv8 architecture
-mfpu=neon            # Enable SIMD

πŸ“Š Benchmark Results

Device SoC NPU Avg HR Error FPS Inference Time
Pixel 7 Tensor G2 TPU Β±6.3 BPM 30 8ms
Galaxy S23 Snapdragon 8 Gen 2 Hexagon Β±4.7 BPM 30 10 ms
Galaxy S24 Snapdragon 8 Gen 3 Hexagon Β±3.1 BPM 30 9 ms

Note: Signal processing via NEON is negligible (<1ms) compared to frame time, proving the efficiency of SIMD. Tested against chest strap (clinical reference)


πŸ—‚οΈ Project Structure

Ojas/
β”œβ”€β”€ app/
β”‚   β”œβ”€β”€ src/
β”‚   β”‚   β”œβ”€β”€ main/
β”‚   β”‚   β”‚   β”œβ”€β”€ cpp/
β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ CMakeLists.txt          # βœ… Neon flags
β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ native-lib.cpp          # JNI bridge
β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ signal_processor.cpp    # βœ… Neon SIMD
β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ signal_processor.h
β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ kiss_fft.c             # FFT implementation
β”‚   β”‚   β”‚   β”‚   └── kiss_fft.h
β”‚   β”‚   β”‚   β”œβ”€β”€ java/com/hemovision/rppg/
β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ MainActivity.kt
β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ camera/
β”‚   β”‚   β”‚   β”‚   β”‚   └── CameraManager.kt
β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ core/
β”‚   β”‚   β”‚   β”‚   β”‚   └── NativeSignalProcessor.kt
β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ ml/
β”‚   β”‚   β”‚   β”‚   β”‚   └── PulseML.kt         # βœ… NNAPI
β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ vision/
β”‚   β”‚   β”‚   β”‚   β”‚   └── FaceTracker.kt
β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ viewmodel/
β”‚   β”‚   β”‚   β”‚   β”‚   └── HeartRateViewModel.kt
β”‚   β”‚   β”‚   β”‚   └── ui/
β”‚   β”‚   β”‚   β”‚       β”œβ”€β”€ MainScreen.kt
β”‚   β”‚   β”‚   β”‚       └── theme/
β”‚   β”‚   β”‚   β”œβ”€β”€ assets/
β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ face_landmarker.task   # MediaPipe model
β”‚   β”‚   β”‚   β”‚   └── rppg_model.tflite      # TFLite model
β”‚   β”‚   β”‚   └── AndroidManifest.xml
β”‚   └── build.gradle                        # Dependencies
β”œβ”€β”€ generate_rppg_model.py                  # Model generator
β”œβ”€β”€ ARM_OPTIMIZATION_CHECKLIST.md          # Submission proof
└── README.md

πŸ§ͺ Testing

Unit Tests

./gradlew test

Instrumented Tests

./gradlew connectedAndroidTest

Manual Validation

Compare readings against:

  • Pulse oximeter
  • Smartwatch (Apple Watch, Galaxy Watch)
  • Clinical heart rate monitor

πŸ› Troubleshooting

Issue: "Face not detected"

Solution: Ensure good lighting and face is centered in frame.

Issue: "Unstable readings"

Solution: Keep face still, avoid covering forehead/cheeks.

Issue: "NNAPI delegate failed"

Solution: App falls back to GPU/CPU automatically. Check device supports NNAPI:

adb shell dumpsys neuralnetworks

Issue: "Low FPS"

Solution:

  • Close background apps
  • Ensure device is not in power-saving mode
  • Check if device supports Neon: adb shell cat /proc/cpuinfo | grep neon

πŸ“š References

Scientific Papers

  1. Remote Photoplethysmography: A Review (2022)
  2. PhysNet: Deep Learning for rPPG

Technologies


🀝 Contributing

Contributions are welcome! Please follow these steps:

  1. Fork the repository
  2. Create feature branch (git checkout -b feature/amazing-feature)
  3. Commit changes (git commit -m 'Add amazing feature')
  4. Push to branch (git push origin feature/amazing-feature)
  5. Open Pull Request

πŸ“„ License

This project is licensed under the MIT License - see LICENSE file for details.


πŸ‘€ Author

Pranshu Namdeo


πŸ™ Acknowledgments

  • MediaPipe team for face landmark detection
  • KissFFT for lightweight FFT implementation
  • TensorFlow Lite team for mobile AI tools
  • Arm for Neon SIMD documentation

πŸš€ Usage Guide

  1. Launch App: Grant camera permission.
  2. Position: Ensure your face is well-lit and centered.
  3. Tracking: Wait for the green mesh overlay to appear.
  4. Measuring: Hold still for ~10 seconds. The "Analysis" card will update from "Gathering data..." to showing your Stress Level and Heart Rate.

⭐ Star History

If this project helped you, please consider giving it a star!

Star History Chart


Built with ❀️ using Kotlin, C++, and Arm optimization

About

Ojas is a production-grade Android application that measures heart rate from live camera feed using remote photoplethysmography (rPPG) technology. The app leverages Arm Neon SIMD for signal processing and NNAPI for AI-powered signal refinement, achieving real-time performance on mobile devices.

Topics

Resources

Stars

Watchers

Forks

Packages

No packages published