Thanks to visit codestin.com
Credit goes to github.com

Skip to content

O-tero/MamaGuardian

Repository files navigation

MamaGuardian

An AI-powered maternal health Copilot and early warning system built for the Microsoft Fabric Global Hackathon.

License: MIT FastAPI Azure OpenAI

Table of Contents

Overview

MamaGuardian is a real-time maternal health triage and support system designed for low-resource settings. It combines Azure OpenAI, Microsoft Fabric, and Azure Communication Services to provide:

  1. AI-Powered Triage: Automatic classification of maternal health messages into Emergency/Warning/Routine categories
  2. Conversational Copilot: RAG-enabled Q&A assistant for maternal health education
  3. Real-Time Alerting: Instant notifications to healthcare providers for high-risk cases
  4. Unified Analytics: Streaming data pipeline to Microsoft Fabric for real-time dashboards and historical analysis

Problem Statement

Maternal mortality remains a critical challenge in low-resource settings, with many deaths preventable through early detection of warning signs. MamaGuardian bridges the gap between expectant mothers and healthcare systems using accessible SMS/WhatsApp channels and intelligent AI triage.

Solution Highlights

  • < 5 second end-to-end latency from message receipt to alert
  • Multi-lingual support (English, Swahili) via Azure Translator
  • RAG-powered Q&A using Azure Cognitive Search vector store
  • HIPAA-compliant PII protection with hashing and content redaction

System Architecture

graph TB
    User[👤 Expectant Mother<br/>SMS/WhatsApp Message]
    
    Twilio[📱 Twilio/Azure<br/>Communication Services]
    
    subgraph Fabric["Microsoft Fabric Platform"]
        Eventstream[⚡ Eventstream<br/>Real-Time Ingestion]
        
        Lakehouse[(🛖 OneLake Lakehouse<br/>Delta Tables<br/>Historical Storage)]
        
        KQL[(📊 KQL Database<br/>Real-Time Analytics<br/>Time-Series Analysis)]
    end
    
    subgraph AI["AI Processing Layer"]
        OpenAI[🤖 Azure OpenAI<br/>GPT-4 NLP Triage<br/>Risk Classification]
        CogSearch[🔍 Cognitive Search<br/>Vector Store<br/>RAG Knowledge Base]
        Copilot[💬 AI Copilot Assistant<br/>FAQ Chatbot<br/>Q&A Support]
    end
    
    subgraph Alerts["Alert & Response System"]
        DataActivator[🔔 Data Activator<br/>Anomaly Detection<br/>Trigger Rules]
        PowerAutomate[⚙️ Power Automate<br/>Notification Flow]
    end
    
    subgraph Output["Output Channels"]
        Clinician[👨‍⚕️ Clinician Alerts<br/>SMS/Teams/Email]
        Dashboard[📈 Power BI Dashboard<br/>Real-Time Triage View<br/>Risk Monitoring]
        UserResponse[📲 User Response<br/>Auto-Reply & Guidance]
    end
    
    User -->|Sends message| Twilio
    Twilio -->|Webhook/API| Eventstream
    
    Eventstream -->|Persist raw data| Lakehouse
    Eventstream -->|Stream events| KQL
    
    Eventstream -->|Message text| OpenAI
    OpenAI -->|Risk category| Lakehouse
    OpenAI -->|Classification| KQL
    
    KQL -->|Monitor conditions| DataActivator
    DataActivator -->|Trigger on Emergency| PowerAutomate
    PowerAutomate -->|Send notification| Clinician
    
    User -->|Ask question| Copilot
    Copilot -->|Retrieve context| CogSearch
    Copilot -->|Generate answer| UserResponse
    OpenAI -->|Auto-response| UserResponse
    UserResponse -->|SMS/WhatsApp| User
    
    Lakehouse -->|Historical data| Dashboard
    KQL -->|Live streaming data| Dashboard
    Dashboard -->|Monitor & analyze| Clinician
    
    classDef userClass fill:#e1f5ff,stroke:#01579b,stroke-width:2px
    classDef fabricClass fill:#fff3e0,stroke:#e65100,stroke-width:2px
    classDef aiClass fill:#f3e5f5,stroke:#4a148c,stroke-width:2px
    classDef alertClass fill:#ffebee,stroke:#b71c1c,stroke-width:2px
    classDef outputClass fill:#e8f5e9,stroke:#1b5e20,stroke-width:2px
    
    class User,Twilio userClass
    class Eventstream,Lakehouse,KQL fabricClass
    class OpenAI,CogSearch,Copilot aiClass
    class DataActivator,PowerAutomate alertClass
    class Clinician,Dashboard,UserResponse outputClass
Loading

Architecture Principles

  • Event-Driven: Asynchronous processing via Fabric Eventstreams
  • Stateless Design: Horizontal scalability with container replicas
  • Pluggable Transports: Support for multiple messaging providers (ACS/Twilio)
  • SOLID Principles: Separation of concerns, dependency injection, interface-based design

Key Features

✅ Core Capabilities

  • Real-Time Message Ingestion: SMS/WhatsApp → Fabric Eventstream (< 1s latency)
  • AI-Powered Triage: Azure OpenAI classifies messages into Emergency/Warning/Routine
  • Emergency Alerting: Automatic notifications via Power Automate & Microsoft Teams
  • Conversational Copilot: RAG-enabled Q&A with Azure Cognitive Search vector store
  • Unified Data Platform: Single source of truth in OneLake Lakehouse
  • Real-Time Dashboards: Power BI with live KQL data streaming

🔒 Security & Compliance

  • Phone number hashing with salt-based PII protection
  • Webhook signature verification (Twilio HMAC SHA1, ACS HMAC SHA256)
  • Content safety filtering and mandatory medical disclaimers
  • Structured logging with automatic PII redaction

🌍 Accessibility Features

  • Multi-lingual support (English, Swahili) via Azure Translator
  • SMS/WhatsApp accessibility for low-connectivity areas
  • Async processing with retry logic for resilient delivery

Technical Stack

Core Technologies

  • Backend: FastAPI (Python 3.11+), Uvicorn, Gunicorn
  • AI/ML: Azure OpenAI (GPT-4, text-embedding-3-large), Azure Cognitive Search
  • Data Platform: Microsoft Fabric (Eventstreams, Lakehouse, KQL Database)
  • Messaging: Azure Communication Services, Twilio
  • Alerting: Power Automate, Microsoft Teams, Data Activator
  • Storage: Azure Blob Storage, OneLake Delta Tables
  • Observability: Structlog, Azure Application Insights

Key Libraries

fastapi==0.104.1
pydantic==2.5.0
openai==1.3.5
azure-identity==1.15.0
azure-eventhub==5.11.5
azure-search-documents==11.4.0
httpx==0.25.2
tenacity==8.2.3

Prerequisites

Required Azure Resources

  1. Azure OpenAI Service (GPT-4 + text-embedding-3-large deployments)
  2. Azure Cognitive Search (Standard tier, semantic search enabled)
  3. Azure Event Hubs Namespace (Standard tier)
  4. Microsoft Fabric Workspace (with Eventstream, Lakehouse, KQL DB)
  5. Azure Communication Services OR Twilio Account
  6. Azure Container Registry (for production deployment)
  7. Azure Key Vault (for secrets management)

Local Development

  • Python 3.11+
  • Docker & Docker Compose
  • Azure CLI (az)
  • Git

Setting Up Microsoft Fabric Environment

Before running the application, you must configure the Microsoft Fabric workspace and resources. This section provides step-by-step instructions for judges and developers to reproduce the complete solution.

Prerequisites for Fabric Setup

  • Microsoft Fabric capacity (F64 or higher recommended)
  • Power BI Premium workspace (or Fabric trial enabled)
  • Permissions: Workspace Admin or Contributor role

Step 1: Create Fabric Workspace

  1. Navigate to Microsoft Fabric Portal
  2. Click Workspaces+ New workspace
  3. Configure workspace:
    • Name: MamaGuardian-Workspace
    • License mode: Fabric capacity or Trial
    • Advanced → Enable all experiences (Data Engineering, Real-Time Intelligence, Data Science)
  4. Click Apply

Step 2: Create Eventstream (Real-Time Ingestion)

  1. In your workspace, click + NewEventstream
  2. Configure:
    • Name: mamaguardian-eventstream
    • Description: "Real-time maternal health message ingestion"
  3. Click Create

Configure Eventstream Source

  1. In the Eventstream designer, click Add sourceCustom App
  2. Configure custom app source:
    • Source name: MessageIngestion
    • Connection: Create new connection
    • Click Add
  3. Copy the connection string displayed in the source details (you'll need this for EVENT_HUB_CONNECTION_STRING)

Configure Eventstream Destinations

  1. Click Add destinationLakehouse

    • Destination name: TriageMessages-Lakehouse
    • Workspace: Select your workspace
    • Lakehouse: Create new or select existing (see Step 3)
    • Table name: triage_messages
    • Input data format: JSON
    • Click Add
  2. Click Add destinationKQL Database

    • Destination name: TriageMessages-KQL
    • Workspace: Select your workspace
    • KQL Database: Create new or select existing (see Step 4)
    • Table name: TriageMessages
    • Input data format: JSON
    • Click Add
  3. Click Publish to activate the Eventstream

Step 3: Create Lakehouse (Historical Storage)

  1. In your workspace, click + NewLakehouse
  2. Configure:
    • Name: MamaGuardian-Lakehouse
    • Description: "Persistent storage for maternal health data"
  3. Click Create

Create Delta Tables

  1. In the Lakehouse, click New SQL endpoint
  2. Run the following SQL to create the schema:
-- Create triage messages table
CREATE TABLE triage_messages (
    message_id STRING,
    phone_hash STRING,
    message_text STRING,
    triage_category STRING,
    confidence DOUBLE,
    keywords_detected ARRAY<STRING>,
    recommended_action STRING,
    alert_triggered BOOLEAN,
    timestamp TIMESTAMP,
    processing_time_ms BIGINT,
    provider STRING,
    language_detected STRING,
    translated BOOLEAN
)
USING DELTA
PARTITIONED BY (DATE(timestamp))
LOCATION 'Tables/triage_messages';

-- Create QnA logs table
CREATE TABLE qna_logs (
    query_id STRING,
    phone_hash STRING,
    question STRING,
    answer STRING,
    sources ARRAY<STRUCT<title: STRING, url: STRING>>,
    confidence DOUBLE,
    timestamp TIMESTAMP,
    language_detected STRING
)
USING DELTA
PARTITIONED BY (DATE(timestamp))
LOCATION 'Tables/qna_logs';

-- Create feedback table
CREATE TABLE feedback_logs (
    feedback_id STRING,
    message_id STRING,
    outcome STRING,
    clinician_notes STRING,
    timestamp TIMESTAMP
)
USING DELTA
LOCATION 'Tables/feedback_logs';
  1. Verify tables created:
SHOW TABLES;

Step 4: Create KQL Database (Real-Time Analytics)

  1. In your workspace, click + NewEventhouse
  2. Configure:
    • Name: MamaGuardian-Eventhouse
    • This creates both an Eventhouse and a KQL Database
  3. Click Create

Create KQL Tables

  1. In the KQL Database, click QueryNew query
  2. Run the following KQL commands:
// Create raw events table
.create table TriageMessages (
    message_id: string,
    phone_hash: string,
    message_text: string,
    triage_category: string,
    confidence: real,
    keywords_detected: dynamic,
    recommended_action: string,
    alert_triggered: bool,
    timestamp: datetime,
    processing_time_ms: long,
    provider: string,
    language_detected: string,
    translated: bool
)

// Enable streaming ingestion
.alter table TriageMessages policy streamingingestion enable

// Create ingestion mapping for JSON
.create table TriageMessages ingestion json mapping 'TriageMessagesMapping'

[ {"column": "message_id", "path": "$.message_id"}, {"column": "phone_hash", "path": "$.phone_hash"}, {"column": "message_text", "path": "$.message_text"}, {"column": "triage_category", "path": "$.triage_category"}, {"column": "confidence", "path": "$.confidence"}, {"column": "keywords_detected", "path": "$.keywords_detected"}, {"column": "recommended_action", "path": "$.recommended_action"}, {"column": "alert_triggered", "path": "$.alert_triggered"}, {"column": "timestamp", "path": "$.timestamp"}, {"column": "processing_time_ms", "path": "$.processing_time_ms"}, {"column": "provider", "path": "$.provider"}, {"column": "language_detected", "path": "$.language_detected"}, {"column": "translated", "path": "$.translated"} ]

  1. Create materialized views for analytics:
// Hourly aggregations
.create materialized-view HourlyTriageStats on table TriageMessages
{
    TriageMessages
    | summarize 
        MessageCount = count(),
        EmergencyCount = countif(triage_category == "Emergency"),
        WarningCount = countif(triage_category == "Warning"),
        RoutineCount = countif(triage_category == "Routine"),
        AvgProcessingTime = avg(processing_time_ms),
        AlertRate = todouble(countif(alert_triggered)) / count() * 100
      by bin(timestamp, 1h), provider
}

// Real-time anomaly detection view
.create function AnomalyDetection() {
    TriageMessages
    | where timestamp > ago(24h)
    | make-series MessageRate = count() default = 0 on timestamp step 1h
    | extend (anomalies, score, baseline) = series_decompose_anomalies(MessageRate, 1.5, -1, 'linefit')
    | mv-expand timestamp to typeof(datetime), MessageRate to typeof(long), anomalies to typeof(long), score to typeof(double), baseline to typeof(long)
    | where anomalies != 0
}

Step 5: Configure Data Activator (Real-Time Alerting)

  1. In your workspace, click + NewReflex
  2. Configure:
    • Name: MamaGuardian-Emergency-Alerts
    • Description: "Trigger alerts for emergency maternal health cases"
  3. Click Create

Create Alert Trigger

  1. Click Get dataEventstream

    • Select mamaguardian-eventstream
    • Click Connect
  2. Click New trigger

    • Trigger name: Emergency-Alert-Trigger
    • Condition: triage_category == "Emergency"
    • Action: Microsoft Teams / Email / Power Automate
  3. Configure action:

Option A: Microsoft Teams

  • Select Teams action
  • Click Sign in and authenticate
  • Select channel for notifications
  • Configure message template:
🚨 EMERGENCY MATERNAL HEALTH ALERT

Patient: {{phone_hash}}
Message: {{message_text}}
Classification: {{triage_category}}
Confidence: {{confidence}}%
Time: {{timestamp}}

Recommended Action: {{recommended_action}}

[View Dashboard](https://app.fabric.microsoft.com/...)

Option B: Power Automate

  • Select CustomPower Automate
  • Create new flow:
    1. Trigger: When a HTTP request is received
    2. Action: Send an SMS (using Twilio/ACS)
    3. Action: Post message in Teams
    4. Action: Send email (to on-call clinician)
  • Copy webhook URL to POWER_AUTOMATE_WEBHOOK_URL in .env
  1. Click Start to activate the trigger

Step 6: Create Power BI Dashboard

  1. In your workspace, click + NewPower BI report
  2. Click Pick a published semantic modelKQL Database
    • Select MamaGuardian-Eventhouse
    • Click Create

Add Real-Time Visuals

Visual 1: Emergency Alert Card

Emergency Cases Today = 
CALCULATE(
    COUNT(TriageMessages[message_id]),
    TriageMessages[triage_category] = "Emergency",
    TriageMessages[timestamp] >= TODAY()
)

Visual 2: Real-Time Message Stream Table

  • Data source: DirectQuery to KQL Database
  • Table: TriageMessages
  • Columns: timestamp, phone_hash, triage_category, message_text
  • Sort: timestamp descending
  • Filter: Last 1 hour
  • Enable Auto page refresh: 5 seconds

Visual 3: Hourly Triage Trend (Line Chart)

  • X-axis: timestamp (hourly bins)
  • Y-axis: Count of messages
  • Legend: triage_category
  • Data source: HourlyTriageStats materialized view

Visual 4: Processing Time Gauge

Avg Processing Time = 
AVERAGE(TriageMessages[processing_time_ms])
  • Target: < 500ms

Visual 5: Alert Rate KPI

Alert Rate = 
DIVIDE(
    COUNTROWS(FILTER(TriageMessages, TriageMessages[alert_triggered] = TRUE)),
    COUNTROWS(TriageMessages)
) * 100
  1. Click FileSave → Name: MamaGuardian-Triage-Dashboard
  2. Click Publish to workspace

Step 7: Verify Fabric Setup

Run the following verification queries in KQL Database:

// Check if tables exist
.show tables

// Test data ingestion (after sending test message)
TriageMessages
| take 10

// Verify streaming ingestion is enabled
.show table TriageMessages policy streamingingestion

// Test anomaly detection function
AnomalyDetection()

// Check hourly aggregations
HourlyTriageStats
| take 10

Step 8: Retrieve Connection Strings

After completing Fabric setup, collect these values for your .env file:

  1. EVENT_HUB_CONNECTION_STRING:

    • Go to Eventstream → Custom App source → Connection details
    • Copy the connection string
  2. Fabric Workspace URLs (for documentation):

    • Eventstream URL: https://app.fabric.microsoft.com/groups/{workspace-id}/eventstreams/{eventstream-id}
    • Lakehouse URL: https://app.fabric.microsoft.com/groups/{workspace-id}/lakehouses/{lakehouse-id}
    • KQL Database URL: https://app.fabric.microsoft.com/groups/{workspace-id}/kqldatabases/{kql-id}
    • Power BI Dashboard URL: https://app.fabric.microsoft.com/groups/{workspace-id}/reports/{report-id}

Troubleshooting Fabric Setup

Issue: Eventstream not receiving data

# Test Event Hub connection
python app/scripts/test_eventhub.py

Issue: KQL table ingestion failing

// Check ingestion failures
.show ingestion failures
| where Table == "TriageMessages"
| top 10 by FailedOn desc

Issue: Data Activator not triggering

  • Verify trigger condition syntax
  • Check Data Activator is in "Started" state
  • Review Activity log for errors

Issue: Power BI not showing real-time data

  • Ensure DirectQuery mode is enabled
  • Verify auto-refresh settings (Fabric capacity required)
  • Check KQL Database permissions

Quick Start

1. Clone Repository

git clone https://github.com/your-org/mamaguardian.git
cd mamaguardian

2. Create Virtual Environment

python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
pip install -r requirements.txt

3. Configure Environment

cp .env.example .env
# Edit .env with your Azure credentials and Fabric connection strings from Step 8 above

4. Initialize Knowledge Base (RAG)

# Embed FAQ documents into Cognitive Search
python app/scripts/embed_knowledge_base.py --input data/maternal_health_faq.json

5. Run Development Server

uvicorn app.main:app --reload --port 8000

6. Test Health Endpoint

curl http://localhost:8000/healthz

7. Simulate Inbound Message

python app/scripts/simulate_inbound.py --message "I have severe headache and blurred vision"

Configuration

Environment Variables (.env)

# ========================================
# REQUIRED - Core Azure OpenAI
# ========================================
AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com/
AZURE_OPENAI_KEY=your_openai_key_here
AZURE_OPENAI_DEPLOYMENT_CHAT=gpt-4-turbo
AZURE_OPENAI_DEPLOYMENT_EMBED=text-embedding-3-large

# ========================================
# REQUIRED - Security
# ========================================
HASH_SALT=random_salt_for_pii_hashing_min_32_chars

# ========================================
# REQUIRED - Azure Cognitive Search (RAG)
# ========================================
AZURE_SEARCH_ENDPOINT=https://your-search.search.windows.net
AZURE_SEARCH_KEY=your_search_admin_key
AZURE_SEARCH_INDEX_NAME=mamaguardian-kb

# ========================================
# REQUIRED - Event Streaming (choose one)
# ========================================
EVENT_HUB_CONNECTION_STRING=Endpoint=sb://your-namespace.servicebus.windows.net/...
EVENT_HUB_NAME=mamaguardian-events

# ========================================
# REQUIRED - Messaging Provider (choose one)
# ========================================
PROVIDER=acs  # or 'twilio'

# Option A: Azure Communication Services
ACS_CONNECTION_STRING=endpoint=https://your-acs.communication.azure.com/;accesskey=...
ACS_PHONE_NUMBER=+1234567890

# Option B: Twilio
TWILIO_ACCOUNT_SID=ACxxxxxxxxxxxxxxxxxxxx
TWILIO_AUTH_TOKEN=your_auth_token
TWILIO_PHONE_NUMBER=+1234567890

# ========================================
# OPTIONAL - Webhook Security
# ========================================
WEBHOOK_SECRET_ACS=your_webhook_secret_for_acs
WEBHOOK_SECRET_TWILIO=your_webhook_secret_for_twilio
DEVELOPMENT_MODE=false  # Set to 'true' to bypass webhook verification

# ========================================
# OPTIONAL - Translation (Multi-lingual)
# ========================================
TRANSLATOR_KEY=your_azure_translator_key
TRANSLATOR_REGION=eastus
TRANSLATOR_ENDPOINT=https://api.cognitive.microsofttranslator.com/

# ========================================
# OPTIONAL - Alerting
# ========================================
POWER_AUTOMATE_WEBHOOK_URL=https://prod-xx.region.logic.azure.com:443/workflows/...
TEAMS_WEBHOOK_URL=https://outlook.office.com/webhook/...

# ========================================
# OPTIONAL - Application Insights
# ========================================
APPLICATIONINSIGHTS_CONNECTION_STRING=InstrumentationKey=...

Webhook Configuration

For Twilio

  1. Go to Twilio Console → Phone Numbers → Active Numbers
  2. Set Messaging Webhook to: https://your-domain.com/webhooks/sms
  3. Method: POST

For Azure Communication Services

  1. Create Event Grid subscription for Microsoft.Communication.SMSReceived
  2. Set endpoint to: https://your-domain.com/webhooks/sms
  3. Add webhook secret in Event Grid headers

Deployment

Option 1: Docker (Development)

# Build development image
docker build -f Dockerfile.dev -t mamaguardian:dev .

# Run with environment file
docker run -p 8000:8000 --env-file .env mamaguardian:dev

Option 2: Docker (Production)

# Build production image
docker build -f Dockerfile -t mamaguardian:prod .

# Run with multiple workers
docker run -p 8000:8000 --env-file .env \
  -e WORKERS=4 \
  -e WORKER_CLASS=uvicorn.workers.UvicornWorker \
  mamaguardian:prod

Option 3: Azure Container Apps (Recommended)

Prerequisites

# Login to Azure
az login

# Set variables
RESOURCE_GROUP="mamaguardian-rg"
LOCATION="eastus"
ACR_NAME="mamaguardianacr"
APP_NAME="mamaguardian-api"

Deploy

# Create resource group
az group create --name $RESOURCE_GROUP --location $LOCATION

# Create container registry
az acr create --resource-group $RESOURCE_GROUP \
  --name $ACR_NAME --sku Standard

# Build and push image
az acr build --registry $ACR_NAME --image mamaguardian:latest .

# Create Container App environment
az containerapp env create \
  --name mamaguardian-env \
  --resource-group $RESOURCE_GROUP \
  --location $LOCATION

# Deploy Container App
az containerapp create \
  --name $APP_NAME \
  --resource-group $RESOURCE_GROUP \
  --environment mamaguardian-env \
  --image ${ACR_NAME}.azurecr.io/mamaguardian:latest \
  --target-port 8000 \
  --ingress external \
  --registry-server ${ACR_NAME}.azurecr.io \
  --env-vars-file .env.production

Option 4: Azure Web App

See .github/workflows/deploy-azure.yml for automated deployment via publish profile.

API Endpoints

Health & Info

GET /
GET /healthz

Response (/healthz):

{
  "status": "healthy",
  "timestamp": "2025-11-03T10:30:00Z",
  "services": {
    "openai": "healthy",
    "search": "healthy",
    "eventhub": "healthy",
    "messaging": "healthy"
  }
}

Webhooks (Production)

POST /webhooks/whatsapp
POST /webhooks/sms

Request Body (Twilio SMS):

{
  "MessageSid": "SM...",
  "From": "+254712345678",
  "To": "+1234567890",
  "Body": "I am 8 months pregnant with severe headache"
}

Direct APIs (Testing)

Triage Classification

POST /triage
Content-Type: application/json

{
  "phone": "+254712345678",
  "message": "I have severe abdominal pain and bleeding",
  "timestamp": "2025-11-03T10:30:00Z"
}

Response:

{
  "category": "Emergency",
  "confidence": 0.95,
  "keywords_detected": ["severe pain", "bleeding"],
  "recommended_action": "Immediate medical attention required",
  "response_message": "This requires urgent care. Please go to the nearest hospital immediately."
}

Q&A Copilot

POST /qna
Content-Type: application/json

{
  "phone": "+254712345678",
  "question": "What foods should I avoid during pregnancy?"
}

Response:

{
  "answer": "During pregnancy, avoid raw or undercooked meats, unpasteurized dairy, raw eggs, certain fish high in mercury (shark, swordfish), and unwashed produce. Also limit caffeine intake.",
  "sources": [
    {"title": "WHO Pregnancy Nutrition Guidelines", "url": "..."},
    {"title": "Safe Foods During Pregnancy", "url": "..."}
  ],
  "disclaimer": "This information is for educational purposes. Consult your healthcare provider for personalized advice."
}

Data Flow

1. Message Ingestion Pipeline

User SMS → Twilio/ACS → Webhook → FastAPI → Background Task
  ↓
  ├─ Normalize & Hash PII
  ├─ Detect Language (optional)
  ├─ Translate to English (if needed)
  └─ Publish to Eventstream

2. Triage Processing

Eventstream → Azure OpenAI (GPT-4) → Classification
  ↓
  ├─ Emergency → Trigger Alert (Data Activator)
  ├─ Warning → Log & Monitor
  └─ Routine → Auto-Response

3. Q&A Processing

User Question → Copilot Endpoint
  ↓
  ├─ Generate Embedding (text-embedding-3-large)
  ├─ Vector Search (Cognitive Search)
  ├─ Retrieve Top-K Documents
  └─ Generate Answer (GPT-4 with RAG context)

4. Alert Flow

Emergency Event → Data Activator Rule
  ↓
  └─ Power Automate → SMS/Teams/Email to Clinician

5. Analytics Pipeline

Eventstream
  ↓
  ├─ → Lakehouse (Delta Tables) → Historical Analysis
  └─ → KQL Database → Real-Time Queries → Power BI

Testing

Unit Tests

# Install test dependencies
pip install pytest pytest-cov pytest-asyncio

# Run all tests with coverage
pytest app/tests/ -v --cov=app --cov-report=html

# Run specific test module
pytest app/tests/test_llm.py -v

Integration Tests

# Test webhook signature verification
pytest app/tests/test_webhooks.py::test_twilio_signature_validation

# Test end-to-end triage flow
pytest app/tests/test_integration.py::test_emergency_triage_flow

Simulation Script

# Simulate multiple scenarios
python app/scripts/simulate_inbound.py --scenarios scenarios.json

Example scenarios.json:

[
  {
    "phone": "+254712345678",
    "message": "I have severe headache and blurred vision",
    "expected_category": "Emergency"
  },
  {
    "phone": "+254712345679",
    "message": "What vitamins should I take?",
    "expected_category": "Routine"
  }
]

CI/CD Pipeline

GitHub Actions Workflow

The repository includes automated workflows at .github/workflows/:

1. Build & Test (ci.yml)

  • Triggers on PR to main
  • Runs unit tests and linting
  • Generates coverage reports

2. Deploy to Azure (deploy-azure.yml)

  • Triggers on push to main
  • Builds Docker image
  • Pushes to Azure Container Registry
  • Updates Azure Container App

Required Secrets:

AZURE_CREDENTIALS          # Service principal JSON
ACR_LOGIN_SERVER          # e.g., mamaguardianacr.azurecr.io
AZURE_RESOURCE_GROUP      # Resource group name
AZURE_WEBAPP_NAME         # Container App name

3. Deploy with Publish Profile (deploy-webapp.yml)

Alternative deployment using Azure Web App publish profile.

Required Secrets:

AZURE_WEBAPP_NAME
AZURE_PUBLISH_PROFILE     # XML content from Azure Portal

Manual Deployment

# Trigger workflow manually
gh workflow run deploy-azure.yml -f environment=production

Monitoring & Observability

Structured Logging

All logs are JSON-formatted with automatic PII redaction:

import structlog
logger = structlog.get_logger()

logger.info("triage_complete",
  phone_hash="abc123...",
  category="Emergency",
  latency_ms=450
)

Application Insights

Integrated via APPLICATIONINSIGHTS_CONNECTION_STRING:

  • Request/response tracking
  • Exception logging
  • Custom metrics (triage accuracy, latency)
  • Dependency telemetry (OpenAI, Search, Event Hub)

KQL Monitoring Queries

Emergency cases last 24 hours:

FabricEvents
| where timestamp >= ago(1d)
| where triage_category == "Emergency"
| summarize count() by bin(timestamp, 1h)
| render timechart

Average processing latency by provider:

FabricEvents
| summarize avg(processing_time_ms) by provider
| render columnchart

Alert trigger rate:

FabricEvents
| where alert_triggered == true
| summarize count() by bin(timestamp, 1h), triage_category
| render timechart

Roadmap

Phase 1: MVP (Completed)

  • ✅ SMS/WhatsApp ingestion
  • ✅ AI triage classification
  • ✅ Emergency alerting
  • ✅ Q&A Copilot with RAG
  • ✅ Power BI dashboard

Phase 2: Enhanced Intelligence (Q1 2026)

  • Anomaly detection for trending symptoms
  • Predictive risk scoring with ML models
  • Multi-turn conversational memory
  • Voice message support (speech-to-text)

Phase 3: Clinical Integration (Q2 2026)

  • EHR/FHIR integration
  • Bidirectional clinician communication
  • Appointment scheduling
  • Prescription management

Phase 4: Scale & Localization (Q3 2026)

  • Multi-country deployment
  • Support for 10+ languages
  • Community health worker portal
  • Offline-first mobile app

Development Setup

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Make your changes with tests
  4. Run linting: ruff check app/
  5. Submit a pull request

Code Standards

  • Python: Follow PEP 8, use type hints
  • Async/await throughout
  • Comprehensive error handling
  • Structured logging with context
  • Unit test coverage > 80%

License

This project is licensed under the MIT License - see LICENSE file for details.

Acknowledgments

  • Microsoft Fabric Global Hackathon for the opportunity
  • Azure OpenAI & Cognitive Search teams for cutting-edge AI tools
  • WHO & UNICEF for maternal health guidelines
  • Open-source community for foundational libraries

Support


Built with ❤️ for maternal health using Microsoft Fabric & Azure AI

Submission for Microsoft Fabric Global Hackathon 2025

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •  

Languages