Predicting Blood Glucose Levels Over Time Using Interpolation Methods
1. INTRODUCTION
1.1 Relevance of Glucose Level Monitoring
Accurate prediction of glucose levels is critical for managing diabetes, optimizing insulin dosage,
and preventing hyperglycemia or hypoglycemia.
1.2 Mathematical Foundation
Glucose level changes over time can be represented as a function using discrete data points (e.g.,
from continuous glucose monitors). Interpolation is a technique used to estimate intermediate values
within the range of collected data.
1.3 Importance of Interpolation
Unlike curve fitting, interpolation ensures that the estimated function passes through all the known
data points, providing high accuracy in medical applications.
1.4 Role of Technology
With digital health records and real-time sensors, implementing interpolation algorithms in Python or
MATLAB helps predict glucose fluctuations efficiently.
1.5 Integration with Applied Mathematics
Interpolation links mathematical theory to real-time patient care, particularly in predictive diagnostics
and continuous health monitoring systems.
2. OBJECTIVES
To model glucose data as a function of time.
To apply interpolation (e.g., Newtons Divided Difference, Lagrange) to estimate glucose levels at
unsampled time points.
To validate interpolation results using real or simulated glucose readings.
To demonstrate the method through Python implementation.
3. OUTCOMES
Understanding interpolation and its relevance in biomedical data analysis.
Ability to estimate unknown glucose levels at unmeasured time points.
Skill development in Python coding for data interpolation.
Enhancement of logical reasoning in healthcare problem-solving.
4. METHODOLOGY
4.1 Problem Description
Given discrete glucose level readings at different times, use interpolation to estimate glucose levels
at in-between time points.
Lets assume:
t = time in minutes
G(t) = glucose level (mg/dL)
4.2 Sample Data
Time (min): [0, 30, 60, 90, 120]
Glucose Level (mg/dL): [100, 140, 160, 130, 110]
Goal: Estimate G(45), G(75), etc.
4.3 Method Used
Lagrange Interpolation formula:
G(t) = (i=0 to n) [y_i * (j=0 to n, j=i) ((t - t_j) / (t_i - t_j))]
Newton's Divided Difference Interpolation (similar format to Newton-Raphson setup)
5. PYTHON CODE EXPLANATION
5.1 Libraries
import numpy as np
5.2 Data Definition
time = np.array([0, 30, 60, 90, 120])
glucose = np.array([100, 140, 160, 130, 110])
5.3 Interpolation Function (Lagrange)
def lagrange_interpolation(x, X, Y):
total = 0
n = len(X)
for i in range(n):
term = Y[i]
for j in range(n):
if i != j:
term *= (x - X[j]) / (X[i] - X[j])
total += term
return total
5.4 Usage Example
predicted = lagrange_interpolation(75, time, glucose)
print(f"Predicted glucose level at t=75 min: {predicted:.2f} mg/dL")
6. CONCLUSION
Summary: Interpolation was used to accurately estimate glucose levels from discrete data.
Practical Use: Helps in diabetes management by predicting levels without continuous sampling.
Accuracy: Highly accurate within data range, especially useful for short intervals.
Tech Benefit: Easy to implement in patient monitoring systems using Python or mobile apps.