Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
7 views26 pages

Facilityanalysisguide Immunization

Uploaded by

Shushil Singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views26 pages

Facilityanalysisguide Immunization

Uploaded by

Shushil Singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 26

ANALYSIS AND USE OF HEALTH FACILITY DATA

Guidance for immunization


programme managers

WORKING DOCUMENT – FEBRUARY 2018


© World Health Organization 2018

All rights reserved. This is a working document and should not be quoted, reproduced, translated or
adapted, in part or in whole, in any form or by any means.
MODULE 6. Guidance for immunization
programme managers

LEARNING OBJECTIVES
Immunization data should be used to support:

 operational, managerial and strategic decisions at all levels; and


 monitoring and accountability.
Those in the target audience will find this section useful as they design or improve information
systems, develop training materials, evaluate the monitoring system and the data it produces, or look
for ways to represent and visualize immunization data in ways that support decision making. This
section focuses on use of routine data. Separate guidance is under development on use of data from
multiple sources (health facility surveys, population-based surveys, focused investigations in addition
to the routine information system) to periodically review an immunization program or review sector-
wide progress.

AUDIENCE
This section provides guidance on the analysis and use of routine immunization data. It is aimed at:

 Ministry of health decision makers such as immunization program staff, data managers and health
information system managers;
 Immunization partner staff involved with immunization or health system strengthening;
 Consultants and staff working at research institutes involved with the assessment and
improvement of immunization data and information systems.

SUGGESTED REFERENCES
 Collecting, Assessing, and Using Immunization Data: Reference guide. WHO, Geneva. Working
draft of February, 2016
 Data Quality Review: A toolkit for facility data quality assessment. WHO. 2015

KEY AUTHORS
Jan Grevendonk | Robert Pond | Alain Poy
ANALYSIS AND USE OF HEALTH FACILITY DATA: Guidance for immunization programme managers
WORKING DOCUMENT, FEBRUARY 2018

Contents

Acknowledgements........................................................................................... 6

1. About the data ............................................................................................ 7

2. Data quality ................................................................................................ 8

3. Core facility indicators............................................................................... 10

4. Core analysis ............................................................................................. 11


Immunization coverage for each vaccine in the national schedule .................................... 11
Dropouts – children who start but don’t complete the immunization schedule ................ 16
Management of vaccine stocks ............................................................................................ 18
Adverse events following immunizations ............................................................................ 21
Immunization sessions ......................................................................................................... 22
Completeness of reporting .................................................................................................. 23

–5–
ANALYSIS AND USE OF HEALTH FACILITY DATA: Guidance for immunization programme managers
WORKING DOCUMENT, FEBRUARY 2018

Acknowledgements
This guidance document has been developed by the World Health Organization, with the support of
grants from Bloomberg Philanthropies Data for Health Initiative, Gavi, the Vaccine Alliance, The Global
Fund to Fight AIDS, Tuberculosis and Malaria, and The Norwegian Agency for Development
Cooperation.

–6–
ANALYSIS AND USE OF HEALTH FACILITY DATA: Guidance for immunization programme managers
WORKING DOCUMENT, FEBRUARY 2018

1. About the data


Routine immunization data are recorded by staff at health facilities to track the immunization status of
children and pregnant women served by the health facility, any adverse events following immunization
(AEFIs), vaccines and supplies used and status of the cold chain. Summary data are periodically
reported to supervisors who monitor performance indicators such as immunization coverage and
vaccine wastage.

The data that are not specifically addressed in this section are those related to:
 Data from immunization campaigns;
 Surveillance for vaccine preventable diseases;
 Case-based data from investigation of severe adverse events following immunization; and
 Financial and human resource tracking.

This guidance includes limited discussion of some closely related data:


 Immunization coverage estimates from population-based surveys;
 Estimates of target populations (e.g. surviving infants);
 Detailed data for immunization supply chain management.

In lower and middle income countries, immunization data are most often collected through a monthly
aggregation-based reporting system. The main exception to this basic setup occurs when Electronic
Immunization Registries (EIR) are used. In such systems, the immunization records of each person are
maintained in an online electronic database, and all the data needed for monitoring can be extracted
from this database, making monthly reports redundant. This section focusses on analysis and use of
aggregate data whether reported monthly or periodically extracted from an EIR.

Most lower and middle income countries have integrated Health Management Information Systems
(HMIS), used for the systematic reporting and storing of health data, including immunization data. An
example of a commonly used system is the District Health Information System version 2 (DHIS2).
Additionally, in many countries, Excel-based tools are used for the reporting and analysis of
immunization data. They provide a systematic way to enter and store data and produce standardized
visualizations. The District Vaccine Data Management Tool (DVDMT) is an example of his kind of tools.
Locally stored databases have the advantage that data are always accessible, even where on-line
access is difficult.

–7–
ANALYSIS AND USE OF HEALTH FACILITY DATA: Guidance for immunization programme managers
WORKING DOCUMENT, FEBRUARY 2018

2. Data quality
All data have limitations that affect the reliability and interpretation of the data. The data routinely
reported by health facilities are certainly no exception. The data cannot be interpreted without first
knowing how complete they are and examining them for inconsistencies and errors. The data may
need to be adjusted before they can be meaningfully analyzed. Findings from review of data quality
and explanations of any adjustments to the data must be presented explicitly and transparently as part
of presentation of the analytic findings.

Whether at district or higher levels, a “data desk review” can proceed with the data immediately
available to the analyst without field investigations. At national level, although a desk review can be
completed with the statistics aggregated to each individual district level, a more revealing review can
be carried out if the fully disaggregated data (for all facilities and all months for the period of analysis)
are available.

If time and resources permit, the desk review should be complemented by a “data verification survey”
of a sample of districts and health facilities to determine to what extent the reported data match with
the source documents (i.e. facility registers and tally sheets) and to assess the data management
system.

WHO has developed a toolkit to support both a desk review and field investigations of data quality.
This toolkit includes an Excel-based DQR tool which, when populated with key data from health
facilities and other sources, analyzes the completeness, internal consistency and external consistency
of the data. For countries using DHIS2 software to manage their routine data, WHO has also
developed the Data Quality Tool, an application which can be installed on the national DHIS2 system
that automatically generates findings from a data desk review at either national or sub-national level.

The remainder of this sub-section discusses the minimal data quality assessments that should be
conducted as an initial step in analysis of routine immunization data.
1. Assess the completeness and timeliness of the data. Facility-level reporting completeness is
defined as the number of facility reports that were received divided by the reports that were
expected. District level reporting may be complete, but do these district reports contain the
reported doses from all their facilities? Timeliness is defined as the fraction of expected reports
that were received before the specified deadline (e.g. the 5th business day of the following month).
Incomplete reporting reduces the level of indicators. It introduces a bias. If completeness is
roughly the same over time then the indicator can show the trends for those facilities that
reported. If, however, there is significant variation over time in completeness, then trends should
be interpreted with great caution.

2. Check for internal consistency. The second data quality assessment step is to look for “suspicious”
reported values. Not all suspicious values are wrong. However, suspicious values merit further
investigation.
Check for outliers – Outliers are values that differ substantially from the mean. Sometimes
outliers are the result of genuine variations in program performance, for example as the result
of an intensified effort or as the result of a stock out. However, in many cases they are clearly
the result of some kind of a data error. The WHO Data Quality Tool for DHIS2 automatically
detects values from a health facility or from a district which are more than 3 standard
deviations above or below the mean value from the same health facility or district for the same
year. Charting data in a time series also provides a good visual way to find outliers.

–8–
ANALYSIS AND USE OF HEALTH FACILITY DATA: Guidance for immunization programme managers
WORKING DOCUMENT, FEBRUARY 2018

Figure 1: Chart looking at trends in doses administered to detect outliers

The peaks in the numbers of doses of Penta-1 and Penta-3 in December 2012 and August 2013 warrant
investigation. Not only are they far above the usual numbers, but the number of Penta-3 also far
exceeds the number of OPV-3, which is normally given at the same time (an example of inconsistency
between related indicators, as discussed below). This raises suspicion on the actual number of Penta
doses administered. OPV-3 in January 2014 also warrants investigation.
i. Look for uniformity and patterns -- Too much uniformity in the data and the existence of
patterns may also point to issues with data quality. If too many reported values are multiples
of 5 or 10, this may indicate that they are guesses instead of reported values, or were obtained
by counting vials, rather than tallying doses. Repeated numbers may point to mistakes.
Whenever performance seems ‘too perfect’, it may be worth investigating more. For instance,
does a district always achieve 100% coverage, month after month?
ii. Check for inconsistency in related indicators – An example of such inconsistency is when there
is a negative dropout over a full year between the number of children receiving the first dose
of DPT vaccine and the number receiving the third dose of the vaccine. Another example, as
illustrated in Figure 1 above, is inconsistency between the number of children receiving DPT
vaccine and the number of children receiving OPV vaccine. Yet another example is when there
is a negative “wastage rate” (discussed below) -- when the number of children reported to
have received a vaccine is greater than the number of doses of the vaccine reported to have
been consumed.
iii. Check for inconsistency between reported data and data recorded on source documents --
“Data verification” requires a survey of a representative sample of health facilities to compare
the data they reported for select months to data recorded on immunization registers or tally
sheets for the same months. Such a data verification survey can provide evidence of either
over-reporting or under-reporting or it can reveal problems with the recording and archiving of
data. During such surveys, health staff at facility and district levels are usually interviewed to
assess the adequacy of the system for data management and use.
3. Check for external consistency -- Assessment of external consistency involves comparison of
findings from two different data sources. Estimates of immunization coverage based upon routine
data should be compared with estimates derived from population-based surveys such as a DHS, a
MICS or an immunization coverage survey.

–9–
ANALYSIS AND USE OF HEALTH FACILITY DATA: Guidance for immunization programme managers
WORKING DOCUMENT, FEBRUARY 2018

3. Core facility indicators


Core Indicators Definition Disaggregations
Immunization coverage 100 x (Number of children receiving the vaccine) / (Estimated number • By vaccine / dose of vaccine
rate by vaccine for each of target population (e.g. infants less than one year) • Age (<1 year, ≥ 1 year for infant
vacine in the national immunizations; ≤ 2 years, ≥ 2
schedule years for toddler
immunizations)
• Status (pregnant women,
others) for TT
• Geographic region
Dropout rate for: DPT 1 to DPT3 dropout = 100 x (DPT 1 doses – DPT 3 doses)/ DPT 1 • Geographic region
- first to third dose of doses
DPT containing
vaccine BCG to MCV1 dropout = 100x (BCG doses – MCV1 doses)/ BCG doses
- BCG to first dose of
measles containing MCV1 to MCV2 dropout = 100x (MCV1-MCV2)/MCV1
vaccine
- first to second dose
of measles
containing vaccine
Vial wastage rate (closed Closed vial wastage = % of doses that were spoiled due to expiry, heat • By vaccine
& open) by vaccine for exposure, freezing or breakage. • Type of spoilage
each vaccine in the Open vial wastage = % of doses that were discarded after vials were • Geographic region
national schedule opened.*

Full availability of vaccines Percentage of health facilities with no stockouts of any tracer vaccine or • Geographic region
and supplies injection supply when vaccinantion is demanded

Functional Status of Cold Number of functional refrigerators


Chain Equipment • Geographic region

Temperature Alarms Number of times the temperature inside cold chain equipment exceeds
or drops below a reference range.

Serious Adverse Events AEFI cases


Following Immunisation • Non-serious, serious
(AEFI) • Geographic region

Immunisation session 100 x number of completed immunization sessions / number of


completion rate planned sessions • Outreach versus fixed

– 10 –
ANALYSIS AND USE OF HEALTH FACILITY DATA: Guidance for immunization programme managers
WORKING DOCUMENT, FEBRUARY 2018

4. Core analysis
IMMUNIZATION COVERAGE FOR EACH VACCINE IN THE NATIONAL
SCHEDULE

Purpose

Immunization coverage is the fraction of a certain target population that is vaccinated with a defined
number of doses for a certain antigen or vaccine, expressed as a percentage. Coverage is the most
direct measure of program performance and is monitored at national, sub-national and facility levels in
order to detect performance issues and take corrective action when needed.

Analysis
Monitor vaccine doses administered and immunization coverage

1) Each health facility can use a cumulative coverage 2) A map of last month’s BCG coverage by district shows
chart to monitor progress towards immunizing their the most recent performance with a vaccine that is
target population ideally delivered at birth

3) A map of last month’s Penta 3 coverage by district 4) A map of last month’s MCV1[*] coverage by district
shows the most recent performance with a vaccine shows the most recent performance with a vaccine
that is ideally delivered at 14 weeks of age that is ideally delivered at around 9 months of age.

– 11 –
ANALYSIS AND USE OF HEALTH FACILITY DATA: Guidance for immunization programme managers
WORKING DOCUMENT, FEBRUARY 2018

5) Ranking of districts by Penta 1 and Penta 3 coverage 6) Doses administered by month show fluctuations in
shows relative performance and dropout reported services as well as the consistency between
related doses (e.g. DPT 1 -- DPT 3 – OPV1 – OPV3)

7) Doses administered by year show trends and 8) Coverage by year shows progress towards targets for
consistency over time without reference to targets each dose in the immunization schedule

9) For DPT3 and or MCV1, compare the routine estimate of immunization coverage nationwide to the survey
estimate

– 12 –
ANALYSIS AND USE OF HEALTH FACILITY DATA: Guidance for immunization programme managers
WORKING DOCUMENT, FEBRUARY 2018

Considerations/issues for interpretation

Necessary dis-aggregations
Data should be disaggregated by vaccine, dose, and age group of the children that received them (e.g.
<1, ≥1).

Monitoring access
Coverage for early doses in the immunization schedule, like BCG or Penta-1, are indicative of access to
immunization services. High first dose coverage indicates that health services cover a high proportion
of the population and that people also seem to accept vaccination.

Monitoring overall performance of immunization services


Immunization program managers and health officials sometimes want to use routine data to monitor
full immunization coverage. It is difficult to set up an aggregate reporting system that reliably reports
on the number of children that are fully immunized. To do this, those aggregating the data at facility
level must reliably scan all immunizations provided to the child to date to determine whether they are
fully immunized. Furthermore, with more and more vaccinations being scheduled for children and
adults of higher ages, one might question how to define “fully immunized” exactly. Therefore overall
performance is most often measured through the use of a tracer vaccine and dose. Typically, this
tracer is the third dose of DTP-containing vaccine, such as DTP3.

Misclassification
Children given their first or second doses of DPT vaccine can be misclassified as receiving their third
dose of DPT. If this happens frequently enough, the health facility can report DPT3 greater than DPT1
(i.e. a “negative dropout rate” – discussed below), sometimes for an entire year overall.

Monitoring coverage with all vaccines in the schedule


Coverage for all other vaccines in the national schedule should also be monitored, as vaccine specific
uptake or programmatic issues may arise and need to be detected as soon as possible. After a new
vaccine introduction, it is recommended to specifically monitor coverage of that vaccine to make sure
it reaches levels in line with established vaccines.

Age dis-aggregations
To determine whether children are protected as early as possible, childhood immunization data should
be disaggregated by age. In this way it is possible to monitor “timely coverage” versus “delayed
coverage”. With the possible exception of the first dose of measles containing vaccine (MCV1), timely
coverage is the percentage of children immunized before 12 month of age while delayed coverage is
the percentage of children immunized at 12 months or older. Some countries choose to define timely
coverage with MCV1 as the percentage of children immunized before their second birthday (MCV1<2)
rather than before their first birthday (MCV1<1). In either case, since the second dose of measles
containing vaccine (MCV2) is recommended to be given between 15 and 18 months, timely coverage
with MCV2 is the percentage of children delivered before the second birthday (MCV2<2) whereas
delayed coverage is provided after the second birthday (MCV2≥2).

Unnecessary dis-aggregations
Sometimes the administered doses are further disaggregated, for example by sex of the child, or the
strategy that was used to vaccinate it (i.e. fixed versus outreach), or whether or not the child lives
within the catchment area of the health facility. These additional dis-aggregations are not
recommended as there is scant evidence that reliable data disaggregated in these ways can be
collected or meaningfully used. Moreover, the recording and reporting workload doubles every time a
new level of disaggregation is introduced. Therefore the decision to further disaggregate immunization
data needs to be weighed carefully against the benefit of the use that will be given to the collected

– 13 –
ANALYSIS AND USE OF HEALTH FACILITY DATA: Guidance for immunization programme managers
WORKING DOCUMENT, FEBRUARY 2018

data. Those designing routine reporting forms should aim to limit the number of cells and rely upon
findings from household surveys to more reliably answer many questions. The analyst can help
promote streamlining of reporting forms by identifying problems with the completeness and
consistency of the separate data elements and bringing such findings to the attention of those
managing the health facility reporting system.

Denominators
For all vaccines in the first year of life other than birth doses (i.e. BCG, OPV 0, hepatitis B vaccine 0), the
target population for calculating coverage is the number of surviving infants (= live births minus infant
deaths). For most countries, official census-derived population estimates are the best source of
denominator data, because they are most likely established using rigorous statistical methods, and are
collected independently from the health system. The use of region-specific demographic estimates of
the crude birth rate and infant mortality rate, is preferable to the use of national averages. They are
however not always available for the lowest administrative levels, and the use of uncertain or outdated
population estimates is a major source of error for coverage estimation, especially at the subnational
level. In some cases, local enumeration is used to estimate the size of the target population. Such an
approach can provide good operational targets for health workers at local levels. However, local
enumeration is not recommended for coverage monitoring at national level, as it lacks independence,
and it is likely that the same children that are missed for immunization are also missed by local
enumerations. Coverage in excess of 100% can either indicate an under-estimation of the denominator
(which, at the level of an individual health facility or district can be in part due to mobile populations
seeking services outside of their area of residence) or over-reporting of vaccines administered.

Year-to-year fluctuations in the denominator


Abrupt year-to-year increases or decreases in coverage estimates are sometimes the result of
adjustments made to estimates of the target population when findings become available from a new
census. To reveal the influence of such denominator adjustments, instead of graphing coverage by
year (see graph 8), it is revealing to graph numerators by year (see graph 7) and denominators by year
(graph not shown).

When reliable denominator estimates are lacking there are several options for monitoring
performance:
 Monitor trends in the absolute numbers of vaccinated children (see graph 7) rather than
coverage (graph 8);
 Monitor dropout rates (see the following graphs) as they can be calculated without
estimates of the target population;
 Nationwide and provincial/regional coverage can be assessed with coverage evaluation
surveys

Time series (see graphs 6, 7, 8)


While often showing trends in the performance of an immunization program, can also be used to
assess the consistency of the data. Erratic fluctuations may suggest a problem with data quality or, as
just discussed, adjustments to denominator estimates. For this reason, it is useful to review the trend
in doses administered (see graph 7) as well as the coverage estimates (graph 8).

Trends
Trends must be interpreted with great caution if the reporting completeness has changed from year to
year. For example, if the national completeness of facility reporting was 65% in 2015 then
completeness rose to 85% in 2016, no trend should be presented unless “imputations” can be made.
Imputation involves making assumptions about missing or invalid data. Such assumptions should be
based upon a good understanding, after conferring with local staff, of why the data is missing or invalid
(Was staffing or supply interrupted?; Were staff not yet oriented to a new reporting system?) and

– 14 –
ANALYSIS AND USE OF HEALTH FACILITY DATA: Guidance for immunization programme managers
WORKING DOCUMENT, FEBRUARY 2018

whether the affected facilities or affected months are similar to other facilities or other months for
which valid data were reported.

Comparison with survey estimates


Coverage surveys are often seen as the gold standards for coverage estimation. But while they can be
very useful, it is worth pointing out that they also have inherent limitations and risks:
 Delay in measurement: There is typically a lag of a couple of years between when
immunizations are provided and when survey estimates becomes available.
 Limited geographic disaggregation: Survey estimates are not commonly available for the
district level, and the estimates are more uncertain even at provincial/regional levels,
given the smaller sample sizes.
 Quality: Not all surveys are conducted with the same quality and adherence to strict
sampling and interview protocols. It is also not always easy to establish the immunization
status of a child, especially if no vaccination card is available, or if the card is incompletely
or incorrectly filled out, and the parent doesn’t remember all the vaccines that the child
might have received. DHS and MICS surveys report the confidence intervals of survey
estimates of immunization coverage. These confidence intervals estimate the effect of
sampling error, which declines with sample size. However, surveys are also subject to
non-sampling errors such as that due to selection bias or recall bias. The direction and
size of the non-sampling error of a survey can seldom be estimated although it can well
exceed the sampling error.

– 15 –
ANALYSIS AND USE OF HEALTH FACILITY DATA: Guidance for immunization programme managers
WORKING DOCUMENT, FEBRUARY 2018

DROPOUTS – CHILDREN WHO START BUT DON’T COMPLETE THE


IMMUNIZATION SCHEDULE

Purpose

The drop-out rates between early and later doses, for example between DTP1 and DTP3, are defined as
the percentage of children that started a course, but didn't finish it for some reason. Low drop-out
rates are indicative of good utilization and therefore of good service quality. The MCV1 to MCV2
dropout rate assesses the ability of the program to vaccinate beyond the first year of life.

Analysis
Monitor “drop out”

10) A table presents year-to-year trends in dropout 11) The same dropout trends are presented
rates graphically

12) A table presents dropout rates for the last 12 13) The same dropout rates are presented graphically
months, by region

– 16 –
ANALYSIS AND USE OF HEALTH FACILITY DATA: Guidance for immunization programme managers
WORKING DOCUMENT, FEBRUARY 2018

14) Classification of regions based upon Key Performance Indicators : DPT1 coverage and the DPT1 to DPT3
dropout rate

Considerations/issues for interpretation

Classification of districts/provinces based upon Key Performance Indicators


Figure 14 shows how districts or geographic regions can be classified based upon two Key Performance
Indicators: DPT1 coverage (<90%; ≥90%; > 100%) and DPT1 to DPT3 dropout rate (<10%; ≥10%;
negative).

– 17 –
ANALYSIS AND USE OF HEALTH FACILITY DATA: Guidance for immunization programme managers
WORKING DOCUMENT, FEBRUARY 2018

MANAGEMENT OF VACCINE STOCKS

Purpose

Effective supply and cold storage of vaccines is essential for good coverage and efficient use of
expensive vaccines. This sub-section covers only the small subset of stock management indicators for
which aggregated data can be reported monthly by each health facility.

Analysis
Monitor availability of vaccines and injection supplies

15) Percentage of health facilities with “full availability” during the month, by month of the last 12
months, by geographic region
% of HF in the district with full availability for the reporting month
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
District A 50% 50% 60% 70% 70% 80% 80%
District B 67% 67% 73% 67% 60% 67% 60%
District C 77% 85% 85% 77% 92% 92% 100%
District D 25% 38% 25% 38% 13% 25% 25%
District E 86% 57% 71% 71% 86% 57% 43%
District F 64% 71% 79% 71% 79% 86% 86%
District G 100% 93% 87% 93% 87% 93% 87%

Monitor vaccine wastage

16) Closed vial wastage, by vaccine for this year and last 17) Closed vial wastage, by vaccine for this year and last
5 years 5 years

18) Closed vial wastage rates for each vaccine for each month of the last 12 months

– 18 –
ANALYSIS AND USE OF HEALTH FACILITY DATA: Guidance for immunization programme managers
WORKING DOCUMENT, FEBRUARY 2018

19) Open vial wastage rate by vaccine by region for the 20) Open vial wastage rate by vaccine, nationwide for
last 12 months the this and the last 5 years

Monitor cold chain equipment


Monitor vaccine wastage

21) Number of high temperature alarms and number of 22) Number of high temperature alarms and number of
low temperature alarms by month by region for the low temperature alarms by region for 2016
last 12 months

23) Number of functional refrigerators by month by region for the last 12 months

– 19 –
ANALYSIS AND USE OF HEALTH FACILITY DATA: Guidance for immunization programme managers
WORKING DOCUMENT, FEBRUARY 2018

Considerations/issues for interpretation

Monitoring for full availability


Full availability means that all demand could be met. Data on unmet demand are hard to come by.
How do we know for sure that no children were turned away because of a stock out, or that some of
them only received half of the vaccines they were due for? It is not enough to know whether there
were stocks at the end of the month. Health staff must track this continuously whenever
immunizations are demanded.

Monitoring closed vial wastage


Closed vial wastage is the percentage of doses of vaccine that were spoiled during a reporting period
due to expiry, heat exposure, freezing or breakage. Closed vial wastage is a good performance
indicator because it can be avoided if the right management practices are in place. It is best practice to
keep track of closed vial wastage by reason code, which would allow managers to identify specific
shortcomings in the system. For example, unacceptably high wastage because of heat exposure as
indicated by turned VVM’s, would point to deficient storage or transport conditions. However, it is
hard to obtain reliable wastage data by reason.

Monitoring open vial wastage


Open vial wastage = (Doses used - doses administered)/ (Doses used). Doses administered are the
same as the number of children reported to have received the vaccine. Doses used = Starting balance
+ addition doses supplied during the period – ending balance. If the number of children reported to
have received the vaccine during the period is greater than the number of doses used then the
wastage rate is negative. This indicates a problem with data quality – either doses administered was
over-reported or doses used was under-reported.

Monitoring vaccine temperature alarms


If digital temperature recorders are being used, then the number of high alarms and the number of low
alarms generated by these devices over the period provide a good indication of the working status of
cold chain equipment and the conditions in which vaccines are being kept.

Monitoring the number of functional refrigerators


The number of functional refrigerators provides a crude indication of cold chain capacity.

– 20 –
ANALYSIS AND USE OF HEALTH FACILITY DATA: Guidance for immunization programme managers
WORKING DOCUMENT, FEBRUARY 2018

ADVERSE EVENTS FOLLOWING IMMUNIZATIONS

Purpose

AEFI should be monitored in two different ways:


 Aggregate reporting systems should be set up to monitor the total number of AEFI recorded in any
health facility or district. These should be classified as either mild or severe AEFI.
 Case reporting forms should be filled out for all severe cases, and each of these cases should be
properly investigated, including an analysis whether the AEFI was a direct result of the vaccination
or not. Serious cases are defined as those severe cases that involved hospitalization, disability, or
death, and the investigation of a serious case might lead to the withdrawal of a vaccine batch
from the market.

Analysis
Monitor AEFIs

24) Non-serious and serious AEFI’s by region, by year 25) Non-serious and serious AEFIs by region for 2016
for this and the last 5 years

26) Non-serious and serious AEFI’s by region, month for the last 12 months

Considerations/issues for interpretation

Sensitivity of AEFI surveillance


If fewer than 10 AEFI’s are reported per year per 100,000 surviving infants, this could indicate that AEFI
surveillance is not sufficiently sensitive.

– 21 –
ANALYSIS AND USE OF HEALTH FACILITY DATA: Guidance for immunization programme managers
WORKING DOCUMENT, FEBRUARY 2018

IMMUNIZATION SESSIONS

Purpose

This indicator enables supervisors to follow up on the implementation of micro plans. Reporting the
number of sessions held versus planned might give an indication of planning, operational, or budget
issues at the reporting facility

Analysis
Monitor completion of planned immunization sessions

27) Completion of planned sessions

– 22 –
ANALYSIS AND USE OF HEALTH FACILITY DATA: Guidance for immunization programme managers
WORKING DOCUMENT, FEBRUARY 2018

COMPLETENESS OF REPORTING

Purpose

Complete reporting is required if all routine immunization data are to be compiled and analyzed.
Understanding of trends in completeness of reporting is especially important when interpreting trends
in coverage.

Analysis

28) Completeness 1 29) Completeness 2

30) Completeness 3

Considerations/issues for interpretation

Assess facility reporting completeness


District level reporting may be complete, but do these district reports contain the reported doses from
all their facilities?

Disaggregation of reporting completeness


Examine regional differences in completeness before analyzing regional disparities in immunization
coverage.

– 23 –
ANALYSIS AND USE OF HEALTH FACILITY DATA: Guidance for immunization programme managers
WORKING DOCUMENT, FEBRUARY 2018

You might also like