Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
23 views34 pages

M-E Design

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views34 pages

M-E Design

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 34

Welcome to Design, Monitoring &

Evaluation Orientation and Training

January 26, 2008


DM&E: Training Objectives

• Learn principles of Design, Monitoring and


Evaluation
• Learn basic terminology related to DM&E
• Understand standard DM&E tools – log
frame, indicator plan, work plan, causal and
logical diagrams
• Recognize different types of indicators and
their purpose
• Relate these principles and tools to individual
programs
• Improve capacity! (pre- and post test)

2
The Design, Monitoring & Evaluation Cycle
is Common in our Field

Logical Articulation Conduct Appropriate


of Design Evaluation
Ongoing, Regular Monitoring

e s so n s
L
e a rn ed
L

Needs Assessment Ongoing Program Project Wrap-Up

Program Life Cycle

Learning and Sharing

We need to apply this cycle and its elements not just to programs, but
to our work in general – keep learning from our experiences.
3
Elements of Program Design

4
Mercy Corps Uses 3 Primary Tools to
Express Program Design

• Logical Framework – Summary of project logic and


indicators
• Work Plan – Schedule of project activities, targets,
and resource allocation
• Indicator Plan – A plan for detailing what and how we
measure

These items are usually a part of the proposal process and are used
to manage a program.

5
We Also Use 2 More Tools to Express
Program Design

• Causal Diagram – A sketch of the main assumed


project relationships
• Logical Diagram – A structured or hierarchal
representation of project causal relationships

These are relatively new to Mercy Corps, but are an important part of
the process and we hope to incorporate them into all new programs.

6
Program Design: Create a Goal

• A Goal is a simple, clear statement of the impact we


want to achieve with our project. It is the change we
hope to bring about in the community. It should
answer the question:
– What are our target communities like if we are successful?
• What are your programs’ goals?

7
Program Design: Create a Goal
Entering the Goal into the Log Frame

• The Goal goes at the top of our log frame,


but may be changed as we design our
program.
GOAL: Revitalized coastal communities in 8 districts in Indonesia
OBJECTIVES: KEY MAJOR INDICATORS
Quantifiable statements of OUTPUTS ACTIVITIES
the effects on people’s
knowledge, attitudes, and
behaviors.

8
Program Design: Identifying Objectives

• Objectives are the major changes in a population’s


knowledge, attitudes or behavior that will make our goal
a reality.
– Where the goal is the vision of the future, our objectives are the
effects that create the change we imagine.
– Objectives should logically and reliably contribute to our goal.

– Completely represent the scope & strategy of our work

9
Program Design: SMART Objectives
What makes an Objective SMART?

• SMART Objectives reflect the project’s needs for


results and help to identify exactly how the
project will reach its goal. SMART Objectives
also help in the creation of reliable indicators for
good monitoring and evaluation. SMART Means:
– Specific
– Measurable
– Achievable
– Relevant
– Time-Bound (have a clear beginning and end)

10
Program Design: SMART Objectives

• Realistically, objectives aren’t always


SMART, But it’s good to keep the SMART
criteria in mind. If they are well articulated,
they already contain the indicators.
• Note also that EC log frames use different
terminology. They have Overall Purpose
(Big Goal), Specific Purpose (something
between an objective and a goal), and
Expected Results. In this case, think of
Expected Results as our SMART
Objectives.

11
Program Design: Relationship Between
Objectives and Goals

The Goal or Result is an The Objective or Intermediary


IMPACT on WELL-BEING Result is an
EFFECT on CAPACITY

Goal – Result – Impact Objective – Intermediary Result - Effect


•Increased literacy •Improved access to education
•Reduced incidence of diarrhea •Improved access to water
•Improved nutritional status •Improved access to food
•Reduced mother and infant •Changed maternal/child health
mortality Behavior
•Increased farmer income •Increased productivity of land
•Increased income from business •Improved access to credit
•Improved local governance •Increased capacity to solve problems
•Improved capacity to deliver justice
•Improved rule of law

12
return
Program Design: Outputs

• Project outputs are the products, sometimes called


“goods and services”, that must exist in order to
achieve our objective.
– Usually the result of activities
– Think of them as “deliverables”
– A critical assumption exists about the things Mercy Corps can
provide (Outputs) and the changes we hope to see in our target
communities (Objectives). Examples below highlight these
assumptions.

Output (good or service) Objective (effect)


Trainings Conducted Participants change behavior
Schools Rebuilt Children resume classes
School biscuits provided More children attend school
Assumptions

13
Program Design: Formulate Outputs
Identify Outputs for the Log Frame

GOAL: Improved natural resource management in Northern Iraq.

OBJECTIVES KEY MAJOR INDICATORS


OUTPUTS ACTIVITIES
Quantifiable statements of the
effects on people’s knowledge,
attitudes, and behaviors.

1. 50% of farmers adopt


new sustainable farming
practices by December
2008.

14
Program Design: Formulate Outputs
Example of Log Frame with Outputs

GOAL: Improved natural resource management in Northern Iraq.


OBJECTIVES KEY MAJOR INDICATORS
OUTPUTS ACTIVITIES
Quantifiable statements of the
effects on people’s knowledge,
attitudes, and behaviors.

1. 50% of farmers adopt 1. Farmers trained in


new sustainable farming sustainable farming
practices by December practices
2008. 2. Seedlings provided for
planting
3. Advocacy meetings with
government held on policy
development.
4. Radio messages developed
to promote new practices.

15
Program Design: Major Activities

• Activities are the daily efforts that result in the


desired outputs and subsequent objectives and
goals.
– Focus on major activities for the log frame -
management need vs. log frame content

– CAUTION: Don’t let the activities lead (determine) the


Log Frame process

16
Log Frame Puzzle

• Instructions
– Teams will receive pieces of Log Frame Puzzle
– Goal is to fit them together to make a completed Log
Frame
– 20 minutes to complete activity

• Hints
– Think carefully both about the logic of the log frame
and the characteristics of outputs
– May want to draw a chart to house the puzzle pieces

17
Log Frame Puzzle “Answer” Key

GOAL: All boys and girls have strong basic education opportunities in our target rural communities.
OBJECTIVES: KEY OUTPUTS MAJOR ACTIVITIES
1) Increase 1. 3000 children receive a 1. Source nutritional supplements
overall nutritional supplement when 2. Distribute nutritional supplements
attendance by they come to school 3. Conduct assessment for local water sources
25% and girls’ 2. 3000 children receive water 4. Provide water by donkey where wells are not possible
attendance by during the school day 5. Visit communities to inform about incentive program
40%. 3. School attendance 6. Write promotional material for school nutrition and water program
incentives are publicized in 7. Distribute nutrition program information in appropriate ways to
communities communities
2) Teachers 1. 300 teachers receive English 1. Interview and hire teacher trainers
deliver high- and math training 2. Coordinate training with ministry of education
quality 2. Teaching handbooks are 3. Create curriculum for teaching handbooks
instruction at delivered to all teachers in 4. Create a basic competency test for teachers
every school in our communities
our target 3. Teachers complete basic
communities. competency test

3) Each school’s 1. Educational priorities 1. Conduct assessment to understand best local media channel
PTA is capable established through 2. Create messages about local educational issues to be
of sustaining parent/teacher community broadcast/disseminated
educational meetings 3. Meet with Ministry of Education
efforts. 2. 50 capacity-building 4. Assess best practices among existing model PTA’s
training sessions are held 5. Engage other NGO’s on local capacity building initiatives
for PTAs 6. Collect stories of “positive deviance” from communities with
high-end educational programs

18
Design Process: Select Indicators

• Indicators are measures that allow us to verify


our program progress.
• Indicators are often confused with targets (also
called benchmarks or milestones by various
organizations). What’s the difference?
– Indicators tell us what we are going to measure
– Targets help us quantify and set goals or benchmarks
for those measures – how much or how many, by what
time
• Can also be confused with SMART objectives,
which can often tell you what the indicator is

19
Design Process: Types of Indicators

• Process – measuring the process or


procedure of our work – the output indicators
• Results – measuring the change occurring as
a result of our activities, and moving us toward
the achievement of our objective
– Effects – immediate changes in behavior, skills,
knowledge
– Impact – what those changes mean (review, review)

20
Program Design: Indicators in the Log Frame
How do we account for different types?

GOAL: Revitalized coastal communities in 8 districts in Indonesia.

OBJECTIVES KEY MAJOR EFFECT-LEVEL


OUTPUT ACTIVITIES INDICATORS
Quantifiable statements of the INDICATORS
effects on people’s knowledge,
attitudes, and behaviors.

1. Members of the fishing Outputs are basically process List the activities Here when we talk about
sector (fishermen, fish indicators, and still need to be indicators, let’s use the ones that
measured, so let’s just call them that tell us more about the effects of
mongers, and fish our activities and outputs. And
exporters) are back at which lead to the Impact.
work by December 31st,
2006.

Impact Indicator: % of
fishing sector back at work.

21
Program Design Process: Indicators
Baseline Data Collection

• What is a Baseline?
– A measure of our indicators at the beginning of the
program or project
– Should apply to the indicators we have identified to
measure our program
– Necessary to show improvement over time
– Very difficult to show results in an evaluation without
baseline data
– How is baseline data different than an assessment? (An
assessment is generalized to help identify needs, not
based on specific indicators)

• What baselines have you participated in?

22
Program Design: Review

Identify each of the following as either goals, objectives,


activities, process indicators (output), or result indicators (effect
or impact)
– “Afghan farmers with improved livelihoods”
– “provide training on production improvements to farmers”
– “number of farmers receiving training”
– “increased livestock production among Helmand farmers by
December 2009”
– “% of mothers obtain improved prenatal care”
– “% increase in production of wheat”
– “increase in sales among wheat producers in Helmand”
– “rebuild irrigation canal for community x”
– “improved test scores among agricultural students”

23
Discussion of Program Monitoring

24
Program Monitoring

• What is monitoring?
– Routine data collection, reflection, and reporting about
project implementation that occurs on a regular basis.
Generally used to check our performance against
expected results or “targets” as well as ensure compliance
with donor regulations.
• What information should we actually monitor?
– Indicators for the Objectives – process and results
– Activities and outputs against targets set forth in the Work
Plan
– Quality/technical standards
– What else?

25
Program Monitoring

How do these items get monitored and by whom?


– Indicators for the Objectives – process and results
• Indicator Plan – designed by PM with DM&E help
– Activities set forth in the Work Plan
• Program management – tracked by program team through
regular meetings
– Quality/technical standards
• We are suggesting developing monitoring checklists or
forms to serve this purpose – we can help, but program staff
have to design and perform
– Financial and compliance
• Documentation and Program Files

26
Program Monitoring:
Using an Indicator Plan

• The Indicator Plan describes the way we will


measure our indicators and should be the first step
in developing a monitoring system.
– Especially useful for complex indicators, like “capacity
building”.

Indicator Definition Baseline Data Data Collection Frequency of People


and Targets Sources & Methods Data Collection Responsible

How often it will be


WHAT will be HOW it will be monitored? WHO will
monitored monitored monitor

27
Program Monitoring:
Frequent Monitoring Challenges

• Several challenges in regular monitoring are:


– Management challenges in finding the time &
resources to conduct good monitoring
– Lack of clarity concerning what to monitor, how, and
who is responsible
– Mastering data collection techniques
– Ensuring data and information get to the people who
need it for decision-making
– Multiple reporting formats

– Others?

28
Discussion of Program Evaluation

29
Evaluation Introduction

• What is an evaluation?
– A periodic review of a project (or program) that is
intended to measure project outcomes and/or impact.
This process serves the twin aims of 1) telling us
how we did and 2) learning how to make programs
better.
• Why do we conduct evaluations?
– Strategic Learning (should be primary motivation!)
– Management request
– Donor Request

30
Program Evaluation:
Understanding the Evaluation

When do we evaluate?

MID-TERM FINAL EX

POST
To assess performance  Documents our experience.  Documents our experience
and determine next steps one or more years after end
Benefits Benefits of program
 Allows us to refine our  Occurs at end of project Benefits
approach for better when impact should be  Helps us capture
performance or impact more obvious sustainability
Costs  Allows us to capture our  Helps us understand long
 Takes time away from impact and/or lessons- term impact
implementation learned. Costs
 Not always possible for Costs  Information too late to
short-term projects  Info comes too late improve project
 Takes funds that could be  Difficult to conduct in a
spent on other activities, participatory way, if staff
including monitoring or gone
final evaluation.  No funding

31
Program Evaluation:
Understanding the “Science” of Evaluation

What are some evaluation tools?

– Surveys – Case studies


– Focus groups – Document analysis
– Review of Baseline and – Structured Interviews
monitoring data (surveys – Semi-structured Interviews
and capacity indices)
– Field visits and tours:
– Observations
Transect Walk
– Participatory evaluation
– Maps

32
DM&E: Review Objectives

• Learn principles of Design, Monitoring and


Evaluation
• Learn basic terminology related to DM&E
• Understand standard DM&E tools – log
frame, indicator plan, work plan, causal and
logical diagrams
• Recognize different types of indicators and
their purpose
• Relate these principles and tools to individual
programs
• Improve capacity! (pre- and post test)

33
Thank You For Your Time!

• Thanks so much for your participation and


attention!

34

You might also like