Step 04
Collecting Evidence

In this step, you will collect stories and other data to answer your process and outcome evaluation questions.

Now it’s time to collect stories and other types of data to answer your process and outcome evaluation questions, but you are probably wondering – how do I do this? Well, we are here to help! This section will be broken down into two parts – process evaluation and outcome evaluation. In this step, you will learn how to describe your program’s services, activities, and polices, and gain a better sense of how you will measure your process outcomes. You will also lay the groundwork for conducting your outcome evaluation which assesses the effects of your program’s chosen interventions.

You will collect quantitative and qualitative data to understand how your program has been implemented (process evaluation) as well as the changes that youth participants are experiencing (outcome evaluation). Both types of evaluation are important because documenting implementation is directly relevant to understanding outcomes. Without knowing exactly what program components have been implemented or to what degree they are implemented, you cannot make use of outcome evaluation results to understand what worked or didn’t work and why. Let’s begin!

Is your program being implemented as intended?
What kinds of stories and data can tell the story of your program’s impact?


Show and tell time! By the end of this step, you’ll have evidence (data + stories) about your program, including how youth experience it and any changes that have resulted for youth.


A process evaluation assesses a program’s approach to service delivery by looking at the day-to-day operations of a program, and typically assesses the following program components/factors below:

1. How services are delivered to youth? (i.e.: the frequency and intensity of what staff do, quality and consistency of programming)

2. What are the administrative mechanisms in place that support these services? (i.e.: program monitoring and documentation, staff qualifications, staff supervision, staff training, hours of availability, and support services)

3. What was the overall participant satisfaction with the program? (i.e., did youth enjoy the program? Was the location and timing of the program accessible?)

4. How external factors influence program delivery? (i.e.: funding)

5. Where there any were barriers/common problems to program implementation? If yes, how were these barriers were addressed?

Check out the Q&A section for more on Process Evaluation.


Outcome evaluation assesses the effectiveness of your program in producing change in your youth participants. As noted in the logic model, outcomes can be broken down into short-term (6-12 months; immediate changes in knowledge, awareness, attitudes, skills) and medium-term (1-2 years; changes in behaviour) to long-term outcomes (3-5 years; changes in the broader community, population).

But how do you measure the specific youth outcomes that you are interested in? Let’s find out in the Key Actions section!

The Stepping Up: A Strategic Framework to Help Ontario’s Youth Succeed outlines a common vision, seven significant themes, 20 outcomes and a beginning set of indicators framed by seven guiding principles that emphasize a Positive Youth Development (PYD) perspective within a social justice framework. Learn more about the Stepping Up outcomes here.

Key Actions


Formulate Your Process Evaluation Questions

Begin to think critically about your program and come up with questions that can help you better determine how your program is functioning. Here are some questions that can assist you with this process:

  • What is the program’s background?
  • What is the program’s participant’s profile?
  • What is the program’s staff profile?
  • What is the amount of service provided to participants?
  • What are the program’s interventions and activities?
  • What administrative supports are in place?
  • How satisfied are the program’s stakeholders?
  • How efficient is the program?


Select Good Indicators for Your Outcome Evaluation Questions:

My-Peer suggests some simple questions to ask yourself when you are deciding what process and outcome indicators you will select for your evaluation:

  • How will we know that the program is having a positive effect on participants?
  • What changes would we expect to see in young people as a result of participating in the program?
  • What features of the program are important in achieving intended effects and therefore need to be monitored?
  • What might indicate that the program is not having its intended effects?

The selection of indicators can be difficult for people who do not have experience with research design, methodology, and terminology.

For example, you may have an idea of what you want to measure (e.g., the degree to which young justice-involved males gain confidence in their ability to exert control over their lives), but you may not know the specific term for the indicator (in this example, the term in psychological literature is ‘self-efficacy’), or if a measure already exists that measures it (yes, there are a few). It is important to investigate if there are existing measures or tools that you can use as this can save time and improve the quality of your evaluation. Further, if you happen to find a measure, you may not know whether it is necessarily a good measure (i.e., valid and reliable), or appropriate for youth populations. This is okay. There are lots of resources available to help you select good indicators for your youth participants.

Essentially, good indicators are S.M.A.R.T. indicators – that is, they are Specific, Measurable, Attainable, Relevant, and Trackable. They can be quantitative/numeric (e.g., the number of youth participants who complete high school or scores on a measure of self-esteem) or qualitative/non-numeric (e.g., responses to an interview question about perceptions of community safety).

After you select your outcome indicators, you can begin to evaluate what data collection processes are most suitable for you organizational and evaluation needs. Depending on available resources and the type of questions you would like to answer, some methodologies may be more preferable than others.

When selecting your data collection method, remember to consider:

  • The purpose of your evaluation: Will the method allow you to gather information that can be analyzed and presented in a way that will provide enough information to answer your evaluation questions?
  • Participant characteristics: Take factors such as respondent availability, access, literacy levels, and the cultural context into consideration when choosing the most appropriate method
  • Available Resources: Be sure that you have adequate resources (e.g., time, money, and staff to design, implement and analyze the information) available to support your methods
  • Type of information you need: Examples include numbers, percentages, comparisons, stories, examples
  • Interruption to participants: Which method would be the least intrusive?


Choosing your Evaluation Methodology

Design: Non-experimental, quasi-experimental and experimental designs

Most program evaluation designs in the grassroots youth sector are non-experimental/descriptive in nature. The type of design that you choose determines the claims that you can make about your research findings. For example, the only way that you can make a causal claim that Program A affected youth in outcomes X, Y, Z is if you use an experimental research design where participants are randomly assigned to either a treatment condition or a control group. If your design does not include random assignment to conditions, then you cannot make any claims about causality. This is important to remember when describing your findings. Non-experimental and quasi-experimental designs are very common in grassroots program evaluations, and especially appropriate because random assignment and control groups are usually not feasible.

See here for a discussion of quasi-experimental and experimental research design
See here for some more info on research design


When designing your evaluation methodology, you have some choices. The type of evaluation design and data collection methods that you select will depend on a few things:

1. The program timeline: When will the program begin? Is the program already up and running? Has the program finished?
2. How many youths are involved in the program: Are there less than 10 youths in the program? Or are there 50?
3. The type of programming: Is the program resource-based, like a drop-in clinic? Or does the program have a clear timeline with defined enrolment?

Let’s keep these questions in mind as we discuss different ways that we can collect data to measure process and outcome evaluation questions.

Here are some examples of source for evaluation data:

Document Review

  • Intake forms, activity reports, progress Reports
  • Contact logs
  • Meeting Minutes
  • Survey/Interviews with participants, clients or staff

Quantitative/Numeric Data Sources

  • Pre-interim-post or pre-post
  • Post-only
  • Retrospective post-then-pre

Qualitative Data Sources 

Qualitative data is non-numerical and is especially useful for gathering rich, in-depth, descriptive data from a small sample. Some examples of qualitative data sources include:

  • Focus groups
  • In-depth Interviews
  • Observations and Field Notes
  • Arts-based Methods
  • Mixed-Methods

Click here for a PDF with more descriptions of these types of data.


Review Ethics

“Our evaluation will comply with the Tri-Council Policy Statement on the Ethical Conduct for Research Involving Humans. All our processes will also demonstrate cultural awareness, sensitivity, inclusive practices that reflect on our own values”  -From YouthREX’s Evaluation Service Pledge

Five Key Ethical Principles

  1. Do no harm
  2. Voluntary Participation
  3. Informed Consent
  4. Parental/Guardian Consent for under 16 years
  5. Confidentiality
  6. Anonymity

Check out your agency’s interest and ability to conduct ethical evaluation with youth and ask these questions:

  • How will you explain the purpose of your evaluation to youth participants?
  • How will you involve youth?
    • Is there a mechanism for youth to contribute to the evaluation design and methods used?
    • Are there supports during and after evaluation research?
Hot Tips

What activities does your program already do?

The methods used to collect evidence to inform a process evaluation should integrate with the existing program activities as much as possible. Program administrative records are an excellent source of process evaluation evidence. Keep this in mind for next time you run your program – incorporating data collecting into programming can be super helpful!

Timing is key.

When data is collected (one time, at various times during the course of the program or continuously through the program) depends on purpose and method selected.

Try this!

Collect words for a Word Cloud by including in all your surveys: “At this point in time, what is one word that best describes _______”

Choosing between collecting qualitative and quantitative data….

It is important to note that the statistical usefulness of quantitative data depends on how large the sample is. If your sample is very small, it is better that you collect qualitative data instead.

Top 6 things to remember when developing survey questions:

1. Avoid leading words/questions
2. Give mutually exclusive choices
3. Ask direct questions
4. Add a ‘prefer not to answer’ option
5. Try to cover all possible answer choices
6. Ask only one question at a time


How to increase response rates:

No Cost Strategies

  • Personalize the study and your data collection instrument
  • Ensure confidentiality
  • Use culturally relevant surveying and/or interviewing practices
  • Respect participants’ time (make their time commitment as small as possible)
  • Be flexible with your deadlines and when scheduling interviews/focus groups

Low Cost Strategies

  • Follow-up with participants
  • Use attention grabbers
  • Make participation easy

Moderate Cost Strategies

  • Offer incentives for participation
  • Translate all materials into participants’ first language

Meaningfully engage youth in your evaluation:

Remember – youth like to be engaged! Make evaluation meaningful for youth participants rather than an extra burden. Keep this in mind when you’re deciding on your data collection tools. For example, youth often find arts-based methods such as art jams engaging.


Check out this factsheet on Art Jams for an example of how to incorporate creative expression into evaluation!


Also, be sure to provide youth with as much info about the evaluation and the rationale so they can determine the merit of the evaluation and whether to participate or not:

  • Why? The purpose for the evaluation
  • How? What participants will be asked to do
  • When? The timelines


What is process (implementation) evaluation?

Also known as implementation evaluation, a process evaluation is designed to determine whether a program is being delivered as intended. It documents what happens (if activities are conducted as planned and according to schedule); the frequency and intensity of the activities; and the extent to which the participants were reached. Implementation evaluation requires close monitoring of implementation activities and processes. This type of information can be used to adjust activities throughout a program’s lifecycle.

Why do I need to evaluate my program’s process?

  1. To improve your program’s operations: Process evaluation will help you to know whether your program is being implemented as intended.
  2. To generate knowledge for the sector: Process evaluation allows a careful description of a program (the active ingredients) so that others can replicate the program.
  3. Estimating Cost Efficiency: Process evaluation helps you gain a better sense of where cost can be reduced or better utilized to meet your program’s objectives.

What are the three purposes of process evaluation?

1. Program Monitoring is important to ensure the program is on track and is usually a normal part of project management.

  • Includes tracking, documenting and summarizing the inputs, activities and outputs of a program:Number of staff or volunteers involved in delivering activities, amount of money spent, amount and type of activity, number of people reached; characteristics of people reached
  • Monitoring is important to ensure the program is on track and is usually a normal part of project management

2. Program Improvement provides information that answers WHY and HOW the program works or does not work that can be used to inform program improvement

  • Are we meeting the expectations we set out to achieve and if not, why not?
  • How do participants feel about the program? What improvements have they suggested?

Accountability: Process evaluation generates the necessary data to justify expenditures of time and money to program stakeholders

  • Clearly demonstrates how program inputs produce program outputs
  • Documents compliance with externally imposed standards or criteria established by funders (e.g. # of youth participants required

What are some ways that I can strengthen my evaluation design?

  • Adding points in time. Any one-time data collection is limited to that snapshot in time. You can strengthen your evaluation by collecting data at other times.
  • Combining multiple methods of data collection.
  • Using multiple sources of information.  
  • Using comparisons (people, groups, sites). Adding a comparison of one or more groups, individuals, or sites can strengthen your design. Comparison groups refer to groups that are not selected at random but are from the same population.

What should I consider when choosing a measure?

  • Good Validity: does it measure what it is intended to measure?
  • Good Reliability: does it measure what it is intended to measure consistently?
  • Is the measure culturally appropriate?
  • Will participants be able to read and understand the questions?
  • Is the length appropriate to hold their attention?
  • Are any items potentially offensive to participants?
Tools / Templates / Checklists

Generating Process Evaluation Questions and Methods

This tool will help you review outputs of your logic model and generate questions relevant to your program

Source: YouthREX

Program Evaluation and Research Ethics Brief

Use this document to ensure you are using proper program evaluation and research ethics.

Source: James Bell Associates

Developing Process Evaluation Questions

Use this brief to help you guide your process evaluation.

Source: Centers for Disease Control and Prevention
Learn More...

Guide: W.K. Kellogg Foundation Logic Model Development Guide

by the Ontario Centre of Excellence for Child and Youth Mental Health


“Not everything that can be counted counts, and not everything that counts can be counted.”

Albert Einstein