PHASE TWO: ACTION
Step 05
Analyzing, Animating and Interpreting Evidence

In Step 5: Analyzing, Interpreting and Animating Evidence you’ll make sense of your findings and turn the sense-making into learning that strengthens your program in order to improve the wellbeing of youth participants.

In this step, you will dive into all the evidence you’ve collecting in your evaluation plan and identify themes in your qualitative data and patterns or trends in your quantitative data. You’ll interpret this evidence and make recommendations for action: what can be improved? what should stay the same?

This is also the step where you’ll start animating your evidence so that you can effectively visualize and communicate your findings to you stakeholder.

What is the data telling you about your program?
What’s working? What could be improved?
Are there themes, patterns or trends that you can identify from your evaluation evidence?
What recommendations for action can you make?

TAKEAWAY FROM STEP 05

Understand and learn about your program effects, especially for your youth participants.

Key Actions

01

Organize Your Data: In order to get to the meaty part of analyzing and interpreting your data, it’s important to organize it in a clear and helpful way so that you can work effectively with all of it.

  1. Quantitative Data: This includes any interviews, focus groups, observations, case studies, testimonials etc.
    • Organize forms/questionnaires in one place
    • Check for completeness/accuracy – take out incompletes but keep a record of what has been removed
    • Assign a unique identifier to each form/questionnaire and for each question – these help you keep track of where the individual pieces of data came from
    • Enter your data and analyze manually or using software – Excel (spreadsheet), Survey Monkey (web program), SPSS (statistical software) are examples of platforms you might use here.
  2. Qualitative Data: This includes questionnaires, evaluation feedback forms, intake forms, administrative records etc.
    • Transcribe audio recordings, pull together answers to open-ended questions

02

Data Analysis: As you gear up to do your analysis, review the purpose of your evaluation – how do you want to focus your analysis? You could focus it via time period, event, individual or group – these are just some commonly used options.

  1. Qualitative: Read transcripts of your data line by line and assign labels or codes to themes and ideas you find.
    • Highlight quotes that illustrate you themes well
    • Sort and assemble all your data by the themes you’ve pulled out
    • Show the relationships among categories or themes; attach quotes to each theme
    • What is really important and coming through loud and clear? What are particular groups of people saying and does that differ from what other people are saying? What have you learned that will help you with your evaluation question? What are some recommendations you can make, based on what you’ve learned?
  2. Quantitative: Perform calculations using the data you have collected.
    • Calculations include: count (frequencies), percentages, rankings, averages (mean), median, mode, standard deviation
    • Bivariate Statistics – accounts for significant change from pre to post

03

Data Interpretation: This is the time where you get to examine findings from all of your sources – this is where science turns to art! View the data as a whole and see if you can use it to answer your evaluation question(s).

  • Did different groups show different results? Were there any a-ha moments? Is there anything you still don’t understand well and warrants further research for answers?? What are the limitations of your evaluation (the way it was designed or how evidence was collected)?
  • Involve others to make sure your data is free from bias and is fair. Doing this ensures your other stakeholders are involved in the process, and that the data is contextualized within an understanding of the program
  • Reflect: what did you learn, how can it improve, what did you learn about participants? Did you discover anything new or unexpected?
  • Recommendations: Create a summary of what you’ve learned, positive or negative. Is the program working? What’s working well? What can the program do better? What is actionable? What can you do now? What can you integrate into an action plan?

04

Data Animation/Visualization: This helps you display your findings in ways that are easier for different audiences to understand. It’s not just about making things look pretty, but it’s to help your stakeholders understand your evaluation and want to take action to support the findings!

  • Decide what story you want to tell
  • Choose the findings that help to tell this story
  • Think through what type of visual will help tell your story best
    • Traditional visuals: Charts (bar, line, scatterplot)
      • Keep these simple and clear to the eye, use fonts sizes according to what you want people to notice more and take out any unnecessary clutter (extra lines, text etc.)
    • Non-Traditional/Creative: word clouds, diagrams, infographics, icons etc.

Check out the Legacy Phase for more on data animation, as this section links closely to how you communicate your findings, internally and externally!

Data Animation
Hot Tips

Post your key evaluation questions somewhere visible so you can stay focused on how your analysis informs and answers those questions

Recommendations are concrete statements, supported by evidence, about what needs to stay the same (worked well) what needs to be done to improve the program. Recommendations are only useful if they are feasible within the scope of the program.

Be sure to take good care of your data during data entry, storage and analysis. Scan and save any papers with data, make back ups and save throughout!

Q&A

Do I need to have advanced math or statistics skills to do qualitative analysis?

Nope! You do need some basic math skills though. We’ve compiled some tips and instructions on how to do this type of analysis in the Learn More section to help you get a better idea of what it entails.

What happens if there is missing data?

If someone doesn’t answer a question, make sure you note the missing data when you’re organizing it – don’t leave an empty box! Noting it’s missing means people will know it’s missing vs. that you just forgot to put it in. If you decide to take our that participants answers entirely, don’t forget to remember where the incomplete data is and store it somewhere safe – just in case!

How can I involve more people in this process?

One great way is to have at least one other person independently code your data in order have different perspectives and interpretations accounted for in your analysis. This is also a really good way to add rigor to your evaluation.

How should I approach coding my data and coming up with themes?

There are 2 main approaches. You can use themes you’ve already defined and look for those in your data or you can start coding your data as you go through it, and all the themes to be emergent. A third approach is to mix these two! Perhaps you know you’ll be looking for themes of youth leadership and resilience, but be open to other themes that emerge as you go through your data. You might be surprised at what comes out!

Should I be worried about reporting any limitations of my evaluation?

Every evaluation has limitations so don’t be afraid to report these. If you’ve gone through the steps of intentionally and meaningfully planning and executing your evaluation, your findings will reflect that. Reporting on limitations can actually make the claims you do make, that much stronger!

Tools / Templates / Checklists

Coming Soon!

Learn More...

Webinar: Basic Data Analysis

Report: Data and Design

By Trina Chiasson
READ IT HERE

Factsheet: Using Excel for Analyzing Survey Questionnaires

CHECK IT OUT

“The data you choose to illustrate should set the context, establish the main points of interest, and explain how these are interconnected. Be intentional in what you present, but do not censor data to further your argument. Your visual story should be based on what the data – and not only what you want to – say.”

Data & Design: A Simple Introduction to Preparing and Visualizing Data