Outputs vs Outcomes: Why This Matters More Than You Think
Stop counting what you do and start measuring what changes. This crucial distinction transforms how you plan, evaluate and communicate impact. Includes practical examples and a six-month implementation plan.
“We delivered 47 workshops to 320 young people this year!”
A charity trustee shared this proudly at a board meeting I was attending. Everyone nodded approvingly. Then someone asked: “That’s brilliant. But what changed for those 320 young people as a result?”
Silence.
They genuinely didn’t know. They’d counted everything they’d done, but hadn’t measured what difference it made.
This is the outputs vs outcomes trap, and nearly every charity falls into it at some point. It’s not because people are bad at evaluation or don’t care about impact. It’s because outputs are concrete, countable, and feel like proof you’re doing something. Outcomes are messier, harder to measure, and require you to ask uncomfortable questions about whether your work is actually creating change.
But here’s why this distinction matters more than you might think: funders, commissioners, and increasingly donors don’t just want to know what you did. They want to know what difference it made. And if you can’t answer that question, you’re at a serious disadvantage.
Let me help you understand the difference, spot when you’re confusing the two, and make the shift to outcome thinking.
What’s the Actual Difference?
At its simplest:
Outputs are what you do and deliver. They’re your activities, services, and products.
Outcomes are what changes as a result of what you do. They’re the benefits, learning, and effects experienced by the people you work with.
Think of it this way: outputs are what you control. Outcomes are what you hope to influence.
Examples that show the difference:
| Output (What you did) | Outcome (What changed) |
| Delivered 20 debt advice sessions | 15 clients successfully reduced their debt and felt less anxious |
| Ran 8 parenting workshops | Parents report using positive discipline strategies and feeling more confident |
| Distributed 500 food parcels | Families had adequate nutrition and avoided choosing between heating and eating |
| Provided 30 hours of counselling | Client’s depression scores decreased and they re-engaged with social activities |
| Trained 40 volunteers | Volunteers feel more skilled and the organisation increased service capacity by 25% |
Notice how the outcome column answers an implied “so what?” question. So what if you delivered 20 advice sessions? Well, it meant people reduced their debt and felt less anxious. That’s the outcome that matters.
Why Charities Get Stuck on Outputs
Before we go further, let’s acknowledge why outputs feel safer and easier:
They’re concrete and countable
You know exactly how many workshops you ran. You can’t argue with that number. Outcomes are fuzzier. Did confidence “increase”? By how much? For everyone? It’s harder to pin down.
They’re fully within your control
You decide to run 10 sessions, and you run 10 sessions. Done. Whether those sessions actually improve anyone’s wellbeing depends on multiple factors, many outside your control.
They’re what you’re organised to deliver
Your staff job descriptions say “deliver support groups” not “achieve sustained improvements in mental health.” Your budget is allocated to activities, not outcomes. Your systems track what you do, not what changes.
They feel like evidence of hard work
When you’re exhausted from delivering services with inadequate resources, those output numbers represent real effort and dedication. It can feel like people are dismissing your work when they say “yes, but what was the outcome?”
They’re what administrative data naturally captures
Your database tracks attendance, your registers count participants, your reports list activities. All of that is output data. Outcome data requires additional, deliberate effort to collect.
I get it. Outputs feel safer. But they’re not enough.
Why Funders Care About Outcomes (And Why You Should Too)
Funders don’t ask about outcomes to make your life difficult. They ask because:
Resources are limited
If a funder has £50,000 to invest, they want it to go toward the organisation that will create the most positive change. Two organisations might both deliver 100 sessions, but if one demonstrably changes lives and the other doesn’t, that’s a meaningful difference.
Activities aren’t automatically effective
Just because you delivered something doesn’t mean it worked. I’ve seen charities run programmes for years that weren’t actually helping anyone but kept running because “that’s what we do.” Outcome thinking forces you to check whether your activities are fit for purpose.
They’re accountable to their own stakeholders
Funders need to demonstrate to trustees, government, or donors that their investment created change, not just activity. They’re under the same pressure you are.
But beyond satisfying funders, outcome thinking matters because:
It helps you improve
When you measure outcomes, you discover what’s working and what isn’t. Maybe your six-week course is too short to create lasting change, but your 12-week version does. You only know that if you’re measuring outcomes, not just counting completions.
It keeps you focused on purpose
It’s easy to get caught up in delivering activities and forget why you’re doing them. Outcome thinking constantly asks: what are we trying to change? Are we succeeding? This keeps you connected to your mission.
It motivates staff and volunteers
People want to know their work matters. “We ran 50 sessions this year” is less motivating than “We helped 35 people move into stable housing.” Outcomes give meaning to the hard work.
It builds a compelling case for support
Donors respond to change and transformation. “We serve meals to 200 people each week” is an output. “We reduce hunger and provide a community space where isolated people form friendships” is an outcome. Which one would you donate to?
The “So What?” Test
Here’s the simplest way to spot whether you’re talking about outputs or outcomes: add “so what?” to the end of your statement.
“We delivered 30 CV writing workshops this year.”
So what?
“So participants now have professional CVs.”
So what?
“So they’re more confident applying for jobs.”
So what?
“So they’re getting interviews and eventually employment.”
There’s your outcome: improved confidence, increased interview success, and ultimately employment.
Every time you can answer “so what?” and go deeper, you’re moving from output toward outcome. Keep going until you hit something that’s valuable in itself, not just because it leads to something else.
A youth charity I worked with proudly told me they’d engaged 150 young people in community action projects. I asked: “That’s impressive. So what happened for those young people?”
They weren’t sure. After some reflection, we designed an evaluation to find out. Six months later they had evidence that participants developed leadership skills, felt more connected to their community, and were more likely to vote. Those were the outcomes that mattered, and they’d nearly missed measuring them entirely.
Common Confusions (And How to Untangle Them)
Let me address the most common ways organisations muddle outputs and outcomes:
Confusion 1: “Participants gained skills”
Is this an output or outcome? It depends on how you’re using it.
If you mean “we delivered skills training” (the activity), that’s an output.
If you mean “participants now have skills they didn’t have before” (the change), that’s an outcome.
The test: Did you measure the skill development, or just count the training sessions?
Confusion 2: “Service user satisfaction”
“95% of participants were satisfied with our service” sounds like an outcome. It’s actually closer to an output quality measure.
Satisfaction tells you people liked the experience. It doesn’t tell you whether anything changed for them. You can be very satisfied with a service that made no lasting difference to your life.
Better: “85% of participants reported increased confidence in managing their condition, and 92% were satisfied with the service.” Now you have both outcomes and satisfaction data.
Confusion 3: “Soft outcomes”
Some people use “soft outcomes” to describe things like increased confidence or improved wellbeing, contrasting them with “hard outcomes” like employment or school attendance.
This language is unhelpful. All outcomes are real changes. Confidence and wellbeing are just as meaningful as employment, they’re just harder to measure. Don’t downgrade their importance by calling them “soft.”
Confusion 4: “Intermediate outcomes”
This one’s legitimate but worth clarifying. Sometimes one outcome leads to another:
- Short-term outcome: Participants gain knowledge about nutrition
- Medium-term outcome: Participants change their eating habits
- Long-term outcome: Participants experience better health
All three are outcomes. They’re just sequenced. The knowledge isn’t an output, it’s an outcome that leads to other outcomes. This is fine, but be clear about which level you’re measuring and why.
How to Shift From Output to Outcome Thinking
If you recognise that you’ve been focused on outputs, here’s how to start thinking in outcomes:
Step 1: For each activity, ask “what changes for participants?”
Take your main activities. For each one, complete this sentence: “After participating in [activity], people will be better off because ___________”
Example:
- After participating in our cooking classes, people will be better off because they have practical skills to prepare nutritious meals on a budget and feel more confident in the kitchen.
That’s your outcome.
Step 2: Check whether you’re describing a change
Look at your answer. Is it really a change, or is it just describing the activity differently?
“People will be better off because they attended cooking classes” – Not a change, just restating the output.
“People will be better off because they can cook healthier meals” – Yes, that’s a change.
Step 3: Consider different types of change
Outcomes can be:
- Knowledge/awareness: Understanding something new
- Skills/abilities: Being able to do something new
- Attitudes/beliefs: Thinking differently about something
- Behaviour: Acting differently in daily life
- Circumstances: Changes in life conditions (housing, employment, relationships)
- Wellbeing: Feeling better physically, mentally, emotionally
Your activities might create several different types of outcomes. That’s fine. You don’t need to measure all of them, but it helps to be aware of the range.
Step 4: Prioritise the outcomes that matter most
You can’t measure everything. Choose the outcomes that are:
- Most closely aligned with your mission
- Most meaningful to participants
- Most relevant to your funders
- Feasible to measure with your resources
Often that’s 2-4 key outcomes rather than 10.
Step 5: Design simple ways to measure them
For each priority outcome, ask: “How would we notice if this change happened?”
- Increased confidence: Before and after self-rating scales
- Changed behaviour: Participant self-reporting or observation
- Improved circumstances: Tracking employment status, housing stability, etc.
- Better relationships: Qualitative interviews about social connections
You don’t need perfect measures. You need good-enough measures you’ll actually use.
When Outputs Still Matter
Let me be clear: I’m not saying outputs are worthless. They have important uses:
Outputs tell you about reach and equity
Who are you serving? How many? Are you reaching the people you intend to reach? Output data answers these questions and flags if you’re only engaging certain groups.
Outputs show delivery and efficiency
Funders need to know you actually delivered what you said you would. If you promised 20 workshops and delivered 5, that’s important information about capacity and planning.
Outputs can indicate engagement
High attendance, completion rates, and repeat engagement are outputs that suggest something valuable is happening. They’re not proof of outcomes, but they’re promising signals.
Outputs are building blocks for outcome calculations
To calculate cost-per-outcome or reach, you need output data. “We helped 60% of participants reduce debt” is more informative when you also know you worked with 100 people.
The problem isn’t tracking outputs. It’s stopping there and assuming outputs are evidence of impact.
Getting the Balance Right
Good evaluation tracks both outputs and outcomes. Here’s how that might look:
A homelessness charity might report:
Outputs:
- Provided 1,247 nights of emergency accommodation to 89 individuals
- Delivered 234 one-to-one support sessions
- Made 67 referrals to specialist services
Outcomes:
- 41 individuals moved into stable housing (sustained 6+ months)
- 68% reported improved mental health and reduced substance use
- 55% successfully engaged with employment support or training
See how the outputs give context and scale, while outcomes show the actual change? Both are valuable. Together, they tell a complete story.
A mentoring programme might report:
Outputs:
- Matched 45 young people with volunteer mentors
- Delivered 520 hours of mentoring support
- Average relationship duration: 14 months
Outcomes:
- 78% of young people reported increased confidence and self-esteem
- 82% showed improved school engagement (attendance and behaviour data)
- 67% said their mentor helped them make better decisions
Again, outputs show what you did. Outcomes show what difference it made.
Making the Shift in Your Organisation
If you’re currently output-focused and want to shift toward outcomes, here’s a practical sequence:
Month 1: Audit your current measurement
List everything you currently measure. Sort it into “outputs” and “outcomes.” For most organisations, 80-90% will be outputs. That’s your baseline.
Month 2: Clarify your intended outcomes
For each programme, describe what change you’re trying to create. Don’t worry about measurement yet. Just get clear on what success looks like in terms of change for participants.
Month 3: Choose 2-3 priority outcomes
You can’t measure everything immediately. Pick the outcomes that matter most and are feasible to measure with your current resources.
Month 4: Design simple measurement approaches
For each priority outcome, create one simple way to track it. Before-and-after questions, exit interviews, or case studies are good starting points.
Month 5: Start collecting data
Begin gathering outcome data alongside your existing output data. Don’t stop tracking outputs; add outcomes to what you already do.
Month 6: Review and adjust
Look at what you’ve learned. Is the data useful? Is collection sustainable? Refine your approach based on what’s working.
By month six, you’ll have meaningful outcome data to include in reports and funding applications. By month twelve, outcome thinking will feel more natural.
A Final Word on Language
One practical tip: audit how you talk about your work, both internally and externally.
Change “We deliver…” language to “We help people to…” language:
- Instead of: “We deliver financial education workshops”
- Try: “We help people develop skills and confidence to manage their money”
- Instead of: “We provide counselling services”
- Try: “We help people recover from trauma and rebuild their lives”
- Instead of: “We run after-school clubs”
- Try: “We help children develop confidence, make friends, and enjoy learning”
This simple language shift forces you to think about change and outcomes rather than just activities and outputs. It also makes your work sound more compelling to others.
Reflection Questions
Before you move on, take a moment to consider:
Look at your most recent impact report or funding application. What percentage is about outputs (what you did) versus outcomes (what changed)? What does that tell you?
Choose one activity your organisation delivers. Complete this sentence: “After participating, people are better off because ___________” That’s the outcome you should be measuring.
About This Series
This guide is part of a learning series on Measuring Social Impact for Charities and Social Enterprises. We’re here to make evaluation practical, accessible, and useful – not overwhelming.
Want to go deeper? Social Value Lab supports organisations to develop proportionate, practical approaches to measuring and communicating impact. We believe every organisation deserves to understand and communicate their value, regardless of size or budget.
Was this helpful? Share it with a colleague who needs to hear that evaluation doesn’t have to be overwhelming.