Skip to main content

Evaluation on a Shoestring: What You Can Do With No Budget

No budget doesn’t mean no evaluation. Discover free tools, lean approaches, and creative strategies for measuring impact using existing resources. Practical guidance for resource-stretched charities to start today.

 

“We know evaluation matters, but we literally have no budget for it. Our funders expect evidence of impact, but won’t pay for us to gather it. What are we supposed to do?”

This is one of the most frustrating contradictions in the third sector. Funders demand robust evidence of impact while refusing to fund the work needed to produce it. You’re expected to demonstrate outcomes on a monitoring budget that barely covers your database subscription.

It’s maddening. It’s unfair. And it’s the reality most small charities face.

But here’s what I’ve learned supporting hundreds of organisations in exactly this position: evaluation without budget is difficult, but it’s not impossible. You can gather meaningful evidence of your impact using free tools, existing resources, and a modest amount of staff time.

Will it be perfect? No. Will it be proportionate and useful? Absolutely.

Let me show you how.

First, Let’s Be Honest About What “No Budget” Really Means

When we say “no budget for evaluation,” we usually mean no budget for:

  • External evaluators or consultants
  • Specialist software or databases
  • Participant incentives or expenses
  • Printing and postage for surveys
  • Staff time specifically allocated to M&E

But that’s not quite the same as having literally nothing. Most organisations do have:

  • Staff who already interact with participants
  • Computers with basic software (Word, Excel, email)
  • Internet access
  • Some form of participant contact details
  • Existing conversations and touchpoints with service users

The real cost of evaluation isn’t money. It’s time. And the question becomes: can you carve out a few hours a month from existing staff time to do this work?

For most organisations, the answer is yes – if you’re strategic about it.

The Foundation: Make Evaluation Part of Normal Practice

The most sustainable zero-budget evaluation doesn’t feel like “extra” work. It’s woven into activities you’re already doing.

Instead of: Creating a separate survey sent weeks after your programme ends (low response rate, chasing people, wasted effort)

Try: Three questions asked in the final session while participants are present (high response rate, immediate data, minimal effort)

Instead of: Scheduling separate evaluation interviews that require booking rooms and coordinating diaries

Try: Adding two reflective questions to check-in conversations that already happen

Instead of: Bolting data collection onto an already-busy service delivery model

Try: Designing your service so that giving feedback is part of the experience

A debt advice service I worked with had no evaluation budget but needed to demonstrate impact to their funder. Their solution was brilliantly simple: they added five minutes to every advice appointment.

In those five minutes, advisors asked: “On a scale of 1-10, how anxious do you feel about your debt right now?” They noted the number in the case file they were already maintaining.

Three months later, when clients came for their follow-up appointment, advisors asked the same question again. The before-and-after comparison showed meaningful improvement in anxiety levels for 73% of clients.

Cost: £0. Additional time per client: 5 minutes. Value to funding application: Transformational.

Free Tools That Actually Work

You don’t need expensive software. Here’s what you can do with tools you already have or can access for free:

For surveys and questionnaires:

Google Forms is completely free with a Gmail account. You can create surveys, share them via link or email, and see responses in a spreadsheet. It’s not sophisticated, but it works perfectly well for most charity evaluation needs.

Microsoft Forms is free if you have a Microsoft 365 account (which many charities do through charity discounts or TechSoup). It’s similar to Google Forms with slightly better accessibility features.

Both allow you to:

  • Create multiple question types (multiple choice, rating scales, open text)
  • Share links via email or social media
  • Collect anonymous responses if needed
  • Export data to Excel for basic analysis

For data management:

Excel or Google Sheets is sufficient for most small-scale evaluation. Yes, proper databases are better. But spreadsheets are free, and you already know how to use them.

Create one master spreadsheet with:

  • One row per participant
  • Columns for basic demographics, dates, activities attended, and outcomes
  • Separate tabs for different programmes if needed

It’s not glamorous, but it works. I’ve seen compelling evaluation reports based entirely on well-organised spreadsheet data.

For qualitative data:

Word documents are perfectly adequate for storing and analysing interview notes or focus group transcripts. You don’t need complicated research software unless you’re dealing with hundreds of pages of text.

Simple approach:

  • Create one document per interview/focus group
  • Use the ‘Find’ function to search for key themes across multiple documents
  • Copy relevant quotes into a separate “themes” document organised by topic

For visual data:

Your phone camera is a powerful (and free) evaluation tool. Photos and short videos can capture change that’s hard to quantify:

  • Environmental transformation (community spaces, gardens)
  • Confidence and engagement (participants presenting work, leading activities)
  • Outputs and products (artwork created, newsletters produced, meals cooked)

Just ensure you have proper consent before photographing participants, particularly children or vulnerable adults.

For communication:

Canva has a free version that’s excellent for creating simple infographics, one-page impact summaries, and visual reports. The free tier includes thousands of templates and design elements.

Free Data Collection Methods That Work Well

Some evaluation methods are naturally low-cost. Here are the ones that work best on zero budget:

Before-and-after questions

The simplest, most accessible way to measure change. Ask participants to rate something at the start and end of your intervention:

  • “On a scale of 1-10, how confident are you in [specific skill]?”
  • “How often do you feel lonely?” (Never / Rarely / Sometimes / Often / Always)
  • “I know where to get help when I need it” (Strongly disagree to Strongly agree)

You can ask these verbally, on paper, or using a free online form. They take 30 seconds to complete and provide quantifiable evidence of change.

Exit conversations

A 10-minute conversation at the end of someone’s involvement can gather rich data if you ask good questions:

  • What’s different for you now compared to when you started?
  • What was most helpful about this programme?
  • What would have made it even better?
  • Would you recommend us to others? Why/why not?

Take brief notes during or immediately after. File them systematically. Every few months, read through them all to identify patterns.

Observation notes

If you’re already present during activities, take five minutes afterwards to note significant observations:

  • Who engaged actively? Who seemed withdrawn?
  • What moments felt like breakthrough points?
  • What seemed to work well or poorly?
  • Any quotes or comments that stood out?

Over time, these build into valuable qualitative data about your programme’s impact.

Administrative data you’re already collecting

Look at what you’re tracking anyway:

  • Attendance rates (engagement and retention)
  • Completion rates (persistence)
  • Referral patterns (reach and targeting)
  • Follow-up contact (sustained engagement)

These might feel like “just monitoring,” but they’re evidence. Someone who attends 10 out of 12 sessions is demonstrating something different from someone who attends 2 out of 12.

Participant-generated content

Ask service users to capture their own experience:

  • “Take a photo of something that represents what this programme means to you”
  • “Write three words describing how you feel now”
  • Keep a brief journal or diary (even if just one sentence per week)

This shifts some data collection burden to participants (with their consent) while generating authentic, meaningful evidence.

What Good Evaluation Looks Like on Zero Budget

Let me show you three real examples of organisations doing credible evaluation with no dedicated budget:

Example 1: The Food Bank

A volunteer-run food bank had no budget, no database, and one coordinator working 15 hours a week. They needed to show impact beyond “number of food parcels distributed.”

Their solution:

  • Created a simple paper form asking three questions when people collected parcels: “Is this your first visit?” / “How did you hear about us?” / “Is there anything else you need help with?”
  • The coordinator spent 30 minutes weekly entering responses into a free Google Sheet
  • After six months, they had data showing: 60% were repeat users (indicating sustained need), 40% heard about them from healthcare services (showing effective partnerships), and 35% needed help beyond food (informing service development)

Cost: £0. Time: 2 hours per month. Evidence gathered: Sufficient for funding applications.

Example 2: The Mental Health Support Group

A peer-led support group met weekly with no paid staff and no budget. They wanted to demonstrate impact to secure small grants.

Their solution:

  • Started each session with a quick mood check-in where people placed a sticky dot on a scale (1-10) showing “How are you feeling today?”
  • Took a photo of the dot pattern each week (with no identifying information)
  • After 12 weeks, patterns showed most participants’ mood improved over the course of each session
  • They also kept brief notes after each session about themes discussed and significant moments

Cost: £5 for sticky dots. Time: 10 minutes per week. Evidence: Compelling photos showing measurable mood improvements.

Example 3: The Homework Club

An after-school homework club had no evaluation budget but needed to show impact for a renewal application.

Their solution:

  • Asked teachers to provide baseline reading levels for participating children (data schools already collected)
  • Asked the same teachers for updated levels 6 months later
  • In the final session, asked children three questions: “Do you feel more confident with homework?” / “What did you learn?” / “What was best about the club?”
  • Combined quantitative teacher data with qualitative child feedback

Cost: £0. Time: 3 hours total over 6 months. Evidence: Reading levels and confidence data that secured renewed funding.

Notice what these examples have in common: simplicity, integration with existing activities, and a clear focus on answering specific questions rather than measuring everything.

Creative Approaches to the Budget Challenge

Sometimes “no budget” can be stretched slightly with creativity:

Piggyback on existing budgets

Could participant feedback forms be printed alongside programme materials you’re already printing? Could follow-up phone calls for evaluation happen during welfare check-ins you’re already making?

Student partnerships

University students often need placement projects or dissertation research opportunities. A social work, psychology, or public health student might conduct interviews, analyse data, or write up findings as part of their studies. You get free evaluation labour; they get real-world experience. (Just ensure proper supervision and ethical approval.)

Volunteer evaluators

Some people have monitoring and evaluation skills and want to donate them. Try:

  • Reaching out to local professional networks (ask a business forum if any members have research/evaluation skills to volunteer)
  • Posting on volunteer matching websites 
  • Asking your supporters if anyone has relevant expertise

In-kind support

Could a funder cover participant travel expenses to attend evaluation focus groups? Could you use free community space rather than hiring a venue? Could a local business sponsor refreshments for a feedback session?

DIY incentives

If you need to incentivise participation in evaluation, get creative: homemade certificates of appreciation, small thank-you cards, first pick of donated items, or entry into a prize draw for something donated.

What You Can’t Do on Zero Budget (And Why That’s Okay)

Let’s be realistic. Some things genuinely require budget:

You probably can’t:

  • Hire external evaluators for independence and specialist expertise
  • Conduct large-scale quantitative surveys with high response rates
  • Use sophisticated software for complex analysis
  • Offer meaningful participant incentives for evaluation involvement
  • Track long-term outcomes if participants have moved away or changed contact details
  • Produce glossy evaluation reports with professional design

But you can still:

  • Do credible evaluation that meets most funders’ requirements
  • Demonstrate plausible contribution to outcomes
  • Learn what’s working and what isn’t
  • Make evidence-based decisions about your programmes
  • Tell compelling stories backed by data

The key is accepting that “good enough” doesn’t mean “perfect.” Your evaluation might be limited in scope, sample size, or methodological sophistication. That’s fine, as long as you’re transparent about limitations and the evidence you do gather is credible.

Most funders understand that small charities can’t afford gold-standard evaluation. They just want to see you’re trying honestly to understand and improve your impact.

Making It Sustainable

The biggest risk with zero-budget evaluation isn’t that it’s impossible. It’s that it burns out the person doing it.

If evaluation relies entirely on someone’s unpaid overtime or goodwill, it won’t last. Here’s how to make it sustainable:

Protect time ruthlessly

If someone has 3 hours per month for evaluation, that time is sacred. It doesn’t get eaten by other urgent tasks. Put it in the diary as if it’s a meeting with your most important funder.

Keep it simple

Sustainable evaluation does fewer things consistently rather than many things sporadically. One good outcome measure tracked reliably is worth more than five measures you never have time to analyse.

Share the load

Can multiple staff take on small pieces? One person coordinates, but others help with data entry, or asking questions during sessions, or taking photos.

Build it into role descriptions

If someone’s role includes “deliver youth work sessions,” add “and collect simple feedback data.” It’s not extra work; it’s part of how the work is done well.

Review and adjust

Every six months, ask: Is this working? Is it too much? Could we simplify further? Zero-budget evaluation needs to be lean and adaptive.

Your Zero-Budget Action Plan

If you’re convinced but don’t know where to start, here’s a practical four-week plan:

Week 1: Audit what you already have

  • List all data you’re already collecting
  • Identify existing touch points with participants where you could gather feedback
  • Check what free tools you already have access to

Week 2: Choose one outcome to measure

  • Pick your most important outcome
  • Design one simple way to measure it (before-and-after question, exit conversation, observation notes)
  • Decide where in your normal delivery this will happen

Week 3: Set up your system

  • Create your data collection tool (form, template, or question script)
  • Set up a simple spreadsheet to store responses
  • Brief any staff who need to be involved

Week 4: Start collecting

  • Begin gathering data consistently
  • Set a reminder to review after one month
  • Be prepared to adjust if something isn’t working

That’s it. You don’t need more resources, more time, or more expertise. You just need to start.

Reflection Questions

Before you move on, take a moment to consider:

What data are you already collecting that could be analysed differently to show impact?

What’s one free tool or method that you could implement this month?

About This Series

This guide is part of a learning series on Measuring Social Impact for Charities and Social Enterprises. We’re here to make evaluation practical, accessible, and useful – not overwhelming.

Want to go deeper? Social Value Lab supports organisations to develop proportionate, practical approaches to measuring and communicating impact. We believe every organisation deserves to understand and communicate their value, regardless of size or budget.

Was this helpful? Share it with a colleague who needs to hear that evaluation doesn’t have to be overwhelming.