Unlocking AI-powered analysis through hands-on learning
Showing how AI can augment existing expertise and expand analytical capabilities through interactive learning that put real organisational data at the centre of the process.

When WaterAid's Performance and Insight team approached us in 2025, they faced a challenge common to many charity data teams: rich datasets full of potential insights, but limited technical capacity to extract them efficiently. Traditional analysis methods were time-consuming and might miss patterns in unstructured data, whilst the team's deep domain knowledge wasn't being fully leveraged in their analytical work.
Rather than delivering theoretical training about AI possibilities, we designed an interactive workshop that put the team's actual data at the centre of the learning experience. The goal wasn't to turn insight professionals into programmers, but to show how AI could augment their existing expertise and dramatically expand their analytical capabilities.
Learning through real organisational challenges
The workshop used WaterAid's genuine supporter data: 150 survey responses to the question "With so many fantastic charities delivering great work, what is it that makes WaterAid one of the charities you choose to support?" We also worked with their demographic cross-tabulated spreadsheet data, demonstrating how AI could handle both qualitative insights and quantitative analysis.
Working with real data rather than synthetic examples proved crucial. Participants could immediately see the relevance to their daily work, and the results generated authentic conversations about what the findings might mean for WaterAid's supporter engagement strategy. When the AI identified themes in supporter feedback or revealed patterns in demographic data, these weren't abstract possibilities but actionable insights about their actual supporters.
The technical approach used Google Colab to create an accessible environment where team members could interact with AI-powered analysis tools without needing extensive programming knowledge. The system allowed participants to ask natural language questions of their data, with the AI generating and executing the necessary code to provide answers.
From comfort zone to art of the possible
The workshop deliberately moved participants through increasing levels of technical complexity, but always anchored in practical utility. The first session focused on qualitative analysis of supporter feedback, showing how AI could identify themes and patterns that might take weeks of manual coding to extract. Participants watched as the AI processed their 150 responses and generated structured insights about supporter motivations, complete with illustrative quotes and categorisation.
The second session tackled quantitative analysis of their demographic data. Here, the AI demonstrated its ability to generate code based on natural language queries: "What categories contain the word age or years?" or "Compare how different age groups prefer to support WaterAid." Participants could see both the results and the code being generated, building confidence for those with technical backgrounds whilst making the process transparent for everyone.
This approach proved particularly powerful because it addressed different learning preferences and comfort levels within the team. Less technically-minded participants could focus on the insights being generated, whilst those with coding experience could evaluate and learn from the generated code. Everyone could take the code away for future use, even without the AI system.
Building capability, not dependency
The workshop revealed that WaterAid's team already possessed the domain expertise needed to ask sophisticated analytical questions - they simply lacked the technical tools to answer them efficiently. By demonstrating how AI could bridge this gap, participants began to see new possibilities for their existing work rather than viewing AI as a replacement for their skills.
The probabilistic nature of AI-generated insights also sparked valuable conversations about data interpretation and analytical rigour. When different participants received slightly different outputs from the same query, it provided natural opportunities to discuss how AI results should be validated and contextualised within broader organisational knowledge.
Perhaps most importantly, the hands-on approach built genuine confidence rather than superficial awareness. Participants left understanding not just what AI could theoretically do, but having experienced how it could practically enhance their day-to-day analytical work. They had moved from their comfort zone to seeing "the art of the possible" with their own data and their own questions.
The session demonstrated that effective AI education for charity teams requires more than explaining what the technology can do. It requires showing how it can augment existing expertise, using real organisational data to make the possibilities tangible and immediately relevant. When teams can see AI enhancing rather than threatening their professional capabilities, adoption becomes a natural next step rather than a daunting leap into the unknown.
Technological change continues to accelerate but only a quarter of charities say they feel prepared to respond to the opportunities and challenges. Let's close the opportunity gap together.

