Make Sense Of It
Arts Marketing Association & National Lottery Heritage Fund

Disciplined exploration in AI product development

Making sure AI genuinely serves sector needs through systematic exploration across desirability, feasibility and viability dimensions.

6 min read
Disciplined exploration in AI product development

When we first started work on the National Lottery Heritage Fund funded Goose, the brief appeared straightforward: create an AI assistant for heritage marketing professionals. However, rather than rushing to build the most obvious solution, we embarked on an extensive exploration process that would ultimately prove essential for developing a product that genuinely served sector needs rather than simply showcasing AI capabilities.

Over months of close collaboration with Arts Marketing Association, we systematically explored fundamentally different approaches across the full spectrum of product development - from core AI functionality to user communication and market positioning. This disciplined divergence before convergence proved crucial for responsible AI development in a mission-critical context.

Exploring the AI functionality landscape

The technical exploration spanned four distinct approaches, each building on learnings from the previous iteration. We built functional prototypes rather than conceptual designs, enabling genuine user feedback on actual capabilities rather than theoretical possibilities.

We began with a basic chatbot approach, role-playing the "Goose" character without access to tools or collaborative features. This quickly revealed fundamental limitations - whilst providing some entertainment value, simple question-answer interactions couldn't address the complex strategic challenges that heritage professionals faced daily.

Building on this insight, our second exploration focused on a vector embedding system using only "trusted" sources, employing cosine similarity search with language model processing to match user queries with verified heritage sector knowledge. While more sophisticated, this approach proved too narrow and brittle - reproducing the information accessibility problems that already constrained heritage professionals rather than addressing their deeper strategic thinking needs.

The third iteration developed an expert system specifically designed for National Lottery Heritage Fund applications and monitoring forms. This tackled genuine administrative pain points but felt even more constrained - solving bureaucratic friction without addressing the strategic thinking challenges that workshop participants had identified as their primary concerns. The narrow focus meant it would only provide value during specific, infrequent activities.

Our final technical exploration pursued a fully agentic route where AI models, equipped with tools and memory, could take on substantial workloads autonomously. The "Gander" prototype demonstrated sophisticated task automation but revealed it was too complex and created fundamental misalignment with heritage professionals' needs. Users wanted to retain control and understanding of their strategic decisions rather than delegating them to autonomous systems.

The thinking partners solution that emerged successfully synthesized the best aspects of each approach whilst removing their confusing or limiting elements. It provided the collaborative intelligence insights from the agentic exploration while maintaining user control, addressed the strategic thinking needs revealed through the expert system work, incorporated questioning methodologies from the trusted sources approach, and avoided the limitations exposed by basic chatbot testing.

Parallel exploration of product positioning

Simultaneously, we were developing Replit prototypes to explore how different value propositions might be communicated through landing page designs and user journeys. This wasn't cosmetic work but fundamental product strategy - testing whether we could articulate clear value propositions for each technical approach and how different user segments might discover and engage with various product concepts.

The landing page prototypes forced crucial questions about product-market fit that technical development alone couldn't answer. Could we explain the agentic approach in terms that busy heritage professionals would find compelling rather than intimidating? Did the expert system concept address problems significant enough to drive adoption? How might we communicate the value of "better questions" versus "better answers" in ways that would resonate with sector needs?

These explorations revealed misalignments between impressive technical capabilities and genuine user value. Some approaches that worked well in demos proved difficult to explain compellingly to potential users, while others that seemed technically straightforward presented complex communication challenges.

We even developed a simple game exploring how heritage professionals might engage with AI-accelerated tools in low-stakes environments. While ultimately not incorporated into the final product, this exploration revealed insights about user comfort levels with AI interaction and the importance of maintaining human agency in professional contexts.

The value of structured exploration

This comprehensive exploration process operated through twice-weekly collaborative sessions rather than traditional waterfall documentation. Working directly with Arts Marketing Association stakeholders, we could demonstrate functional prototypes, gather immediate feedback, and pivot based on real user responses rather than predicted requirements.

The organic nature of this process wasn't a documentation failure but the optimal discovery method for complex AI product development. Each session built on learnings from previous explorations, allowing genuine emergent insights rather than predetermined solutions. The thinking partners concept that ultimately defined Goose emerged from this iterative process rather than initial requirements gathering.

The exploration revealed that heritage professionals needed collaborative intelligence rather than automated solutions, strategic thinking support rather than information retrieval, and professional empowerment rather than task replacement. These insights wouldn't have emerged from a single technical approach or traditional user research methods.

Strategic investment in uncertainty

For mission-driven organisations, the cost of building the wrong AI solution far exceeds the investment required for thorough exploration. Heritage professionals can't afford AI tools that compromise their values, create dependency, or fail to address genuine professional needs. The extensive prototyping process was essential risk mitigation rather than expensive inefficiency.

Each prototype taught us something crucial about user needs, technical constraints, or market positioning that informed subsequent development. The agentic approach revealed user preferences for maintaining control. The expert system highlighted the importance of strategic versus administrative support. The trusted sources approach demonstrated limitations of information-focused solutions. The basic chatbot confirmed the need for more sophisticated interaction models.

The parallel exploration of technical capabilities and market communication ensured we were developing solutions that were simultaneously feasible, desirable, and viable - addressing the full product development challenge rather than just technical implementation.

From exploration to implementation

The thinking partners concept that emerged from this exploration process successfully synthesized learnings from all previous approaches. It provided the collaborative intelligence of agentic systems while maintaining user control, addressed strategic thinking needs identified through expert system exploration, incorporated the questioning methodology revealed through trusted sources work, and avoided the limitations exposed by basic chatbot testing.

The beta launch with 40 heritage organisations across the UK validated this exploration investment. Usage data showing extended engagement with thinking partners features demonstrated that the disciplined exploration process had identified genuine user value rather than impressive but superficial capabilities.

This comprehensive exploration approach demonstrates that responsible AI development for mission-critical contexts requires systematic investigation of multiple approaches across technical, user experience, and market positioning dimensions. The apparent inefficiency of building multiple prototypes is actually essential for avoiding the far greater cost of implementing AI solutions that fail to serve genuine organisational needs.

G
e
t
i
n
t
o
u
c
h

Technological change continues to accelerate but only a quarter of charities say they feel prepared to respond to the opportunities and challenges. Let's close the opportunity gap together.