Getting Started with AI in Your Charity
This on-demand training is for charity leaders, trustees, and community groups. It introduces the basics of AI tools, explains why AI is a leadership issue, and offers practical steps to help you set boundaries and make informed decisions about AI use in your organisation.
Hello, and welcome to this on-demand training session on how charities and community groups can engage with AI. This recording forms part of a library of online resources commissioned by Support Cambridgeshire. You’ll find guidance links to additional information at the end of the accompanying transcript.
My name is Flóra Raffai, from Rapport Coaching and I work with small charities and voluntary organisations to help them think critically about leadership, strategy, and digital tools.
In this session, we’re going to explore why engaging with AI, especially generative AI, is something every charity leader and trustee needs to think about. We will not cover the technical side of AI, but we will signpost to further resources.
AI isn’t just a passing trend or a bit of hype. It’s already influencing how organisations work, how they communicate, and even how they make decisions. Whether or not you’re actively using AI tools right now, they’re shaping the environment your charity works in.
That’s why, as leaders, you have a responsibility to engage with AI thoughtfully. Not to jump on the bandwagon, but to make sure your charity is protected, prepared, and making the most of opportunities.
The 2025 Charity Digital Code of Practice puts this clearly: digital isn’t just about tools – it’s a leadership, ethical, operational, and engagement issue. The Code talks about risks, culture, skills and adaptability – and AI touches all of these areas.
Choosing to ignore AI isn’t a neutral decision. In fact, it could mean missing out on opportunities or exposing your organisation to risks you didn’t see coming.
This session is here to help you start the right conversations in your charity. To reflect on your approach, set clear boundaries, and think about how AI fits into your leadership and governance responsibilities.
Before we go any further, let’s take a moment to explain what we actually mean by AI and why it matters for voluntary and community groups.
Artificial Intelligence, or AI, is a broad term. It covers a whole range of technologies that help analyse data, make predictions, automate tasks, or generate content.
The type of AI that most charities are likely to come across is Generative AI, often called GenAI for short. These are tools like ChatGPT, Co-Pilot, or Gemini. They create text, images, audio, and even videos based on huge amounts of information they’ve been trained on.
Under the hood, these tools work by using Large Language Models, or LLMs. These models have been trained on vast amounts of data, often scraped from the internet and publicly available information. They predict what’s likely to come next in a sequence or what should be included in an image, based on patterns in that data.
So, while GenAI tools can seem impressive, they aren’t thinking, creating, or understanding like a human. They’re predicting.
That means they can be useful for things that have been done before, like:
- Drafting text or reports
- Summarising information
- Generating ideas or suggestions
But it also means they can:
- Get facts wrong
- Reflect biases from their training data
- Generalise or oversimplify complex topics
This is why critical thinking is so important, and this is a point we’ll come back to repeatedly. AI can be a really helpful tool, but only if we remember that it’s not infallible and it’s certainly not a substitute for human judgement.
So, how do you make sure your charity is using GenAI tools safely, responsibly, and in line with your values?
There’s no single right answer, but there are good principles that can guide your approach. One helpful framework is the 4Ds of AI Fluency, developed by Anthropic, one of the major AI research organisations. These four principles work really well alongside the leadership principles in the Charity Digital Code of Practice.
Let’s look at each one in turn.
First, Delegation is about deciding what tasks are appropriate for AI, and what must remain in human hands. For example, you might use AI to help draft meeting minutes based on your notes or review a funding application against a funder’s criteria to identify areas for improvement. But you wouldn’t use it to make decisions about anything sensitive, like safeguarding or service delivery.
This is part of your leadership responsibility, knowing when AI is a helpful assistant and when it can hinder. It means seeing AI as a tool to support learning and development within your team, not a shortcut to avoid building skills. It means recognising that while certain actions can be delegated, a human should be involved in the act of delegating and reviewing what is produced.
Next is Description. This is about being clear and specific when you use AI tools. The clearer you are in your instructions (what’s known as a ‘prompt’) the better the output will be. It is useful to cover the following in your prompt:
- What outcome do you want? Be specific about the task. Do you want a summary, a plan, a draft paragraph, a headline, or a checklist?
- Who is writing it? Define the voice or perspective you want the tool to take, e.g. “write this as a fundraising officer speaking to trustees” or “explain this like you’re speaking to a 16-year-old service user.”
- What is important? Include any key details you want to appear, such as values to reflect, tone to match, or elements to include or avoid.
- What are the restrictions? Set clear boundaries: “keep it under 150 words,” “write in bullet points,” “use British English spelling,” etc.
Clear description helps make sure the AI’s outputs are useful and appropriate, and that they align with your charity’s voice and values.
Third is Discernment: reviewing AI-generated content with a critical eye. You should never assume that an AI-generated output is correct, unbiased, or fit for purpose straight away. AI tools reflect the data they were trained on, which means they can repeat biases or make assumptions that don’t apply to your context. They also have a tendency to simplify, generalise, or even make things up (something known as ‘hallucination’).
Your role is to critically review everything AI produces. Check the facts, figures, and sources. Review whether there are any hidden biases, stereotypes, or harmful assumptions in the language or content. It’s up to you to check, verify, and adapt it before it goes anywhere.
And finally, Diligence. This is about being responsible and ethical in how you use AI.
Never input confidential, sensitive, or personal data into AI tools – especially free versions where your data may be used to train the system. Make sure your data protection and cyber security policies are followed. Read the terms of use and privacy policies of the platforms. Treat AI platforms as if they’re public spaces and never enter sensitive internal information. Use only reputable AI platforms and keep your software and security protocols up to date.
Think carefully about the ethical side of AI use. Are you representing your community fairly? Are you protecting your beneficiaries’ privacy? This links directly to the Charity Digital Code’s principles on risk, ethics, and digital culture.
So, in summary: decide carefully what AI is for in your organisation, be clear in your instructions, review outputs with critical thinking, and always use AI tools responsibly and ethically. These principles will help you lead your charity with confidence in an AI-enabled world.
So what does this actually look like in practice for your charity? It’s easy to feel like you need all the answers before you start. But really, this is about beginning the conversation and taking intentional, manageable steps forward.
Here are some practical ways you can start to responsibly integrate AI into your work:
- First, start the conversation in your organisation. Talk with your team, trustees, and volunteers. Ask: What do people already know or think about AI? Are they excited? Wary? Curious? You might be surprised by how many people are already using AI tools informally, like for writing drafts or generating ideas. Having an open conversation helps surface concerns, set shared expectations, and build confidence.
- Next, agree on your boundaries and principles. Use the 4Ds we talked about earlier and your charity’s values to set clear guidelines on how AI will (and will not) be used. Consider things like:
- What types of tasks is AI appropriate or inappropriate for in our context?
- How will we make sure AI-generated content is reviewed and adapted before use?
- How will we protect sensitive data and confidentiality?
- How will we check for bias and uphold your inclusion commitments?
This isn’t about writing a huge policy straight away; it’s about agreeing on some practical, shared principles to guide your team.
- Third, invest in learning and confidence-building. Encourage your team and trustees to explore basic AI literacy. There are plenty of free resources out there, from Microsoft, Google, Anthropic, and Charity Digital. Keep the focus on learning together, sharing examples, and reflecting on what works for your organisation.
- Fourth, capture how your charity is using AI and what you’re learning along the way. Set up a simple spreadsheet to record how AI is being used, what’s worked well, and what challenges have come up. This helps everyone in the team reflect on their experiences, spot risks or gaps, and share good practice across the organisation. It also gives you a foundation for setting policies and reviewing your digital strategy.
- Finally, start to integrate AI considerations into your policies and organisational strategy. Review your existing policies (like data protection, safeguarding, digital, and communications) and make sure they cover AI use where relevant. Make AI part of your ongoing strategy conversations: about digital, risk management, leadership, and service delivery.
And remember, this isn’t a one-off task. AI tools will keep evolving, and so will your experience with them. Stay curious. Keep learning as a team. Review your approach regularly. The goal isn’t perfection, it’s to make sure AI is helping your charity and that you’re staying in control of how you use it.
So, let’s recap. AI, especially Generative AI, is already shaping the world we work in. As charity leaders and trustees, it’s your responsibility to engage with AI critically and thoughtfully to make sure your organisation is protected, prepared, and making the most of the opportunities AI can offer.
We’ve talked about:
- What AI and GenAI are, and why they matter.
- How to use AI well, using the 4Ds of AI Fluency (Delegation, Description, Discernment, and Diligence) alongside key leadership principles from the Charity Digital Code of Practice.
- And the practical steps you can take to start (or build on) your organisation’s approach to AI.
The key takeaway?
You don’t need to be a tech expert. But you do need to engage. Start the conversation with your team. Agree your principles. Build your confidence together. And stay curious as the technology evolves.
For more detail on this topic, check out the guidance links at the end of the recording transcript. You can contact the Support Cambridgeshire team for guidance on all aspects of running your organisation at info@supportcambridgeshire.org.uk.
You are also welcome to contact me at Rapport Coaching to explore the bespoke support I can offer around leadership, strategy, and digital tools via flora@rapport-coaching.com or https://www.rapport-coaching.com/.
Further Resources
- AI with Intention: A practical guide to critical and safe use of AI for small charities
- The Charity Digital Code of Practice 2025
- AI Fluency: Framework & Foundations – free AI course from Anthropic
- Grow with Google: AI Training Overview – free AI courses from Google
- Introduction to AI Skills for nonprofits – free AI course from Microsoft
- AI for Everyone – free AI course from Deep Learning
- Charity Leadership Essentials – free videos from Zoe Amar Digital in partnership with Microsoft