Marketing & communications, AI (Artificial Intelligence) Hub

Getting Started With AI: Practical Tips for Small Charities

In this article, Ian Pembridge explores how many small charities are hesitant or slow to adopt AI despite growing pressures on their capacity and funding. He offers practical tips from DSC's own experience, such as starting small, forming working groups, and prioritising data protection, to help charities begin using AI confidently and responsibly.

Artificial intelligence (AI) usage is on the rise but according to Neighbourly’s Spring 2025 Community Survey, many small charities have not yet taken the leap. The survey found that 79% of the charities surveyed across the UK and Ireland are either not using AI or have only just started exploring it. 

Many charities are experiencing an increased demand for services yet struggling with capacity and funding. The Neighbourly report suggests that AI could help relieve some of this pressure by supporting administration, communications, and fundraising. These are areas in which even small improvements could free up valuable time and resources. However, barriers remain including a lack of technical skills, confusion over how AI works, limited financial resources and concerns about data privacy. 

At DSC we have started exploring AI ourselves and have written these five tips based on our own experiences. Hopefully they will help your charity take its first steps into AI with confidence and care.   

1. Start small 

If your organisation is keen on trying AI but is unsure where to begin then try starting on some small tasks first. This could be using AI to summarise meeting notes, draft emails, or tidy up fundraising data. Using AI on straightforward tasks can still save your organisation time while also giving a sense of what is possible without needing a large budget or technical expertise. Small wins can build confidence, show what works and get colleagues on board. AI may feel overwhelming to some but starting small can make it feel more manageable, and most importantly safe.  

2. Form a working group 

AI isn’t just an IT issue, it also touches on ethics, operations, communications and service delivery. That is why we formed an AI working group here at DSC to help guide our AI adoption. The group brings together staff from different teams and management levels, creating a mix of perspectives. It is responsible for exploring AI tools, assessing risks, guiding AI policies and procedures and sharing learning across the organisation. Not everyone has to be AI experts – just curious and willing to get involved. Taking time to work through things as a group helps improve decision making and creates shared ownership of the outcomes.   

3. Map your needs before choosing tools 

Before starting to use any AI tools at DSC we first asked each team to list areas in which they thought AI could help them work more effectively. These were usually tasks that were repetitive, time consuming or hard to keep on top of. Once we had mapped our needs, we began researching tools that could solve these problems. Starting with your challenges and not someone else’s solutions means you’ll be more likely to find an AI solution that is useful, sustainable and affordable.   

4. Prioritise data protection 

Using AI can make data processing significantly more efficient which is why many charities are exploring its use for donor data. Charities have a duty to protect personal information so before trying out a new AI tool ask: what data is being used, where is it going, and who can access it? Any AI tools you use should be compliant with GDPR and any internal data protection policies your charity has. Sensitive information should be handled cautiously and any suppliers should be carefully vetted. Protecting people’s personal information protects their trust, and your reputation. 

5. Ensure human oversight 

AI should never be allowed to make important decisions without any human oversight in place. It may be good at writing things and processing data but it doesn’t always understand context or nuance and can get things wrong. Ensure someone is responsible for reviewing and approving its output. There should be clear responsibility for AI oversight and a plan in place to intervene in case any issues arise. Keeping humans in the loop is one of the best ways to use AI safely and responsibly. 

If you would like to find out more about using AI in your charity, why not join us at The Charity AI Conference: Setting Yourself Up For Success?