Jump to heading

Guidance

Guidance for using artificial intelligence in fundraising

Jump to heading

The purpose of this guidance is to help charitable fundraising institutions comply with the Code of Fundraising Practice (the code) when using artificial intelligence (AI) for fundraising purposes. Follow this guidance whenever you are exploring, preparing for and using AI to carry out charitable fundraising. It is for each charitable fundraising institution to decide if using AI is right for them.  

In this guide, where we say ‘you’ or ‘your’ it means a charitable institution and/or its trustees, and/or a third-party fundraiser, as applicable. Where we say ‘donor’ we mean any potential or current donors. Where we say ‘Artificial Intelligence’ or ‘AI’ we mean generative and predictive AI.  

Exploring the use of AI in fundraising 

See more in the code, including sections 1 and 2 

Accountability and AI 

You are responsible and accountable for your use of AI for fundraising purposes, including that of your third-party fundraisers, and any output that is entirely or partly generated by AI. Be aware that some AI tools might generate incorrect, misleading or inappropriate content but you remain responsible for any outputs you use for your fundraising purposes.  

You must be aware from the start, for example, that AI could produce output that seems believable but is incorrect, known as ‘hallucinations’, or output that incorporates hidden biases leading to unfair, discriminatory or prejudicial output or outcomes.  

Make sure you are comfortable with the outputs of AI before using them for your fundraising purposes so that you remain compliant with the code.   

Trustee engagement and oversight 

Your board of trustees should be involved in making strategic decisions to explore, prepare for and use AI solutions and maintain ongoing oversight. Take all reasonable steps to avoid a situation where trustee understanding and knowledge about AI is concentrated among a minority of individuals, such as only one or two people. This will help improve engagement, decision-making and avoid any power imbalances.  

Supporting your trustees to be equipped to effectively scrutinise opportunities and risks arising from using AI will help your institution to grow in knowledge and confidence. See more in our guide to ‘Documenting your fundraising decisions’.  

Risk assessment 

Carry out a risk assessment proportionate to your intended uses of AI (however limited) and the tool you are considering, in the context of your fundraising activity. Avoid using any AI tool before conducting a risk assessment. Failing to do so might mean you have not fully understood the effects of using your chosen AI tool. There may be unintended consequences leading you to be in breach of the code.  

See more information about carrying out risk assessments in our guides to ‘Due diligence and fundraising’ and ‘Monitoring your fundraising partners

Protecting your organisation 

Even if you decide not to use AI for your fundraising purposes at this time, you may still need to consider how the use of AI by others could affect your fundraising activities. This includes by those you enter into agreements with, including any contractors, professional fundraisers and commercial participators. See more in our guidance on 'Guidance for charitable institutions working with commercial participators' and 'Guidance for charitable institutions working with professional fundraisers'. It also includes those who are unrelated to your fundraising.  

Be aware that bad actors may use AI to misappropriate your fundraising content for criminal, immoral or unethical purposes.  

Where proportionate to the risk and feasible for you, consider using appropriate methods of content credentialling (authenticating) your fundraising intellectual property to show your ownership.     

Preparing to use AI to support your fundraising 

See more in the code, including sections 1, 2, 4, 6 and 8 

AI policy 

Develop and agree an AI policy before using any AI tools to support your fundraising. This may be part of a wider AI policy or one that only relates to fundraising.  

To maintain trust and transparency we recommend you publish your policy on how you use AI in your fundraising activities accessibly on your website. This is so donors know when you will or may use AI, for example to wholly or partly create content, make or inform decisions, carry out tasks or any other uses relating to your fundraising, and where you have not. This is particularly important, for example, where any AI generated content could be mistaken by donors as being real situations or people - such as sound, video and photographic images, if your AI tools use personal data, or if you use chatbots for fundraising purposes.  

Keep your AI policy regularly under review so it remains up to date and reflects all your uses of AI for fundraising purposes. 

User skills 

Anyone involved in implementing, managing and using your chosen AI tool for your fundraising purposes should have adequate knowledge and skills for your intended purposes before using it. Bear in mind that knowing how to use one AI tool may not be relevant or sufficient for using other AI tools. 

Testing or piloting AI tools

We recommend testing or piloting your use of AI tool before implementing it and reporting your findings to your board of trustees. This is so they can make a decision about whether it should be used for your fundraising purposes.  

Testing or piloting is especially important where you have identified there are risks associated with using a particular AI tool for a particular purpose. When doing so ideally it will be in a closed or small-scale environment that replicates the real world but does not affect your actual fundraising activity. This will help you evaluate the benefits and risks before deciding whether to implement an AI tool more fully and widely.  

Data management and security 

At all times make sure you know how any AI tool will use, store and share the material you use to train and prompt it. Make sure you understand how the tool will interact with any other IT systems you use and assure yourself that it will not compromise the security and privacy protections you already have in place.  

You may need to update your policies and strengthen your cyber security and other protective measures to take account of any AI tool you will use.  

Where it applies, ensure your privacy policy sets out how you will use AI to process personal data in relation to your fundraising. See more in our guidance on data protection, from the Information Commissioner’s Office on using AI, and the UK Government’s Code of Practice for the Cyber Security of AI

Using AI to carry out fundraising  

See more in the code, including sections 1, 6 and 8 

Transparency  

It is important to uphold trust and accountability and grow donor and public confidence in the safe and fair use of AI in fundraising. Be transparent with your donors about your use of AI. The amount of transparency you provide should be in proportion to the risk of misleading donors. The greater the risk of misleading donors through or about your use of AI the more transparent you should be.  

Depending on your specific AI uses, sometimes your published AI policy and/or data protection policy will be sufficient transparency. At other times it may be necessary to provide precise information relating to a specific instance of using AI use so as not to mislead donors.  

Human oversight 

Make sure you have a process in place for a human(s) to check the accuracy, fairness and legality of any content generated by AI, before using it for fundraising purposes. It is also important you have measures in place to monitor and audit AI uses where there is no immediate human oversight, for example when using AI-powered virtual assistants like online chatbots or telephone voicebots.  

Where proportionate to the risk, keep a record of the checks you carry out and the decisions you make in relation to the AI content you generate. This will help you to justify your actions and any decisions you make based on your AI uses if you receive a complaint or if we investigate your fundraising.  

See more in our guides to ‘Documenting your fundraising decisions’ and ‘Due diligence and fundraising’. 

Training AI tools 

We recommend you do not feed information not already in the public domain into publicly available or open access AI, including when training AI systems or making AI prompts. We also recommend you do not use AI tools to make automated decisions relating to your fundraising without an adequate level of human oversight.  

Legal assurance 

Always check that any AI content used for your fundraising purposes is accurate, legal and you have the right to use it. Take professional specialist legal advice where you need to.  

Some uses of AI might risk legal challenge, for example under intellectual property law, contract law, or data protection legislation.  

Always make sure any decisions you make based on AI content are in the best interest of your charitable fundraising institution and uphold the reputation of fundraising.  

Defining key terms 

These definitions do not amount to legal advice. They are intended to be indicative in the context of this guidance to help charitable fundraising organisations to follow this guidance. 

Artificial Intelligence (AI): Computer-based systems using varying levels of autonomy that can carry out tasks that humans normally carry out.  

Bad actor: A person responsible for intentionally malicious or harmful actions that may be illegal, immoral or unethical. 

Content credentials or credentialling: Authentication data securely embedded into digital content to show origin, ownership, copyright and other relevant data. 

Generative AI: A type of AI based on large language machine learning that can create new content, including text, pictures, sounds, video, and computer code.  

Hallucinations: Plausible but inaccurate or incorrect content generated by AI. 

Human oversight: Human involvement at some or all stages of AI development and use for a variety of purposes that is proportionate to the risks and in accordance with legal requirements. 

Intellectual property: Owned intangible property, such as ideas and creative content. 

Predictive AI: Using AI to analyse data to identify patterns, trends and make predictions, also sometimes known as predictive analytics. 

Last updated: