What is all the AI fuss about?
It’s no surprise that people in nonprofits are using AI tools. AI is everywhere, and it seems it arrived everywhere so fast. According to UBS, Chat GPT acquired more than 100 million users in two months.
And it’s easy to see why AI received so much uptake. When the first Generative AI tools arrived on the web, people suddenly found they could get quick, detailed and very readable responses to prompts as widely ranging as:
- “What should be in an agenda for a nonprofit board meeting?”
- “How do I make the perfect lasagna?”
- “Who was to blame for the American Civil War?”
It was the kind of technology that made teachers wonder if they’d be able to set an essay for homework ever again.
Generative AI also held new possibilities for organizations looking for innovative ways to work across all departments. After all, AI’s abilities were far-reaching:
- Translation
- Image creation
- Summarizing what happened in a virtual meeting
- Creating code for web and mobile apps
- Summarizing research findings and presenting them in a graph
- Creating outlines for documents, such as reports or safety briefings
- Editing and suggesting improvements to existing text
- Suggesting potential outlines for fundraising campaigns, membership drives, etc.
- Finding the answers to questions about laws and legislation
Inevitably, AI tools have become an important issue for nonprofits. For some, AI has made their workload seem a lot less intimidating. For others, AI has thrown up a number of very legitimate questions. And for many, AI has become one more way through which they can make a positive impact in the world.
Why does AI matter for nonprofits?
AI, when aligned with the human-centered goals of nonprofits, can be a tremendous force for good.
The Google.org accelerator program, for instance, supports nonprofits in developing high-impact applications of generative AI. These include the World Bank – which is looking to make research more accessible – and Jacaranda Health – which is using AI to scale digital health services for mothers in underserved areas.
AI isn’t only for the big organizations developing new AI-powered solutions to the world. It’s for any nonprofit organization that wants to speed up routine tasks so they can focus more on strategy, building relationships and decision making.
For example, AI can be very helpful for:
- Fundraisers looking to quickly summarize the results of their last campaign.
- Marketers looking to send personalized content out regularly to members.
- Admin assistants looking to speed up day-to-day comms and notetaking, so they can use their eye for detail where it’ll matter most.
Let’s talk about it: “Is AI going to mean I lose my job?”
As far as we can tell, you don’t need to worry about your job. According to the World Economic Forum, most of the professions set to be impacted by AI are bank tellers, cashiers, clerks, and accountants. And the more you’re ready to keep learning (not just technology skills but other skills, too), the more indispensable you’ll continue to be.
Your role can still be vital. Just look at the list above. It shows the World Economic Forum’s analysis of the top 10 skills on the rise, and the skills that will grow in importance most between now and 2027 are:
- Creative Thinking
- Analytical Thinking
For all its qualities, AI still can’t actually think – for now this is a job solely reserved for humans.
This is what leaders in the nonprofit sector are saying. As Amy Sample Ward, CEO of the Nonprofit Technology Enterprise Network (NTEN), told the Stanford Social Innovation Review. “I think one of the simplest and most important guidelines is that tools should not make decisions. That’s been a core part of NTEN’s internal approach.”
I think one of the simplest and most important guidelines is that tools should not make decisions. That’s been a core part of NTEN’s internal approach.”
Amy Sample Ward, CEO of the Nonprofit Technology Enterprise Network (NTEN)
So if you’re someone who wants to make decisions in your organization – thinking creatively or analytically about how you can impact the world – you can probably stop worrying about the job market and start thinking about how AI can give you more tools to make an impact.
An example of how a nonprofit can use ChatGPT (and similar tools)
With an AI-chat tool like ChatGPT or Bing Copilot, or even better, Boardable’s Consultant specifically trained for nonprofits, you can quickly get drafts of material you might use in your regular work.
Here’s a prompt we gave ChatGPT 4.0. The aim was to get a first draft and rough structure for an email to a high-value donor.
“Can you write an email to a high-value donor. Write from the perspective of a major sports charity for disadvantaged young people which is looking to build upon successes of the last 12 months.”
When the initial response wasn’t quite what we were looking for, we added more prompts to help get the output we wanted:
- Can you write it with a slightly less formal tone?
- Can you include that we helped 7,562 underserved young people last year?
- Can you invite the donor to visit one of the sports clubs they’re helping through their ongoing support.
The full response is too long to include here, but here’s some of what it came up with:
Hi [Donor’s Name],
I hope this email finds you well! I wanted to reach out and personally thank you for your amazing support over the past year. Your generosity has truly made a world of difference for the young people we serve at [Charity Name].
With your help, we’ve had a fantastic year:
- Reaching More Kids: We’ve been able to support 7,652 underserved young people, giving them opportunities to thrive through sports.
- New Spaces: We opened two new sports centers, creating safe and inspiring places for our participants to grow and play.
- Real Impact: Our surveys show that our participants are healthier, doing better in school, and feeling more confident.
We’d love to show you firsthand the impact of your generosity. Please consider this an open invitation to visit one of our sports clubs. It would be great to have you meet some of the young people whose lives you’ve helped change.
Thank you again for being such a vital part of the [Charity Name] family. Together, we’re making a real difference through the power of sports.
Best regards,
[Your Full Name]
[Your Position]
[Charity Name]
[Contact Information]
[Website URL]
How you can improve the content AI produces for your nonprofit organization
Now the example of a donor letter above isn’t a bad starting point considering we had a blank page a few seconds earlier, but there are a couple of issues with it. For one, ChatGPT made some data points up. And second, the output is a little generic for an email that should be personalized.
That’s totally normal for AI. And often you first few attempts at generating text or images with AI will be off the mark, but there are ways to refine what AI produces for you.
To get closer to the type of content your organization will want to use, you can:
- Refine your prompts and learn the art of “prompt engineering”
- Edit the produced text yourslf
- Copy an existing article or email into the prompt box and ask the tool to “write in this style”
For AI-images you can also try the following:
- Use an image editing tool to fix an AI-generated image
- Use your own images instead of a prompt for reference (this is possible with tools such as Adobe Firefly)
More AI tools you can use for drafting
As published in the We and AI report Grassroots and nonprofit perspectives on generative AI, a survey found that most nonprofits are using AI for advertising, marketing, PR and comms. And this is where most AI tools seem to put their focus.
iWave’s NonprofitOS, for instance, is a fundraising tool that uses ChatGPT-like functionality tuned for the nonprofit sector. While Grantable works in the same way but specifically for grants. While Gravyty’s Raise combines text generation with CRM-like analytics to not only draft donor letters, but to recommend who you should send them to and when.
Some larger nonprofit organizations are committing developers to integrate AI with their own recruitment software. An internal recruiter can ask the recruitment tool to help them hire a new policy worker, for instance, and the tool will suggest role-appropriate content for the job description, along with interview questions.
How you can use AI to access and summarize information
In the We and AI report we mentioned earlier, the second most popular use of AI in nonprofits was found to be research and development. And there are a number of tools available to help nonprofits here, from workplace integrations like Google Gemini to tools that will access external research like Census GPT.
One worker in a government organization tells us how they use Microsoft CoPilot (available through Microsoft 365) to scrape and provide summarizations of documents in their database. They say it’s been really handy to interrogate a mass of information and bring it together in a really clear way.
Similarly the National Geographic Society uses AI-powered image recognition software to catalog images so they can retrieve them again with ease. If your organization has a large archive of images or other data, a similar tool (whether it’s Google Cloud Vision or AWS’s Amazon Rekognition) could be vital.
Another use is in sentiment analysis. If you have a number of free-text responses in a survey, AI tools can help you to identify patterns quickly – though you’ll undoubtedly want to verify the findings thoroughly if you’re looking to publish them or make business-critical decisions.
How you can reduce the risks of AI misuse at a nonprofit
While it’s good for nonprofits to experiment with AI, you want to reduce the risks involved. So your organization will want to appoint someone to provide oversight of your nonprofit’s use of AI – and perhaps you’re the right person for the task!
You’ll want to develop policies that guide your organization in AI’s proper use. Here’s a few checks to consider if you’re just getting started:
1. Always check AI-generated content for accuracy
Even ChatGPT’s welcome message warns that it’s prone to error, so AI’s fallibility is no secret. And it’s not just AI-generated text that can hallucinate. Errors are also common when generating images. For example, AI will often misspell words on that image. And sometimes bod proportions will be out of proportion.
For instance, the image below was created using Adobe Firefly, an image generation tool. And…just look at the hand in the bottom left.
Margin for error is not a major problem in and of itself. You just need to ensure you have a process for checking the details are in order before using AI-content in research or public communications.
2. Define when AI tools are appropriate for content creation
Here’s the thing: AI outputs will be based on the data the algorithm is trained on – and you can’t control the data it’s trained on.
If accurate, authentic image depiction is going to be more important than speed, you might be better off arranging to use an actual photo rather than generating a mockup of one.
Alternatively, you might be better to depict a situation in a less-photorealistic way.
For instance, when we asked Adobe Firefly to provide an image of “A mangrove conservation project by the coast in the Philippines”, the images it produced were uncanny – highly realistic but incorrect in a few crucial ways.
We then changed the image settings so Firefly would produce the same image in the style of a painting. The result was something that offered a far more sensitive depiction of its subject matter, even if the leaves look a little autumnal for a tropical region – an issue that might be fixed by further prompt.
3. Define when research must be done by a human
Some policy and research workers in nonprofit organizations are using AI to analyze government guidance documents and to summarize their contents. Others would argue that this level of scrutiny and margin for error is unacceptable. You’re going to have to decide as an organization where you’re going to draw the line.
One manager in a major mental health charity in London told us they’d been looking into using an AI chatbot as a qualitative research tool. After weighing the issue, however, the organization decided against using the tool for safeguarding reasons – for now.
This is just the beginning of AI development – so stay tuned
Generative-AI technology is still in its infancy. But just as children can suddenly develop rapidly and all at once, AI is going through its own growth spurt. If you try out an AI tool once and then return to it a few months later, you’ll notice just how much it’s growing up.
Software platforms that embed AI into their products – Boardable, for instance – are putting safeguards in place to protect against many of its potential pitfalls. You shouldn’t be dissuaded from using AI, you just need to use it with your eyes wide open.
Visit here for more details on the latest innovations at Boardable.