It's 11:14 PM on a Tuesday. You just left the hospital after sitting with a family whose father is in the ICU. You're in the parking lot, engine running, and you open ChatGPT on your phone. You type: "Write a follow-up email to a church family after a hospital visit. Warm, pastoral, mention that we're praying."
Thirty seconds later, you have a perfectly fine email. Warm. Compassionate. Well-structured.
And something feels off.
Not because the email is bad. It's actually better than what you'd write at 11 PM with nothing left in the tank. The discomfort is somewhere deeper. You just spent two hours being fully present with a family in pain. Now you're outsourcing the follow-up to a machine. Where's the line? Should there be one?
That tension is worth paying attention to. Because right now, most churches are adopting AI much faster than they're thinking about it.
The Gap Between Adoption and Intention
Churches are using AI at a remarkable rate, but almost none of them have decided how they should be using it. Exponential NEXT's 2025 State of AI in the Church survey found that 91% of church leaders support AI use in ministry and 61% use it frequently. Yet 73% have no AI policy of any kind.
A Barna Group and Pushpay study from late 2025 (surveying 1,306 church leaders) puts the number even lower: only 5% of churches have formal AI guidelines.
Think about that. Nine out of ten church leaders are on board with AI. Fewer than one in ten have talked through what "on board" actually means for their context.
This isn't a technology problem. It's a leadership problem. And it's understandable. AI tools arrived fast, they're immediately useful, and the daily pressure of running a church doesn't leave much room for philosophical reflection. But "we'll figure it out later" isn't a strategy. It's how you end up with a volunteer using ChatGPT to write a condolence card to a grieving widow, and nobody realized that was happening until the widow noticed.
Why Existing Tech Ethics Don't Quite Fit
Secular AI ethics conversations tend to focus on bias, privacy, labor displacement, and misinformation. Those concerns matter for churches too. Barna's research shows 83% of church leaders are concerned about data privacy and 51% about plagiarism and message integrity.
But ministry adds dimensions that most ethics frameworks weren't built for.
When a pastor prepares a sermon, the struggle with the text is part of the calling. Ed Stetzer, writing for Biola's Good Book Blog, argues that the process of wrestling with Scripture is "formative for the preacher." Skip that process, and you might deliver a polished message, but you've bypassed something the Holy Spirit uses to shape both the sermon and the person delivering it.
The Vatican's January 2025 document "Antiqua et Nova" makes a broader point: AI must always respect "the supreme value of the dignity of every human being." The ERLC's Evangelical Statement of Principles on AI centers the same conviction. Human dignity. Both documents are worth reading. But neither one will tell your worship pastor whether it's okay to use AI to draft the weekly email newsletter.
That's the gap. The theology is sound. The practical application is missing.
Five Questions for Every AI Use in Ministry
Here's a framework. Five questions you can ask before using AI for any task in your church. They're not perfect, and your answers will differ from mine. That's fine. The point is to have a structured way to think through each decision instead of defaulting to "it's faster, so why not?"
1. Does this task require my presence, or my output?
Some ministry tasks matter because you did them. Sitting at a hospital bedside. Praying with a couple before their wedding. Looking a teenager in the eye and telling them they belong. The value is in your presence, not the words.
Other tasks just need to get done well. A volunteer schedule. A facilities request. A reminder email about the potluck. AI can handle output tasks without anyone losing something important.
The hospital follow-up email from the opening of this post? That's in the gray zone. The visit itself required presence. The email is output. But it's output connected to a deeply personal moment. Worth pausing on.
2. Am I shortcutting a process that's spiritually formative?
Sermon preparation is the clearest example. Only 12% of pastors say they're comfortable using AI to write sermons, and the instinct behind that number is sound. The hours spent sitting with a passage, reading commentaries, praying through application: that work shapes the preacher as much as the sermon.
But this principle extends beyond preaching. Writing a prayer for a specific person. Thinking through how to counsel a couple in conflict. Preparing a lesson for a small group. If the preparation itself is part of how you grow as a minister, outsourcing it costs more than time saved.
3. Would my congregation feel deceived if they knew?
This is the transparency test. If your church found out you used AI to draft the annual giving letter, would they shrug or would they feel misled? What about the welcome email new visitors receive? What about a blog post on your church website?
The answer varies by context, and that's exactly why you need to answer it deliberately. Barna's data shows that 49% of church leaders already worry about loss of authenticity. Your congregation is probably thinking about this too, even if they haven't said so.
My opinion: default to disclosure. "Our team uses AI tools to help draft communications, and every message is reviewed and edited by a real person." That sentence costs nothing and builds trust.
4. Does this protect or expose the people we serve?
Churches handle sensitive information. Prayer requests about marriages falling apart. Giving records. Counseling notes. Health updates shared in confidence.
Before typing any of that into an AI tool, ask who else might see it. Most consumer AI tools (ChatGPT, Gemini, Claude) use inputs to improve their models unless you opt out. That prayer request about a member's addiction could theoretically become training data. Eighty-three percent of church leaders are already concerned about data privacy, according to Barna. The concern is justified.
Practical rule: never put personally identifiable information about a church member into a consumer AI tool without understanding and configuring its privacy settings first.
5. Who benefits most: the people we serve, or our own convenience?
This is the motive check. Using AI to free up ten hours a week so you can make hospital visits, mentor young leaders, and be more present with your family? That's using a tool in service of your calling. Using AI to do the same work in less time so you can do... less? That's a different thing.
The question isn't whether AI saves time. It does. The question is what you do with the time it saves.
Where the Lines Seem Clearest
Opinions will differ on the gray areas, and that's healthy. But some categories seem fairly settled among the church leaders I've talked to and the research I've read.
Widely accepted uses: Scheduling and calendar coordination. Data entry and record keeping. First drafts of administrative communications that get reviewed by a person. Facility management. Social media post ideas (not final copy). Research and brainstorming.
Widely concerning uses: AI-generated pastoral care without human review. Unedited AI sermon content delivered as the pastor's own words. Theological counsel from a chatbot. Inputting confidential member information into consumer AI tools.
Gray areas worth discussing with your team: Visitor follow-up email templates. Giving campaign messaging. Blog and newsletter content. Bulletin announcements. Small group curriculum suggestions.
The gray areas are where your five-question framework earns its keep.
Writing Your Church's AI Ethics Statement
You don't need a 30-page document. You need 90 minutes with your staff and these five questions applied to the ten most common tasks where someone might reach for AI.
Write down what you agree on. Where you disagree, note the disagreement and revisit it in six months. Publish a simple statement to your congregation: here's where we use AI tools, here's where we don't, and here's why.
The ERLC has published a free guide for churches working through these questions. It's a solid starting point.
The churches that will use AI well are the ones that decided how to use it before a problem forced the conversation. That window is still open. But with 91% adoption and 5% having policies, it won't be open long.
Written by the Flowbudd Team. For more on church technology and leadership, subscribe to our newsletter for weekly insights that help your church run better.