Doors Open AI

Here are the other industries fast-adopting AI functions.


Highest rate of adoption

18.1% information   

12 % professional
scientific and technical services

9.1% educational
services

Lowest rate of adoption

1.4% construction

1.4% agriculture, forestry, fishing, and hunting

1.5% transportation
and warehousing


Data taken from the U.S. Census Bureau’s Business Trends Outlook Survey “AI supplement,” which combines estimates of their data from December 4, 2023, to February 25, 2024.

If you’re regularly conducting serious Bible study, or you’ve taken a formal theology course in the last decade, you’re likely familiar with Logos Bible Software. Actually, chances are you’re one of its two million users. If you’re not familiar, here’s the deal: Logos is basically a digital library of Bibles and around 250,000 Bible-related reference works. The selling point is that a few strokes of your keyboard produces the kind of research that once took hours, to days, and untold miles of travel.

By now, it goes without saying that any business that depends on electricity is rolling out something with — or at least labeled with — artificial intelligence. Logos is no exception. Yet the stakes — and opportunities — seem different for a tool in many cases accessed for the spiritual growth of users and those they influence, don’t they? The addition of AI at least presents new questions about what happens when technology mixes with Bible study.

In December, Common Good editors posed some of these questions to Logos’ chief product officer, Phil Gons.

You’re a tech company just as much as a research company. How do you evaluate technologies for use?

We evaluate all technologies through a biblical worldview that considers mankind’s creation in God’s image, the creation mandate to rule over and care for the world he’s placed us over, the Fall, redemption, and the long-awaited restoration of all things. Technology is the application of our God-given knowledge to solve problems and steward the created order, and every technology can be used for great good or great evil. We consider the risks and rewards, the benefits and the drawbacks, to each new technology, and we make every effort to ensure that our users will use it responsibly.

Everybody wants to talk about AI. It’s the latest and greatest technological advancement that everyone is consumed with, and it deserves to be talked about. But generative AI is merely the next evolution in information technology. Aspects may be unique, but at one level it’s just another application of our God-given skill and ingenuity to solve problems — just like humanity has been doing for millennia.

And you’re integrating AI functions into your search?

Search is one of the most important parts of Logos. In the past we had what is called lexical search, which focuses on the words or phrases you’re interested in finding. If you search for a term like “w-i-n-d,” it could mean the verb “to wind” or it could mean “the wind” that blows, and lexical search doesn’t know the difference between them. It’s just going to find the term, and then you have to sort through to find what you’re looking for.

But semantic search is different; it’s interested in meaning not in terms. That’s what Google’s been doing for a long time. People have wanted Logos to be able to search like Google for many years. Google has vastly more engineers and resources; it’s hard for us to keep up with them and major tech companies. But AI has really changed the game, because everyone is opening up access to their APIs. We can now finally deliver a semantic search experience that finds what you’re looking for without requiring you to search with the right words. This enables users to ask questions about the Bible or theology and get relevant results back regardless of the particular terms they use.

Is there a part of integrating this technology into your product that concerns you?

One risk that comes with AI is plagiarism, stealing the ideas and words of someone else and falsely presenting them as your own. That’s not unique to AI: plagiarism isn’t new. Sometimes it happens accidentally; sometimes it happens on purpose. AI makes it easier to accidentally plagiarize, because it often obscures the source of the ideas and the words that it generates.

Our use of AI anticipates this risk and makes an effort to ensure that the user knows where the ideas and words come from, taking them to the sources, which they can quote or paraphrase and cite appropriately. How to reference AI is an evolving space, and we’re still working some of this out. But we want to make it easy for users to cite their sources and acknowledge appropriately when they utilize AI in some form.

Can you say more about that?

We know how to footnote someone else’s work. Do we yet know how to cite help we get from AI? With text, generative AI is an advanced form of auto completion. Today, when you use just about any editor, like an email editor or a word processing editor, it’s going to try to complete your phrase or your clause based on patterns. That’s a large language model making predictions based on the words you just typed. If you hit the “tab” key and accept an auto completion, do you need to cite that? What if you weren’t going to type those exact words? Do you need to credit AI? How much text does AI need to generate before you need to give it credit? There’s not an easy answer to that question yet. But if you go to AI and say, “Generate me a sermon on this or that topic,” and then copy and paste it and use it as is, you’ve crossed a line.

We want to help people take thoughtful, cautious steps in using AI primarily in places where it’s safe, which is helping people find information, learn, and then create something from what they’ve learned — rather than creating the outputs for them. We’re focused primarily on information retrieval and ideation rather than content creation.

And then there’s the issue of study habits, of making Bible study something just to knock out. Could there be a threat there, too?

As with all forms of technology, it can promote laziness. One of the benefits of virtually every technology is that it saves us time. When it comes to our use of technology, the question becomes what do with that extra time? Do you go watch more Netflix or swipe through TikTok videos, or do you take that saved time and engage more deeply and thoughtfully with Scripture because you can do more Scripture engagement in a smaller amount of time? Or do you use that to go minister to people, to spend time with your family? If you’re a busy pastor, do you have more time to counsel and visit people, or do you have more time for the golf course? Technology is an enabler, and it’s up to us to use what it enables to love God and others more.

One of the concerns I’ve heard from ethicists is the inability for most people to factcheck AI functions, or even fully understand some of what these algorithms do. How do you mitigate that?

Citing the sources for where the ideas in our search synopsis come from — and highlighting the relevant section in the book — is critical to our responsible use of AI, as this allows users to dig in and verify for themselves that these ideas weren’t made up by AI. We’re very explicit about telling users when we’ve used AI to produce results, and we make it clear that the output may not be comprehensive, accurate, or relevant. We encourage our users to use discernment and check the sources for themselves. We see AI as a way to get users pointed in the right direction with access to the most relevant information faster, so they have more time to study — or serve.

It’s worth remembering that human authors are fallible, too. Just because you found it in a book doesn’t mean it’s true. We also need to be responsible and check the sources behind human-generated content. Human authors aren’t inerrant anymore than machines are. One of the things that we try to do in general is encourage people to be like the Berean Jews in Acts 17, who questioned Paul and searched the Scriptures to see if the things he said were true. Don’t take AI’s word for it — or any human author’s, for that matter. Dig in and validate it for yourself.