Dan Kelley
Dan Kelley

A few months ago, when the ideas just weren’t flowing and I needed some creative defibrillation, I started experimenting with AI-generated content.

I was hoping that AI would make writing easier. But the results fell well short of the level of professional copy that any of us would share with clients. The results were too generic, often off-point and sometimes bizarre.

But I came to the conclusion that, despite this, AI-generated content is already quite pervasive.

Take, for example, a chicken soup recipe. A search on Google or Bing will return thousands of results. The algorithms decide what appears at the top. They prioritize articles that are long and articles where readers scroll deeply into the page. These signal to the algorithms not that the recipe is good, or even edible, but that the content itself is “useful.”

I’m convinced that a good amount of this copy has been generated by AI. It’s repetitive, vapid, and bland. But if a creator can slap a chicken soup recipe at the bottom of a 2,000-word behemoth and readers scroll past it, search algorithms confuse this for quality. It isn’t meant to be read. It’s merely meant to fool a search engine into believing that the content was useful.

OpenAI’s ChatGPT changes everything. The copy it generates is actually pretty good. It’s still bland, but it’s far more serviceable than previous generations of AI.

Framing the Discussion

It’s time for the PR industry and its clients to think about what the AI content future could look like:

The big question first. Will AI come for our jobs? I’m not worried. The most likely scenario for content creators is one in which humans and tools like ChatGPT work in tandem. When it comes to AI-generated content in the future, we might rely on AI’s to offer a competent first draft. Our role will be to give voice and personality to content that can only be informed by a close working relationship with companies.

Our clients hire us for our communications and messaging expertise, industry experience and judgment, and AI can’t replicate that. We’ll retain our roles as trusted advisors.

What about copyright and originality? A big question in the world of AI content is whether it’s just plagiarism by another name. Because a lot of AIs are black boxes, the real answer is that we don’t know how it arrives at the results it offers, and therefore can’t entirely rule out a claim that the work is less-than-original. There are a lot of other legal and ethical considerations along these lines. But these issues will likely get ironed out over the next three to 10 years. For most agencies, I think we’ll see an environment in which AI is deployed selectively, while developing competencies in this area.

On reputation. Assume for a moment that the legal and ethical challenges have been resolved. There remain other challenges. Compliance with OpenAI’s content policy requires an acknowledgement of its use. Would there be reputational issues associated with that? Would CEOs on whose behalf we craft thought leadership want to share a byline with an AI? Would publications accept articles that contain AI-generated content?

What it means for search and news. Over the past few years, news organizations and blogs have devoted significant resources toward answering simple, frequently searched questions like “What is ChatGPT?” or “How can I lose weight?” Even the New York Times does it. While evergreen articles may not have the immediate traffic juice of a Kardashian exclusive or political scandal, they have long-term search value against which to sell advertising.

Evergreen content strategies are probably headed for extinction. ChatGPT makes content so easy, and so inexpensive to produce that the coming years will see a flood of it. There will almost certainly be a change in search algorithms to account for it. And there’s nothing stopping search engines from simply answering these questions themselves using ChatGPT (or even generating chicken soup recipes without making readers slog through a 2,000-word Sumerian history lesson). Users would almost certainly benefit.

On style. AI produces content that is, for the most part, generic. There are contexts in which commodity content is completely appropriate.

At the same time, content that is authentic, has a clear and distinctive voice, that delights and surprises, that includes details and expertise from human experience will rise in importance. Ironically, the folks working on search algorithms will likely need to train an AI to recognize content as such.

Humans will still have to decide which situations call for creativity, and which call for commodity.

AI can’t provide the value that we provide. In our work with businesses, we become extensions of our clients. We know their industries and their nuances. We build strong relationships with executives, knowing their voices, and their values. We help shape their messages and thought leadership to best be received by the audiences they need to reach. None of this can be farmed out to an AI.

Where I’ve used AI. I’ve never used AI in client content. ChatGPT is certainly capable of providing client-ready copy, but the legal and ethical issues make me wary. One area where it’s been useful, however, is in “brainstorming” questions for subject matter experts. I recently asked ChatGPT to come up with six questions for a briefing. Four of them were usable, and only one required a little bit of editing. I have tried to coax blog post ideas from it, but the results were still too generic. I imagine that AI might be good for more complex tasks in the not too distant future.

A Bland Conclusion

I toyed briefly with offering my grandmother’s chicken soup recipe as proof that this essay was not written by an AI.

But my grandmother wasn’t a great cook. So I put the headline of this piece into ChatGPT’s interface. I offer you this essay, as written by ChatGPT and completely unedited:

‘The use of artificial intelligence (AI) to generate content is expected to have a significant impact on the PR industry. AI-generated content, such as news articles and social media posts, can be produced quickly and at a large scale, potentially making it a cost-effective alternative to human-generated content. This technology could also be used to personalize content for specific audiences, making it more engaging and effective.

However, the use of AI-generated content raises concerns about the potential loss of jobs in the PR industry, as well as the potential for the spread of misinformation. It will be important for PR professionals to stay informed about the development and use of AI in content creation, and to consider the ethical implications of this technology. Additionally, PR professionals will need to adapt their skills and expertise to continue to provide value in an industry that is being transformed by AI.’

It’s bland like raw tofu. You could swap PR with the name of almost any other content-dominated industry and the results would be just as accurate. But it’s also absolutely pitch perfect for certain situations, and it took less than 20 seconds to produce.

It’s a starting point—one we all need to think about.

***

Dan Kelley is content director at FINN Partners.