Lately, I have noticed a pattern in meetings with clients and colleagues. At some point, the conversation always turns to artificial intelligence (AI). The concern is usually the same: are we heading toward a point where AI replaces communications teams?
It’s a fair question, especially given how quickly these tools have become part of our daily work.
For instance, findings from the 2025 Global AI Survey by McKinsey and Company show that 88 percent of businesses report regular AI use in at least one business function, with generative AI adoption continuing to rise sharply. This is compared with 78 percent a year ago.
In communications, that shift is already visible in day-to-day work. Most AI Tools are now routinely used to draft everything from internal memos to full campaign messaging.
According to the 2026 State of PR, 81.5 percent of senior professionals are very familiar with and actively using AI tools like ChatGPT and Gemini in ideating, drafting, and producing messaging.
There’s no denying the benefits. Work that used to take hours can now be done in minutes, and the output is often clean, structured, and easy to work with. But having spent years in this field, I don’t think job replacement is the real issue we should be worried about.
What concerns me more is something less obvious, but more damaging over time: the rise of what I would call fake authenticity. AI is very good at producing language that sounds right.
It can create content that feels polished, thoughtful, and balanced. The challenge is that many of these tools are trained on similar datasets, so they tend to produce similar outputs.
Over time, you begin to notice that messages across different brands and organisations start to sound alike. Everything is well-written, but very little feels distinctive.
As communications professionals, we know that effectiveness is not just about how something is written. It’s about the thinking behind it, the clarity of the message, and the audience’s understanding.
AI can support the writing process, but it cannot replace the judgment and context that give communication its meaning. While audiences may not always be able to explain why something doesn’t resonate, they can almost always feel it.
When a message is too generic or lacks a clear point of view, people disengage, and it fails to leave a lasting impression.
This becomes even more important in a context where trust is already fragile.
According to Edelman Trust Barometer 2025, trust in institutions remains under pressure globally, and audiences are becoming more selective about what they choose to believe. Therefore, in that kind of environment, sounding polished is not enough; one must ensure that the message feels credible, authentic, and grounded for them to communicate effectively.
We are all witnesses to this on digital platforms. Content that connects tends to be direct, clear, and human, even if it is not perfectly refined. On the other hand, overly polished messaging often struggles to gain traction because it feels distant or overly manufactured. In African markets, for instance, this challenge is even more pronounced.
Communication here is deeply shaped by culture, language, and lived experience. Tone matters. Context matters. Nuance matters. Yet many AI tools are trained primarily on Western data, which means that without careful adaptation, the output can feel generic or slightly out of place.
That is where the real risk lies. Not that AI replaces communicators, but that it gradually erodes the distinctiveness of how we communicate. It also creates a temptation to take shortcuts.
It becomes easier to generate content quickly rather than think through what really needs to be said. It becomes easier to rely on familiar patterns instead of developing original ideas. Over time, that can weaken the quality of communication, even if the output appears strong on the surface.
The way forward is not to avoid AI, but to use it more deliberately. It works best as a support tool, helping to structure ideas and improve efficiency, while leaving the core thinking to the communicator. It should help sharpen your message, not define it.
There is a need for a stronger focus on the point of view. In a space where anyone can generate content, the real differentiator is clarity of perspective. Audiences respond to messages that feel intentional and specific, not ones that try to say everything without saying anything clearly.
In the end, the role of communication has not changed. It is still about building trust and creating meaningful connections. AI can help us do that more efficiently, but it cannot do it for us. And in a world where so much content sounds human, the real advantage will belong to those who remain genuinely so.
Abel Muhatia is a Senior consultant in crisis PR, strategic communications, and reputation management.