Over the last twelve months, conversations within newsrooms across the globe have been dominated by the topic of artificial intelligence. While some journalists express concern that AI might eventually replace their roles, others hold the optimistic belief that it could revolutionize the industry overnight. However, the reality inside many newsrooms is far less dramatic: AI tools are being adopted at a surprisingly slow pace, with only minimal integration into daily workflows.
This hesitation is not due to a lack of available technology. Cutting-edge AI applications designed to assist with research, writing, and content formatting are readily accessible. Nor is it because journalists are unwilling or unable to learn how to use these tools; many are eager to embrace innovations that can ease their workload. Instead, the root cause lies deeper within the structural setup of news organizations, a challenge that remains largely unaddressed and often overlooked.
Recently, research published by Anthropic, the company behind the AI model Claude, shed light on this issue by analyzing millions of real-world AI interactions across various professions. Rather than speculating about AI’s future impact, the study focused on how AI is currently being utilized in workplaces. It then compared these findings to the theoretical potential of AI in different fields, revealing a striking disparity between what AI could assist with and what it actually does.
To clarify, theoretical AI exposure refers to the extent to which AI could technically support the tasks involved in a given job. For journalists, whose daily routine includes reading reports, conducting background research, drafting articles, and adapting content for multiple platforms, AI has the capability to assist with most of these activities. This places journalism among the professions with the highest potential for AI integration. Yet, the actual use of AI tools remains disappointingly low, highlighting a significant gap between potential and practice.
For instance, occupations in computer science and mathematics show a 92 percent theoretical AI exposure but only 30 percent actual usage. In business and finance, the potential stands at 85 percent, while real-world adoption is just 14 percent. Office administration roles have an 87 percent potential but only 11 percent utilization, and the legal sector, despite having 80 percent potential, sees a mere 6 percent actual use. When it comes to arts and media, which encompasses journalism, the theoretical coverage is around 62 percent, yet only about 13 percent of tasks are currently supported by AI.
Interestingly, this gap is consistent across many knowledge-based professions. Even in trades like construction, where AI’s theoretical exposure is only 14 percent, actual use drops to 3 percent. While there is little concern over AI replacing manual labor jobs such as plumbing, the alarm within journalism is disproportionate to the actual level of AI adoption. The fear of AI displacing journalists has grown louder, yet the practical application of AI tools remains minimal.
It is crucial to understand what AI cannot do before advocating for its widespread adoption in newsrooms. The technology is incapable of performing some of the most essential and nuanced aspects of journalism. AI cannot build relationships with sources, nor can it interpret the subtle cues during a tense interview with a government official. It lacks the instinctive judgment of an experienced editor who senses when to hold a story back for further verification. Moreover, AI cannot foster the trust within communities that leads to exclusive information and unique insights.
These human elements lie at the heart of quality journalism and remain beyond the reach of AI. Therefore, the fear that AI will replace journalists is misplaced. Instead, the real challenge AI poses is to the routine, mechanical tasks that often surround genuine reporting in many newsrooms. This includes rewriting stories from other outlets simply to meet content quotas or producing filler material that serves little purpose beyond occupying space. If AI makes it harder to justify such low-value work, it is not undermining journalism; rather, it is helping to refocus efforts on meaningful reporting.
The tools that could transform newsroom efficiency are already available and functional. Consider the traditional workflow: an eighty-page government report would require hours of painstaking reading to identify the key points. With AI, this process can be reduced to mere minutes, allowing journalists to quickly extract essential information while still verifying facts before publication. Similarly, a recorded forty-five-minute interview that once demanded several hours of transcription can now be converted into searchable text in under ten minutes, freeing up valuable time for analysis and storytelling.
Furthermore, adapting a single story for multiple platforms—such as websites, social media, and mobile notifications—used to take twenty minutes or more. AI can now accomplish this in just a couple of minutes. Editors can experiment with multiple headline variations in the time it once took to craft just one. These efficiencies do not replace editorial judgment but rather eliminate the tedious tasks that add no value to readers, allowing journalists to dedicate more time to investigation, verification, and thoughtful reporting.
Despite these clear benefits, many newsrooms have yet to embrace this shift. Reflecting on a previous experience, I recall a time when a news website’s traffic was declining because the digital content strategy was stagnant. The site was essentially republishing print stories online, with very little original material created specifically for digital audiences. I encouraged the team to produce more unique content tailored for the web, reaching beyond traditional readership boundaries. I also proposed integrating AI tools—not to replace reporting, but to handle time-consuming tasks like transcribing interviews and summarizing complex documents.
The response was immediate and resistant. Some team members viewed the use of AI as a threat to the integrity of journalism, dismissing it as “not real journalism.” Ironically, this was the same group that spent much of their time rewording stories from other outlets to fill content quotas—the very practice AI could help streamline or eliminate. This experience underscored a vital truth: authentic journalism is not defined by the absence of technology but by how journalists apply the tools at their disposal.
When journalists gather information, verify facts, analyze data, and produce content that serves the public interest, the quality of journalism is preserved regardless of the tools used. AI is simply the latest advancement in a long line of technologies—from search engines to transcription software—that have enhanced the newsroom toolkit over the years.
However, the misuse or careless adoption of AI can introduce new risks. A notable example occurred in Pakistan in November 2025, when a leading English-language newspaper published a business story on automobile sales that inadvertently included an AI prompt at the bottom, instructing the writer to create a snappier headline. This oversight quickly went viral on social media, highlighting a failure in editorial oversight rather than a technological flaw.
This incident serves as a cautionary tale: AI does not eliminate the responsibility of thorough editing and fact-checking. In fact, it raises the stakes, making it imperative for newsrooms to maintain rigorous review processes. When editors fail to carefully proofread content before publication, errors—whether factual or procedural—can slip through, undermining trust and credibility.
Another challenge lies in the underutilization of data analytics to inform editorial decisions. Recently, a colleague observed unusual patterns in website traffic, but the conversation did not progress beyond that initial observation. Critical questions about changes in search queries, the performance of specific pages, or variations in reader engagement across different story formats were left unexplored. Although the necessary data was readily available in analytics dashboards, it was not being leveraged effectively to guide content strategy or improve audience reach.
In conclusion, while AI holds immense promise for transforming journalism by automating routine tasks and enhancing productivity, its adoption remains limited due to structural barriers and cultural resistance within newsrooms. Embracing AI responsibly requires not only technological investment but also a commitment to editorial discipline and a willingness to rethink traditional workflows. By doing so, journalism can reclaim valuable time and resources to focus on what truly matters: producing accurate, insightful, and impactful stories that serve the public interest.
