Friend or foe? Exploring AI’s impact on broadcast journalism

Posted on May 21, 2025 by FEED Staff
Artificial intelligence is making its mark on just about every industry. We hear from four industry experts on how best to move forward and leverage it for innovation
Words Katie Kasperson
While it’s been top of mind for years, it seems like artificial intelligence has finally hit a critical juncture. Companies are considering how and when to use it – but use it they feel they must, even if merely to keep pace with whatever industry they belong to. A contentious topic in the latest SAG-AFTRA and WGA strikes, AI feels most threatening to the creative fields (journalism most certainly included), and with recent waves of layoffs at companies like CNN, NBC News, ABC News and Fox Entertainment, it’s easy to understand all this existential dread.
Advancements in AI present exciting business opportunities and infinite potential for innovation. With technological evolution always comes uncertainty and perhaps an exaggerated fear of the unknown. It also reveals lots of new moral dilemmas, the likes of which we’re wrestling with in real time.
We’re also witnessing a surge in both public and private discussions about AI, and it’s these very conversations that will shape how we ultimately adapt to it.
In the interest of healthy debate, we invited input from tech employees and broadcast journalists on the risks and rewards of incorporating AI. What follows is largely agreement from both sides, reassuring us that the ethical considerations around the use of AI seem to be of exceptional importance to everyone.
Transformational impact
AI – artificial intelligence – can be difficult to define. It often conjures up distinct meanings for different people. Across the board, though, there appears to be a consensus: it holds the potential to transform how we work, consume information and interact with the world around us.
“This year, AI has taken significant strides towards becoming an integral part of our professional and personal lives,” begins CGI vice president Michael Pfitzner. “While apocalyptic predictions about its impact are unlikely to materialise any time soon, one thing is clear: AI will profoundly reshape our world, and at a rapid pace. It will be much like electricity, mobile phones, microcomputers and the internet once did, but on an even greater scale.”
Myriam Samake, multimedia journalist at KTAL NBC 6 News, shares Pfitzner’s positivity. “AI is fascinating,” she states. “I feel as though we were told AI would happen when I was a kid, and now it’s here,” she says, acknowledging that she – like almost everyone alive today – ‘got to live in a world without AI too’.
Samake, while excited about AI, suggests that it does have a dark side. Tyler Soso, Emmy Award-winning associate producer at the MLB Network, shares this nuanced perspective. “Today’s AI comes in several forms, so it’s hard to be fully for or against it, as I’ve seen both positives and negatives in its use,” he shares, mentioning misinformation as a potential drawback. “While AI can be an extremely useful tool in research and knowledge-gathering, I also believe that, in some ways, it is doing more harm than good.”
Dave MacKinnon, vice president of product management at Clear-Com, has a different view. “AI has incredible potential to improve how we work, especially in journalism,” he believes. “It’s not about replacing people, but about making journalists more powerful; it’s a ‘force multiplier’. When used thoughtfully, AI can support creativity and efficiency in ways previously unimaginable.”
From ink to algorithm
Should AI have a place in broadcast journalism and, if so, what place is that? All four interviewees believe that the answer, in some capacity, is yes, and they also largely agree on how it should be used. For instance, Pfitzner argues that AI can “automate tedious tasks, freeing up journalists for creative and investigative work.”
More specifically, he continues, “AI could help draft new forms of content like summaries, newsletters or audio scripts. It can also analyse large volumes of source material to identify promising stories; this includes datasets, PDFs, document translation, as well as monitoring social media, government websites and financial reports.”
MacKinnon adds that AI is best used “as an assistant, doing heavy lifting behind the scenes, handling repetitive tasks like transcription or analysing large amounts of data quickly, so reporters and producers can focus more on the big stories.”
Soso echoes much of what’s been said, naming ‘research, article ideation, fact checking and data analysis’ as possible applications of AI; while Samake sees it more as a jumping-off point, helping strapped-for-time journalists with generating or summarising ideas. AI can be particularly beneficial for smaller newsrooms that are understaffed or low on budget.
Despite AI’s potential positives, maintaining credibility is the highest priority for journalists, with Pfitzner suggesting that “using AI for creative content generation carries inherent risks around eroding trust.”
Other words, like transparency and plagiarism, also came up in conversation, with Soso arguing that “using ChatGPT to generate full articles to then pass off as your own is not something that should be encouraged.” He suggests regulation to avoid this. Meanwhile, Samake argues that AI’s use should always be flagged. She believes the technology “should never be used to pass as a genuine piece of media. For example, to generate a video or photo and claim that it is ‘real’. This leads to misinformation and distrust.” In a world where many already lack trust in the media, this risk is unaffordable.
MacKinnon neatly summarises the issue: “AI has a role to play, but it’s all about balance. It can be a great tool, but should never replace the human element that makes journalism what it is. It’s the journalists and producers who bring the heart, context and credibility to the final product.”
Friend or foe?
With generative AI on the rise, many creatives (that includes writers and reporters) are becoming fearful of job displacement and copyright infringement. “There’s a reason why it was such a topic of contention during the 2023 Hollywood labour disputes, specifically the WGA strike,” states Soso. There’s a growing sentiment that the journalism industry – which is currently ripe with mass layoffs – is also under siege.
“The practice of scraping content without the proper compensation poses a serious existential threat to creators, with the potential to undermine their livelihoods and the value of original work,” says Pfitzner. From a technological perspective, he adds: “Cybersecurity threats to newsrooms are on the rise; AI enables more deepfakes, advanced disinformation campaigns and targeted cyberattacks through sophisticated scams.”
There’s the question of whether AI can adequately imitate and ultimately replace human labour; Soso doesn’t think so, but he acknowledges the possibility. MacKinnon agrees that human judgement and oversight is irreplaceable and remains the most valuable tool for minimising the spread of misinformation. Samake believes that, while AI poses a threat to creatives, it might be more perilous to those who lack media literacy.
Risk vs reward
AI’s primary benefit, according to our interviewees, is its efficiency. Everyone knows that time is money, and AI has innumerable operational and economic benefits. “AI is an innovative tool that – if used correctly – can make a huge difference,” states Soso. As long as it gets results, AI is here to stay, though Soso suggests it be used as a workflow supplement rather than a replacement.
“AI boosts efficiency through the automation of repetitive tasks, enabling reporters to focus on more valuable work,” begins Pfitzner. “It also enhances coverage, allowing newsrooms to delve deeper into a wider range of topics, and empowers better-informed audiences through data-driven insights.”
He doesn’t stop there though, taking stock of its drawbacks: “AI poses challenges like the potential erosion of audience trust when used to generate creative content, risks of inaccuracies or biases from flawed training data and privacy concerns regarding the use of sensitive information in developing models.” Importantly, AI-generated output is only as good as the input.
For MacKinnon, AI’s leading risks include “losing the personal, human touch in stories or allowing bias to creep in accidentally if the data isn’t handled carefully. On the flip side,” he adds, “AI can help journalists work faster and dig deeper, which allows for more compelling and accurate reporting. It’s about making the most of these tools without losing sight of what really matters.”
Getting a piece of the AI pie
As with any burgeoning technology, there are early adopters and there are laggards. Economically speaking, it’s better to be on the earlier side. In short, “organisations that fail to adopt AI risk falling behind,” states Pfitzner. “AI offers significant benefits that include increased productivity, higher revenue and reduced costs, leaving non-adopters struggling to match these advantages.”
Whether we like it or not, ‘AI is here now’. Samake argues: “To not at least acknowledge it is like ignoring an elephant in a room.” She predicts that AI will become an inextricable element of journalistic work, and it seems like we’re already headed in that direction.
“The media industry is changing quickly, and staying out of the AI conversation could mean missing out on opportunities to improve how our stories are told or even losing relevance with audiences,” MacKinnon concludes. “If others are using AI to speed up workflows and make them more engaging, those who don’t will struggle to keep up.”
While many companies are racing towards AI, some are purposefully opting out. Pfitzner sees this as potentially being a good business strategy: “Avoiding AI could serve as a unique quality marker in an AI-saturated market – like the appeal of handcrafted goods in today’s automated world.”
Rules, regulation and responsibility
In the grand scheme of things, it’s still early days for AI. ChatGPT launched in 2022, but widespread adoption of AI-based technology is only just getting in full swing.
“The industry is still figuring it out,” admits MacKinnon. “There are conversations happening, but transparency is key, and audiences should know when AI is involved. It’s up to us to make sure it’s enhancing the content experience for audiences and not simply replacing human decision-making in journalism.”
Samake alludes to ‘some guidelines’ but doesn’t necessarily see them being followed. “On social media platforms like Facebook, I don’t always see statements that an image has been created artificially (when I know that it has). That is incredibly dangerous,” she thinks.
Since our conversation, Meta has announced that it would end its fact-checking program – an undeniable step backwards and an invitation for rampant misinformation. While Meta’s decision is a disturbing one, the bigger picture needn’t be so bleak. “Across industries, focus on ethical AI is gaining momentum, supported by initiatives such as the EU AI Act,” describes Pfitzner.
“By prioritising transparency, accountability and innovation, early adopters of responsible AI practices are setting a powerful example. Aligning with these principles fosters trust and ensures that AI continues to serve as a force for good.”
Integrity is irreplaceable
Most newsrooms are privately funded, so staying in business is the first, most basic goal. Once they’ve avoided bankruptcy, companies can then deliver on their mission – for newsrooms, this is usually to provide high-quality, trustworthy journalism. There are two key ways to do this: to hire talented humans and follow ethical workplace practices.
“Studios and companies can strike a balance between profitability and journalistic integrity by prioritising responsible AI usage,” suggests Pfitzner. This involves installation of ‘robust cybersecurity measures’, as well as ensuring ‘transparency about AI practices’.
He also encourages companies to advocate for fair compensation and to participate in industry-wide discussions. As AI evolves, updated guidelines will have to follow.
Samake believes that ‘talented, passionate writers and journalists’ are the backbone of any newsroom. “Every reporter learns about ethics in school or on the job,” she claims.
MacKinnon agrees: “The key is keeping people at the centre of it all.”
He continues: “While studios can save time and money by using AI for repetitive tasks, the real investment should go into training up their teams and making sure human oversight is part of every step.
”At the end of the day, journalism is about trust – something that’s been heavily eroded in the past few years,” he admits, “and we don’t want to compromise it more with any missteps while leveraging AI.”
This feature was first published in the Spring 2025 issue of FEED.