OFF Radio Krakow, a radio station in Poland, has made a groundbreaking move and replaced its human presenters with AI.
And this is just the beginning.
In the not-too-distant future, many radio and television stations will likely begin to replace their presenters with AI due to advancements in natural language processing and video synthesis technologies. AI technology can also help translate scripts and interviews from one language to another. Imagine CNN, MSNBC, or FOX using AI instead of human staff.
According to a BBC report, “In Kuwait, an AI persona by the name of Fedha ran through the headlines for Kuwait News. Hermes presented the news in May 2023 for Greek state broadcaster ERT. South Korean broadcaster SBS handed over the duties of news presenting to Zae-In, an AI-generated deepfake, for five months this year. There are others in India and Taiwan, too – all created by AI.”
It may not be a bad idea.
For one, this could potentially reduce mistakes, gaffes, and unpleasant moments. It could also save costs.
According to The Wrap, Anderson Cooper is said to earn an estimated $20 million a year; Wolf Blitzer, $15 million; Jake Tapper, $8.5 million; and Chris Wallace, who makes about $8 million.
Other networks no doubt pay their hosts similar salaries, and it is no wonder that companies are showing a keen interest in understanding how AI could work for them and whether the move is worthwhile. So far, the idea seems to suggest it would pay off financially.
In terms of programming, AI hosts can literally provide news 24 hours a day without getting tired. They can also deliver breaking news within seconds, don’t need make-up, and don’t go on vacation.
While these are, of course, great advantages for networks seeking to cut costs and operate continuously, they don’t solve every problem.
Networks will maintain the role of human broadcasters in fields that require deep contextual knowledge, personal experience, or emotional intelligence, like investigative journalism and complex live reporting.
There are other reasons as well.
While in much of the world, journalists view the use of artificial intelligence as a looming threat to livelihoods, in Venezuela – where showing your face on a news report can conceivably land you in jail – many people view it as protection.
In an interesting development in this field, Microsoft and OpenAI are investing up to $10 million to help local news organizations implement artificial intelligence in their operations.
Each recipient outlet will receive funding to hire a two-year AI expert, along with $5 million worth of combined cash and technology credits split between Microsoft Azure and OpenAI services. The program aims to help local news organizations develop new revenue streams and improve their operations using AI tools.
The program is part of the Lenfest Institute's Local Independent News Coalition, which includes eight of the largest independently owned metropolitan news organizations in the United States. These outlets will work together to develop AI solutions that uphold journalistic standards while strengthening their business models.
According to Open Society Foundations, AI and Large Language Models (LLMs), in particular, “are likely to bring about significant and lasting structural change to information ecosystems as we know them.”
Global concerns about the use of AI in news production and misinformation are growing, a report published by the Reuters Institute for the Study of Journalism found.
The report found that consumers are suspicious about the use of AI to create news content, particularly for sensitive subjects such as politics.
According to the survey, 52% of U.S. respondents and 63% of UK respondents said they would be uncomfortable with news produced mostly with AI. The report surveyed 2,000 people in each country, noting that respondents were more comfortable with behind-the-scenes uses of AI to make journalists' work more efficient.
However, this specific poll deals with journalists using AI to write stories as opposed to newsrooms using AI characters to deliver the news on television.
Part of the problem is that AI avatars cannot interpret or express humor or deal with complex ethical judgments. Certainly in a round-
table setting, AI will not be able to engage in banter and various degrees of analysis.
For this type of journalism, perhaps the most important kind where the public relies on journalists to decipher and analyze important news, AI will remain insufficient, and the need for human presenters will continue.