Dr. T. Franklin Waddell
Dr. T. Franklin Waddell

Newsrooms for several years have begun adopting artificial intelligence-driven applications, which has proven effective for everything from fact-checking to data gathering, and in some cases, is now even being used to report on current events with little or no human collaboration.

Research conducted at the University of Florida’s College of Journalism and Communications, however, suggests that readers perceive news written by algorithms to be far less credible than news written by humans, even when no evidence is presented to support this belief.

That research, which was recently published in the journal Digital Journalism, was conducted by Frank Waddell, assistant professor in UF’s Department of Journalism. Waddell conducted two separate experimental studies to examine the impact automated authorship has on people’s perceptions of news credibility.

In the first study, Waddell recruited participants to read one of two then-current news articles which were identical save for the byline: one read “Kelly Richards, Reporter,” while another read “Automated Insights, Robot Reporter.” Readers were then asked to respond to a series of questions to evaluate whether they found the article to be accurate, authentic, believable and newsworthy.

That study revealed that the article attributed to the robot was perceived to be less credible and less newsworthy than the identical article attributed to a person.

“Entities that are perceived as less human-like are less likely to be perceived as capable of completing human tasks,” Waddell wrote in his Digital Journalism article.

Waddell’s second study observed readers’ feelings and prior exposure to AI, including robots in films and TV, and again presented readers with an article written either by a person or a machine. Readers were then asked whether they found the source credible and if it met their prior expectations.

This portion of the study suggests that our biases toward machine writing appear to operate on “source anthropomorphism,” or our perceptions that people simply do a better job at tasks — such as journalism — than machines do.

“It appears that news writing is still largely perceived as ‘a human’s job,’” Waddell wrote. “Newsreaders appear to prefer journalists who are similar to the self, even when the level of identification is merely an affinity based on possessing human-like (rather than machine-like) appearance.”

Waddell’s research also found that readers’ exposure to robots in popular culture plays a big role in shaping their feelings regarding what functions machines can do, as participants who recalled seeing robots in films were less likely to express trepidation regarding automation technology.

A similar July report released by FleishmanHillard, which gauged the public’s perceptions of AI, found that while people are excited about the future of artificial intelligence, they remain wary of potentially job-eliminating technology that they admit they still don’t know much about. Like Waddell’s research, that study also found that positive sentiment of AI seems to correlate with those who have the most experience with the technology.