Creating Future Truth
Kristian Hammond is paying close attention to how the journalism industry incorporates artificial intelligence. Dangers — and opportunities — abound, he says.
For decades, Sports Illustrated was the gold standard for sports magazines. It featured the best design, the best photography, and the best writing by the best sportswriters in the nation.
So, when in late 2023 the magazine was outed by the website Futurism for running articles generated by artificial intelligence (AI) and filed under the names of fake writers, it represented a stunning fall for the periodical – and was a sign of the increasingly complex relationship between journalism and AI.
Kristian Hammond has been examining that relationship carefully. As director of Northwestern Engineering's Master of Science in Artificial Intelligence (MSAI) program, he is closely monitoring how AI influences numerous professions.
With journalism, Hammond doesn't believe the biggest danger is the creation of substandard sports stories by AI-created “writers.” In his eyes, the biggest danger rests in the algorithms that serve the majority of people with their news today.
“You end up getting recommendations for the news based upon what you read before and what other people like you read before,” Hammond said. “This creates this world of news-filter bubbles and eventually weaponization of information.”
In other words, if you read stories talking favorably about issues championed by one political party, algorithms will serve up more similarly themed articles and keep those with differing viewpoints away from you, all in an effort to keep your eyes on that site’s content for as long as possible.
Today’s algorithms can make it easily appear as if countering views don’t exist. That makes those who hold countering views seem more fringe, radical, and “other,” Hammond said.
Which isn’t to suggest that AI-generated news isn’t a concern. It is, Hammond said, and journalists and news consumers alike need to be aware of the limitations large language models (LLMs) have in actually creating content.
To understand those limitations, Hammond points out the twin roles of a journalist that separate them from content creators. Both journalists and content creators produce content, be that with the written word, audio, or video, but only journalists are tasked with ethical reporting to gather current facts from what is happening right now.
The distinction is important because LLMs are built by being fed already-existing information that was online before the news of the day occurred. Someday, LLMs might be able to consume today’s news and incorporate it into the answers it gives to user prompts. But that time hasn’t arrived.
Yet.
Because of how LLMs are developed, they are vulnerable to creating “journalism” that does not accurately reflect the current, right-now situation. It doesn’t report, Hammond said. Humans do that.
AI might help reporters verify facts, tell the story of the five-car pileup that led to three fatalities, or explain the city council's decision to rezone land from agriculture to residential. But even then, it needs human fact-checkers to ensure the story it tells is accurate and fair.
“You’ve got to make sure it’s true. That is how journalism has always worked,” Hammond said. “Now we're in a place where we’ve got to figure out how LLMs fit into that.”
Hammond and the MSAI program are helping future journalists learn how to do just that. He and his colleagues are working with Northwestern’s Medill School of Journalism, Media, Integrated Marketing Communications to create classes aimed at these issues. The goal is to work with a new generation of journalists on new ways to check facts, validate information, and research stories of interest and value to the general public.
These lessons are not aimed at just journalists, however. Because the internet gives anyone a platform and AI can help create content, it is necessary for all to understand the time-tested principles journalists have lived with to create accurate, fair information that will then be fed to grow existing LLMs and create new ones.
“Good prompt in, bad content out”, Hammond cautions.
“We want to integrate our students with the processes that journalists go through in the same way we want to integrate them in the processes that lawyers go through and people on the medical side go through,” Hammond said. “Journalists are, by their nature, supposed to be telling us the truth. The notion of how you turn things like language models into truth tellers is a huge issue, and journalism is perfect for it.”