Reporting the news traditionally required things like faxes, printers, landline phones, and physical tape decks before a story could make its way to the public. In an increasingly digital world, that's no longer the case.
The days when people like Dean Coombs at the Saguache Crescent newspaper would have to individually ink letters into a Linotype machine and then press them onto paper to form sentences are long gone. Mechanics are the past, digital is the present, and artificial intelligence appears to be the future. But the advent of AI raises the question: At what point will the devices be able to do it all themselves?
Mike Humphrey is an assistant professor of journalism at Colorado State University and he's been teaching about the consequences of AI in his classroom for the better part of the last 8 years, even before the emergence of programs like ChatGPT.
"I started asking the question: 'Ten years from now, when I own a media company, why should I hire you instead of some kind of artificial intelligence?' You know you are not going to be as efficient, you're going to be sick at times, you're not going to want to work 24/7, why should I hire you?'" said Humphrey.
That question can be seen in news organizations across the country. In 2014, The Associated Press was the first outlet to implement AI when it started automating stories about corporate earnings.
The Washington Post then also started using bots to take election data in real time and publish articles. Then in December, the New York Times announced it hired its first ever editorial director of artificial intelligence initiatives.
A recent study from the London School of Economics and Political Science found that 75% of the 120 editors and journalists surveyed around the globe had used AI somewhere in the chain of newsgathering, production, and distribution. But, with the acceptance of AI, come new concerns.
"There's an ethical question, for sure," said Humphrey. "Like, just basic ethics: What's the right and wrong thing to do? But journalism already has a lot of trust issues, and if we want to exacerbate that problem as quickly as possible, we will be very unclear about the way we are using AI. So, to really build confidence with an audience, we need to be very, very transparent."
Trending stories at Scrippsnews.com