Can machine intelligence be “taught” to be unbiased? This is more than an academic question, as Google has created an AI tool called Genesis that’s designed to act as a “helpmate” for journalists in the newsroom. Google has already demonstrated Genesis to several major media companies, including the New York Times, the Washington Post, and the Wall Street Journal’s parent company News Corp.
Beyond bias, how reliable could Genesis be in a newsroom where getting the facts right is paramount?
How Genesis works is still proprietary information, but sources tell the New York Times that it can take in information, including details on current events, and then generate news content—presumably basic “who, what, when, where, and why” articles.
One of the three people familiar with the product said that Google believed it could serve as a kind of personal assistant for journalists, automating some tasks to free up time for others, and that the company saw it as responsible technology that could help steer the publishing industry away from the pitfalls of generative A.I.
Some executives who saw Google’s pitch described it as unsettling, asking not to be identified discussing a confidential matter. Two people said it seemed to take for granted the effort that went into producing accurate and artful news stories.
What is it about the news business that those connected to it treat the generation of news as some kind of sacred right? We see it when there are large layoffs at media companies, and there is much wailing and gnashing of teeth over lost jobs in a dying industry. AI is “unsettling” to these executives because it’s proving how superfluous writers and editors will eventually be. It’s probably something akin to how blacksmiths and wheelwrights felt watching the first automobile drive past their shops.
The future is always scary, and sometimes for good reason. Artificial intelligence will, like all other technological advances, come with an enormous price as well as enormous benefits. Thus has it always been so. Thus it will always be so.
Google spokesperson Jenn Crider said in a statement that “in partnership with news publishers, especially smaller publishers, we’re in the earliest stages of exploring ideas to potentially provide A.I.-enabled tools to help their journalists with their work.”
“Quite simply, these tools are not intended to, and cannot, replace the essential role journalists have in reporting, creating and fact-checking their articles,” she added.
News organizations around the world are grappling with whether to use artificial intelligence tools in their newsrooms. Many, including The Times, NPR and Insider, have notified employees that they intend to explore potential uses of A.I. to see how it might be responsibly applied to the high-stakes realm of news, where seconds count and accuracy is paramount.
But Google’s new tool is sure to spur anxiety, too, among journalists who have been writing their own articles for decades. Some news organizations, including The Associated Press, have long used A.I. to generate stories about matters including corporate earnings reports, but they remain a small fraction of the service’s articles compared with those generated by journalists.
Now those college-educated journalists know exactly how the guy on the assembly line feels when he sees automation taking more and more of his friends off the line because they’re redundant components.
Journalism isn’t art — at least, it shouldn’t be. It’s a craft that, until J-schools started churning out mini-Woodwards and Bernsteins by the bushelfull, was the province of high school grads who learned their craft by starting out as copy boys and gofers. News stories weren’t created; they were “constructed” — built the way one might build a house brick by brick, stone by stone.
Now, reporters are considered high priests, and editors are popes. They’re judged for their creative writing skills as much as their ability to tell a story. There are still good journalists, but they’re far outnumbered by rock stars who are rewarded for their skills as dramatists and playwrights, not necessarily for getting the story right.
I’m willing to give AI in the newsroom a chance. How much worse can it be than biased human reporters?