You read the news. But if Wikipedia co-founder Jimmy Wales’ hunch is right, you’ll want to edit it, too.
Wales is raising money to bring a new model of ad-free news creation to the web: one that would mix professional journalists with volunteer editors. Wales, like so many other idealists who believe in a better public discourse, wants to fix the fake news problem he sees as driven by a clickbait economy where accuracy comes second to intrigue. Unsurprisingly given Wikipedia’s success, he sees crowd
sourcing as the remedy. Don’t readers, after all, have just as much stake in the information they read being true? Wikitribune, as he calls the nonprofit project, will also have no ads when it launches in December. By eliminating the profit motive and opening up the editing process, he hopes Wikitribune will be a place where truth flourishes and inaccuracies are self-corrected—just like Wikipedia is at its best.
“The way I envision this is journalists working side by side with community members,” Wales says. “This isn’t just about fake news [but] fake media. There are also things that are run that are very, very poorly sourced, and communities are good at spotting that if they pause to think about it.”
But in removing professional editors from the process, Wales sets up an interesting dilemma: Can the crowd ensure not just accuracy but also fairness and all the other ethical values traditionally preserved by journalistic independence? In theory at least, the whole point of a free and independent press is to have disinterested observers standing apart from the crowd.
Wales and company are still finalizing Wikitribune’s workflow. He says a content management system will let logged-in members participate in not just story editing after publication but story conception and planning. Editors will be able to advise on sourcing, brainstorm ideas, and fact-check in real-time. Once a story goes live, anyone can log in and suggest changes. Unlike on Wikipedia, those changes won’t appear immediately in the story. Either a staff member or a designated community moderator will have to approve. Wales’ goal is to get enough donations to hire 10 full-time reporters, who will report to him as editor in chief. But like Wikipedia, you’ll be able to see who made the changes.
In traditional journalism (like this article you’re reading) an editor assigns a story, sets a deadline, and checks in with the reporter throughout the reporting process. After a story comes in, the editor reads it, points out holes and problems, and revises for structure and clarity, sourcing and fairness. At Wikitribune, all those jobs are yours.
Why not just go with a fully crowdsourced model and have the crowd write stories too? Wikipedia has already tried that: It’s called Wikinews. But by Wales’ own estimation, it’s a failure because it relies entirely on volunteers who don’t necessarily know how to report news. That’s why Wikitribune will have professional writers doing the reporting and let the crowd act as support.
“We’re going to let communities do a lot of the kinds of things that communities do really well,” Wales says of the hybrid model, by which he mostly means fact-checking. Of course, communities are sometimes terrible at fact-checking, too; Think of all the times online mobs have spread misinformation on Twitter or Facebook. Or what about when self-designated internet detectives have tried to track down criminals, only to finger the wrong person? These perils exist, but as the creator of the single most successful crowdsourced information hub in history, Wales believes the right design he can encourage good behavior from a community.
“Yes, the system will be gamed. But if anyone knows how to deal with it, it’s Jimmy Wales,” says Jeff Jarvis, a professor of journalism at the City University of New York and well-known news futurist who will act as an advisor on the project. Jarvis also oversees the Facebook-backed News Integrity Initiative, which plans to invest money in Wikitribune, though as of press time Jarvis hadn’t decided yet how much. Jarvis is excited about an idea he sees as innovative—the kind of original thinking that the industry needs to embrace to solve problems from fake news to dwindling ad budgets.
Whether or not the Wikitribune model can actually succeed at addressing these issues, the project will at least serve as a fascinating experiment. For one, the pros often do miss what’s really news. The Standing Rock protests stand as a stark example. Though I was aware that the people were protesting the Dakota Access Pipeline, I had no plans to report on it because I didn’t see a good WIRED angle. Then protesters took to Facebook Live to broadcast their confrontations with police, and viewers watched the live streams by the thousands. I wrote a report on the clashes, and it remains the most-read story in my career.
Wales hopes that’s how Wikitribune will work much of the time. Though the crowd won’t exactly assign stories to the professional reporters he hires, he hopes that it will play a large role in helping determine the editorial agenda of the site. That has the potential to align reader interest with news content in a satisfying way for everyone.
Another upside is that, as with Wikipedia, Wiktribune will harness the expertise of the crowd to effectively edit and update news. If a Wikitribune writer is reporting about neuroscience, neuroscientists themselves can log in and contribute to the story. Wales sees this volunteer expertise as key to fighting misinformation.
But those outside contributions also pose the greatest risk for conflicts of interest. Journalists don’t let sources write or edit their stories. You don’t show a source a story before you publish it, and you don’t let a source dictate changes to a story afterword, beyond correcting true factual errors. Otherwise, you’re just doing PR. Wikitribune will be walking a very fine line between giving readers more power and giving sources and subjects direct access to shape stories to suit their interests.
“Who are these people who are making the changes?” asks Eugene Kiely, director of FactCheck.org, a site where professional journalists do the job Wales will be asking the public to do instead. “Do they have a political agenda? Do they have a financial stake? That would concern me making sure there is some level of transparency.”
Wales agrees that transparency is paramount. “You’ve got the be very careful and vigilant about questions of bias,” he says. In the end, he believes stamping out bias comes down to community design, in particular a moderator system that discourages people from creating their own little fiefdoms. That challenge gets at the key questions undergirding not just Wales’ experiment but the whole journalistic enterprise: Who should have the power to create news, and who will use the power most responsibly? Implicit in the question is the admonishment that journalists aren’t always necessarily the best protectors of that truth. But is humanity at large? Wikitribune will try to find out.