Wikipedia is better prepared for Election Day than Facebook or Twitter – Vox.com
If youre looking for up-to-the-minute results on election night, Wikipedia might be one of the first sites to pop up in your Google search. But, in this case, the crowd-sourced encyclopedia of human knowledge likely wont have the immediate answers you seek. And thats by design.
In yet another election cycle defined by copious amounts of misinformation from a variety of sources, Wikipedia wants and is set up to be a carefully curated resource of impartial facts. Theres no rush to be the first to declare a winner (quite the opposite, in fact). Its also difficult for trolls to vandalize associated pages, let alone keep those edits up for a prolonged period of time or to allow them to spread.
For the 2020 United States presidential election page, as well as the pages for presidential candidates Donald Trump and Joe Biden and vice presidential candidate Kamala Harris, only editors whose accounts are at least 30 days old and who have made at least 500 edits can change the article. This is what Wikipedians, the editors who run the site, call extended confirmed protection.
The election page lock was put in place on October 21 by Molly White, who goes by the handle GorillaWarfare on the site. Shes been a Wikipedia editor for almost 15 years and also serves as an administrator. This gives her some additional abilities, like the power to lock pages. But White is not anticipating any major issues on Wikipedia with regard to the upcoming election.
For the most part, things will be business as usual on Wikipedia, White told Recode. Wikipedia editors and administrators have plenty of tools at our disposal to ensure that our readers are only seeing accurate information, even as things are changing quickly behind the scenes.
This probably wont be the case elsewhere online. Like Wikipedia, social media companies run on user-generated content, and theyre once again scrambling to come up with ways to stop the spread of misinformation and disinformation on their platforms. After being blamed for influencing the outcome of the 2016 election, Facebook is particularly concerned with how it will handle Election Day this year.
But Wikipedia, which will be 20 years old on January 15, has been around longer than Facebook, Twitter, and YouTube. This will be the sixth presidential election in Wikipedias lifetime, and the sites all-volunteer army of thousands of editors has used those years of experience to develop and refine methods of combating lies and inaccuracies during prominent breaking new events while also identifying and deleting anything incorrect or poorly sourced that happens to make it onto their pages.
Wikipedia editors are currently discussing how to handle Election Day and its results in public forums on the site. Theyre debating how many sources to use for election-related updates, which ones to rely on when a presumptive winner is declared, and how long after polls close to start adding the results to the page.
Wikipedia is intended to be an encyclopedia, not a news organization, and so we are much more concerned with being accurate than we are with being quick, White said.
Indeed, Wikipedias stated mission is to be a repository for all human knowledge. The site has 55 million articles across its 300 versions the most popular version, English, has 6.2 million articles. Wikipedia is also one of the most-read websites in the world, with 1.5 billion unique visitors per month.
So while huge social media platforms tend to expose their users to content that generally fits their existing worldview and political sensibilities, Wikipedia has quietly emerged as a website for people who are actively seeking accurate information. Whats behind the effort is a community that strives to provide that information as neutrally and as accurately sourced as possible.
Wikipedia is ruled by consensus, its articles are fluid, and discussions over how and why they should be changed are ongoing. Wikipedia putting up information about the presidential election is no different.
Most pages associated with the election and candidates have some kind of edit protection on them, though the level of protection might vary. For example, while Harris currently has extended confirmed protection, her opponent, Mike Pence, has a page that is only semi-protected. That means edits can only be made by registered users whose accounts are at least four days old and have made at least 10 edits though, again, this might change as Election Day nears.
Similarly, many United States politics-associated pages are also subject to additional rules limiting edits to reverse a previous edit or requiring a consensus to apply any edits that have been challenged. To reach consensus, editors will typically argue their respective viewpoints on an articles accompanying talk page, citing various Wikipedia rules and procedures to back up their case until a majority of editors agree on what to do next. Administrators can block or ban editors who dont follow those rules.
When it comes to the election results, editors are still hashing out whether the Associated Presss projections are a good enough single source or if at least three news sources should be used. Theyre also considering just locking certain pages from edits for everyone except administrators for a set period of time.
With standards, rules, and a community of editors to uphold them, moving slowly has been a Wikipedia superpower, Noam Cohen recently wrote in Wired. That, Cohen added, makes the site a less attractive target to those bent on campaigns of misinformation with immediate payoffs. Vandalism is hard to add, usually doesnt stay up for long, and therefore doesnt spread widely.
While Facebook and Google have spent billions of dollars on content moderators and other measures to combat misinformation and abuse on their platforms, Wikipedias editors do this work for free. Wikipedia is hosted by the nonprofit Wikimedia Foundation, which covers its associated costs, including servers, software, and legal fees. The Foundation relies on donations and gifts and gets a lot of them: The organization received $113 million last year alone.
The Foundations role is to support those folks in every way that that they need us to, Ryan Merkley, Wikimedia Foundations chief of staff, told Recode. That means everything from keeping the servers up and running, to running our security operation, to communications, fundraising. But also working with trust and safety, and then supporting [editors] with the tools that they need in order to edit.
Some of those tools include bots that can quickly detect article vandalism and either get rid of it or flag it to an editor. Editors can also add articles to their watch lists to be immediately alerted of any changes (nearly 550 editors have put the 2020 US presidential election page on their watch lists). And they can lock pages that might or already have become targets for vandalism.
The Foundation has also done some of its own work to prepare for the election.
We put together an internal task force, with staff representatives from every part of the foundation who relate to disinformation, Merkley said. So that includes the security team, trust and safety, legal policy, communications, our partnerships group that works with the other platforms that engage with Wikimedia content.
The guiding principle behind Wikipedia is that anyone can contribute anything to it. This being the internet, not everyone operates in good faith or knows what theyre talking about, so the site has a longstanding reputation for inaccuracy. Thats no longer wholly deserved, but Wikipedia itself will tell you that its not a reliable source for this very reason.
The site has also been criticized for systemic bias, with a lack of representation from certain demographics theres a lot of white English-speaking men who contribute that can create a hostile environment for minority editors. The lack of diversity also has the potential for bias to make it into the articles themselves. The Wikipedia Foundation and Wikipedians have made efforts to improve this, but they still have work to do.
Other things get overlooked on a site as big as Wikipedia, too. For instance, you might stumble across vandalized articles, usually lurking in Wikipedias lower-trafficked corners, that have managed to escape the notice of editors. You may even find a version of Wikipedia that contains thousands of articles written by someone who doesnt really know the language theyre supposed to be written in.
While anyone can become a Wikipedia editor, only a tiny fraction of Wikipedias readers actually will. And its deceptively difficult. The initial process of making an edit is as simple as signing in and changing some text, but Wikipedias editorial rules and processes and the various code words and language around them can be a barrier to doing it correctly, which is necessary for the edit to be accepted.
But the people who get it, like White, may spend a considerable amount of their time doing unpaid work on the site. They might also become the target of harassment as a result. White, who spends two or three hours a day working on Wikipedia, said shes been doxxed, threatened with violence and lawsuits, and people have even tried to get her fired from her day job because of it.
It is at best frustrating and at worst extremely frightening, but I both care deeply about the importance of Wikipedia and I am also a very stubborn person who does not like to feel like I am giving in to threats, White said, attributing some of that harassment to her position as an administrator, her gender, and the controversial articles and topics she often works on (she created the Boogaloo movement page, for example).
And Wikipedia is important. Its one of the top results for most internet searches, and so, for better or worse, Wikipedia is the site people are most likely to visit when they want more information about something. That means the stakes are high when big topics are involved.
Notably, its coverage of Covid-19 has drawn praise. This involved the creation of a WikiProject dedicated to the virus with over 200 participating editors (anyone can join!) who may focus on pandemic case data, the viruss impact on specific locations, or the industries affected. One professor who studies misinformation told the Washington Post that Wikipedia was a ray of hope in a sea of pollution and handled the virus exceptionally well.
Theres a lot of really great work done through these WikiProjects, especially during times of crisis where a lot of hard-hitting, late-breaking stuff is coming out, Zachary J. McDowell, an assistant professor in the Department of Communication at the University of Illinois at Chicago, told Recode.
So if Wikipedia, with its high visibility and wide-open door for anyones contributions, can still provide readers with well-sourced, neutral articles, why cant the social media platforms that play such a big role in the spread of misinformation do the same? Clearly, some of them see the merits of Wikipedias work; Facebook and Google use Wikipedia articles to provide additional knowledge in user searches.
Social media is designed to keep users on their platforms for as long as possible, both to show them as many ads as possible and to collect their data, which is then used to show them even more ads. They are incentivized to keep your attention, not to ensure that what youre reading or seeing is accurate. That business model is unlikely to change anytime soon. Meanwhile, Wikipedias model is quite different.
[Wikipedia has] no algorithms designed to serve content in certain ways to some people, Merkley said. None of that structure exists which can be later gamed, in order to advance this post about a person or to target this message to that person.
Wikipedia is also very transparent, Merkley said. An articles associated history and talk pages will tell you, in great and granular detail, all the edits that have been made, who made them, and any associated discussions between editors about them.
This transparency helps create trust, but good luck getting, say, Facebook to implement it. Facebook is notoriously secretive about its algorithms, which determine what you see on the site, from ads to posts from your friends to recommendations for groups you should join or people you should befriend. These algorithms create filter bubbles of information that tends to line up with your political viewpoints, offering little exposure to anything that might conflict with them. You get what Facebook thinks you want to hear or watch what YouTube thinks you want to watch, and thats not always whats true.
It is essentially a game where the entire system is already rigged for disinformation, fake news, McDowell said. Its monetarily incentivized to get people riled up and to click. It will always be a game where those who are trying to control the information flow will be the ones who are one step behind.
McDowells studies include Wikipedias value as a teaching tool for information literacy. He stresses that Wikipedia itself shouldnt be seen as a source but rather as a collection of information, clearly cited, that users can follow if they want to learn more or verify what theyve read.
Having a critical eye toward information is absolutely imperative right now, McDowell said. And a lot of people dont.
For their part, social media platforms have, in recent years, tried to hold back the flow of misinformation in some cases, including during the election. Facebook has made rules around political ads, voter suppression, and even premature declarations of victory. But social media still receives plenty of criticism from both sides of the aisle, and it will almost certainly be blamed for influencing the outcome of the election in some way, regardless of the winner.
Wikipedia, on the other hand, will just tell you who reliable sources say the winner is as soon as its editors reach a consensus on what those sources are.
Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.
Will you help keep Vox free for all?
The United States is in the middle of one of the most consequential presidential elections of our lifetimes. Its essential that all Americans are able to access clear, concise information on what the outcome of the election could mean for their lives, and the lives of their families and communities. That is our mission at Vox. But our distinctive brand of explanatory journalism takes resources. Even when the economy and the news advertising market recovers, your support will be a critical part of sustaining our resource-intensive work. If you have already contributed, thank you. If you havent, please consider helping everyone understand this presidential election: Contribute today from as little as $3.
Go here to see the original:
Wikipedia is better prepared for Election Day than Facebook or Twitter - Vox.com
- Wikipedia fights the UKs flawed and burdensome online safety rules - The Verge - May 10th, 2025 [May 10th, 2025]
- Not courts duty to tell media to delete this and take that down: SC sets aside Delhi HCs order to take down page on ANI vs Wikipedia case - The Indian... - May 10th, 2025 [May 10th, 2025]
- Propaganda tool row: SC reverses Wikipedia takedown in ANI defamation case - Siasat.com - May 10th, 2025 [May 10th, 2025]
- Wikipedia is using (some) generative AI now - The Verge - May 8th, 2025 [May 8th, 2025]
- Jay-Z Accuses Attorney Of Wikipedia Manipulation In Legal Battle - Evrim Aac - May 8th, 2025 [May 8th, 2025]
- US jurist accuses Wikipedia of disseminating propaganda and rewriting history - MSN - May 8th, 2025 [May 8th, 2025]
- Wikipedia Foundation Withdraws Appeal Before Delhi High Court Following Supreme Court Ruling - The Law Advice - May 8th, 2025 [May 8th, 2025]
- Generative AI will help Wikipedia editors moderate, translate, and onboard newcomers - the-decoder.com - May 8th, 2025 [May 8th, 2025]
- Wikipedia will apply generative AI to support editors and reduce technical barriers - The Weekly Journal - May 8th, 2025 [May 8th, 2025]
- Wikipedia turns to generative AI to support its volunteer community - TechSpot - May 8th, 2025 [May 8th, 2025]
- How is Wikipedia Progressive in the Age of AI? - Analytics Insight - May 8th, 2025 [May 8th, 2025]
- Members of Congress call on Wikipedia to curb its antisemitism - Israel National News - May 8th, 2025 [May 8th, 2025]
- Is Wikipedia in trouble? - London Evening Standard - April 12th, 2025 [April 12th, 2025]
- Wikipedia Has an Alter Ego Thats Obsessed With Questions. Everyone Should Browse It. - Slate - April 12th, 2025 [April 12th, 2025]
- ANI vs Wikipedia: What the case is about and what has happened so far - Business Standard - April 12th, 2025 [April 12th, 2025]
- Delhi HC refuses to stay order asking Wikipedia to remove alleged defamatory description of ANI - The Economic Times - April 10th, 2025 [April 10th, 2025]
- The ADL says Wikipedia contains antisemitic bias, amid dispute over how the Israel-Hamas conflict is represented on the site - CNN - March 22nd, 2025 [March 22nd, 2025]
- I Tried a TikTok-Style Version of Wikipedia, and It's Now My Favorite Way of Learning - MUO - MakeUseOf - March 22nd, 2025 [March 22nd, 2025]
- How obscure is prospective Celtics buyer William Chisholm? He didnt have a Wikipedia page until Thursday. - The Boston Globe - March 22nd, 2025 [March 22nd, 2025]
- How biased Wikipedia trashed Trumps nominees after he named them - New York Post - March 22nd, 2025 [March 22nd, 2025]
- Deconstructing Wikipedia: Its biased, lopsided and partisan - The Sunday Guardian - March 22nd, 2025 [March 22nd, 2025]
- ADL report finds clear evidence of anti-Israel bias among Wikipedia editors - JNS.org - March 22nd, 2025 [March 22nd, 2025]
- ADL: Anti-Israel Wikipedia editors colluding in anti-Israel bias on site - The Times of Israel - March 22nd, 2025 [March 22nd, 2025]
- What happens when Wikipedia, Joe Biden, and Ms. Frizzle walk into a reality show? - Queen's Journal - March 22nd, 2025 [March 22nd, 2025]
- Wikipedia posts updated to smear Patel, Hegseth, Gabbard: Watchdog - Washington Examiner - March 22nd, 2025 [March 22nd, 2025]
- John Oliver Marvels at Wikipedia Page of Mel Gibson's Father: Somehow Your Son 'Is Not the Worst Thing About You' - TheWrap - March 22nd, 2025 [March 22nd, 2025]
- Wikipedia disrupted by edit wars to manipulate pages on war in Gaza with at least 14 editors banned: report - New York Post - March 13th, 2025 [March 13th, 2025]
- Volunteer photographers are fixing Wikipedia's terrible celebrity headshots - Engadget - March 13th, 2025 [March 13th, 2025]
- Photographers Are on a Mission to Fix Wikipedia's Famously Bad Celebrity Portraits - 404 Media - March 13th, 2025 [March 13th, 2025]
- Wikipedia roiled with internal strife over page edits about the Middle East - Detroit News - March 13th, 2025 [March 13th, 2025]
- Wikipedia has a huge gender equality problem heres why it matters - The Conversation Indonesia - March 13th, 2025 [March 13th, 2025]
- Wikipedia Co-founder: It's Not Neutral, Needs to Be Investigated - Newsmax - March 13th, 2025 [March 13th, 2025]
- Volunteer Photographers Tackle Terrible Celeb Headshots on Wikipedia - PCMag UK - March 13th, 2025 [March 13th, 2025]
- Bored? Check out the Museum of All Things and dive into Wikipedia in 3D - GamingOnLinux - March 3rd, 2025 [March 3rd, 2025]
- This free interactive museum lets you explore Wikipedia like never before - Digital Trends - March 3rd, 2025 [March 3rd, 2025]
- The Wild World of Wikipedia Speedrunning - LAFM - March 3rd, 2025 [March 3rd, 2025]
- Wikipedia co-founder's open challenge to Musk: Which US govt branches 'paid to edit, monitor, update, lobby' the website? - Business Today - March 3rd, 2025 [March 3rd, 2025]
- Wikipedia co-founder may just have agreed with Elon Musk in his first viral post in a few years - The Times of India - February 27th, 2025 [February 27th, 2025]
- Elon Musk wants to change the name of Wikipedia $1 billion on the table to achieve it - Unin Rayo - February 27th, 2025 [February 27th, 2025]
- Wikipedia is now an endless 3D museum, and admission is free - Rock Paper Shotgun - February 27th, 2025 [February 27th, 2025]
- This slick new service puts ChatGPT, Perplexity, and Wikipedia on the map - Fast Company - February 27th, 2025 [February 27th, 2025]
- From agnostic to believer: Wikipedia co-founder publicly shares his testimony - CHVN Radio - February 27th, 2025 [February 27th, 2025]
- Wikipedia co-founder's request to Donald Trump and Elon Musk to probe the dubious website - OpIndia - February 27th, 2025 [February 27th, 2025]
- User booked for adding content on Chhatrapati Sambhaji Maharaj on Wikipedia - The Times of India - February 27th, 2025 [February 27th, 2025]
- Remove derogatory and objectionable reference from Wikipedia about Sambhaji Maharaj: Fadnavis - Deccan Herald - February 20th, 2025 [February 20th, 2025]
- 'There's limit to free speech': Fadnavis orders action against Wikipedia content - The Times of India - February 20th, 2025 [February 20th, 2025]
- Why these scientists devote time to editing and updating Wikipedia - Nature.com - February 20th, 2025 [February 20th, 2025]
- Elon Musk's 'reminder' to Wikipedia: $1 billion offer for name change to ... still stands; come on, do .. - The Times of India - February 20th, 2025 [February 20th, 2025]
- Maharashtra CM directs cyber police to get objectionable content on Sambhaji Maharaj removed from Wikipedia - The Indian Express - February 20th, 2025 [February 20th, 2025]
- Elon Musk and Wikipedia are feuding - The Week - February 20th, 2025 [February 20th, 2025]
- Wikipedia UnReliable Sources: Case Study How Wikipedia is Rigged to Prevent Balance When It Comes to Religious Articles - World Religion News - February 20th, 2025 [February 20th, 2025]
- Behind the Blog: Backdoors and the Miracle of Wikipedia - 404 Media - February 20th, 2025 [February 20th, 2025]
- What if TikTok and Wikipedia had a baby? - The Washington Post - February 20th, 2025 [February 20th, 2025]
- How Wikipedia Co-Founder Found Faith After 35 Years as a Nonbeliever - Movieguide - February 20th, 2025 [February 20th, 2025]
- Wikipedia, Are You Ready? Musk's $1 Billion Name Change Offer Still On - Analytics Insight - February 20th, 2025 [February 20th, 2025]
- Remove objectional reference about Sambhaji Maharaj from Wikipedia: Fadnavis - The Hindu - February 20th, 2025 [February 20th, 2025]
- Zee 24 TAAS forces Wikipedia to take action on false content about Chhatrapati Sambhaji Maharaj - MediaNews4U - February 20th, 2025 [February 20th, 2025]
- Elon Musks $1 Billion Wikipedia Challenge: Reality or Stunt? - The Octant - February 20th, 2025 [February 20th, 2025]
- Fadnavis asks to remove objectionable Wikipedia content on Sambhaji Maharaj - Business Standard - February 20th, 2025 [February 20th, 2025]
- Kumbh mela among most viewed content on Wikipedia - The Times of India - February 20th, 2025 [February 20th, 2025]
- This Web App Is TikTok for Reading Wikipedia - Lifehacker - February 14th, 2025 [February 14th, 2025]
- An infinite Wikipedia scroll I created in mere hours went viral. I think people may be tired of curated algorithms. - Business Insider - February 14th, 2025 [February 14th, 2025]
- Wikipedia Prepares for 'Increase in Threats' to US Editors From Musk and His Allies - 404 Media - February 12th, 2025 [February 12th, 2025]
- Want to know how the world ends? Try this Wikipedia page - The Guardian - February 12th, 2025 [February 12th, 2025]
- Anti-algorithm app combines Wikipedia and TikTok to combat brain rot - Interesting Engineering - February 12th, 2025 [February 12th, 2025]
- This website combines Wikipedia and TikTok to fight doomscrolling - Fast Company - February 12th, 2025 [February 12th, 2025]
- A developer from the US crossed Wikipedia with TikTok using AI. Now WikiToks endless stream of useful articles cures users of boredom and addiction to... - February 12th, 2025 [February 12th, 2025]
- Wikipedia instead of TikTok the developer has created an endless feed of knowledge without tracking algorithms - ITC - February 12th, 2025 [February 12th, 2025]
- Wikipedia accused of blacklisting conservative US media - The Times - February 7th, 2025 [February 7th, 2025]
- Chamber of Commerce leading the charge for updated city Wikipedia page - KFDX - Texomashomepage.com - February 7th, 2025 [February 7th, 2025]
- Edit wars over Israel spur rare ban of 8 Wikipedia editors from both sides - The Times of Israel - February 7th, 2025 [February 7th, 2025]
- Does Left-Wing Tendency of Wikipedia Editors and Admins Contribute to Bias in the Platforms Coverage of Religion? - World Religion News - February 7th, 2025 [February 7th, 2025]
- Wikipedia rabbit holes trained me for this genealogical mystery game - Polygon - February 7th, 2025 [February 7th, 2025]
- Stanford University Introduces an LLM that Writes Wikipedia-Like Reports - IBL News - February 7th, 2025 [February 7th, 2025]
- Wikipedia blacklists conservative sources in favor of left-wing bias - Washington Examiner - February 7th, 2025 [February 7th, 2025]
- Edit wars over Israel spur rare ban of 8 Wikipedia editors from both sides - JTA News - Jewish Telegraphic Agency - January 24th, 2025 [January 24th, 2025]
- Elon Musk furious after Wikipedia page calls his controversial gesture a Nazi salute - The Independent - January 24th, 2025 [January 24th, 2025]
- Wikipedia UnReliable Sources: Who Are These Editors and Admins Who Define Reality for the Rest of Us? - World Religion News - January 24th, 2025 [January 24th, 2025]
- EasyJet founder used YouTube and Wikipedia in doomed trademark battle - The Times - January 24th, 2025 [January 24th, 2025]
- 'Elon is unhappy that Wikipedia is not for sale', says co-founder Jimmy Wales after Musk repeats call to defu - Indiatimes.com - January 24th, 2025 [January 24th, 2025]