Wikipedia Article Approval Rates: What 1,000+ AfC Submissions Reveal

Wikipedia is more than just an online encyclopedia: it's the backbone of search visibility, a primary training source for LLMs, and a validation of a topic's notability. There is enormous demand for new articles, especially from brands and startups looking to influence how they appear in search. To address this interest, a diverse ecosystem of vendors—ranging from prominent PR firms to fly-by-night operatives—has emerged to handle the research, drafting, and submission of new entries.

Despite these vendors' promises, getting a new Wikipedia article approved remains extraordinarily difficult.

Lumino's exclusive audit of 1,009 Articles for creation (AfC) submissions on the English-language Wikipedia offers a rare, data-driven look at what actually happens at the platform’s front door. The findings challenge common assumptions about how Wikipedia works and reveal deeper structural dynamics shaping the encyclopedia's growth and guidelines.

Lumino AfC Report Cover

Key Findings from 1,009 Wikipedia Submissions:

  • 68 percent overall rejection rate: More than two-thirds of the submissions in our dataset were declined, highlighting just how difficult it is to get a new Wikipedia article.

  • Most submissions lack notability: The most frequent reason (57 percent) drafts were rejected was that they did not meet Wikipedia's notability guidelines, meaning the submission content and sourcing did not convey that the topic was important enough to warrant a standalone encyclopedia entry.

  • Approval rates vary by article type: AfC submissions in the Arts & Culture category had the highest approval rate at 48 percent. Startups and other tech companies fared especially poorly, with only a six percent approval rate. Business executives likewise had a low approval rate: only 12 percent.

  • AI use is a big problem: In March 2026, Wikipedia’s editor community took the bold step of banning the usage of AI to generate new articles or rewrite existing ones. Our dataset underscored the challenge this technology poses to the encyclopedia: 16 percent of AfC submissions were flagged for AI/LLM concerns.

  • Month-long average review time: Approved submissions have an average review duration of 30.5 days while declined submissions have a slightly longer average review duration of 31.7 days.

  • Limited number of reviewing editors: Nearly 40 percent of submission reviews were handled by 11 especially prolific editors.

The full report answers the following questions: 

  • How do editors determine if a subject meets Wikipedia's notability guidelines? Is that analysis subjective? 

  • Why do submissions about entrepreneurs, startups, and tech-based businesses face such scrutiny? 

  • Do resubmitted drafts have a higher or lower approval rate? 

  • Why can't Manon Bannerman from Katseye have an article?

The book is organized as follows:

  • We provide a high-level summary of how Wikipedia's Articles for creation (AfC) workflow operates, why it's so important, and what our analysis of 1,009 submissions uncovered. 

  • Wikipedia terminology can be difficult to parse, especially as there are numerous closely related terms with subtle differences in meaning. We offer definitions for key terms and provide examples of how they're deployed by editors.

  • We summarize why there is such a large demand for new Wikipedia articles and how vendors attempt to satisfy that market need. Then we explain our own role in that ecosystem and what we wanted to know about the AfC process.

  • This section is split into two: (1) Wikipedia’s methodology for reviewing AfC submissions, and (2) Our methodology for compiling a dataset of submissions and assessing relevant datapoints.

  • We summarize our findings across data analysis categories: Overall approval rate, Reason for decline, Approval rates by category, Reviewing editors, Time to review, Resubmission status, and LLM usage.

  • We provide a deep dive into two core areas that stood out in the data: the widespread use of LLM tools, a new problem facing Wikipedia, and the subjective nature of notability, a contentious subject since the encyclopedia's launch in 2001.

  • Articles for creation serves as a human check on AI-generated hallucinations and prose, and on low-quality, poorly sourced content more generally. But like all human decision-making, there is an interpretive element to the AfC assessments—especially regarding notability—that leads to unpredictable approval decisions.