Anyone can try to edit Grokipedia 0.2 but Grok is running the show

December 3, 2025
2,171 Views

Elon Musk envisions Grokipedia — xAI’s AI-generated, anti-woke spin on Wikipedia — as a definitive monument to human knowledge, something complete and truthful enough to etch in stone and preserve in space. In reality, it’s a hot mess, and it’s only getting worse now that anyone can suggest edits.

Grokipedia was not always editable. When it first launched in October, its roughly 800,000 Grok-written articles were locked. I thought it was a mess then, too — racist, transphobic, awkwardly flattering to Musk, and in places straight-up cloned from Wikipedia — but at least it was predictable. That changed a few weeks ago, when Musk rolled out version 0.2 and opened the door for anyone to propose edits.

Proposing edits on Grokipedia is simple, so simple that the site apparently doesn’t feel a need to give instructions on how to do it. You highlight some text, click the “Suggest Edit” button, and fill in a form with a summary of the proposed change, with an option to suggest content and provide supporting sources. Reviewing edit suggestions is Grok, xAI’s problematic, Musk worshipping AI chatbot. Grok, yes, the chatbot, will also be the one making actual changes to the site. Most edits on Wikipedia don’t require approval, but there is an active community of human editors who watch the “recent changes” page closely.

It’s not very clear what changes Grok is making, though. The system is confusing and isn’t very transparent. Grokipedia tells me there have been “22,319” approved edits so far, though I’ve no way of seeing what these edits were, on what pages they happened, or who suggested them. It contrasts with the well-documented editing logs on Wikipedia, which can be sorted by pages, users, or, in the case of anonymous users, IP addresses. My hunch is that many of Grokipedia’s edits are adding internal links to other Grokipedia pages within articles, though I’ve no firm evidence beyond scrolling through a few pages.

The closest I got to seeing where edits were actually happening was on the homepage. There’s a small panel below the search bar displaying five or so recent updates on a rotation, though these only give the name of the article and say that an unspecified edit has been approved. Not exactly comprehensive. These are entirely at the mercy of whatever users feel like suggesting, leading to a confusing mix of stories. Elon Musk and religious pages were the only things that seemed to come up frequently when I looked, interspersed with things like the TV shows Friends and The Traitors UK and requests to note the potential medical benefits of camel urine.

On Wikipedia, there is a clear timeline of edits outlining what happened, who did what, and the reasons for doing so, with viewable chat logs for contentious issues. There are also copious guidelines on editing style, sourcing requirements, and processes, and you can directly compare edited versions of the site to see exactly what changed and where. Grokipedia had no such guidelines — and it showed, many requests were a jumbled mess — but it did have an editing log. It was a nightmare that only hinted at transparency. The log — which only shows a timestamp, the suggestion, and Grok’s decision and often-convoluted AI-generated reasoning — must be scrolled through manually on a tiny pop-up at the side of the page with no ability to skip ahead or sort by time or type of edit. It’s frustrating, and that’s with only a few edits, and it doesn’t show where changes were actually implemented. With more edits, it would be completely unusable.

Unsurprisingly, Grok doesn’t seem to be the most consistent editor. It makes for confounding reading at times and edit logs betray the lack of clear guidelines for wannabe editors. For example, the editing log for Musk’s biographical page shows many suggestions about his daughter, Vivian, who is transgender. Editors suggest using both her name and pronouns in line with her gender identity and those assigned at birth. While it’s almost impossible to follow what happened precisely, Grok’s decision to edit incrementally meant there was a confusing mix of both throughout the page.

As a chatbot, Grok is amenable to persuasion. For a suggested edit to Musk’s biographical page, a user suggested “the veracity of this statement should be verified,” referring to a quote about the fall of Rome being linked to low birth rates. In a reply far wordier than it needed to be, Grok rejected the suggestion as unnecessary. For a similar request with different phrasing, Grok reached the opposite conclusion, accepting the suggestion and adding the kind of information it previously said was unnecessary. It isn’t too taxing to imagine how one might game requests to ensure edits are accepted.

While this is all technically possible on Wikipedia, the site has a small army of volunteer administrators — selected after a review process or election — to keep things in check. They enforce standards by blocking accounts or IP addresses from editing and locking down pages in cases of page vandalism or edit wars. It’s not clear Grokipedia has anything in place to do the same, leaving it completely at the mercy of random people and a chatbot that once called itself MechaHitler. The issue showed itself on several pages related to World War II and Hitler, for example. I found repeated (rejected) requests to note the dictator was also a painter and that far fewer people had died in the Holocaust than actually did. The corresponding pages on Wikipedia were “protected,” meaning they could only be edited by certain accounts. There were also detailed logs explaining the decision to protect them. If the editing system — or site in general — were easier to navigate, I’m sure I’d find more examples.

Pages like these are obvious targets for abuse, and it’s no surprise they’re among the first hit by malicious editors. They won’t be the last, and with Grokipedia’s chaotic editing system and Grok’s limited guardrails, it may soon be hard to tell what’s vandalism and what isn’t. At this rate, Grokipedia doesn’t feel poised for the stars, it feels poised to collapse into a swamp of barely readable disinformation.

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.


Source link

You may be interested

Some things aren’t games, school is one of those things.
Education
shares2,988 views
Education
shares2,988 views

Some things aren’t games, school is one of those things.

new admin - Feb 04, 2026

[ad_1] Several weeks ago, thrown off by a change in routine brought about by the holiday period, I forgot to…

Feds releasing Elizabeth Zuna, 4th-grader who’s from same school district as Liam Ramos, officials say
Top Stories
shares2,364 views
Top Stories
shares2,364 views

Feds releasing Elizabeth Zuna, 4th-grader who’s from same school district as Liam Ramos, officials say

new admin - Feb 04, 2026

Federal authorities are releasing fourth-grader Elizabeth Zuna, the first of several students detained by immigration officers in the Minneapolis suburb…

Mega rare bird sighting in UK — looks like a ‘painted’ robin
Lifestyle
shares2,602 views
Lifestyle
shares2,602 views

Mega rare bird sighting in UK — looks like a ‘painted’ robin

new admin - Feb 04, 2026

Female red-flanked bluetail, the male is the more colourful one of the pair (Image: Getty Images )A “super rare” bird…