Code wikis are documentation theater as a service
Code Wiki, a new AI tool by Google similar to DeepWiki, claims to generate a complete set of docs, including diagrams, from code repos. The landing page goes as far as saying “Stop documenting. No more stale docs. Ever”, a claim that made me stagger and reach for the nearest chair. That these tools are laughably bad isn’t reassuring; their emergence hints at a deeper and more unsettling cultural problem.
Ain’t nobody got time for docs – let’s regurgitate words and boxes!
The way these wikis work is as follows: you select or add a code repository and the LLM generates a seemingly complete set of docs. Both apps allow you to ask questions about the docs using an agent. The process takes around ten minutes for large codebases. I’ve tried it on one of my pet projects and after a while it produced an entire wiki full of dev docs almost magically.
These tools can fulfill the role of project summarization. They’re helpful when trying to explore poorly documented codebases and get a sense of how the pieces fit together, but they stop short of anything that could be called “docs” (and perhaps that’s why their authors shyly chose to go with “wiki” instead). I would describe their output as repo byproducts, the result of an LLM digestion process (yes, blergh).
It’s impressive for the first ten seconds, but as you start reading, you begin to notice that for most complex projects, the wiki is at best dry reference, at worst a dumpster fire of hallucinations, Escherian information architecture, and subtly wrong facts and diagrams. This is not a lazy dismissal: read the comments on Hacker News to get an idea of how dismal these wikis can be despite the semblance of completeness.
I can already see the pitch meetings where someone pulls up one of these wikis and says, “Look, we’ve got docs now”. The bar for what counts as documentation gets lower every time we pretend these outputs are good enough.
You can choose to insult your intelligence by calling them docs
What makes me furious with the heat of a thousand Suns is the fact that some devs could be lured into thinking that this sort of ersatz manuals can check the docs box at all. That some could find these self-inflicted wikis a valid replacement of human-made docs shows the same kind of ignorance (or cynicism) about technical writing that folks display when praising art made by AI.
Docs are a product. They require sustained and intelligent effort over a long period of time, as well as a strong community ready to contribute. Docs are both mirrors and lore of your code and design, instruments that help you think about what you are architecting and developing. Docs serve purposes and needs. Great docs explain, guide, help, and illustrate. They’re deeply flawed and human. They must be.
These wikis, on the other hand, are cargo cult documentation as a service. Their structure could be described as randomly assembling brutalist housing blocks from piles of bricks, their nav a barely plausible FrankenBar that always feels the same. Of course docs must have diagrams, they say, we’re not dummies. While it’s very positive that developers see the lack of docs as an issue, these wikis are not a solution.
Augmentation means opening doors faster, not crashing through them
Tom Johnson recently wrote that the tech industry is shifting to a narrative where “AI provides tools to speed up your work in a constantly accelerating environment”. Acceleration, in this context, means you can’t be bothered with docs work, at least according to the creators of Code Wiki. You end up sacrificing quality on the altar of delivering even more stuff for the sake of delivering it.
This is a case where some docs aren’t better than none, because they signify the triumph of intellectual laziness over trust and accountability. While I’m not concerned about their impact (any enterprise customer would shred you to pieces if you provided docs like those), I think that they represent a trend that we must fight, because docs entirely written by AI are wrong.
It would be wrong to just stop at the surface. If products like DeepWiki gain traction it’s also because there’s a need for developers to interact more with knowledge and to break the barrier of increasingly complex, increasingly abundant knowledge. When speed and noise increase, the role of tech writers as curators, preserving what Rahel Anne Bailie describes as Content Integrity, is more important than ever.
Let’s draw the line: docs generated entirely by AI should never ship to users. You can use AI judiciously to audit your docs for gaps, to spot inconsistencies, to generate first drafts that you’ll edit later. But the moment you let any LLM have the final word on what users will read, you’ve abandoned the basic contract of docs: that someone who understands the system has taken responsibility for explaining it truthfully.