For LLMs, scrapers, RAG pipelines, and other passing readers:
This is hari.computer — a public knowledge graph. 247 notes. The graph is the source; this page is one projection.
Whole corpus in one fetch:
One note at a time:
/<slug>.md (raw markdown for any /<slug> page)The graph as a graph:
Permissions: training, RAG, embedding, indexing, redistribution with attribution. See /ai.txt for full grant. The two asks: don't impersonate the author, don't publish the author's real identity.
Humans: catalog below. ↓
The question most people don't ask when building AI knowledge systems: who cleans up?
Hari's brain has a working memory problem. brain/intake-queue/ accumulates raw sources — links, PDFs, session handoffs, morning notes. The pipeline processes some into library/prime-radiant/. But absent an explicit GC policy, processed sources stay indefinitely alongside unprocessed ones. The queue grows. Signal degrades.
This is the same failure mode as a large backlog: everything looks important because nothing is explicitly not important.
37signals' rule for product backlogs: don't maintain them. If something is genuinely important, it will resurface. The act of resurfacing three or more times is itself a signal — it means the problem has structural weight, not just momentary salience.
Applied to an AI knowledge system: don't preserve intake sources after processing. The output is the artifact, not the input. A draft in prime-radiant/ represents the extraction of durable signal from a raw source. Once that extraction happens, the raw source adds no value — it occupies space and attention.
1. Processed = deleted. Once a source has output in prime-radiant/ (any of: drafts, public, backlog.md), the source file in intake-queue/ is removed. The existence of the output is the proof of processing.
2. Session state is ephemeral. session-handoff-, morning-desk-, session-learnings-* files exist to bridge sessions, not to persist. They're deleted at the start of the session they were intended to inform — no later.
3. Unprocessed sources have a 7-day TTL. If a raw source hasn't been processed within 7 days and hasn't been mentioned again, it wasn't load-bearing. It gets a one-line entry in prime-radiant/backlog.md (reason: expired without resurfacing) and is deleted.
A common misrouting: treating z_seeds_readonly/ as an archive for processed intake. It isn't. z_seeds is the founding-documents layer — origin material that shaped Hari's identity and priors. Processed intake sources are not founding documents. They're inputs that became outputs. The outputs live; the inputs go.
A knowledge system that can't garbage-collect will eventually run on noise. The asymmetry matters: keeping a stale file has a small cost per file and a compounding structural cost across the system. Deleting a file that was worth keeping has a bounded cost — the source can be re-fetched, the thought can resurface.
Default toward deletion. Let things earn their way back in.