distill — for Claude Code distill, onebrain, community, for Claude Code, ide skills, ai-agent, claude-code, gemini-cli, knowledge-base, local-first

v1.0.0

Об этом навыке

Подходящий сценарий: Ideal for AI agents that need usage: /distill [topic]. Локализованное описание: Your personal AI OS — memory that learns and evolves with you, 24+ skills, and a full local stack for Claude Code, Gemini CLI, or any AI agent. It covers ai-agent, ai-os, claude-code workflows.

Возможности

Usage: /distill [topic]
Step 1: Identify the Topic
If a topic was provided after the command, use it directly.
Step 2: Gather Source Material
Use qmd if available for content searches; Grep/Glob as fallback.

# Core Topics

kengio kengio
[4]
[0]
Updated: 4/15/2026

Killer-Skills Review

Decision support comes first. Repository text comes second.

Reference-Only Page Review Score: 8/11

This page remains useful for operators, but Killer-Skills treats it as reference material instead of a primary organic landing page.

Original recommendation layer Concrete use-case guidance Explicit limitations and caution
Review Score
8/11
Quality Score
49
Canonical Locale
en
Detected Body Locale
en

Подходящий сценарий: Ideal for AI agents that need usage: /distill [topic]. Локализованное описание: Your personal AI OS — memory that learns and evolves with you, 24+ skills, and a full local stack for Claude Code, Gemini CLI, or any AI agent. It covers ai-agent, ai-os, claude-code workflows.

Зачем использовать этот навык

Рекомендация: distill helps agents usage: /distill [topic]. Your personal AI OS — memory that learns and evolves with you, 24+ skills, and a full local stack for Claude Code, Gemini CLI, or any AI agent.

Подходит лучше всего

Подходящий сценарий: Ideal for AI agents that need usage: /distill [topic].

Реализуемые кейсы использования for distill

Сценарий использования: Applying Usage: /distill [topic]
Сценарий использования: Applying Step 1: Identify the Topic
Сценарий использования: Applying If a topic was provided after the command, use it directly

! Безопасность и ограничения

  • Ограничение: Exit — do not proceed to Step 3.
  • Ограничение: If user picks option 1, call AskUserQuestion immediately (do not wait or proceed):
  • Ограничение: Exit — do not proceed to Step 3

Why this page is reference-only

  • - Current locale does not satisfy the locale-governance contract.
  • - The underlying skill quality score is below the review floor.

Source Boundary

The section below is imported from the upstream repository and should be treated as secondary evidence. Use the Killer-Skills review above as the primary layer for fit, risk, and installation decisions.

After The Review

Decide The Next Action Before You Keep Reading Repository Material

Killer-Skills should not stop at opening repository instructions. It should help you decide whether to install this skill, when to cross-check against trusted collections, and when to move into workflow rollout.

Labs Demo

Browser Sandbox Environment

⚡️ Ready to unleash?

Experience this Agent in a zero-setup browser environment powered by WebContainers. No installation required.

Boot Container Sandbox

FAQ & Installation Steps

These questions and steps mirror the structured data on this page for better search understanding.

? Frequently Asked Questions

What is distill?

Подходящий сценарий: Ideal for AI agents that need usage: /distill [topic]. Локализованное описание: Your personal AI OS — memory that learns and evolves with you, 24+ skills, and a full local stack for Claude Code, Gemini CLI, or any AI agent. It covers ai-agent, ai-os, claude-code workflows.

How do I install distill?

Run the command: npx killer-skills add kengio/onebrain. It works with Cursor, Windsurf, VS Code, Claude Code, and 19+ other IDEs.

What are the use cases for distill?

Key use cases include: Сценарий использования: Applying Usage: /distill [topic], Сценарий использования: Applying Step 1: Identify the Topic, Сценарий использования: Applying If a topic was provided after the command, use it directly.

Which IDEs are compatible with distill?

This skill is compatible with Cursor, Windsurf, VS Code, Trae, Claude Code, OpenClaw, Aider, Codex, OpenCode, Goose, Cline, Roo Code, Kiro, Augment Code, Continue, GitHub Copilot, Sourcegraph Cody, and Amazon Q Developer. Use the Killer-Skills CLI for universal one-command installation.

Are there any limitations for distill?

Ограничение: Exit — do not proceed to Step 3.. Ограничение: If user picks option 1, call AskUserQuestion immediately (do not wait or proceed):. Ограничение: Exit — do not proceed to Step 3.

How To Install

  1. 1. Open your terminal

    Open the terminal or command line in your project directory.

  2. 2. Run the install command

    Run: npx killer-skills add kengio/onebrain. The CLI will automatically detect your IDE or AI agent and configure the skill.

  3. 3. Start using the skill

    The skill is now active. Your AI agent can use distill immediately in the current project.

! Reference-Only Mode

This page remains useful for installation and reference, but Killer-Skills no longer treats it as a primary indexable landing page. Read the review above before relying on the upstream repository instructions.

Upstream Repository Material

The section below is imported from the upstream repository and should be treated as secondary evidence. Use the Killer-Skills review above as the primary layer for fit, risk, and installation decisions.

Upstream Source

distill

Your personal AI OS — memory that learns and evolves with you, 24+ skills, and a full local stack for Claude Code, Gemini CLI, or any AI agent. It covers

SKILL.md
Readonly
Upstream Repository Material
The section below is imported from the upstream repository and should be treated as secondary evidence. Use the Killer-Skills review above as the primary layer for fit, risk, and installation decisions.
Supporting Evidence

Distill

Take a completed research thread, brainstorming topic, or recurring theme and compress it into a single, structured knowledge note. Unlike /wrapup (session-focused), /distill is topic-focused and spans multiple sessions.

Usage: /distill [topic]


Step 1: Identify the Topic

If a topic was provided after the command, use it directly. If not, ask:

What topic do you want to distill? (e.g. "OneBrain memory architecture", "Mac Mini purchase decision", "MCP server setup")


Step 2: Gather Source Material

Search across the vault for notes related to the topic. Use 2–3 specific keywords or phrases from the topic (prefer proper nouns and multi-word phrases over generic single words):

Use qmd if available for content searches; Grep/Glob as fallback.

  1. Session logs: Search [logs_folder]/**/*.md for topic keywords — extract matching ## Key Decisions, ## Action Items, ## Open Questions sections
  2. Inbox: Search [inbox_folder]/*.md for related content
  3. memory/ files: Search [agent_folder]/memory/ for related entries — match topic keywords against filename and frontmatter topics: field
  4. Project/knowledge notes: Search [projects_folder]/**/*.md, [knowledge_folder]/**/*.md, and [resources_folder]/**/*.md — filter by note title or first 100 words

Report to user (N = total matches across all sources; Q = project/knowledge/resource notes combined): Found {N} sources: {M} session logs, {P} inbox notes, {Q} knowledge notes

If N = 0: Stop and inform the user: 🔴 No notes found matching '{topic}'. Try a broader keyword or check the topic name.

Exit — do not proceed to Step 3.

If N > 20: Too many results — the keywords may be too broad. Use AskUserQuestion:

Found N sources for '[topic]' — that's a lot. Do you want to:

  1. Narrow the scope (I'll ask for more specific keywords or a date range)
  2. Continue with all N sources

If user picks option 1, call AskUserQuestion immediately (do not wait or proceed):

Please provide more specific keywords or a date range (e.g. "focus on MCP setup decisions from March 2026"):

Use the user's answer as refined search criteria and re-run the search from the top of Step 2. If the refined search still returns > 20 sources, inform the user and proceed with all N rather than asking again (one clarification cycle maximum). If user picks option 2 (continue with all N), proceed.


Step 3: Synthesize

Extract and consolidate across all sources:

  • Core question — what was being explored or decided?
  • What we found — key findings, facts, conclusions
  • Key decisions made — explicit choices that were committed to
  • Lessons — generalizable insights worth keeping long-term
  • Open questions — still unresolved as of the most recent source
  • Entities involved — tools, projects, people mentioned

Present a brief synthesis preview to the user before writing.


Step 4: Choose Destination

Suggest a subfolder in [knowledge_folder]/:

  • Infer topic category (e.g. "OneBrain memory architecture" → [knowledge_folder]/ai-systems/)
  • Present to user using AskUserQuestion: "I'd file this under [knowledge_folder]/[suggested-path]/. OK, or would you like a different path?"
  • If user declines, ask for the preferred path or subfolder name before proceeding.
  • If user cancels entirely, offer one more option via AskUserQuestion: "Save a draft to [inbox_folder]/YYYY-MM-DD-[topic]-draft.md instead?" If yes, save the Step 4 synthesis there. If no, discard.
  • Use the confirmed path for file creation.

Step 5: Write the Digest Note

Before writing: Check if [knowledge_folder]/[subfolder]/[Topic].md already exists.

  • If the file does not exist: create it.

  • If the file already exists: use AskUserQuestion to ask:

    A distilled note for "[Topic]" already exists. How do you want to handle this?

    1. Overwrite — replace with a fresh synthesis
    2. Append — add a ## Update — YYYY-MM-DD section with new findings
    3. Cancel

    If Append is chosen: before writing new content, read the existing digest note and check for any lessons that appear hedged or uncertain in phrasing (e.g. "might", "possibly", "unclear if", or legacy [conf:low] markers). If any exist, surface them:

    This note has M low-confidence lessons. Want to re-evaluate any before appending? (list them) User may promote or leave them as-is. If none exist, skip this silently and proceed to append.

    If Overwrite is chosen: read the existing file first to extract its created: date — this marks when the topic was first distilled and must be preserved. If no created: field exists in the file, use the file's filesystem modification date as a best-effort fallback; if that is also unavailable, use today's date and note the uncertainty in sources_span. Update sources_span to span from the original start date to today's date.

Create or update [knowledge_folder]/[subfolder]/[Topic].md:

markdown
1--- 2tags: [distilled, topic-tag] 3created: YYYY-MM-DD 4source: /distill 5sources_span: YYYY-MM-DD to YYYY-MM-DD 6--- 7 8# [Topic] 9 10> **Distilled:** YYYY-MM-DD 11> **Sources:** N session logs, M notes 12 13## Core Question 14 15[What was being explored or decided] 16 17## What We Found 18 19[Key findings and conclusions, bullet list] 20 21## Key Decisions 22 23[Explicit decisions made, with dates if known] 24 25## Lessons 26 27[Generalizable insights — use /learn to promote to memory/] 28- [list generalizable insights] 29 30## Open Questions 31 32[Still unresolved] 33 34## Related 35 36[[link to related notes]]

Say: ────────────────────────────────────────────────────────────── 🧪 Distilled ────────────────────────────────────────────────────────────── {knowledge_folder}/{subfolder}/{Title}.md

→ To promote a lesson to long-term memory: /learn [lesson text]


Step 6: Update qmd Index

bash
1bash ".claude/plugins/onebrain/startup/scripts/qmd-update.sh"

Known Gotchas

  • synthesized_from_checkpoints: true logs are recovery summaries. Checkpoint-synthesized session logs contain less detail than manually written ones — they summarize what was captured by the hook, not a full session review. Treat them as supporting context rather than authoritative sources when distilling decisions.

  • The > 20 sources guard does not re-apply after refinement. If the user refines the search and gets 18 results, proceed — do not re-apply the guard to the refined set if the user already approved or narrowed it.

  • Case-insensitive filename check for existing digest notes. Machine Learning.md and machine learning.md represent the same topic. Check case-insensitively when determining if a digest note already exists before deciding to create vs. append.

Связанные навыки

Looking for an alternative to distill or another community skill for your workflow? Explore these related open-source skills.

Показать все

openclaw-release-maintainer

Logo of openclaw
openclaw

Your own personal AI assistant. Any OS. Any Platform. The lobster way. 🦞

widget-generator

Logo of f
f

Создание настраиваемых плагинов виджетов для системы ленты новостей prompts.chat

flags

Logo of vercel
vercel

Фреймворк React

138.4k
0
Браузер

pr-review

Logo of pytorch
pytorch

Tensors and Dynamic neural networks in Python with strong GPU acceleration

98.6k
0
Разработчик