distill — ai-agent distill, onebrain, community, ai-agent, ide skills, claude-code, gemini-cli, knowledge-base, local-first, markdown, Claude Code

v1.0.0

关于此技能

Ideal for Knowledge Management Agents requiring advanced note-taking and research organization capabilities. Your personal AI OS — memory that learns and evolves with you, 24+ skills, and a full local stack for Claude Code, Gemini CLI, or any AI agent. Built on Obsidian + plain Markdown.

# 核心主题

kengio kengio
[4]
[0]
更新于: 4/15/2026

Killer-Skills Review

Decision support comes first. Repository text comes second.

Reference-Only Page Review Score: 7/11

This page remains useful for operators, but Killer-Skills treats it as reference material instead of a primary organic landing page.

Original recommendation layer Concrete use-case guidance Explicit limitations and caution
Review Score
7/11
Quality Score
39
Canonical Locale
en
Detected Body Locale
en

Ideal for Knowledge Management Agents requiring advanced note-taking and research organization capabilities. Your personal AI OS — memory that learns and evolves with you, 24+ skills, and a full local stack for Claude Code, Gemini CLI, or any AI agent. Built on Obsidian + plain Markdown.

核心价值

Empowers agents to condense complex research threads into structured notes using qmd, Grep, and Glob searches, facilitating efficient information retrieval with Claude Code and similar AI tools.

适用 Agent 类型

Ideal for Knowledge Management Agents requiring advanced note-taking and research organization capabilities.

赋予的主要能力 · distill

Consolidating knowledge from multiple sessions
Generating structured notes for topics and projects
Automating research organization with keyword searches

! 使用限制与门槛

  • Requires access to vault and note folders
  • Limited to 20 sources without user refinement
  • Dependent on proper noun and multi-word phrase searches for accuracy

Why this page is reference-only

  • - Current locale does not satisfy the locale-governance contract.
  • - The underlying skill quality score is below the review floor.

Source Boundary

The section below is supporting source material from the upstream repository. Use the Killer-Skills review above as the primary decision layer.

实验室 Demo

Browser Sandbox Environment

⚡️ Ready to unleash?

Experience this Agent in a zero-setup browser environment powered by WebContainers. No installation required.

Boot Container Sandbox

常见问题与安装步骤

以下问题与步骤与页面结构化数据保持一致,便于搜索引擎理解页面内容。

? FAQ

distill 是什么?

Ideal for Knowledge Management Agents requiring advanced note-taking and research organization capabilities. Your personal AI OS — memory that learns and evolves with you, 24+ skills, and a full local stack for Claude Code, Gemini CLI, or any AI agent. Built on Obsidian + plain Markdown.

如何安装 distill?

运行命令:npx killer-skills add kengio/onebrain/distill。支持 Cursor、Windsurf、VS Code、Claude Code 等 19+ IDE/Agent。

distill 适用于哪些场景?

典型场景包括:Consolidating knowledge from multiple sessions、Generating structured notes for topics and projects、Automating research organization with keyword searches。

distill 支持哪些 IDE 或 Agent?

该技能兼容 Cursor, Windsurf, VS Code, Trae, Claude Code, OpenClaw, Aider, Codex, OpenCode, Goose, Cline, Roo Code, Kiro, Augment Code, Continue, GitHub Copilot, Sourcegraph Cody, and Amazon Q Developer。可使用 Killer-Skills CLI 一条命令通用安装。

distill 有哪些限制?

Requires access to vault and note folders;Limited to 20 sources without user refinement;Dependent on proper noun and multi-word phrase searches for accuracy。

安装步骤

  1. 1. 打开终端

    在你的项目目录中打开终端或命令行。

  2. 2. 执行安装命令

    运行:npx killer-skills add kengio/onebrain/distill。CLI 会自动识别 IDE 或 AI Agent 并完成配置。

  3. 3. 开始使用技能

    distill 已启用,可立即在当前项目中调用。

! 参考页模式

此页面仍可作为安装与查阅参考,但 Killer-Skills 不再把它视为主要可索引落地页。请优先阅读上方评审结论,再决定是否继续查看上游仓库说明。

Imported Repository Instructions

The section below is supporting source material from the upstream repository. Use the Killer-Skills review above as the primary decision layer.

Supporting Evidence

distill

安装 distill,这是一款面向AI agent workflows and automation的 AI Agent Skill。支持 Claude Code、Cursor、Windsurf,一键安装。

SKILL.md
Readonly
Imported Repository Instructions
The section below is supporting source material from the upstream repository. Use the Killer-Skills review above as the primary decision layer.
Supporting Evidence

Distill

Take a completed research thread, brainstorming topic, or recurring theme and compress it into a single, structured knowledge note. Unlike /wrapup (session-focused), /distill is topic-focused and spans multiple sessions.

Usage: /distill [topic]


Step 1: Identify the Topic

If a topic was provided after the command, use it directly. If not, ask:

What topic do you want to distill? (e.g. "OneBrain memory architecture", "Mac Mini purchase decision", "MCP server setup")


Step 2: Gather Source Material

Search across the vault for notes related to the topic. Use 2–3 specific keywords or phrases from the topic (prefer proper nouns and multi-word phrases over generic single words):

Use qmd if available for content searches; Grep/Glob as fallback.

  1. Session logs: Search [logs_folder]/**/*.md for topic keywords — extract matching ## Key Decisions, ## Action Items, ## Open Questions sections
  2. Inbox: Search [inbox_folder]/*.md for related content
  3. memory/ files: Search [agent_folder]/memory/ for related entries — match topic keywords against filename and frontmatter topics: field
  4. Project/knowledge notes: Search [projects_folder]/**/*.md, [knowledge_folder]/**/*.md, and [resources_folder]/**/*.md — filter by note title or first 100 words

Report to user (N = total matches across all sources; Q = project/knowledge/resource notes combined):

Found N sources: M session logs, P inbox notes, Q knowledge notes

If N = 0: Stop and inform the user:

No notes found matching '[topic]'. Try a broader keyword or check the topic name.

Exit — do not proceed to Step 3.

If N > 20: Too many results — the keywords may be too broad. Use AskUserQuestion:

Found N sources for '[topic]' — that's a lot. Do you want to:

  1. Narrow the scope (I'll ask for more specific keywords or a date range)
  2. Continue with all N sources

If user picks option 1, call AskUserQuestion immediately (do not wait or proceed):

Please provide more specific keywords or a date range (e.g. "focus on MCP setup decisions from March 2026"):

Use the user's answer as refined search criteria and re-run the search from the top of Step 2. If the refined search still returns > 20 sources, inform the user and proceed with all N rather than asking again (one clarification cycle maximum). If user picks option 2 (continue with all N), proceed.


Step 3: Synthesize

Extract and consolidate across all sources:

  • Core question — what was being explored or decided?
  • What we found — key findings, facts, conclusions
  • Key decisions made — explicit choices that were committed to
  • Lessons — generalizable insights worth keeping long-term
  • Open questions — still unresolved as of the most recent source
  • Entities involved — tools, projects, people mentioned

Present a brief synthesis preview to the user before writing.


Step 4: Choose Destination

Suggest a subfolder in [knowledge_folder]/:

  • Infer topic category (e.g. "OneBrain memory architecture" → [knowledge_folder]/ai-systems/)
  • Present to user using AskUserQuestion: "I'd file this under [knowledge_folder]/[suggested-path]/. OK, or would you like a different path?"
  • If user declines, ask for the preferred path or subfolder name before proceeding.
  • If user cancels entirely, offer one more option via AskUserQuestion: "Save a draft to [inbox_folder]/YYYY-MM-DD-[topic]-draft.md instead?" If yes, save the Step 4 synthesis there. If no, discard.
  • Use the confirmed path for file creation.

Step 5: Write the Digest Note

Before writing: Check if [knowledge_folder]/[subfolder]/[Topic].md already exists.

  • If the file does not exist: create it.

  • If the file already exists: use AskUserQuestion to ask:

    A distilled note for "[Topic]" already exists. How do you want to handle this?

    1. Overwrite — replace with a fresh synthesis
    2. Append — add a ## Update — YYYY-MM-DD section with new findings
    3. Cancel

    If Append is chosen: before writing new content, read the existing digest note and check for any lessons that appear hedged or uncertain in phrasing (e.g. "might", "possibly", "unclear if", or legacy [conf:low] markers). If any exist, surface them:

    This note has M low-confidence lessons. Want to re-evaluate any before appending? (list them) User may promote or leave them as-is. If none exist, skip this silently and proceed to append.

    If Overwrite is chosen: read the existing file first to extract its created: date — this marks when the topic was first distilled and must be preserved. If no created: field exists in the file, use the file's filesystem modification date as a best-effort fallback; if that is also unavailable, use today's date and note the uncertainty in sources_span. Update sources_span to span from the original start date to today's date.

Create or update [knowledge_folder]/[subfolder]/[Topic].md:

markdown
1--- 2tags: [distilled, topic-tag] 3created: YYYY-MM-DD 4source: /distill 5sources_span: YYYY-MM-DD to YYYY-MM-DD 6--- 7 8# [Topic] 9 10> **Distilled:** YYYY-MM-DD 11> **Sources:** N session logs, M notes 12 13## Core Question 14 15[What was being explored or decided] 16 17## What We Found 18 19[Key findings and conclusions, bullet list] 20 21## Key Decisions 22 23[Explicit decisions made, with dates if known] 24 25## Lessons 26 27[Generalizable insights — use /learn to promote to memory/] 28- [list generalizable insights] 29 30## Open Questions 31 32[Still unresolved] 33 34## Related 35 36[[link to related notes]]

Report:

Distilled into [path].

If you want any lesson to persist in long-term memory, promote it manually:

To promote a lesson: /learn [lesson text]


Step 6: Update qmd Index

If qmd_collection is set in vault.yml, run:

bash
1qmd update -c [qmd_collection]

相关技能

寻找 distill 的替代方案 (Alternative) 或可搭配使用的同类 community Skill?探索以下相关开源技能。

查看全部

openclaw-release-maintainer

Logo of openclaw
openclaw

Your own personal AI assistant. Any OS. Any Platform. The lobster way. 🦞

333.8k
0
AI

widget-generator

Logo of f
f

为prompts.chat的信息反馈系统生成可定制的插件小部件

149.6k
0
AI

flags

Logo of vercel
vercel

React 框架

138.4k
0
浏览器

pr-review

Logo of pytorch
pytorch

Python中具有强大GPU加速的张量和动态神经网络

98.6k
0
开发者工具