Skill Overview
Start with fit, limitations, and setup before diving into the repository.
Ideal for AI agents that need 工厂:src/factory/llm/llmfactory.js. xrk-llm is an AI agent skill for 工厂:src/factory/llm/llmfactory.js.
Core Value
xrk-llm helps agents 工厂:src/factory/llm/llmfactory.js. xrk-llm helps AI agents handle repository-specific developer workflows with documented implementation details.
Ideal Agent Persona
Ideal for AI agents that need 工厂:src/factory/llm/llmfactory.js.
↓ Capabilities Granted for xrk-llm
! Prerequisites & Limits
- Requires repository-specific context from the skill documentation
- Works best when the underlying tools and dependencies are already configured
About The Source
The section below comes from the upstream repository. Use it as supporting material alongside the fit, use-case, and installation summary on this page.
Browser Sandbox Environment
⚡️ Ready to unleash?
Experience this Agent in a zero-setup browser environment powered by WebContainers. No installation required.
FAQ & Installation Steps
These questions and steps mirror the structured data on this page for better search understanding.
? Frequently Asked Questions
What is xrk-llm?
Ideal for AI agents that need 工厂:src/factory/llm/llmfactory.js. xrk-llm is an AI agent skill for 工厂:src/factory/llm/llmfactory.js.
How do I install xrk-llm?
Run the command: npx killer-skills add sunflowermm/XRK-AGT/xrk-llm. It works with Cursor, Windsurf, VS Code, Claude Code, and 19+ other IDEs.
What are the use cases for xrk-llm?
Key use cases include: Applying 工厂:src/factory/llm/LLMFactory.js, Applying OpenAI Chat 协议工具:src/utils/llm/openai-chat-utils.js, Applying v3 网关:core/system-Core/http/ai.js(POST /api/v3/chat/completions).
Which IDEs are compatible with xrk-llm?
This skill is compatible with Cursor, Windsurf, VS Code, Trae, Claude Code, OpenClaw, Aider, Codex, OpenCode, Goose, Cline, Roo Code, Kiro, Augment Code, Continue, GitHub Copilot, Sourcegraph Cody, and Amazon Q Developer. Use the Killer-Skills CLI for universal one-command installation.
Are there any limitations for xrk-llm?
Requires repository-specific context from the skill documentation. Works best when the underlying tools and dependencies are already configured.
↓ How To Install
-
1. Open your terminal
Open the terminal or command line in your project directory.
-
2. Run the install command
Run: npx killer-skills add sunflowermm/XRK-AGT/xrk-llm. The CLI will automatically detect your IDE or AI agent and configure the skill.
-
3. Start using the skill
The skill is now active. Your AI agent can use xrk-llm immediately in the current project.
! Source Notes
This page is still useful for installation and source reference. Before using it, compare the fit, limitations, and upstream repository notes above.
Upstream Repository Material
The section below comes from the upstream repository. Use it as supporting material alongside the fit, use-case, and installation summary on this page.
xrk-llm
Install xrk-llm, an AI agent skill for AI agent workflows and automation. Explore features, use cases, limitations, and setup guidance.