streaming — community streaming, yash-full-stack-personal, community, ide skills

v0.0.1

About this Skill

Perfect for AI Agents needing real-time data processing from AI backends using the assistant-stream package. Guide for assistant-stream package and streaming protocols. Use when implementing streaming backends, custom protocols, or debugging stream issues.

yashness yashness
[0]
[0]
Updated: 3/12/2026

Killer-Skills Review

Decision support comes first. Repository text comes second.

Reviewed Landing Page Review Score: 9/11

Killer-Skills keeps this page indexable because it adds recommendation, limitations, and review signals beyond the upstream repository text.

Original recommendation layer Concrete use-case guidance Explicit limitations and caution Quality floor passed for review Locale and body language aligned
Review Score
9/11
Quality Score
56
Canonical Locale
en
Detected Body Locale
en

Perfect for AI Agents needing real-time data processing from AI backends using the assistant-stream package. Guide for assistant-stream package and streaming protocols. Use when implementing streaming backends, custom protocols, or debugging stream issues.

Core Value

Empowers agents to handle streaming data from AI backends using the assistant-stream package, enabling seamless integration with AI SDK data streams and native assistant-ui formats like assistant-transport, with support for various encoders and decoders.

Ideal Agent Persona

Perfect for AI Agents needing real-time data processing from AI backends using the assistant-stream package.

Capabilities Granted for streaming

Streaming AI-generated content in real-time
Processing live data streams from AI backends
Integrating AI SDK data streams with native assistant-ui formats

! Prerequisites & Limits

  • Requires the assistant-stream package
  • Dependent on AI backend data stream availability
  • Must consult the latest API documentation at assistant-ui.com/llms.txt

Source Boundary

The section below is imported from the upstream repository and should be treated as secondary evidence. Use the Killer-Skills review above as the primary layer for fit, risk, and installation decisions.

After The Review

Decide The Next Action Before You Keep Reading Repository Material

Killer-Skills should not stop at opening repository instructions. It should help you decide whether to install this skill, when to cross-check against trusted collections, and when to move into workflow rollout.

Labs Demo

Browser Sandbox Environment

⚡️ Ready to unleash?

Experience this Agent in a zero-setup browser environment powered by WebContainers. No installation required.

Boot Container Sandbox

FAQ & Installation Steps

These questions and steps mirror the structured data on this page for better search understanding.

? Frequently Asked Questions

What is streaming?

Perfect for AI Agents needing real-time data processing from AI backends using the assistant-stream package. Guide for assistant-stream package and streaming protocols. Use when implementing streaming backends, custom protocols, or debugging stream issues.

How do I install streaming?

Run the command: npx killer-skills add yashness/yash-full-stack-personal/streaming. It works with Cursor, Windsurf, VS Code, Claude Code, and 19+ other IDEs.

What are the use cases for streaming?

Key use cases include: Streaming AI-generated content in real-time, Processing live data streams from AI backends, Integrating AI SDK data streams with native assistant-ui formats.

Which IDEs are compatible with streaming?

This skill is compatible with Cursor, Windsurf, VS Code, Trae, Claude Code, OpenClaw, Aider, Codex, OpenCode, Goose, Cline, Roo Code, Kiro, Augment Code, Continue, GitHub Copilot, Sourcegraph Cody, and Amazon Q Developer. Use the Killer-Skills CLI for universal one-command installation.

Are there any limitations for streaming?

Requires the assistant-stream package. Dependent on AI backend data stream availability. Must consult the latest API documentation at assistant-ui.com/llms.txt.

How To Install

  1. 1. Open your terminal

    Open the terminal or command line in your project directory.

  2. 2. Run the install command

    Run: npx killer-skills add yashness/yash-full-stack-personal/streaming. The CLI will automatically detect your IDE or AI agent and configure the skill.

  3. 3. Start using the skill

    The skill is now active. Your AI agent can use streaming immediately in the current project.

Upstream Repository Material

The section below is imported from the upstream repository and should be treated as secondary evidence. Use the Killer-Skills review above as the primary layer for fit, risk, and installation decisions.

Upstream Source

streaming

Install streaming, an AI agent skill for AI agent workflows and automation. Review the use cases, limitations, and setup path before rollout.

SKILL.md
Readonly
Upstream Repository Material
The section below is imported from the upstream repository and should be treated as secondary evidence. Use the Killer-Skills review above as the primary layer for fit, risk, and installation decisions.
Supporting Evidence

assistant-ui Streaming

Always consult assistant-ui.com/llms.txt for latest API.

The assistant-stream package handles streaming from AI backends.

References

When to Use

Using Vercel AI SDK?
├─ Yes → toUIMessageStreamResponse() (no assistant-stream needed)
└─ No → assistant-stream for custom backends

Installation

bash
1npm install assistant-stream

Custom Streaming Response

ts
1import { createAssistantStreamResponse } from "assistant-stream"; 2 3export async function POST(req: Request) { 4 return createAssistantStreamResponse(async (stream) => { 5 stream.appendText("Hello "); 6 stream.appendText("world!"); 7 8 // Tool call example 9 const tool = stream.addToolCallPart({ toolCallId: "1", toolName: "get_weather" }); 10 tool.argsText.append('{"city":"NYC"}'); 11 tool.argsText.close(); 12 tool.setResponse({ result: { temperature: 22 } }); 13 14 stream.close(); 15 }); 16}

With useLocalRuntime

useLocalRuntime expects ChatModelRunResult chunks. Yield content parts for streaming:

tsx
1import { useLocalRuntime } from "@assistant-ui/react"; 2 3const runtime = useLocalRuntime({ 4 model: { 5 async *run({ messages, abortSignal }) { 6 const response = await fetch("/api/chat", { 7 method: "POST", 8 body: JSON.stringify({ messages }), 9 signal: abortSignal, 10 }); 11 12 const reader = response.body?.getReader(); 13 const decoder = new TextDecoder(); 14 let buffer = ""; 15 16 while (reader) { 17 const { done, value } = await reader.read(); 18 if (done) break; 19 20 buffer += decoder.decode(value, { stream: true }); 21 const parts = buffer.split("\n"); 22 buffer = parts.pop() ?? ""; 23 24 for (const chunk of parts.filter(Boolean)) { 25 yield { content: [{ type: "text", text: chunk }] }; 26 } 27 } 28 }, 29 }, 30});

Debugging Streams

ts
1import { AssistantStream, DataStreamDecoder } from "assistant-stream"; 2 3const stream = AssistantStream.fromResponse(response, new DataStreamDecoder()); 4for await (const event of stream) { 5 console.log("Event:", JSON.stringify(event, null, 2)); 6}

Stream Event Types

  • part-start with part.type = "text" | "reasoning" | "tool-call" | "source" | "file"
  • text-delta with streamed text
  • result with tool results
  • step-start, step-finish, message-finish
  • error strings

Common Gotchas

Stream not updating UI

  • Check Content-Type is text/event-stream
  • Check for CORS errors

Tool calls not rendering

  • addToolCallPart needs both toolCallId and toolName
  • Register tool UI with makeAssistantToolUI

Partial text not showing

  • Use text-delta events for streaming

Related Skills

Looking for an alternative to streaming or another community skill for your workflow? Explore these related open-source skills.

View All

openclaw-release-maintainer

Logo of openclaw
openclaw

Your own personal AI assistant. Any OS. Any Platform. The lobster way. 🦞

333.8k
0
AI

widget-generator

Logo of f
f

Generate customizable widget plugins for the prompts.chat feed system

149.6k
0
AI

flags

Logo of vercel
vercel

The React Framework

138.4k
0
Browser

pr-review

Logo of pytorch
pytorch

Tensors and Dynamic neural networks in Python with strong GPU acceleration

98.6k
0
Developer