KS
Killer-Skills

build-and-deploy — how to use build-and-deploy how to use build-and-deploy, build-and-deploy setup guide, LangChain.js agent deployment, Vercel AI SDK installation, build-and-deploy vs LangChain, Supabase vector store integration, build-and-deploy environment variables, LangGraph retrieval tool, Vercel AI SDK tutorial

v1.0.0
GitHub

About this Skill

Perfect for LangChain.js Agents needing streamlined deployment to Vercel with Supabase vector store integration. build-and-deploy is a LangChain.js agent skill that utilizes LangGraph, Supabase vector store, and Vercel AI SDK for efficient deployment

Features

Utilizes Vercel AI SDK for seamless deployment
Employs LangGraph for retrieval tool functionality
Integrates with Supabase vector store for data management
Requires `vercel build --prod` and `vercel deploy --prebuilt --prod` for deployment
Supports environment variable setup via `.env` files
Follows critical rules for Git author permission issues in VM environments

# Core Topics

rebyteai-template rebyteai-template
[0]
[0]
Updated: 3/6/2026

Quality Score

Top 5%
36
Excellent
Based on code quality & docs
Installation
SYS Universal Install (Auto-Detect)
Cursor IDE Windsurf IDE VS Code IDE
> npx killer-skills add rebyteai-template/langchain-retrieval-agent

Agent Capability Analysis

The build-and-deploy MCP Server by rebyteai-template is an open-source Categories.community integration for Claude and other AI agents, enabling seamless task automation and capability expansion. Optimized for how to use build-and-deploy, build-and-deploy setup guide, LangChain.js agent deployment.

Ideal Agent Persona

Perfect for LangChain.js Agents needing streamlined deployment to Vercel with Supabase vector store integration.

Core Value

Empowers agents to deploy LangChain.js applications using LangGraph and Vercel AI SDK, simplifying the deployment process with `vercel build --prod` and `vercel deploy --prebuilt --prod` commands, while leveraging environment variables from `.env` files.

Capabilities Granted for build-and-deploy MCP Server

Deploying LangChain Retrieval Agents to Vercel
Configuring Supabase vector stores for AI models
Streamlining LangChain.js application deployment with automated build and deploy scripts

! Prerequisites & Limits

  • Requires Vercel account and Git author permissions
  • Must use `vercel build --prod` and `vercel deploy --prebuilt --prod` for successful deployment
  • Dependent on LangGraph and Vercel AI SDK compatibility
Project
SKILL.md
3.0 KB
.cursorrules
1.2 KB
package.json
240 B
Ready
UTF-8

# Tags

[No tags]
SKILL.md
Readonly

Build and Deploy LangChain Retrieval Agent

CRITICAL: For Vercel, you MUST use vercel build --prod then vercel deploy --prebuilt --prod. Do NOT use vercel --prod or vercel deploy --prod directly - these will fail due to Git author permission issues in VM environments.

Workflow

1. Setup Environment Variables

Read .env.example to see all required variables:

bash
1cat .env.example

Create .env by reading values from current environment:

For each variable in .env.example, read the value from the current environment and write to .env. Example approach:

bash
1# Read .env.example and create .env with values from current environment 2while IFS= read -r line || [[ -n "$line" ]]; do 3 # Skip comments and empty lines 4 [[ "$line" =~ ^#.*$ || -z "$line" ]] && continue 5 # Extract variable name (before = sign) 6 var_name=$(echo "$line" | cut -d'=' -f1) 7 # Get value from environment 8 var_value="${!var_name}" 9 # Write to .env 10 echo "${var_name}=${var_value}" >> .env 11done < .env.example

Or manually inspect .env.example and create .env with the required values from environment variables.

2. Install Dependencies

bash
1yarn install

3. Setup Supabase Vector Store

Get the database connection URL from Supabase Dashboard:

  • Go to Settings → Database → Connection string → URI
  • Copy the URI and add it to .env as SUPABASE_DB_URL
bash
1yarn setup-db

This creates the pgvector extension, documents table, and match_documents function in your Supabase project.

4. Build

bash
1yarn build

5. Deploy

Vercel:

All vercel CLI commands require -t <token> or --token <token> for authentication.

bash
1# Pull project settings (also links project, creates .vercel/project.json) 2vercel pull --yes -t $VERCEL_TOKEN 3 4# Push env vars to Vercel (first time only) 5# Must add to each environment separately 6while IFS='=' read -r key value; do 7 [[ "$key" =~ ^#.*$ || -z "$key" || -z "$value" ]] && continue 8 for env in production preview development; do 9 printf '%s' "$value" | vercel env add "$key" $env -t $VERCEL_TOKEN 10 done 11done < .env 12 13# Build locally for production 14vercel build --prod -t $VERCEL_TOKEN 15 16# Deploy prebuilt 17vercel deploy --prebuilt --prod --yes -t $VERCEL_TOKEN

Netlify:

bash
1# Import all env vars from .env (first time only) 2netlify env:import .env 3 4# Deploy 5netlify deploy --prod

Critical Notes

  • VERCEL PREBUILT MODE IS MANDATORY: Always use vercel build --prod followed by vercel deploy --prebuilt --prod. Never use vercel --prod or vercel deploy --prod without --prebuilt flag.
  • Supabase Required: Need a Supabase project (free tier works)
  • Vector Store Setup: Run yarn setup-db to create tables automatically
  • Environment Variables: All values come from current environment - inspect .env.example for required variables
  • OpenAI for Embeddings: OPENAI_API_KEY is always required for vector embeddings
  • No Dev Server: Never run yarn dev in VM environment

Related Skills

Looking for an alternative to build-and-deploy or building a Categories.community AI Agent? Explore these related open-source MCP Servers.

View All

widget-generator

Logo of f
f

widget-generator is an open-source AI agent skill for creating widget plugins that are injected into prompt feeds on prompts.chat. It supports two rendering modes: standard prompt widgets using default PromptCard styling and custom render widgets built as full React components.

149.6k
0
Design

chat-sdk

Logo of lobehub
lobehub

chat-sdk is a unified TypeScript SDK for building chat bots across multiple platforms, providing a single interface for deploying bot logic.

73.0k
0
Communication

zustand

Logo of lobehub
lobehub

The ultimate space for work and life — to find, build, and collaborate with agent teammates that grow with you. We are taking agent harness to the next level — enabling multi-agent collaboration, effortless agent team design, and introducing agents as the unit of work interaction.

72.8k
0
Communication

data-fetching

Logo of lobehub
lobehub

The ultimate space for work and life — to find, build, and collaborate with agent teammates that grow with you. We are taking agent harness to the next level — enabling multi-agent collaboration, effortless agent team design, and introducing agents as the unit of work interaction.

72.8k
0
Communication