KS
Killer-Skills

omi-backend-patterns — Categories.community

v1.0.0
GitHub

About this Skill

Ideal for Conversation Analysis Agents requiring advanced backend integration with LangGraph, Firestore, Pinecone, or Redis. AI wearables. Put it on, speak, transcribe, automatically

BasedHardware BasedHardware
[0]
[0]
Updated: 3/4/2026

Quality Score

Top 5%
55
Excellent
Based on code quality & docs
Installation
SYS Universal Install (Auto-Detect)
Cursor IDE Windsurf IDE VS Code IDE
> npx killer-skills add BasedHardware/omi/omi-backend-patterns

Agent Capability Analysis

The omi-backend-patterns MCP Server by BasedHardware is an open-source Categories.community integration for Claude and other AI agents, enabling seamless task automation and capability expansion.

Ideal Agent Persona

Ideal for Conversation Analysis Agents requiring advanced backend integration with LangGraph, Firestore, Pinecone, or Redis.

Core Value

Empowers agents to process conversations, extract memories, and integrate with the LangGraph chat system using Python, while providing robust backend patterns for API endpoint implementation and data storage with Firestore, Pinecone, or Redis.

Capabilities Granted for omi-backend-patterns MCP Server

Implementing new API endpoints for conversation processing
Extracting memories from user interactions
Integrating LangGraph with existing chat systems
Optimizing backend code in the `backend/` directory

! Prerequisites & Limits

  • Requires Python backend environment
  • Specific to Omi backend and LangGraph integration
  • Needs access to Firestore, Pinecone, or Redis for data storage
Project
SKILL.md
4.9 KB
.cursorrules
1.2 KB
package.json
240 B
Ready
UTF-8

# Tags

[No tags]
SKILL.md
Readonly

Omi Backend Patterns Skill

This skill provides guidance for working with the Omi backend, including conversation processing, memory extraction, chat system, and LangGraph integration.

When to Use

Use this skill when:

  • Working on backend Python code in backend/
  • Implementing new API endpoints
  • Processing conversations or extracting memories
  • Working with the LangGraph chat system
  • Integrating with Firestore, Pinecone, or Redis

Key Patterns

Conversation Processing

The conversation processing pipeline follows this flow:

  1. Audio arrives via WebSocket (/v4/listen)
  2. Transcription via Deepgram/Soniox/Speechmatics
  3. Conversation creation in Firestore (status: "in_progress")
  4. Processing trigger via POST /v1/conversations or timeout
  5. LLM extraction of structured data:
    • Title and overview
    • Action items
    • Calendar events
    • Memories (user facts)
  6. Storage in Firestore and Pinecone

Key Function: utils/conversations/process_conversation.py::process_conversation()

Memory Extraction

Memories are extracted from conversations using LLM:

python
1from utils.llm.conversation_processing import _extract_memories 2 3memories = await _extract_memories( 4 transcript=transcript, 5 existing_memories=existing_memories, 6)

Categories: personal, health, work, relationships, preferences

Chat System Architecture

The chat system uses LangGraph for routing:

  1. Classification: requires_context() determines path
  2. Simple Path: Direct LLM response (no context needed)
  3. Agentic Path: Full tool access with LangGraph ReAct agent
  4. Persona Path: Persona app responses

Key File: utils/retrieval/graph.py

Module Hierarchy

CRITICAL: Always follow the import hierarchy:

  1. database/ - Data access (lowest)
  2. utils/ - Business logic
  3. routers/ - API endpoints
  4. main.py - Application entry

Never import from higher levels in lower levels!

Database Patterns

  • Firestore: Primary database for conversations, memories, users
  • Pinecone: Vector embeddings for semantic search
  • Redis: Caching (speech profiles, enabled apps, user names)
  • GCS: Binary files (audio, photos, speech profiles)

API Endpoint Patterns

  • Use FastAPI routers in routers/
  • Keep routers thin - business logic in utils/
  • Use dependency injection for auth
  • Return consistent error formats

Common Tasks

Adding a New API Endpoint

  1. Create router function in appropriate routers/*.py
  2. Add business logic in utils/
  3. Use database functions from database/
  4. Follow error handling patterns
  5. Add to router in main.py

Processing Conversations

  1. Use process_conversation() from utils/conversations/process_conversation.py
  2. Handle extraction results
  3. Store in Firestore and Pinecone
  4. Trigger app webhooks if needed

Adding a Chat Tool

  1. Create tool function in utils/retrieval/tools/
  2. Use @tool decorator from LangChain
  3. Add to tool loading in utils/retrieval/tools/app_tools.py
  4. Tool will be available in agentic chat path

Related Documentation

The docs/ folder is the single source of truth for all user-facing documentation, deployed at docs.omi.me.

  • Backend Deep Dive: docs/doc/developer/backend/backend_deepdive.mdx - View online
  • Chat System: docs/doc/developer/backend/chat_system.mdx - View online
  • Data Storage: docs/doc/developer/backend/StoringConversations.mdx - View online
  • Transcription: docs/doc/developer/backend/transcription.mdx - View online
  • Backend Setup: docs/doc/developer/backend/Backend_Setup.mdx - View online
  • Backend Architecture: .cursor/rules/backend-architecture.mdc

Related Cursor Resources

Rules

  • .cursor/rules/backend-architecture.mdc - System architecture and module hierarchy
  • .cursor/rules/backend-api-patterns.mdc - FastAPI router patterns
  • .cursor/rules/backend-database-patterns.mdc - Database storage patterns
  • .cursor/rules/backend-llm-patterns.mdc - LLM integration patterns
  • .cursor/rules/backend-testing.mdc - Testing patterns
  • .cursor/rules/backend-imports.mdc - Import rules
  • .cursor/rules/memory-management.mdc - Memory management

Subagents

  • .cursor/agents/backend-api-developer/ - Uses this skill for API development
  • .cursor/agents/backend-llm-engineer/ - Uses this skill for LLM integration
  • .cursor/agents/backend-database-engineer/ - Uses this skill for database work

Commands

  • /backend-setup - Uses this skill for setup guidance
  • /backend-test - Uses this skill for testing patterns
  • /backend-deploy - Uses this skill for deployment patterns

Related Skills

Looking for an alternative to omi-backend-patterns or building a Categories.community AI Agent? Explore these related open-source MCP Servers.

View All

widget-generator

Logo of f
f

widget-generator is an open-source AI agent skill for creating widget plugins that are injected into prompt feeds on prompts.chat. It supports two rendering modes: standard prompt widgets using default PromptCard styling and custom render widgets built as full React components.

149.6k
0
Design

chat-sdk

Logo of lobehub
lobehub

chat-sdk is a unified TypeScript SDK for building chat bots across multiple platforms, providing a single interface for deploying bot logic.

73.0k
0
Communication

zustand

Logo of lobehub
lobehub

The ultimate space for work and life — to find, build, and collaborate with agent teammates that grow with you. We are taking agent harness to the next level — enabling multi-agent collaboration, effortless agent team design, and introducing agents as the unit of work interaction.

72.8k
0
Communication

data-fetching

Logo of lobehub
lobehub

The ultimate space for work and life — to find, build, and collaborate with agent teammates that grow with you. We are taking agent harness to the next level — enabling multi-agent collaboration, effortless agent team design, and introducing agents as the unit of work interaction.

72.8k
0
Communication