KS
Killer-Skills

database-patterns — database-patterns setup guide database-patterns setup guide, what is database-patterns, database-patterns vs other skills, how to use database-patterns for schema design, database-patterns install for AI agents, database-patterns alternative for robust database design, database-patterns tutorial for developers, database-patterns best practices for normalization, database-patterns and UUID implementation, database-patterns for secure data management

v1.0.0
GitHub

About this Skill

Perfect for Data Analysis Agents needing robust database design principles with normalization and UUID implementation. database-patterns is a skill that generates autonomous projects with best practices for database design, including normalization, UUIDs, and timestamps, for secure and efficient data management.

Features

Applies normalization principles (3NF minimum) for efficient data storage
Utilizes UUIDs for distributed systems and auto-increment for simple cases
Includes timestamps on every table for robust data tracking
Generates projects with custom skills, agents, and security-reviewed code
Supports test-driven development (TDD) with 80%+ coverage
Enforces schema design principles for optimal database performance

# Core Topics

Wania-Kazmi Wania-Kazmi
[0]
[0]
Updated: 3/6/2026

Quality Score

Top 5%
39
Excellent
Based on code quality & docs
Installation
SYS Universal Install (Auto-Detect)
Cursor IDE Windsurf IDE VS Code IDE
> npx killer-skills add Wania-Kazmi/claude-code-autonomous-agent-workflow/database-patterns

Agent Capability Analysis

The database-patterns MCP Server by Wania-Kazmi is an open-source Categories.community integration for Claude and other AI agents, enabling seamless task automation and capability expansion. Optimized for database-patterns setup guide, what is database-patterns, database-patterns vs other skills.

Ideal Agent Persona

Perfect for Data Analysis Agents needing robust database design principles with normalization and UUID implementation.

Core Value

Empowers agents to design optimal database schemas using normalization techniques, UUIDs for distributed systems, and timestamps for data tracking, ensuring data consistency and integrity across various applications and protocols like SQL.

Capabilities Granted for database-patterns MCP Server

Designing normalized database schemas for large-scale applications
Implementing UUID-based primary keys for distributed systems
Automating timestamp tracking for data modification and creation

! Prerequisites & Limits

  • Requires understanding of database design principles
  • Limited to relational databases with support for UUID and timestamp data types
Project
SKILL.md
10.4 KB
.cursorrules
1.2 KB
package.json
240 B
Ready
UTF-8

# Tags

[No tags]
SKILL.md
Readonly

Database Patterns & Best Practices

Schema Design Principles

1. Normalization (3NF minimum)

  • No repeating groups
  • No partial dependencies
  • No transitive dependencies

2. Use UUIDs vs Auto-Increment

sql
1-- GOOD: UUID for distributed systems 2id UUID PRIMARY KEY DEFAULT gen_random_uuid() 3 4-- OK: Auto-increment for simple cases 5id SERIAL PRIMARY KEY

3. Timestamps on Every Table

sql
1CREATE TABLE users ( 2 id UUID PRIMARY KEY DEFAULT gen_random_uuid(), 3 name VARCHAR(100) NOT NULL, 4 email VARCHAR(255) UNIQUE NOT NULL, 5 created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(), 6 updated_at TIMESTAMP WITH TIME ZONE DEFAULT NOW() 7);

Prisma ORM Patterns

Schema Definition

prisma
1// prisma/schema.prisma 2generator client { 3 provider = "prisma-client-js" 4} 5 6datasource db { 7 provider = "postgresql" 8 url = env("DATABASE_URL") 9} 10 11model User { 12 id String @id @default(uuid()) 13 email String @unique 14 name String 15 role Role @default(USER) 16 posts Post[] 17 profile Profile? 18 createdAt DateTime @default(now()) 19 updatedAt DateTime @updatedAt 20 21 @@index([email]) 22 @@map("users") 23} 24 25model Post { 26 id String @id @default(uuid()) 27 title String 28 content String? 29 published Boolean @default(false) 30 author User @relation(fields: [authorId], references: [id]) 31 authorId String 32 tags Tag[] 33 createdAt DateTime @default(now()) 34 updatedAt DateTime @updatedAt 35 36 @@index([authorId]) 37 @@index([published]) 38 @@map("posts") 39} 40 41enum Role { 42 USER 43 ADMIN 44}

CRUD Operations

typescript
1import { PrismaClient } from '@prisma/client' 2 3const prisma = new PrismaClient() 4 5// Create 6async function createUser(data: { name: string; email: string }) { 7 return prisma.user.create({ 8 data, 9 select: { 10 id: true, 11 name: true, 12 email: true, 13 createdAt: true 14 } 15 }) 16} 17 18// Read with relations 19async function getUserWithPosts(id: string) { 20 return prisma.user.findUnique({ 21 where: { id }, 22 include: { 23 posts: { 24 where: { published: true }, 25 orderBy: { createdAt: 'desc' }, 26 take: 10 27 }, 28 profile: true 29 } 30 }) 31} 32 33// Update 34async function updateUser(id: string, data: Partial<User>) { 35 return prisma.user.update({ 36 where: { id }, 37 data 38 }) 39} 40 41// Delete (soft delete pattern) 42async function deleteUser(id: string) { 43 return prisma.user.update({ 44 where: { id }, 45 data: { deletedAt: new Date() } 46 }) 47}

Transactions

typescript
1// Interactive transaction 2async function transferFunds(fromId: string, toId: string, amount: number) { 3 return prisma.$transaction(async (tx) => { 4 const from = await tx.account.update({ 5 where: { id: fromId }, 6 data: { balance: { decrement: amount } } 7 }) 8 9 if (from.balance < 0) { 10 throw new Error('Insufficient funds') 11 } 12 13 await tx.account.update({ 14 where: { id: toId }, 15 data: { balance: { increment: amount } } 16 }) 17 18 return tx.transaction.create({ 19 data: { fromId, toId, amount } 20 }) 21 }) 22} 23 24// Sequential transaction 25async function createUserWithProfile(data: CreateUserInput) { 26 return prisma.$transaction([ 27 prisma.user.create({ data: data.user }), 28 prisma.profile.create({ data: data.profile }) 29 ]) 30}

Pagination

typescript
1// Cursor-based (recommended for large datasets) 2async function getUsers(cursor?: string, limit = 20) { 3 const users = await prisma.user.findMany({ 4 take: limit + 1, 5 cursor: cursor ? { id: cursor } : undefined, 6 orderBy: { createdAt: 'desc' }, 7 skip: cursor ? 1 : 0 // Skip the cursor itself 8 }) 9 10 const hasMore = users.length > limit 11 const data = hasMore ? users.slice(0, -1) : users 12 13 return { 14 data, 15 nextCursor: hasMore ? data[data.length - 1].id : null 16 } 17} 18 19// Offset-based 20async function getUsersWithOffset(page = 1, limit = 20) { 21 const [users, total] = await Promise.all([ 22 prisma.user.findMany({ 23 skip: (page - 1) * limit, 24 take: limit 25 }), 26 prisma.user.count() 27 ]) 28 29 return { 30 data: users, 31 meta: { total, page, limit, totalPages: Math.ceil(total / limit) } 32 } 33}

Query Optimization

Avoid N+1 Queries

typescript
1// BAD: N+1 problem 2const users = await prisma.user.findMany() 3for (const user of users) { 4 const posts = await prisma.post.findMany({ 5 where: { authorId: user.id } 6 }) 7} 8 9// GOOD: Include in single query 10const users = await prisma.user.findMany({ 11 include: { posts: true } 12}) 13 14// GOOD: Select only needed fields 15const users = await prisma.user.findMany({ 16 select: { 17 id: true, 18 name: true, 19 posts: { 20 select: { 21 id: true, 22 title: true 23 } 24 } 25 } 26})

Use Indexes Properly

prisma
1model Post { 2 id String @id 3 authorId String 4 status Status 5 createdAt DateTime 6 7 // Compound index for common query patterns 8 @@index([authorId, status]) 9 @@index([status, createdAt]) 10}
sql
1-- For frequently queried columns 2CREATE INDEX idx_posts_author_status ON posts(author_id, status); 3 4-- For text search 5CREATE INDEX idx_posts_title_gin ON posts USING gin(to_tsvector('english', title)); 6 7-- For JSON columns 8CREATE INDEX idx_metadata_gin ON posts USING gin(metadata);

Batch Operations

typescript
1// Batch create 2await prisma.user.createMany({ 3 data: users, 4 skipDuplicates: true 5}) 6 7// Batch update with raw SQL (when needed) 8await prisma.$executeRaw` 9 UPDATE posts 10 SET status = 'archived' 11 WHERE created_at < NOW() - INTERVAL '1 year' 12` 13 14// Batch delete 15await prisma.user.deleteMany({ 16 where: { 17 deletedAt: { not: null }, 18 deletedAt: { lt: thirtyDaysAgo } 19 } 20})

Migration Patterns

Prisma Migrations

bash
1# Create migration 2npx prisma migrate dev --name add_user_role 3 4# Apply in production 5npx prisma migrate deploy 6 7# Reset database (dev only) 8npx prisma migrate reset

Safe Schema Changes

typescript
1// Step 1: Add nullable column 2model User { 3 newField String? // nullable first 4} 5 6// Step 2: Backfill data 7await prisma.$executeRaw` 8 UPDATE users SET new_field = 'default' WHERE new_field IS NULL 9` 10 11// Step 3: Make non-nullable 12model User { 13 newField String @default("default") 14}

Rollback Strategy

sql
1-- Always create down migrations 2-- up.sql 3ALTER TABLE users ADD COLUMN phone VARCHAR(20); 4 5-- down.sql 6ALTER TABLE users DROP COLUMN phone;

Connection Management

Connection Pooling

typescript
1// Singleton pattern for Prisma 2const globalForPrisma = globalThis as unknown as { 3 prisma: PrismaClient | undefined 4} 5 6export const prisma = globalForPrisma.prisma ?? new PrismaClient({ 7 log: ['error', 'warn'], 8 datasources: { 9 db: { 10 url: process.env.DATABASE_URL 11 } 12 } 13}) 14 15if (process.env.NODE_ENV !== 'production') { 16 globalForPrisma.prisma = prisma 17}

Serverless Configuration

typescript
1// For serverless (Vercel, AWS Lambda) 2import { PrismaClient } from '@prisma/client' 3import { Pool } from '@neondatabase/serverless' 4import { PrismaNeon } from '@prisma/adapter-neon' 5 6const pool = new Pool({ connectionString: process.env.DATABASE_URL }) 7const adapter = new PrismaNeon(pool) 8const prisma = new PrismaClient({ adapter })

Soft Delete Pattern

prisma
1model User { 2 id String @id @default(uuid()) 3 email String @unique 4 deletedAt DateTime? 5 6 @@index([deletedAt]) 7}
typescript
1// Middleware for automatic filtering 2prisma.$use(async (params, next) => { 3 if (params.model === 'User') { 4 if (params.action === 'findMany' || params.action === 'findFirst') { 5 params.args.where = { 6 ...params.args.where, 7 deletedAt: null 8 } 9 } 10 } 11 return next(params) 12}) 13 14// Soft delete 15async function softDelete(id: string) { 16 return prisma.user.update({ 17 where: { id }, 18 data: { deletedAt: new Date() } 19 }) 20} 21 22// Hard delete (permanent) 23async function hardDelete(id: string) { 24 return prisma.user.delete({ where: { id } }) 25}

Audit Trail Pattern

prisma
1model AuditLog { 2 id String @id @default(uuid()) 3 entityType String 4 entityId String 5 action String // CREATE, UPDATE, DELETE 6 changes Json? 7 userId String? 8 createdAt DateTime @default(now()) 9 10 @@index([entityType, entityId]) 11 @@index([userId]) 12 @@index([createdAt]) 13}
typescript
1// Middleware for automatic audit logging 2prisma.$use(async (params, next) => { 3 const result = await next(params) 4 5 if (['create', 'update', 'delete'].includes(params.action)) { 6 await prisma.auditLog.create({ 7 data: { 8 entityType: params.model!, 9 entityId: result.id, 10 action: params.action.toUpperCase(), 11 changes: params.args.data, 12 userId: getCurrentUserId() 13 } 14 }) 15 } 16 17 return result 18})

Multi-Tenant Pattern

prisma
1model Organization { 2 id String @id @default(uuid()) 3 name String 4 users User[] 5 posts Post[] 6} 7 8model User { 9 id String @id @default(uuid()) 10 organization Organization @relation(fields: [organizationId], references: [id]) 11 organizationId String 12 13 @@index([organizationId]) 14}
typescript
1// Row-level security with Prisma extension 2const prismaWithTenant = prisma.$extends({ 3 query: { 4 $allModels: { 5 async $allOperations({ args, query, model }) { 6 const tenantId = getTenantId() 7 8 if (tenantId && hasTenantField(model)) { 9 args.where = { ...args.where, organizationId: tenantId } 10 } 11 12 return query(args) 13 } 14 } 15 } 16})

Raw SQL When Needed

typescript
1// Complex aggregations 2const stats = await prisma.$queryRaw<Stats[]>` 3 SELECT 4 DATE_TRUNC('day', created_at) as date, 5 COUNT(*) as count, 6 SUM(amount) as total 7 FROM transactions 8 WHERE created_at >= ${startDate} 9 GROUP BY DATE_TRUNC('day', created_at) 10 ORDER BY date DESC 11` 12 13// Full-text search 14const results = await prisma.$queryRaw<Post[]>` 15 SELECT * FROM posts 16 WHERE to_tsvector('english', title || ' ' || content) 17 @@ plainto_tsquery('english', ${searchTerm}) 18 ORDER BY ts_rank( 19 to_tsvector('english', title || ' ' || content), 20 plainto_tsquery('english', ${searchTerm}) 21 ) DESC 22 LIMIT ${limit} 23`

Checklist

  • UUIDs for distributed systems
  • Timestamps on all tables
  • Proper indexes for query patterns
  • N+1 queries avoided (use include/join)
  • Transactions for multi-step operations
  • Soft delete where appropriate
  • Connection pooling configured
  • Migrations tested (up and down)
  • Audit logging for sensitive data
  • Query performance monitored

Related Skills

Looking for an alternative to database-patterns or building a Categories.community AI Agent? Explore these related open-source MCP Servers.

View All

widget-generator

Logo of f
f

widget-generator is an open-source AI agent skill for creating widget plugins that are injected into prompt feeds on prompts.chat. It supports two rendering modes: standard prompt widgets using default PromptCard styling and custom render widgets built as full React components.

149.6k
0
Design

chat-sdk

Logo of lobehub
lobehub

chat-sdk is a unified TypeScript SDK for building chat bots across multiple platforms, providing a single interface for deploying bot logic.

73.0k
0
Communication

zustand

Logo of lobehub
lobehub

The ultimate space for work and life — to find, build, and collaborate with agent teammates that grow with you. We are taking agent harness to the next level — enabling multi-agent collaboration, effortless agent team design, and introducing agents as the unit of work interaction.

72.8k
0
Communication

data-fetching

Logo of lobehub
lobehub

The ultimate space for work and life — to find, build, and collaborate with agent teammates that grow with you. We are taking agent harness to the next level — enabling multi-agent collaboration, effortless agent team design, and introducing agents as the unit of work interaction.

72.8k
0
Communication