← Back to langfuse

How to Deploy & Use langfuse

Langfuse Deployment and Usage Guide

1. Prerequisites

Runtime & Databases:

  • Node.js 18+ (required for TypeScript/Node.js backend)
  • Docker & Docker Compose (recommended for self-hosting)
  • PostgreSQL 15+ (primary database)
  • ClickHouse 23.3+ (analytics database, proudly used as open source database)
  • Redis (caching and job queue)

Optional Integrations:

  • OpenAI API key (or other LLM provider keys) for evaluations and prompt management
  • OpenTelemetry compatible instrumentation for tracing
  • LangChain, OpenAI SDK, or LiteLLM for native integration

Accounts (Optional):

  • Cloud deployment platforms (Vercel, Railway, AWS, GCP, Azure)
  • Docker Hub account for container registry

2. Installation

Clone the Repository

git clone https://github.com/langfuse/langfuse.git
cd langfuse

Install Dependencies

# Install root dependencies
npm install

# Install workspace dependencies
npm run install:all

Docker Setup (Recommended)

# Using Docker Compose (includes all services)
docker-compose up -d

# Or pull individual images
docker pull langfuse/langfuse:latest
docker pull clickhouse/clickhouse-server:latest
docker pull postgres:15
docker pull redis:alpine

Manual Database Setup

If not using Docker Compose:

  1. PostgreSQL: Create database langfuse
  2. ClickHouse: Create database langfuse
  3. Redis: Standard setup

3. Configuration

Environment Variables

Create a .env file in the root directory:

# Database
DATABASE_URL="postgresql://postgres:password@localhost:5432/langfuse"
CLICKHOUSE_URL="http://localhost:8123"
CLICKHOUSE_USER="default"
CLICKHOUSE_PASSWORD=""
REDIS_URL="redis://localhost:6379"

# Application
NEXTAUTH_SECRET="your-secret-key-here"
NEXTAUTH_URL="http://localhost:3000"
ENCRYPTION_KEY="32-character-encryption-key"

# Optional: Authentication providers
AUTH_GOOGLE_CLIENT_ID=""
AUTH_GOOGLE_CLIENT_SECRET=""
AUTH_GITHUB_CLIENT_ID=""
AUTH_GITHUB_CLIENT_SECRET=""

# Optional: Email (SMTP)
SMTP_HOST=""
SMTP_PORT=""
SMTP_USER=""
SMTP_PASSWORD=""
SMTP_FROM=""

# Optional: LLM API Keys for evaluations
OPENAI_API_KEY=""
ANTHROPIC_API_KEY=""
GROQ_API_KEY=""

API Keys Configuration

For evaluation jobs and prompt management, configure model services in the web interface or via environment variables:

// Example configuration from evalService.ts
const evalModelService = new DefaultEvalModelService({
  openaiApiKey: process.env.OPENAI_API_KEY,
  anthropicApiKey: process.env.ANTHROPIC_API_KEY,
  // ... other model configurations
});

Database Schema Initialization

# Run database migrations
npm run db:migrate

# Seed development data (optional)
npm run db:seed

4. Build & Run

Development Mode

# Start all services in development
npm run dev

# Or start individually:
npm run dev:web      # Frontend (Next.js)
npm run dev:worker   # Background worker (eval jobs)
npm run dev:ingest   # Ingestion API

Production Build

# Build all packages
npm run build

# Build individual packages
npm run build:web
npm run build:worker
npm run build:ingest
npm run build:shared

Running Production Build

# Start production server
npm run start

# Or using PM2 (recommended for production)
npm install -g pm2
pm2 start ecosystem.config.js

Docker Production Build

# Build Docker images
docker build -t langfuse-web -f Dockerfile.web .
docker build -t langfuse-worker -f Dockerfile.worker .
docker build -t langfuse-ingest -f Dockerfile.ingest .

# Run with Docker Compose
docker-compose -f docker-compose.prod.yml up -d

5. Deployment

Platform Recommendations

Based on the TypeScript/Next.js stack with PostgreSQL, ClickHouse, and Redis:

Easiest (All-in-one):

  • Docker Compose on any VPS (DigitalOcean, AWS EC2, Hetzner)
  • Railway or Render (handles databases automatically)

Scalable (Managed services):

  • Frontend: Vercel (optimized for Next.js)
  • Backend/Worker: AWS ECS/EKS, Google Cloud Run, Azure Container Apps
  • Databases:
    • PostgreSQL: AWS RDS, Google Cloud SQL, Azure Database
    • ClickHouse: ClickHouse Cloud, AWS ClickHouse, self-managed
    • Redis: Redis Cloud, AWS ElastiCache, Google Memorystore

Kubernetes (Production):

# Example deployment with Helm
helm install langfuse ./charts/langfuse \
  --set postgresql.enabled=true \
  --set clickhouse.enabled=true \
  --set redis.enabled=true

Deployment Steps

  1. Build container images (see Docker Production Build above)
  2. Push to container registry:
    docker tag langfuse-web your-registry/langfuse-web:latest
    docker push your-registry/langfuse-web:latest
    
  3. Configure production environment variables
  4. Deploy using your platform's orchestration tool

Health Checks

After deployment, verify:

  • Web UI: http://your-domain.com
  • Health endpoint: http://your-domain.com/api/health
  • API: http://your-domain.com/api/inngest

6. Troubleshooting

Common Issues

Database Connection Errors:

# Check PostgreSQL
pg_isready -h localhost -p 5432

# Check ClickHouse
curl "http://localhost:8123/ping"

# Check Redis
redis-cli ping

Migration Failures:

# Reset and re-run migrations
npm run db:reset
npm run db:migrate

Evaluation Jobs Not Running:

  • Check worker logs: npm run dev:worker or check Docker logs
  • Verify Redis connection for job queue
  • Check ClickHouse for trace data existence (eval jobs query traces)
  • Verify model API keys are configured

ClickHouse Query Errors: Check the query builder in event-query-builder.ts for syntax issues:

// Example from source: ensure proper field mapping
const EVENTS_FIELDS = {
  id: "e.span_id as id",
  traceId: 'e.trace_id as "trace_id"',
  projectId: 'e.project_id as "project_id"',
  // ... other fields
};

Memory Issues: Langfuse uses ClickHouse for analytics. Ensure:

  • ClickHouse has sufficient memory (≥4GB recommended)
  • PostgreSQL connection pool configured properly
  • Redis eviction policy set for cache

Authentication Problems:

  • Ensure NEXTAUTH_SECRET is set and consistent
  • Verify OAuth provider configurations
  • Check session storage (uses Redis)

Performance Issues:

  1. Database Indexes: Check PostgreSQL and ClickHouse indexes
  2. Query Optimization: Use trace filtering utilities:
    // From traceFilterUtils.ts
    const filter = mapTraceFilterColumn(filterColumn, value);
    
  3. Caching: Redis cache for prompts and frequent queries
  4. Connection Pooling: Configure database connection limits

Getting Help:

Logs and Monitoring

# View application logs
docker-compose logs -f web worker ingest

# Check specific service
docker logs langfuse-web

# Monitor background jobs
npm run dev:worker  # Shows eval job processing

Debugging Evaluations

From evalService.ts, key debugging points:

  1. Check eval template compilation
  2. Verify variable extraction from traces
  3. Check LLM API responses
  4. Monitor job execution status:
    // Job execution states
    type JobExecutionState = "PENDING" | "RUNNING" | "COMPLETED" | "FAILED";
    

Reset and Recovery

# Full reset (development only!)
npm run db:reset
docker-compose down -v
docker-compose up -d

# Preserve data but restart
docker-compose restart
docker system prune -f  # Clean up Docker resources