← Back to LibreChat

How to Deploy & Use LibreChat

LibreChat Deployment and Usage Guide

1. Prerequisites

Before installing LibreChat, ensure you have the following:

Runtime & Tools:

  • Node.js (v18 or higher) - Required for both frontend and backend
  • npm or yarn or pnpm - Package manager
  • Git - For cloning the repository
  • Docker (Optional) - For containerized deployment
  • Python 3.8+ (Optional) - For Code Interpreter functionality

API Accounts & Keys:

  • At least one AI provider API key:
    • OpenAI API key (for GPT models)
    • Anthropic API key (for Claude models)
    • Google AI/Vertex AI key (for Gemini models)
    • Azure OpenAI credentials
    • AWS Bedrock access
    • OpenRouter API key
    • Custom endpoint (for Ollama, LM Studio, LocalAI, etc.)
  • MongoDB instance (local or cloud-based like MongoDB Atlas)
  • Optional: Redis for caching (recommended for production)

2. Installation

Clone and set up the project:

# Clone the repository
git clone https://github.com/danny-avila/LibreChat.git
cd LibreChat

# Install dependencies (using npm as shown in package.json)
npm install

# Or with pnpm (if preferred)
pnpm install

Note: LibreChat uses a monorepo structure with packages in /packages and the main client in /client.

3. Configuration

Environment Variables

Create a .env file in the root directory with the following essential variables:

# MongoDB Connection
MONGODB_URI=mongodb://localhost:27017/librechat

# App Configuration
NEXTAUTH_SECRET=your-secret-key-here # Generate with: openssl rand -base64 32
NEXTAUTH_URL=http://localhost:3080

# AI Provider API Keys (configure at least one)
OPENAI_API_KEY=sk-your-openai-key
ANTHROPIC_API_KEY=your-anthropic-key
GOOGLE_API_KEY=your-google-ai-key

# Optional Redis for caching
REDIS_URI=redis://localhost:6379

# Optional: Enable specific features
ALLOW_REGISTRATION=true
APP_TITLE=LibreChat

Advanced Configuration

For production deployments or advanced features, create a librechat.yaml configuration file:

# Example librechat.yaml
version: 1

endpoints:
  - name: openai
    apiKey: ${OPENAI_API_KEY}
    models:
      - gpt-4o
      - gpt-4-turbo
      - gpt-3.5-turbo

  - name: anthropic
    apiKey: ${ANTHROPIC_API_KEY}
    models:
      - claude-3-5-sonnet-20241022
      - claude-3-opus-20240229

features:
  codeInterpreter:
    enabled: true
    sandboxTimeout: 30000
  
  webSearch:
    enabled: true
    providers:
      - tavily
      - serper
    
  imageGeneration:
    enabled: true
    providers:
      - dalle3
      - stable-diffusion

API Key Configuration for Different Providers

  1. OpenAI/OpenRouter: Set OPENAI_API_KEY in .env
  2. Anthropic: Set ANTHROPIC_API_KEY in .env
  3. Google/Vertex AI: Set GOOGLE_API_KEY in .env
  4. Azure OpenAI: Configure via librechat.yaml with endpoint and API key
  5. Custom Endpoints: Configure in UI or via librechat.yaml
  6. Local Models (Ollama): Set base URL to http://localhost:11434 in custom endpoints

4. Build & Run

Development Mode

# Start in development mode (with hot reload)
npm run dev

# Or start frontend and backend separately
npm run client:dev  # Frontend on port 3000
npm run api:dev     # Backend on port 3080

The application will be available at http://localhost:3080.

Production Build

# Build the application
npm run build

# Start production server
npm start

# Or using the provided start script
npm run start:production

Docker Deployment

# Build Docker image
docker build -t librechat .

# Run with Docker Compose (if docker-compose.yml exists)
docker-compose up -d

# Or run directly
docker run -p 3080:3080 --env-file .env librechat

5. Deployment

Recommended Platforms

Based on the tech stack (TypeScript, Next.js, Node.js, MongoDB), consider these deployment options:

Platform-as-a-Service (PaaS):

  • Railway (Recommended) - One-click deploy via Railway template
  • Zeabur - Deploy via Zeabur template
  • Render - Good for Node.js applications with MongoDB
  • Fly.io - Good for global distribution

Container Platforms:

  • Sealos - Deploy via Sealos template
  • Kubernetes (EKS, GKE, AKS) - For scalable production deployments
  • Docker Swarm - For simpler container orchestration

Traditional VPS:

  • DigitalOcean, Linode, Vultr - Manual deployment with Node.js
  • AWS EC2, Google Compute Engine, Azure VMs - Cloud VMs with full control

Railway Deployment (Simplest)

  1. Click the "Deploy on Railway" button in the README
  2. Connect your GitHub repository
  3. Add environment variables in Railway dashboard
  4. Deploy - Railway automatically sets up MongoDB and runs the application

Manual VPS Deployment

# On your server
git clone https://github.com/danny-avila/LibreChat.git
cd LibreChat

# Install dependencies
npm install --production

# Build the application
npm run build

# Set up process manager (PM2 recommended)
npm install -g pm2

# Start with PM2
pm2 start npm --name "librechat" -- start

# Save PM2 configuration
pm2 save
pm2 startup

# Set up reverse proxy (Nginx example)
sudo nano /etc/nginx/sites-available/librechat

Nginx configuration:

server {
    listen 80;
    server_name your-domain.com;
    
    location / {
        proxy_pass http://localhost:3080;
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection 'upgrade';
        proxy_set_header Host $host;
        proxy_cache_bypass $http_upgrade;
    }
}

6. Troubleshooting

Common Issues and Solutions

1. MongoDB Connection Issues

# Error: "MongoDB connection failed"
# Solution: Verify MongoDB is running and accessible
mongod --version  # Check if MongoDB is installed
sudo systemctl status mongod  # Check service status
# Update MONGODB_URI in .env to correct connection string

2. Port Already in Use

# Error: "Port 3080 already in use"
# Solution: Change port or kill existing process
lsof -i :3080  # Find process using port
kill -9 <PID>  # Kill the process
# OR change port in .env: PORT=3081

3. API Key Errors

# Error: "Invalid API key" or "No API key provided"
# Solution: Verify API keys are set correctly
echo $OPENAI_API_KEY  # Check if environment variable is set
# Ensure .env file is in root directory and properly formatted
# Restart the application after changing .env

4. Build Failures

# Error: "TypeScript compilation failed"
# Solution: Clear build cache and reinstall
npm run clean  # If available
rm -rf node_modules .next
npm install
npm run build

5. Memory Issues (Code Interpreter)

# Error: "Sandbox timeout" or memory errors
# Solution: Increase timeout or limit file sizes
# In librechat.yaml:
features:
  codeInterpreter:
    sandboxTimeout: 60000  # Increase to 60 seconds
    maxFileSize: 10485760  # 10MB limit

6. Authentication Issues

# Error: "NextAuth.js" configuration errors
# Solution: Ensure NEXTAUTH_SECRET is set
openssl rand -base64 32  # Generate a new secret
# Update NEXTAUTH_URL to match your deployment URL

7. Docker Container Fails to Start

# Error: Container exits immediately
# Solution: Check logs and environment variables
docker logs <container_id>
docker run -it --env-file .env librechat /bin/sh  # Debug shell
# Ensure all required env vars are in .env file

Getting Help: