Source Repository
This documentation is from amiable-dev/litellm-langfuse-railway.
Last synced: 2026-01-03 | Commit: 5a45454
Quick Setup Guide: LiteLLM + Langfuse on Railway¶
This guide walks you through manually setting up the stack on Railway using the web dashboard.
Prerequisites¶
- Railway account (free tier works for testing)
- LLM API keys (OpenAI, Anthropic, etc.)
Step 1: Create a New Project¶
- Go to railway.app
- Click "New Project"
- Select "Empty Project"
- Name it "LiteLLM-Langfuse"
Step 2: Deploy Infrastructure Services¶
2.1 PostgreSQL¶
- Click "+ New" → "Database" → "PostgreSQL"
- Wait for deployment
- Copy the
DATABASE_URLfrom variables (you'll need this)
2.2 Redis¶
- Click "+ New" → "Database" → "Redis"
- Wait for deployment
- Note the connection details:
REDIS_HOST= private domain (e.g.,redis.railway.internal)REDIS_PORT= 6379REDIS_PASSWORD= from variables
2.3 ClickHouse¶
- Click "+ New" → "Docker Image"
- Enter:
clickhouse/clickhouse-server:24 - Add a volume mount:
/var/lib/clickhouse - Add environment variables:
- Deploy
2.4 MinIO (S3-compatible storage)¶
- Click "+ New" → "Docker Image"
- Enter:
minio/minio - Add a volume mount:
/data - Set start command:
minio server /data --console-address :9001 - Add environment variables:
- Deploy
- After deployment, access MinIO console (port 9001) and create a bucket named
langfuse
Step 3: Deploy Langfuse¶
3.1 Langfuse Web¶
- Click "+ New" → "Docker Image"
- Enter:
langfuse/langfuse:3 - Add environment variables:
PORT=3000 NEXTAUTH_URL=https://<your-langfuse-domain>.up.railway.app NEXTAUTH_SECRET=<generate-32-char-secret> SALT=<generate-salt> ENCRYPTION_KEY=<generate-64-char-hex> # Database DATABASE_URL=<from-postgres> # ClickHouse CLICKHOUSE_URL=http://clickhouse.railway.internal:8123 CLICKHOUSE_MIGRATION_URL=clickhouse://clickhouse.railway.internal:9000 CLICKHOUSE_USER=clickhouse CLICKHOUSE_PASSWORD=<clickhouse-password> # Redis REDIS_HOST=redis.railway.internal REDIS_PORT=6379 REDIS_AUTH=<redis-password> # MinIO LANGFUSE_S3_EVENT_UPLOAD_ENDPOINT=http://minio.railway.internal:9000 LANGFUSE_S3_EVENT_UPLOAD_BUCKET=langfuse LANGFUSE_S3_EVENT_UPLOAD_ACCESS_KEY_ID=minioadmin LANGFUSE_S3_EVENT_UPLOAD_SECRET_ACCESS_KEY=<minio-password> LANGFUSE_S3_EVENT_UPLOAD_FORCE_PATH_STYLE=true LANGFUSE_S3_EVENT_UPLOAD_REGION=us-east-1 # Disable telemetry TELEMETRY_ENABLED=false - Enable public domain under Settings → Networking
- Deploy
3.2 Langfuse Worker¶
- Click "+ New" → "Docker Image"
- Enter:
langfuse/langfuse-worker:3 - Copy all environment variables from Langfuse Web EXCEPT:
- Change
PORT=3030 - Remove
NEXTAUTH_URL - Deploy (no public domain needed)
Step 4: Deploy LiteLLM¶
- Click "+ New" → "Docker Image"
- Enter:
docker.litellm.ai/berriai/litellm-database:main-stable - Add environment variables:
PORT=4000 LITELLM_MASTER_KEY=sk-<generate-secure-key> LITELLM_SALT_KEY=<generate-salt> # Database DATABASE_URL=<from-postgres> # Redis (for caching) REDIS_HOST=redis.railway.internal REDIS_PORT=6379 REDIS_PASSWORD=<redis-password> # Langfuse integration LANGFUSE_PUBLIC_KEY=<get-from-langfuse-ui> LANGFUSE_SECRET_KEY=<get-from-langfuse-ui> LANGFUSE_HOST=https://<your-langfuse-domain>.up.railway.app # UI settings STORE_MODEL_IN_DB=True UI_USERNAME=admin UI_PASSWORD=<generate-password> - Enable public domain
- Deploy
Step 5: Configure LiteLLM with Langfuse¶
- Open Langfuse UI:
https://<your-langfuse-domain>.up.railway.app - Create an account
- Create a new project
- Go to Settings → API Keys
- Create a new API key
- Copy the public and secret keys
- Update LiteLLM's environment variables with these keys
- Restart LiteLLM service
Step 6: Add LLM Models to LiteLLM¶
Access LiteLLM Admin UI: https://<your-litellm-domain>.up.railway.app/ui
Or via API:
# Add OpenAI GPT-4o
curl -X POST 'https://<litellm-domain>/model/new' \
-H 'Authorization: Bearer sk-<your-master-key>' \
-H 'Content-Type: application/json' \
-d '{
"model_name": "gpt-4o",
"litellm_params": {
"model": "openai/gpt-4o",
"api_key": "sk-<your-openai-key>"
}
}'
# Add Claude Sonnet
curl -X POST 'https://<litellm-domain>/model/new' \
-H 'Authorization: Bearer sk-<your-master-key>' \
-H 'Content-Type: application/json' \
-d '{
"model_name": "claude-sonnet",
"litellm_params": {
"model": "anthropic/claude-sonnet-4-20250514",
"api_key": "sk-ant-<your-anthropic-key>"
}
}'
Step 7: Test the Setup¶
from openai import OpenAI
client = OpenAI(
api_key="sk-<your-litellm-master-key>",
base_url="https://<your-litellm-domain>.up.railway.app"
)
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Hello, world!"}]
)
print(response.choices[0].message.content)
Then check Langfuse UI - you should see the trace!
Generating Secrets¶
Use these commands to generate secure values:
# 32-character secret
openssl rand -base64 24
# 64-character hex key (for ENCRYPTION_KEY)
openssl rand -hex 32
# LiteLLM master key (must start with sk-)
echo "sk-$(openssl rand -hex 24)"
Troubleshooting¶
Langfuse shows "ClickHouse migration failed"¶
- Ensure ClickHouse is healthy: check logs
- Verify
CLICKHOUSE_URLuses port 8123 (HTTP) andCLICKHOUSE_MIGRATION_URLuses port 9000 (native)
LiteLLM can't connect to Langfuse¶
- Verify Langfuse public domain is enabled
- Check that
LANGFUSE_HOSTstarts withhttps:// - Confirm API keys are correct
Traces not appearing¶
- Check Langfuse Worker logs
- Verify Redis is connected
- Ensure MinIO bucket
langfuseexists
Cost Estimate¶
With moderate usage: - PostgreSQL: ~$5/month - Redis: ~$3/month - ClickHouse: ~$8/month - MinIO: ~$3/month - Langfuse (web + worker): ~$10/month - LiteLLM: ~$8/month
Total: ~$35-40/month
Next Steps¶
- Create virtual keys with budgets for team members
- Set up prompt templates in Langfuse
- Configure evaluations and scoring
- Add more LLM providers for fallbacks