Skip to content

OpenLLM Monitor is a plug-and-play, real-time observability dashboard for monitoring and debugging LLM API calls across OpenAI, Ollama, OpenRouter, and more. Tracks tokens, latency, cost, retries, and lets you replay prompts โ€” fully open-source and self-hostable.

License

Notifications You must be signed in to change notification settings

prajeesh-chavan/OpenLLM-Monitor

Repository files navigation

๐Ÿš€ OpenLLM Monitor

Real-time LLM Observability Dashboard

License: MIT Node.js Version MongoDB React Docker

Monitor, analyze, and optimize your LLM usage across multiple providers in real-time

Created and Developed by Prajeesh Chavan
Full-Stack Developer & AI Enthusiast
๐Ÿ“„ View Full Credits & Project Journey

๐Ÿš€ Quick Start โ€ข ๐Ÿ“Š Features โ€ข ๐Ÿ”ง Installation โ€ข ๐Ÿ“– Documentation โ€ข ๐Ÿค Contributing


๐Ÿ“Š Dashboard Preview

๐ŸŽฏ Main Dashboard

Dashboard Overview Real-time monitoring of all your LLM requests with comprehensive analytics

๐Ÿ“‹ Request Logs

Request Logs Detailed logging of all LLM API calls with filtering and search

๐Ÿ“‹ Logs Details

Request Logs Detailed logging of all LLM API calls with filtering and search

๐Ÿ”„ Prompt Replay & Comparison

Prompt Replay Test and compare prompts across different providers and models


โญ Why OpenLLM Monitor?

๐ŸŽฏ Zero-Code Integration

  • Drop-in proxy servers
  • No code changes required
  • Works with existing applications

๐Ÿ“Š Real-Time Analytics

  • Live request monitoring
  • Cost tracking & optimization
  • Performance insights

๐Ÿ”„ Multi-Provider Support

  • OpenAI, Ollama, OpenRouter
  • Unified monitoring interface
  • Easy provider comparison

๐Ÿš€ Features

๐Ÿ”” Enhanced Smart Alerts & Monitoring

  • ๐Ÿšจ Intelligent Alert System - Proactive monitoring with enhanced real-time notifications and interactive bell icon
  • ๐Ÿ“Š Advanced Performance Alerts - Latency, error rate, and retry monitoring with detailed analysis and visual indicators
  • ๐Ÿ’ฐ Smart Cost Alerts - Budget threshold notifications with comprehensive spending insights and trends
  • ๐Ÿ” Detailed Alert Analysis - In-depth alert analysis with actionable recommendations and impact assessment
  • ๐Ÿ“ˆ Enhanced Trend Monitoring - Continuous tracking of key performance metrics with advanced visualizations
  • ๐ŸŽ›๏ธ Notification Management - Granular notification controls with test notification capabilities

๐Ÿงช Enhanced Model Testing & Experimentation

  • ๐Ÿ”ฌ Redesigned Testing Interface - Completely redesigned model testing with template categories and step-by-step wizard
  • ๐ŸŽฏ Template Categories - Organized templates for Quick Start, Development, Creative, and Analysis use cases
  • ๐Ÿ”„ Advanced Multi-Model Comparison - Enhanced side-by-side comparison with detailed performance metrics
  • ๐Ÿ’ก Smart Template Library - Pre-built prompts with categorization, icons, and estimated completion times
  • ๐Ÿ“Š Real-time Cost Estimation - Accurate cost preview with token counting before running tests
  • ๏ฟฝ Enhanced Batch Testing - Test multiple configurations with progress tracking and results analysis

๐Ÿ“ฑ Modern UI/UX & Enhanced User Experience

  • โœจ Beautiful Loading Experience - Animated loading screen with gradient backgrounds and smooth transitions
  • ๐ŸŽจ Enhanced Responsive Design - Fully optimized layouts for desktop, tablet, and mobile devices
  • โŒจ๏ธ Keyboard Shortcuts - Built-in shortcuts for power users (Cmd/Ctrl+K, R, Escape)
  • ๐Ÿ”„ Live Feed Mode - Real-time activity monitoring with toggleable live feed and activity counters
  • ๐ŸŽญ Smooth Animations - CSS animations and transitions for enhanced user experience
  • ๐Ÿ“Š Interactive Components - Enhanced tables, modals, and interactive elements with improved UX
  • ๐Ÿ“ Pre-built prompts for different use cases
  • โšก Quick Examples - Pre-built prompts for different use cases
  • ๐Ÿ’ฐ Cost Estimation - Preview costs before running expensive tests
  • ๐Ÿ“Š Performance Benchmarking - Compare latency, quality, and costs

๐Ÿ“Š Core Monitoring

  • ๐Ÿ“Š Real-time Request Logging - Monitor all LLM API calls with detailed metrics
  • ๐Ÿ”„ Prompt Replay & Comparison - Re-run prompts across different providers/models
  • ๐Ÿ’ฐ Cost Tracking & Analysis - Track spending across providers with detailed breakdowns
  • โšก Performance Monitoring - Latency tracking, retry analysis, and error monitoring
  • ๐Ÿšจ Error Tracking - Comprehensive error analysis and alerting

๐ŸŽจ Latest UI/UX Enhancements

OpenLLM Monitor has received major UI/UX improvements and feature enhancements!

๐Ÿš€ What's New:

  • โœจ Beautiful Loading Experience with animated backgrounds
  • ๐Ÿ”” Smart Alerts System with interactive bell notifications
  • ๐Ÿ“ฑ Enhanced Mobile Experience with responsive design
  • โŒจ๏ธ Keyboard Shortcuts for power users
  • ๐Ÿ”„ Live Feed Mode with real-time activity monitoring
  • ๐Ÿงช Redesigned Model Testing with template categories
  • ๐Ÿ“Š Interactive Components with smooth animations

๐Ÿ“– See Complete Enhancement Guide: Enhanced Features Documentation

๐ŸŒ Provider Support

Provider Status Models Supported
OpenAI โœ… GPT-3.5, GPT-4, GPT-4o, DALL-E
Ollama โœ… Llama2, Mistral, CodeLlama, Custom
OpenRouter โœ… 100+ models via unified API
Mistral AI โœ… Mistral-7B, Mistral-8x7B, Mixtral
Anthropic ๐Ÿ”œ Claude 3, Claude 2

๐Ÿ”ง Integration Options

  • ๐Ÿš€ Zero-code Proxy Servers - Monitor without changing your code
  • ๐Ÿ“ฆ SDK Wrappers - Drop-in replacements for popular libraries
  • ๐Ÿ’ป CLI Monitoring - Track command-line LLM usage
  • ๐Ÿ”Œ Custom Middleware - Integrate with your existing applications

๏ฟฝ Dashboard Features

  • ๐Ÿ”ด Real-time WebSocket Updates - Live dashboard with instant updates
  • ๐Ÿ“Š Comprehensive Analytics - Usage patterns, trends, and insights
  • ๐Ÿ“ค Export Capabilities - CSV, JSON export for logs and analytics
  • ๐ŸŒ Multi-Environment Support - Dev, staging, and production environments
  • ๐ŸŽจ Customizable Views - Personalized dashboards and filtering

๐Ÿš€ Quick Start

๐ŸŒฑ Demo Data Generation (New!)

Want to showcase your system immediately? Generate comprehensive seed data:

# Windows PowerShell (Recommended)
cd "scripts"
.\generate-seed-data.ps1

# Or use Node.js directly
cd scripts
npm install
node seed-data.js

โœจ What you get:

  • ๐Ÿ”ข 1,000+ realistic LLM requests across 30 days
  • ๐Ÿข Multi-provider coverage (OpenAI, Ollama, Mistral, OpenRouter)
  • ๐ŸŽช 7 diverse use cases (coding, analysis, support, creative, etc.)
  • ๐Ÿ“Š Analytics-ready data for impressive demos
  • ๐Ÿ’ฐ Cost tracking with real pricing models
  • โšก Performance metrics and error patterns

๐Ÿ“– Complete Seed Data Guide | โš™๏ธ Advanced Configuration

๐Ÿณ Docker (Recommended)

Get up and running in less than 2 minutes:

# Clone the repository
git clone https://github.com/prajeesh-chavan/openllm-monitor.git
cd openllm-monitor

# Start with Docker (includes everything)
docker-compose up -d

# Or use our setup script
./docker-setup.sh  # Linux/Mac
.\docker-setup.ps1 # Windows PowerShell

๐ŸŒ Access your dashboard: http://localhost:3000

โšก Manual Setup

Click to expand manual installation steps

Prerequisites

  • Node.js 18+ and npm
  • MongoDB (local or cloud)
  • Git

1. Clone & Install

git clone https://github.com/prajeesh-chavan/openllm-monitor.git
cd openllm-monitor

# Backend setup
cd backend
npm install
cp ../.env.example .env

# Frontend setup
cd ../frontend
npm install

2. Configure Environment

Edit backend/.env:

MONGODB_URI=mongodb://localhost:27017/openllm-monitor
PORT=3001
OPENAI_API_KEY=your-openai-key-here
OLLAMA_BASE_URL=http://localhost:11434

3. Start Services

# Terminal 1: MongoDB
mongod

# Terminal 2: Backend
cd backend && npm run dev

# Terminal 3: Frontend
cd frontend && npm run dev

๐ŸŒ Open: http://localhost:5173


๏ฟฝ Installation

๐Ÿ”ง System Requirements

Component Minimum Recommended
Node.js 18.x 20.x LTS
Memory 4GB RAM 8GB RAM
Storage 10GB 20GB SSD
MongoDB 4.4+ 6.0+

๐Ÿ“ฆ Installation Methods

๐Ÿณ Docker Fastest & Easiest

docker-compose up -d

โœ… Everything included
โœ… Zero configuration
โœ… Production ready

๐Ÿ’ป Manual Install Full Control

npm install

โœ… Customizable
โœ… Development friendly
โœ… Learning purposes

โ˜๏ธ Cloud Deploy Production Scale

docker build -t openllm-monitor .

โœ… Scalable
โœ… High availability
โœ… Enterprise ready

๐Ÿš€ One-Click Setup Scripts

Windows:

# PowerShell (Recommended)
.\docker-setup.ps1

# Command Prompt
docker-setup.bat

Linux/macOS:

# Make executable and run
chmod +x docker-setup.sh
./docker-setup.sh

Validation:

# Check if everything is configured correctly
.\docker\docker-validate.ps1  # Windows
./docker/docker-validate.sh   # Linux/Mac

๐Ÿ—๏ธ Architecture

graph TB
    A[Client Applications] --> B[OpenLLM Monitor Proxy]
    B --> C{LLM Provider}
    C --> D[OpenAI]
    C --> E[Ollama]
    C --> F[OpenRouter]
    C --> G[Mistral AI]

    B --> H[Backend API]
    H --> I[MongoDB]
    H --> J[WebSocket Server]
    J --> K[React Dashboard]

    H --> L[Analytics Engine]
    H --> M[Cost Calculator]
    H --> N[Token Counter]
Loading

๐Ÿ“ Project Structure

openllm-monitor/
โ”œโ”€โ”€ ๐ŸŽฏ backend/                 # Node.js + Express + MongoDB
โ”‚   โ”œโ”€โ”€ controllers/           # ๐ŸŽฎ API request handlers
โ”‚   โ”œโ”€โ”€ models/               # ๐Ÿ“Š Database schemas & models
โ”‚   โ”œโ”€โ”€ routes/               # ๐Ÿ›ฃ๏ธ API route definitions
โ”‚   โ”œโ”€โ”€ middlewares/          # ๐Ÿ”Œ Custom middleware (LLM logger)
โ”‚   โ”œโ”€โ”€ services/             # ๐Ÿ”ง LLM provider integrations
โ”‚   โ”œโ”€โ”€ utils/                # ๐Ÿ› ๏ธ Helper functions & utilities
โ”‚   โ””โ”€โ”€ config/               # โš™๏ธ Configuration management
โ”‚
โ”œโ”€โ”€ ๐ŸŽจ frontend/               # React + Vite + Tailwind
โ”‚   โ”œโ”€โ”€ src/components/       # ๐Ÿงฉ Reusable UI components
โ”‚   โ”œโ”€โ”€ src/pages/            # ๐Ÿ“„ Page-level components
โ”‚   โ”œโ”€โ”€ src/services/         # ๐ŸŒ API communication layer
โ”‚   โ”œโ”€โ”€ src/hooks/            # ๐Ÿช Custom React hooks
โ”‚   โ”œโ”€โ”€ src/store/            # ๐Ÿ—„๏ธ State management (Zustand)
โ”‚   โ””โ”€โ”€ public/               # ๐Ÿ“‚ Static assets
โ”‚
โ”œโ”€โ”€ ๐Ÿณ docker/                 # Docker configuration
โ”œโ”€โ”€ ๐Ÿ“š docs/                   # Documentation & guides
โ”œโ”€โ”€ ๐Ÿงช scripts/               # Setup & utility scripts
โ””โ”€โ”€ ๐Ÿ“ README.md              # You are here!

๐Ÿ› ๏ธ Tech Stack

Backend Frontend Database DevOps
Node.js React MongoDB Docker
Express.js Vite Socket.io Nginx

๐Ÿ“Š Usage Examples

๐Ÿ”„ Automatic Monitoring (Zero Code Changes)

# Start the proxy server
npm run proxy

# Your existing code works unchanged!
# All OpenAI calls are automatically logged
const response = await openai.chat.completions.create({
  model: "gpt-3.5-turbo",
  messages: [{ role: "user", content: "Hello!" }]
});

๐ŸŽฏ Direct Integration

// Add to your existing application
const { LLMLogger } = require("openllm-monitor");

const logger = new LLMLogger({
  apiUrl: "http://localhost:3001",
});

// Wrap your LLM calls
const response = await logger.track(async () => {
  return await openai.chat.completions.create({
    model: "gpt-4",
    messages: [{ role: "user", content: "Explain quantum computing" }],
  });
});

๐Ÿ“ˆ Analytics & Insights

// Get comprehensive analytics
const analytics = await fetch("/api/analytics", {
  method: "POST",
  body: JSON.stringify({
    dateRange: "last-7-days",
    providers: ["openai", "ollama"],
    groupBy: "model",
  }),
});

console.log(analytics.data);
// {
//   totalRequests: 1247,
//   totalCost: 23.45,
//   averageLatency: 850,
//   topModels: [...]
// }

๐Ÿ”„ Prompt Replay & Comparison

// Compare the same prompt across providers
const comparison = await fetch("/api/replay/compare", {
  method: "POST",
  body: JSON.stringify({
    prompt: "Write a haiku about coding",
    configurations: [
      { provider: "openai", model: "gpt-3.5-turbo" },
      { provider: "ollama", model: "llama2:7b" },
      { provider: "openrouter", model: "anthropic/claude-2" },
    ],
  }),
});

๐ŸŽ›๏ธ Configuration

โš™๏ธ Environment Variables

Create backend/.env from the template:

cp .env.example .env

Essential Configuration:

# ๐Ÿ—„๏ธ Database
MONGODB_URI=mongodb://localhost:27017/openllm-monitor

# ๐Ÿš€ Server
PORT=3001
NODE_ENV=development
FRONTEND_URL=http://localhost:5173

# ๐Ÿค– LLM Provider API Keys
OPENAI_API_KEY=sk-your-openai-key-here
OPENROUTER_API_KEY=sk-your-openrouter-key-here
MISTRAL_API_KEY=your-mistral-key-here

# ๐Ÿฆ™ Ollama (Local)
OLLAMA_BASE_URL=http://localhost:11434

# ๐Ÿ” Security
JWT_SECRET=your-super-secret-jwt-key
RATE_LIMIT_MAX_REQUESTS=100

๐ŸŽฏ Provider Setup Guide

๐Ÿค– OpenAI Setup
  1. Visit OpenAI Platform
  2. Create new API key
  3. Add to .env: OPENAI_API_KEY=sk-...
  4. Set usage limits if needed
๐Ÿฆ™ Ollama Setup
  1. Install Ollama
  2. Start Ollama: ollama serve
  3. Pull a model: ollama pull llama2
  4. Configure in .env: OLLAMA_BASE_URL=http://localhost:11434
๐ŸŒ OpenRouter Setup
  1. Sign up at OpenRouter
  2. Get API key from Keys page
  3. Add to .env: OPENROUTER_API_KEY=sk-or-...
  4. Browse available models in dashboard
๐Ÿค– Mistral AI Setup
  1. Create account at Mistral Console
  2. Generate API key
  3. Add to .env: MISTRAL_API_KEY=...
  4. Choose from available models

๐Ÿ—„๏ธ Database Setup

๐Ÿณ Docker (Recommended)

# Start MongoDB with Docker
docker-compose up -d mongodb

# Access MongoDB Admin UI
open http://localhost:8081  # admin/admin

โ˜๏ธ MongoDB Atlas (Cloud)

  1. Create free account at MongoDB Atlas
  2. Create cluster and get connection string
  3. Add to .env: MONGODB_URI=mongodb+srv://...

๐Ÿ’ป Local Installation

Windows
# Download and install MongoDB Community Server
# https://www.mongodb.com/try/download/community

# Or with Chocolatey
choco install mongodb

# Start MongoDB service
net start MongoDB
macOS
# Install with Homebrew
brew tap mongodb/brew
brew install mongodb-community

# Start MongoDB
brew services start mongodb/brew/mongodb-community
Linux (Ubuntu)
# Install MongoDB
wget -qO - https://www.mongodb.org/static/pgp/server-6.0.asc | sudo apt-key add -
echo "deb [ arch=amd64,arm64 ] https://repo.mongodb.org/apt/ubuntu focal/mongodb-org/6.0 multiverse" | sudo tee /etc/apt/sources.list.d/mongodb-org-6.0.list
sudo apt-get update && sudo apt-get install -y mongodb-org

# Start MongoDB
sudo systemctl start mongod
sudo systemctl enable mongod

๐Ÿ› ๏ธ Automated Setup Scripts

# Use our setup scripts
./scripts/setup-mongodb.sh     # Linux/Mac
.\scripts\setup-mongodb.ps1    # Windows PowerShell
.\scripts\setup-mongodb.bat    # Windows CMD

๐Ÿš€ Deployment

๐Ÿณ Production Docker

# Production build and deploy
docker-compose -f docker/docker-compose.prod.yml up -d

# With custom environment
docker-compose -f docker/docker-compose.prod.yml --env-file .env.production up -d

โ˜๏ธ Cloud Deployment

๐Ÿš€ Deploy to AWS
# Build and push to ECR
aws ecr get-login-password --region us-west-2 | docker login --username AWS --password-stdin 123456789012.dkr.ecr.us-west-2.amazonaws.com
docker build -t openllm-monitor .
docker tag openllm-monitor:latest 123456789012.dkr.ecr.us-west-2.amazonaws.com/openllm-monitor:latest
docker push 123456789012.dkr.ecr.us-west-2.amazonaws.com/openllm-monitor:latest

# Deploy with ECS or EKS
๐ŸŒŠ Deploy to DigitalOcean
# Use DigitalOcean App Platform
doctl apps create --spec .do/app.yaml

# Or deploy to Droplet
docker-compose -f docker/docker-compose.prod.yml up -d
โšก Deploy to Vercel/Netlify
# Frontend only (with separate backend)
cd frontend
npm run build

# Deploy frontend to Vercel
vercel --prod

# Deploy backend separately to Railway/Render

๐Ÿ”ง Environment Configuration

Production Environment Variables:

NODE_ENV=production
MONGODB_URI=mongodb+srv://production-cluster/openllm-monitor
JWT_SECRET=super-secure-production-secret
CORS_ORIGIN=https://your-domain.com
RATE_LIMIT_MAX_REQUESTS=1000
LOG_LEVEL=info

Security Checklist:

  • Change default passwords
  • Use HTTPS in production
  • Set up proper CORS
  • Configure rate limiting
  • Enable MongoDB authentication
  • Use environment-specific secrets
  • Set up monitoring and alerts

๐Ÿ“– API Documentation

๐Ÿฅ Health & Status

Endpoint Method Description
/api/health GET Service health check
/api/info GET API version & information
/api/status GET System status & metrics

๐Ÿ“Š Logs & Analytics

Endpoint Method Description
/api/logs GET Retrieve logs with filtering
/api/logs/:id GET Get specific log details
/api/logs/stats GET Dashboard statistics
/api/logs/export POST Export logs (CSV/JSON)
/api/analytics POST Advanced analytics queries

๐Ÿ”„ Replay & Testing

Endpoint Method Description
/api/replay POST Replay a prompt
/api/replay/compare POST Compare across providers
/api/replay/estimate POST Get cost estimates
/api/replay/models GET Available models list

๐ŸŒ Provider Management

Endpoint Method Description
/api/providers GET List provider configs
/api/providers/:id PUT Update provider settings
/api/providers/:id/test POST Test provider connection

๏ฟฝ WebSocket Events

// Real-time events
socket.on("new-log", (log) => {
  console.log("New request:", log);
});

socket.on("stats-update", (stats) => {
  console.log("Updated stats:", stats);
});

socket.on("error-alert", (error) => {
  console.log("Error detected:", error);
});

๐Ÿงช Testing

๐Ÿš€ Run Tests

# Backend tests
cd backend
npm test                 # Run all tests
npm run test:watch      # Watch mode
npm run test:coverage   # With coverage

# Frontend tests
cd frontend
npm test                # Run all tests
npm run test:ui         # UI test runner
npm run test:coverage   # With coverage

๐ŸŽฏ Test Categories

  • Unit Tests - Individual component testing
  • Integration Tests - API endpoint testing
  • E2E Tests - Full user journey testing
  • Performance Tests - Load and stress testing

๐Ÿ“Š Test Coverage

Component Coverage Status
Backend API 85% โœ… Good
Frontend Components 78% โœ… Good
Integration Tests 92% โœ… Excellent
E2E Tests 65% โš ๏ธ Needs Work

๏ฟฝ๏ธ Development

๐Ÿ”ง Development Workflow

# 1. Fork & Clone
git clone https://github.com/prajeesh-chavan/openllm-monitor.git
cd openllm-monitor

# 2. Create Feature Branch
git checkout -b feature/amazing-feature

# 3. Start Development
npm run dev:all  # Start all services

# 4. Make Changes & Test
npm test
npm run lint

# 5. Commit & Push
git commit -m "feat: add amazing feature"
git push origin feature/amazing-feature

# 6. Create Pull Request

๐ŸŽฏ Project Status

Component Status Progress
โœ… Backend API Complete 100%
โœ… Database Models Complete 100%
โœ… Provider Services Complete 95%
โœ… WebSocket Server Complete 100%
โœ… Frontend Dashboard Complete 90%
โœ… Analytics Engine Complete 85%
๐Ÿšง Mobile App In Progress 30%
๐Ÿ“‹ API v2 Planned 0%

๐Ÿ—๏ธ Contributing Areas

We welcome contributions in these areas:

  • ๐Ÿ› Bug Fixes - Help us squash bugs
  • โœจ New Features - Implement requested features
  • ๐Ÿ“š Documentation - Improve guides and docs
  • ๐Ÿงช Testing - Add more test coverage
  • ๐ŸŽจ UI/UX - Enhance user experience
  • ๐Ÿš€ Performance - Optimize speed and efficiency

๐Ÿค Contributing

We love contributions! Here's how you can help make OpenLLM Monitor even better:

๐Ÿš€ Quick Contribution Guide

  1. ๐Ÿด Fork the repository
  2. ๐ŸŒŸ Star the project (if you like it!)
  3. ๐Ÿ”ง Create your feature branch (git checkout -b feature/AmazingFeature)
  4. ๐Ÿ’พ Commit your changes (git commit -m 'feat: Add some AmazingFeature')
  5. ๏ฟฝ Push to the branch (git push origin feature/AmazingFeature)
  6. ๐ŸŽฏ Open a Pull Request

๐ŸŽฏ Ways to Contribute

๐Ÿ› Bug Reports Found a bug?

Report it

โœจ Feature Requests Have an idea?

Suggest it

๐Ÿ“š Documentation Improve our docs

Help here

๐Ÿงช Testing Add test coverage

View tests

๐Ÿท๏ธ Commit Convention

We use Conventional Commits:

feat: add new dashboard widget
fix: resolve login issue
docs: update API documentation
test: add unit tests for analytics
refactor: optimize database queries
chore: update dependencies

๐ŸŽ–๏ธ Contributors

Thanks to all the amazing people who have contributed to this project!

Contributors

๐Ÿ†˜ Support & Community

๏ฟฝ Get Help

๐Ÿ“– Documentation Read our comprehensive guides

View Docs

๐Ÿ’ฌ Discussions Join the community

GitHub Discussions

๏ฟฝ Issues Report bugs or request features

GitHub Issues

๐Ÿ“ง Email Direct support

[email protected]

๐Ÿ› ๏ธ Troubleshooting

Common Issues:

๐Ÿ”ด MongoDB Connection Failed
# Check if MongoDB is running
docker ps | grep mongo

# Restart MongoDB
docker-compose restart mongodb

# Check logs
docker-compose logs mongodb
๐Ÿ”ด Port Already in Use
# Find what's using the port
netstat -tulpn | grep :3001

# Kill the process
kill -9 <PID>

# Or change port in .env
PORT=3002
๏ฟฝ Ollama Not Connecting
# Check Ollama status
ollama ps

# Restart Ollama
ollama serve

# Check logs
tail -f ~/.ollama/logs/server.log

๐Ÿ“š Resources


๏ฟฝ License

This project is licensed under the MIT License - see the LICENSE file for details.

Copyright (c) 2024-2025 Prajeesh Chavan

MIT License

Copyright (c) 2024-2025 Prajeesh Chavan

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.

๐Ÿ‘จโ€๐Ÿ’ป About the Creator

Prajeesh Chavan

Full-Stack Developer & AI Enthusiast

Portfolio Blog LinkedIn GitHub Twitter

About Me: I'm a passionate full-stack developer with expertise in modern web technologies and AI/ML systems. I built OpenLLM Monitor to solve the real-world challenge of monitoring and optimizing LLM usage across different providers. This project represents my commitment to creating tools that help developers work more efficiently with AI technologies.

Skills & Technologies:

  • ๐Ÿš€ Full-Stack Development (React, Node.js, MongoDB)
  • ๐Ÿค– AI/ML Integration & LLM Applications
  • ๐Ÿณ DevOps & Cloud Deployment (Docker, AWS)
  • ๐Ÿ“Š Data Analytics & Visualization
  • ๐Ÿ”ง System Architecture & API Design

Connect with me if you have questions about the project, want to collaborate, or discuss opportunities!

๐Ÿš€ Project Journey

This project represents months of dedicated development and continuous improvement. Here's what makes it special:

  • ๐Ÿ”ฌ Research-Driven: Extensive research into LLM monitoring needs and best practices
  • ๐Ÿ—๏ธ Built from Scratch: Every component carefully designed and implemented
  • ๐ŸŽฏ Problem-Solving: Addresses real-world challenges faced by LLM developers
  • ๐Ÿ“ˆ Continuous Evolution: Regular updates and feature enhancements
  • ๐ŸŒ Community-First: Open source with detailed documentation and support

If you find this project valuable, please:

  • โญ Star the repository to show your support
  • ๐Ÿค Connect with me on LinkedIn or Twitter
  • ๐Ÿ’ฌ Share your feedback or suggestions
  • ๐ŸŽฏ Consider hiring me for your next project!

๐ŸŒŸ Star History

Star History Chart


๐ŸŽฏ What's Next?

๐ŸŽ‰ Ready to get started?

Get Started View Demo Join Community

๐ŸŽ–๏ธ Project Roadmap

  • โœ… v1.0 - Core monitoring and analytics
  • ๐Ÿšง v1.1 - Advanced filtering and exports
  • ๐Ÿ“‹ v1.2 - Mobile app and notifications
  • ๐Ÿ”ฎ v2.0 - AI-powered insights and recommendations

Built with โค๏ธ by Prajeesh Chavan for the LLM developer community

This project is the result of extensive research, development, and testing to provide the best LLM monitoring experience. If this project helped you, please consider giving it a โญ star on GitHub and connecting with me!

โญ Star on GitHub ๐Ÿ› Report Bug โœจ Request Feature

Creator: Prajeesh Chavan โ€ข License: MIT โ€ข Year: 2024-2025

About

OpenLLM Monitor is a plug-and-play, real-time observability dashboard for monitoring and debugging LLM API calls across OpenAI, Ollama, OpenRouter, and more. Tracks tokens, latency, cost, retries, and lets you replay prompts โ€” fully open-source and self-hostable.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published