Easy-to-use NPX wrapper for Zen MCP Server - Give Claude access to multiple AI models (Gemini, OpenAI, OpenRouter, Ollama) for enhanced development capabilities.
npx zen-mcp-server-199bio
That's it! No Docker required. 🎉
Zen MCP Server gives Claude Desktop access to multiple AI models for:
- 🧠 Extended reasoning with Gemini 2.0 Pro's thinking mode
- 💬 Collaborative development with multiple AI perspectives
- 🔍 Code review and architectural analysis
- 🐛 Advanced debugging with specialized models
- 📊 Large context analysis (Gemini: 1M tokens, O3: 200K tokens)
- 🔄 Conversation threading - AI models maintain context across multiple calls
- ✅ No Docker required - Runs directly with Python
- 🚀 Fast startup - No container overhead
- 💾 Lightweight - Minimal resource usage
- 🔧 Auto-setup - Handles Python dependencies automatically
- 📦 Virtual environment - Isolated dependencies
- 🌍 Cross-platform - Works on macOS, Windows, Linux
On first run, the wrapper will:
- Check Python 3.11+ is installed
- Clone Zen MCP Server to
~/.zen-mcp-server
- Create
.env
file and prompt for API keys - Set up Python virtual environment
- Install dependencies automatically
Choose one or more:
- Gemini: Google AI Studio
- OpenAI: OpenAI Platform
- OpenRouter: OpenRouter (access to 100+ models)
Add to your claude_desktop_config.json
:
{
"mcpServers": {
"zen": {
"command": "npx",
"args": ["zen-mcp-server-199bio"],
"env": {
"GEMINI_API_KEY": "your_gemini_key_here",
"OPENAI_API_KEY": "your_openai_key_here",
"OPENROUTER_API_KEY": "your_openrouter_key_here"
}
}
}
}
That's it! Just restart Claude Desktop and you're ready to go.
Location of config file:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json
- Windows:
%APPDATA%\Claude\claude_desktop_config.json
- Linux:
~/.config/Claude/claude_desktop_config.json
claude mcp add zen "npx" "zen-mcp-server-199bio"
Once configured, Claude will have access to these tools:
zen
- Default tool for quick AI consultation (alias for chat)chat
- Collaborative development discussionsthinkdeep
- Extended reasoning (Gemini 2.0 Pro)codereview
- Professional code reviewprecommit
- Pre-commit validationdebug
- Advanced debugging assistanceanalyze
- Smart file and codebase analysis
Quick Usage: Just say "use zen" for quick AI consultations!
- macOS:
brew install [email protected]
- Windows: Download from python.org
- Linux:
sudo apt install python3.11
The wrapper tries to install automatically, but if it fails:
cd ~/.zen-mcp-server
python3 -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
pip install -r requirements.txt
- Check
~/.zen-mcp-server/.env
has valid keys - Ensure at least one API key is configured
- For OpenRouter, check your credits/limits
- Python 3.11+
- Node.js >= 14.0.0
- Git
- At least one API key (Gemini, OpenAI, or OpenRouter)
We removed Docker because:
- Faster startup - No container overhead
- Less resource usage - No Redis, no Docker daemon
- Simpler - Just Python and your API keys
- Same features - Conversation threading works perfectly with in-memory storage
Apache 2.0 - See LICENSE