UI and local proxy for Ollama — chat with LLaMA and other models from any device on your network.
llamaUI is a lightweight web interface and proxy server that allows you to chat with Ollama models running on your host machine — from mobile, tablet, or other computers. It exposes a simple UI and REST endpoint so you don’t have to stay on the same PC.
- 🔄 Model Selector – switch between installed models (e.g. llama3, mistral, gemma, etc.)
- 💬 Chat Interface – stream responses from models in real-time
- 🌈 Output Highlighting – style model responses for readability
- 🧠 Code Highlighting – syntax-aware blocks for code in model replies
- 📋 Copy Code Button – quickly copy code snippets with one click
- ⚡ Keyboard Shortcuts – navigate UI and send messages efficiently
- 🧩 Proxy API – exposes Ollama's /api/chat endpoint over your LAN
- 📝 Chat History – persist chat sessions for seamless follow-up conversations
- Required
pyton3
(no external dependency required) - Ollama installed and running on your host
- Models pulled via ollama pull llama3 (or other)
- Run
python3 host.py
- Then open in browser:
📱http://<host-ip>:8000
(accessible from mobile or other devices on same/any network)
⚙️ Configuration
- You need to modify host.py to change the port
- Resizable input box with multiline support
- Keyboard shortcuts:
- Enter to send
- Ctrl+D to start New Chat
- Ctrl+S: Stop output
- Automatic scroll to latest response
- Persistent chat history
This app exposes your Ollama instance to the network. To restrict access:
- Use firewall/IP filtering
- Add authentication (coming soon)
- Multi-user sessions
- Auth layer for public deployment
- Prompt presets
- Voice input & output
- Mobile PWA install
Ollama is powerful — but it's limited to your local machine 😭. llamaUI lets you access local models like llama3 from any device on your network 🤩, with a lightweight and user-friendly web interface. No need to install heavy apps or frameworks — just run this simple web app and start chatting. 😎
MIT License — feel free to fork, contribute, and make it better!