A modern web interface for Ollama, featuring a clean design and essential chat functionalities.
- 🚀 Real-time streaming responses
- 💬 Multi-conversation management
- 🔄 Conversation history
- 📝 Markdown and code syntax highlighting support
- 🌓 Clean and modern UI design
- 📋 List available models
- 🔄 Auto-select first available model
- 🎯 Easy model switching
- ⚙️ Configurable Ollama server URL
- 💾 Settings persistence
- 📤 Export chat history to JSON
- 📥 Import chat history from JSON
- Ollama installed and running on your machine
- Node.js 18+ installed (for development)
- Docker (optional, for containerized deployment)
- Using Docker Compose (includes Ollama service):
wget https://raw.githubusercontent.com/oslook/ollama-webui/main/docker-compose.yml
docker compose up -d
- Using Docker directly:
docker run -d -p 3000:3000 ghcr.io/oslook/ollama-webui:latest
- Clone the repository:
git clone https://github.com/oslook/ollama-webui.git
cd ollama-webui
- Install dependencies:
npm install
- Start the development server:
npm run dev
- Open http://localhost:3000 in your browser
- Ensure your Ollama server is running (default: http://127.0.0.1:11434)
- Select a model from the dropdown menu
- Start chatting!
- Click the settings icon (⚙️) to:
- Configure Ollama server URL
- Export chat history
- Import chat history
latest
: Latest stable releasemain
: Latest development buildvX.Y.Z
: Specific version releases
Variable | Description | Default |
---|---|---|
NODE_ENV | Node environment | production |
PORT | Port to run the server on | 3000 |
# Build the image
docker build -t ollama-webui .
# Run the container
docker run -d -p 3000:3000 ollama-webui
The included docker-compose.yml
provides two services:
ollama-webui
: The web interfaceollama
: The Ollama server (optional)
To use only the web interface:
docker compose up -d ollama-webui
To run both services:
docker compose up -d
- Next.js - React Framework
- Tailwind CSS - CSS Framework
- DaisyUI - UI Components
- TypeScript - Type Safety
Feel free to submit issues and enhancement requests!
This project is licensed under the MIT License - see the LICENSE file for details.
Note: This project was generated by an AI agent (Cursor) and has been human-verified for functionality and best practices. The implementation combines modern web development patterns with practical user experience considerations.