You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
A lightweight, minimalistic frontend built with React.js for interacting with the Ollama API.
Designed with minimal dependencies for simplicity, speed, and easy customization.
🚀 Features
Minimal dependencies - build with React only
Streamed conversations with Ollama models
Persistent Converstaion history
Markdown rendering with syntax highlighting
Support Two modes: Chat and Completion
Allow users to define or tweak the system prompt for better control.
Copy code blocks or entire messsage easily
Automatic title genreation for conversations
Exlusive Reasoning Component
Clean, Reponsive UI
📸 Screenshot
🛠️ Tech Stack
React.js
Tailwindcss
shadcn/ui
📦 Getting Started
Clone the repository and install dependecies:
git clone https://github.com/cushydigit/lumina.git
cd lumina
npm install
npm run dev
or easily use lamina with Docker way:
build then docker image
docker build -t lumina .
run the docker container
docker run =p 4173:4173 lumina
Make sure your Ollama server is running locally at (localhost:11434) or update the API URL if needed.
⚙️ Configuration
if your Ollama instance is runnig elsewhere, you could easily edit the API_BASE_URL in api.ts file if needed.
constAPI_BASE_URL="https://localhost:11434";
📄 License
This porject is licensed under the MIT License.
🙌 Contributing
pull requiest, suggestions, and feedback are welcome!
🛣️ Roadmap / Upcoming Features
Delete and Retry Messages
Allow users to delete messages or retry sending failed messages.
Model Pulling Support
UI for pulling, updating, and managing Ollama models directly.
Conversation Pinning
Pin important conversations to the top for quick access.
Search Conversations
Quickly search across conversations by keywords.
Export / Import Conversations
Allow users to back up and restore chats in JSON or Markdown format.