π Production-ready Node.js/TypeScript API server for the Vezlo AI Assistant platform - Complete backend APIs with advanced RAG (chunk-based semantic search + adjacent retrieval), Docker deployment, and database migrations.
π Changelog | π Report Issue | π¬ Discussions
New chunk-based architecture with adjacent retrieval for better code understanding.
- Database Schema: New
vezlo_knowledge_chunkstable and RPC functions - Embedding Model: Upgraded to
text-embedding-3-large(3072 dimensions) - Migration: Automatic via
npm run migrate:latest(migration 006) - Rollback: Supported via
npm run migrate:rollback
Upgrade Steps:
npm install @vezlo/assistant-server@latest
npm run migrate:latestIntroduced multi-tenancy with authentication. Existing data not auto-migrated.
See CHANGELOG.md for complete migration guide.
- Backend APIs - RESTful API endpoints for AI chat and knowledge management
- Real-time Communication - WebSocket support for live chat with Supabase Realtime broadcasting
- Human Agent Handoff - Agent join/leave workflows with realtime status updates and message synchronization
- Advanced RAG System - Chunk-based semantic search with adjacent retrieval using OpenAI text-embedding-3-large (3072 dims) and pgvector
- Conversation Management - Persistent conversation history with agent support
- Slack Integration - Direct query bot with full AI responses, conversation history, and reaction-based feedback (setup guide)
- Feedback System - Message rating and improvement tracking
- Database Migrations - Knex.js migration system for schema management
- Production Ready - Docker containerization with health checks
# Install globally
npm install -g @vezlo/assistant-server
# Or install in your project
npm install @vezlo/assistant-servergit clone https://github.com/vezlo/assistant-server.git
cd assistant-server
npm installπ Recommended for Vercel Users - Deploy with automated setup:
The Vercel Marketplace integration provides:
- Guided Configuration - Step-by-step setup wizard
- Automatic Environment Setup - No manual configuration needed
- Database Migration - Automatic table creation
- Production Optimization - Optimized for Vercel's serverless platform
Learn more about the marketplace integration β
- Node.js 20+ and npm 9+
- Supabase project
- OpenAI API key
The fastest way to get started is with our interactive setup wizard:
# If installed globally
vezlo-setup
# If installed locally
npx vezlo-setup
# Or if cloned from GitHub
npm run setupThe wizard will guide you through:
- Supabase Configuration - URL, Service Role Key, DB host/port/name/user/password (with defaults)
- OpenAI Configuration - API key, model, temperature, max tokens
- Validation (nonβblocking) - Tests Supabase API and DB connectivity
- Migrations - Runs Knex migrations if DB validation passes; otherwise shows how to run later
- Environment - Generates
.env(does not overwrite if it already exists) - Default Data Seeding - Creates default admin user and company
- API Key Generation - Generates API key for the default company
After setup completes, start the server:
vezlo-serverIf you prefer manual configuration:
# Copy example file
cp env.example .env
# Edit with your credentials
nano .envGet your Supabase credentials from:
- Dashboard β Settings β API
- Database β Settings β Connection string
# Supabase Configuration
SUPABASE_URL=https://your-project.supabase.co
SUPABASE_ANON_KEY=your-anon-key
SUPABASE_SERVICE_KEY=your-service-role-key
# Database Configuration for Migrations
SUPABASE_DB_HOST=db.your-project.supabase.co
SUPABASE_DB_PORT=5432
SUPABASE_DB_NAME=postgres
SUPABASE_DB_USER=postgres
SUPABASE_DB_PASSWORD=your-database-password
# OpenAI Configuration
OPENAI_API_KEY=sk-your-api-key
AI_MODEL=gpt-4o
# Migration Security
MIGRATION_SECRET_KEY=your-secure-migration-key-here# Using Knex migrations (primary method)
npm run migrate:latest
# Or via API after server is running
curl "https://localhost:3000/api/migrate?key=$MIGRATION_SECRET_KEY"# Create default admin user and company (if not exists)
npm run seed-default
# Generate API key for library integration
npm run generate-keyOptional fallback (not recommended if using migrations):
# Run raw SQL in Supabase Dashboard β SQL Editor
cat database-schema.sql# Verify database connection and tables
vezlo-validate
# Or with npm
npm run validate# If installed globally
vezlo-server
# If installed locally
npx vezlo-server
# Or from source
npm run build && npm start- Copy the environment template and fill in your Supabase/OpenAI values:
cp env.example .env # edit .env with your credentials before continuing - Build and start the stack:
The entrypoint runs migrations, seeds the default org/admin, and generates an API key automatically.
docker-compose build docker-compose up -d
- View container logs:
docker-compose logs -f vezlo-server
Deploy to Vercel's serverless platform with multiple options. The Marketplace integration collects your credentials during configuration and sets environment variables automatically.
π Deploy via Vercel Marketplace - Automated setup with guided configuration:
Benefits:
- β Guided Setup - Step-by-step configuration wizard
- β Automatic Environment Variables - No manual env var configuration needed
- β Database Migration - Automatic table creation and schema setup
- β Production Ready - Optimized for Vercel's serverless platform
After Installation:
- Run the migration URL:
https://your-project.vercel.app/api/migrate?key=YOUR_MIGRATION_SECRET - Verify deployment:
https://your-project.vercel.app/health - Access API docs:
https://your-project.vercel.app/docs
This will:
- Fork the repository to your GitHub
- Create a Vercel project
- Require marketplace integration setup
- Deploy automatically
# Install Vercel CLI
npm i -g vercel
# Deploy
vercel
# Follow prompts to configure- Supabase project (URL, Service Role key, DB host/port/name/user/password)
- OpenAI API key
- If not using the Marketplace, add environment variables in Vercel project settings
- Disable Vercel Deployment Protection if the API needs to be publicly accessible; otherwise Vercel shows its SSO page and the browser never reaches your server.
See docs/VERCEL_DEPLOYMENT.md for detailed deployment guide.
Edit .env file with your credentials:
# REQUIRED - Supabase Configuration
SUPABASE_URL=https://your-project-id.supabase.co
SUPABASE_SERVICE_KEY=your-service-role-key
# REQUIRED - Database Configuration for Knex.js Migrations
SUPABASE_DB_HOST=db.your-project.supabase.co
SUPABASE_DB_PORT=5432
SUPABASE_DB_NAME=postgres
SUPABASE_DB_USER=postgres
SUPABASE_DB_PASSWORD=your-database-password
# REQUIRED - OpenAI Configuration
OPENAI_API_KEY=sk-your-openai-api-key
AI_MODEL=gpt-4o
AI_TEMPERATURE=0.7
AI_MAX_TOKENS=1000
# REQUIRED - Database Migration Security
MIGRATION_SECRET_KEY=your-secure-migration-key-here
# REQUIRED - Authentication
JWT_SECRET=your-super-secret-jwt-key-here-change-this-in-production
DEFAULT_ADMIN_EMAIL=admin@vezlo.org
DEFAULT_ADMIN_PASSWORD=admin123
# OPTIONAL - Server Configuration
PORT=3000
NODE_ENV=production
LOG_LEVEL=info
# OPTIONAL - CORS Configuration
CORS_ORIGINS=https://localhost:3000,https://localhost:5173
# OPTIONAL - Swagger Base URL
BASE_URL=https://localhost:3000
# OPTIONAL - Rate Limiting
RATE_LIMIT_WINDOW=60000
RATE_LIMIT_MAX=100
# OPTIONAL - Organization Settings
ORGANIZATION_NAME=Vezlo
ASSISTANT_NAME=Vezlo Assistant
# OPTIONAL - Knowledge Base (uses text-embedding-3-large, 3072 dims)
CHUNK_SIZE=1000
CHUNK_OVERLAP=200The package provides these command-line tools:
Interactive setup wizard that guides you through configuration.
vezlo-setupCreates default admin user and company.
vezlo-seed-defaultGenerates API key for the default admin's company. The API key is used by src-to-kb library.
vezlo-generate-keyValidates database connection and verifies all tables exist.
vezlo-validateStarts the API server.
vezlo-serverhttps://localhost:3000/api
- Swagger UI:
https://localhost:3000/docs - Health Check:
https://localhost:3000/health
POST /api/conversations- Create new conversation (public widget endpoint)GET /api/conversations- List company conversations (agent dashboard)GET /api/conversations/:uuid- Get conversation with messagesDELETE /api/conversations/:uuid- Delete conversationPOST /api/conversations/:uuid/join- Agent joins a conversationPOST /api/conversations/:uuid/messages/agent- Agent sends a messagePOST /api/conversations/:uuid/close- Agent closes a conversation
POST /api/conversations/:uuid/messages- Create user messagePOST /api/messages/:uuid/generate- Generate AI response
POST /api/knowledge/items- Create knowledge item (supports raw content, pre-chunked data, or chunks with embeddings)GET /api/knowledge/items- List knowledge itemsGET /api/knowledge/items/:uuid- Get knowledge itemPUT /api/knowledge/items/:uuid- Update knowledge itemDELETE /api/knowledge/items/:uuid- Delete knowledge item
Knowledge Ingestion Options:
- Raw Content: Send
contentfield, server creates chunks and embeddings - Pre-chunked: Send
chunksarray withhasEmbeddings: false, server generates embeddings - Chunks + Embeddings: Send
chunksarray with embeddings andhasEmbeddings: true, server stores directly
GET /api/migrate?key=<secret>- Run pending database migrationsGET /api/migrate/status?key=<secret>- Check migration status
Migration Workflow:
- Create Migration: Use
npm run migrate:make migration_nameto create new migration files - Check Status: Use
/api/migrate/statusto see pending migrations - Run Migrations: Use
/api/migrateto execute pending migrations remotely
Migration Endpoints Usage:
# Check migration status
curl "https://localhost:3000/api/migrate/status?key=your-migration-secret-key"
# Run pending migrations
curl "https://localhost:3000/api/migrate?key=your-migration-secret-key"Required Environment Variable:
MIGRATION_SECRET_KEY- Secret key for authenticating migration requests
Migration Creation Example:
# Create a new migration
npm run migrate:make add_users_table
# This creates: src/migrations/002_add_users_table.ts
# Edit the file to add your schema changes
# Then run via endpoint or command linePOST /api/knowledge/search- Search knowledge base
POST /api/feedback- Submit message feedback (Public API)DELETE /api/feedback/:uuid- Delete/undo message feedback (Public API)
join-conversation- Join conversation roomconversation:message- Real-time message updates
The conversation system follows the industry-standard 2-API flow pattern for AI chat applications:
POST /api/conversations/{conversation-uuid}/messagesPurpose: Store the user's message in the conversation Response: Returns the user message with UUID
POST /api/messages/{message-uuid}/generatePurpose: Generate AI response based on the user message Response: Returns the AI assistant's response
This pattern is the global recognized standard because:
β Separation of Concerns
- User message storage is separate from AI generation
- Allows for message persistence even if AI generation fails
- Enables message history and conversation management
β Reliability & Error Handling
- User messages are saved immediately
- AI generation can be retried independently
- Partial failures don't lose user input
β Scalability
- AI generation can be queued/processed asynchronously
- Different rate limits for storage vs generation
- Enables streaming responses and real-time updates
β Industry Standard
- Used by OpenAI, Anthropic, Google, and other major AI platforms
- Familiar pattern for developers
- Enables advanced features like message regeneration, threading, and branching
# 1. User sends message
curl -X POST /api/conversations/abc123/messages \
-d '{"content": "How do I integrate your API?"}'
# Response: {"uuid": "msg456", "content": "How do I integrate your API?", ...}
# 2. Generate AI response
curl -X POST /api/messages/msg456/generate \
-d '{}'
# Response: {"uuid": "msg789", "content": "To integrate our API...", ...}Use the builtβin migration endpoints to create/upgrade tables:
# Run pending migrations
curl "https://localhost:3000/api/migrate?key=your-migration-secret-key"
# Check migration status
curl "https://localhost:3000/api/migrate/status?key=your-migration-secret-key"These endpoints execute Knex migrations and keep schema versioned.
If you prefer manual setup, run the SQL schema in Supabase SQL Editor:
# View the schema SQL locally
cat database-schema.sql
# Copy into Supabase Dashboard β SQL Editor and executeThe database-schema.sql contains all required tables and functions.
# Start services
docker-compose up -d
# View logs
docker-compose logs -f vezlo-server
# Stop services
docker-compose down
# Rebuild and start
docker-compose up -d --build
# View running containers
docker-compose ps
# Access container shell
docker exec -it vezlo-server shcurl https://localhost:3000/health# 1. Create conversation
CONV_UUID=$(curl -X POST https://localhost:3000/api/conversations \
-H "Content-Type: application/json" \
-d '{"title": "Test Conversation", "user_uuid": 12345, "company_uuid": 67890}' \
| jq -r '.uuid')
# 2. Send user message
MSG_UUID=$(curl -X POST https://localhost:3000/api/conversations/$CONV_UUID/messages \
-H "Content-Type: application/json" \
-d '{"content": "Hello, how can you help me?"}' \
| jq -r '.uuid')
# 3. Generate AI response
curl -X POST https://localhost:3000/api/messages/$MSG_UUID/generate \
-H "Content-Type: application/json" \
-d '{}'curl -X POST https://localhost:3000/api/knowledge/search \
-H "Content-Type: application/json" \
-d '{
"query": "How to use the API?",
"limit": 5,
"threshold": 0.7,
"type": "hybrid"
}'# Install dependencies
npm install
# Build TypeScript
npm run build
# Start server (Node)
npm start
# Or start via CLI wrapper
npx vezlo-server
# Run tests
npm testvezlo/
βββ docs/ # Documentation
β βββ DEVELOPER_GUIDELINES.md
β βββ MIGRATIONS.md
βββ src/
β βββ config/ # Configuration files
β βββ controllers/ # API route handlers
β βββ middleware/ # Express middleware
β βββ schemas/ # API request/response schemas
β βββ services/ # Business logic services
β βββ storage/ # Database repositories
β βββ types/ # TypeScript type definitions
β βββ migrations/ # Database migrations
β βββ server.ts # Main application entry
βββ scripts/ # Utility scripts
βββ Dockerfile # Production container
βββ docker-compose.yml # Docker Compose configuration
βββ knexfile.ts # Database configuration
βββ env.example # Environment template
βββ package.json # Dependencies and scripts
βββ tsconfig.json # TypeScript configuration
Ensure all required environment variables are set:
SUPABASE_URLandSUPABASE_SERVICE_KEY(required)SUPABASE_DB_HOST,SUPABASE_DB_PASSWORD(required for migrations)OPENAI_API_KEY(required)MIGRATION_SECRET_KEY(required for migration endpoints)JWT_SECRET(required for authentication)DEFAULT_ADMIN_EMAILandDEFAULT_ADMIN_PASSWORD(required for initial setup)NODE_ENV=productionCORS_ORIGINS(set to your domain)BASE_URL(optional, for custom Swagger server URL)
# Build production image
docker build -t vezlo-server .
# Run production container
docker run -d \
--name vezlo-server \
-p 3000:3000 \
--env-file .env \
vezlo-server- Health check endpoint:
/health - Docker health check configured
- Logs available in
./logs/directory
# Check migration status
curl "https://your-domain.com/api/migrate/status?key=your-migration-secret-key"
# Run pending migrations
curl "https://your-domain.com/api/migrate?key=your-migration-secret-key"- Fork the repository
- Create feature branch:
git checkout -b feature/new-feature - Make changes and test locally
- Run tests:
npm test - Commit:
git commit -m 'Add new feature' - Push:
git push origin feature/new-feature - Submit pull request
- TypeScript - Full type safety required
- ESLint - Code formatting and quality
- Prettier - Consistent code style
- Tests - Unit tests for new features
- Documentation - Update README for API changes
- Follow RESTful conventions
- Use proper HTTP status codes
- Include comprehensive error handling
- Update Swagger documentation
- Add request/response schemas
- Response Time: Optimized for fast API responses
- Concurrent Users: Supports multiple concurrent users
- Memory Usage: Efficient memory management
- Database: Supabase vector operations integration
- Rate Limiting - Configurable request limits
- CORS Protection - Configurable origins
- Input Validation - Request schema validation
- Error Handling - Secure error responses
- Health Monitoring - Application logs and Docker health checks
- Developer Guidelines - Development workflow, coding standards, and best practices
- Database Migrations - Complete guide to Knex.js migration system
- API Documentation - Interactive Swagger documentation (when running)
This project is dual-licensed:
- Non-Commercial Use: Free under AGPL-3.0 license
- Commercial Use: Requires a commercial license - contact us for details
Status: β Production Ready | Version: 2.9.0 | Node.js: 20+ | TypeScript: 5+