Skip to content

Configuration

Omelette is configured via environment variables. Copy .env.example to .env and customize.

Application

VariableDescriptionDefault
APP_ENVdevelopment, production, or testingdevelopment
APP_DEBUGEnable debug modetrue
APP_HOSTBackend bind address0.0.0.0
APP_PORTBackend port8000

Database

VariableDescriptionDefault
DATABASE_URLSQLite connection stringsqlite:///./data/omelette.db

Data Storage

VariableDescription
DATA_DIRBase path for PDFs, OCR output, ChromaDB
PDF_DIRPDF storage (default: {DATA_DIR}/pdfs)
OCR_OUTPUT_DIROCR output (default: {DATA_DIR}/ocr_output)
CHROMA_DB_DIRChromaDB path (default: {DATA_DIR}/chroma_db)

LLM Provider

Set LLM_PROVIDER to one of: openai, anthropic, aliyun, volcengine, ollama, mock.

OpenAI

VariableDescription
OPENAI_API_KEYOpenAI API key
OPENAI_MODELModel name (default: gpt-4o-mini)

Anthropic

VariableDescription
ANTHROPIC_API_KEYAnthropic API key
ANTHROPIC_MODELModel name (default: claude-sonnet-4-20250514)

Aliyun Bailian

VariableDescription
ALIYUN_API_KEYAliyun API key
ALIYUN_BASE_URLOpenAI-compatible endpoint
ALIYUN_MODELModel name (e.g. qwen3.5-plus)

Volcengine Doubao

VariableDescription
VOLCENGINE_API_KEYVolcengine API key
VOLCENGINE_BASE_URLOpenAI-compatible endpoint
VOLCENGINE_MODELModel name

Ollama (local)

VariableDescription
OLLAMA_BASE_URLOllama server URL (default: http://localhost:11434)
OLLAMA_MODELModel name (default: llama3)

Mock

Use LLM_PROVIDER=mock for testing without API keys. No additional variables required.

Embeddings

VariableDescriptionDefault
EMBEDDING_PROVIDERlocal, api, or mocklocal
EMBEDDING_MODELModel name (local: HuggingFace; api: OpenAI-compatible)BAAI/bge-m3
  • local: Uses sentence-transformers with GPU auto-detection
  • api: Uses OpenAI-compatible embedding API
  • mock: Deterministic mock for tests

GPU

VariableDescriptionDefault
CUDA_VISIBLE_DEVICESComma-separated GPU IDs for OCR/embeddings0,3

Proxy

VariableDescription
HTTP_PROXYHTTP proxy URL
HTTPS_PROXYHTTPS proxy URL

Example: HTTP_PROXY=http://127.0.0.1:20171/

External APIs

VariableDescription
SEMANTIC_SCHOLAR_API_KEYOptional; increases Semantic Scholar rate limit
UNPAYWALL_EMAILRequired for Unpaywall PDF lookup

Frontend Settings

LLM provider, model, temperature, and API keys can be configured via the Settings page at /settings in the web UI. These settings override environment variables for the current user and are stored in the database. Use this for per-user customization without editing .env.

Released under the MIT License.