Local Deployment
This guide will walk through the complete process of installing docker and running OmniBox cloud service using docker compose on a freshly installed Debian 12 system, starting from the command line as the root user.
Deployment Requirements
- Have a Linux server with access to the
rootuser and command line - Possess basic computer knowledge
- Be able to access GitHub and ghcr.io
- By default, ports 8080, 8025, and 9000 are publicly accessible (or modify in the .env file)
Environment Overview
| Environment | Description |
|---|---|
| CPU | i5-12600KF |
| RAM | 64GB |
| Storage | 1T NVMe |
| OS | Debian 12 |
| Server IP | 192.168.0.100 |
Install Docker
Install Docker using Tsinghua Mirror:
shell
export DOWNLOAD_URL="https://mirrors.tuna.tsinghua.edu.cn/docker-ce"
wget -O- https://raw.githubusercontent.com/docker/docker-install/master/install.sh | shClone the Project
shell
GIT_LFS_SKIP_SMUDGE=1 git clone https://github.com/import-ai/omnibox.git
cd omnibox
cp example.env .envRun the Project
shell
docker compose -f compose.yaml -f compose/deps.yaml up -dIf there are no errors, the project has started successfully.
Register the First Account
Visit http://192.168.0.100:8080 and register with any email, e.g., omnibox@qq.com, then go to http://192.168.0.100:8025 to view the email verification code.
After successful registration and login, you can start using it.
Configure Environment Variables
After the project is running, you still need to configure some environment variables, otherwise parts involving AI and file upload/download will error.
There are 3 main parts:
OBW_VECTOR: Vector search relatedOBW_GRIMOIRE: Web collection, LLM Q&A relatedOBB_S3_PUBLIC_ENDPOINT: File upload/download related
Edit .env:
shell
# >>> AI-related configuration >>>
OBW_VECTOR_EMBEDDING_API_KEY="sk-***"
OBW_VECTOR_EMBEDDING_BASE_URL="https://api.openai.com/v1"
OBW_VECTOR_EMBEDDING_MODEL="text-embedding-3-small"
OBW_GRIMOIRE_OPENAI_DEFAULT_API_KEY="***"
OBW_GRIMOIRE_OPENAI_DEFAULT_BASE_URL="https://api.openai.com/v1"
OBW_GRIMOIRE_OPENAI_DEFAULT_MODEL="gpt-4o"
OBW_GRIMOIRE_OPENAI_MINI_MODEL="gpt-4o-mini"
OBW_GRIMOIRE_OPENAI_LARGE_MODEL="gpt-4.1"
OBW_GRIMOIRE_OPENAI_LARGE_THINKING_MODEL="o3"
# <<< AI-related configuration <<<
# Here our server IP is 192.168.0.100, replace with your server's external address, ensure it can be directly accessed from the browser
OBB_S3_PUBLIC_ENDPOINT="http://192.168.0.100:9000"After editing, run again:
shell
docker compose -f compose.yaml -f compose/deps.yaml up -d