Skip to content

Local Deployment

Local deployment does not include PDF and audio/video parsing, as these are cloud-based premium features

This guide will walk through the complete process of installing docker and running OmniBox using docker compose on a freshly installed Debian 12 system, starting from the command line as the root user.

Deployment Requirements

  1. Have a Linux server with access to the root user and command line
  2. Possess basic computer knowledge
  3. Be able to access GitHub and ghcr.io
  4. By default, ports 8080, 8025, and 9000 are publicly accessible (or modify in the .env file)

Environment Overview

EnvironmentDescription
CPUi5-12600KF
RAM64GB
Storage1T NVMe
OSDebian 12
Server IP192.168.0.100

Install Docker

Install Docker using Tsinghua Mirror:

shell
export DOWNLOAD_URL="https://mirrors.tuna.tsinghua.edu.cn/docker-ce"
wget -O- https://raw.githubusercontent.com/docker/docker-install/master/install.sh | sh

Clone the Project

shell
GIT_LFS_SKIP_SMUDGE=1 git clone https://github.com/import-ai/omnibox.git
cd omnibox
cp example.env .env

Run the Project

shell
docker compose -f compose.yaml -f compose/deps.yaml up -d

If there are no errors, the project has started successfully.

Register the First Account

Visit http://192.168.0.100:8080 and register with any email, e.g., omnibox@qq.com, then go to http://192.168.0.100:8025 to view the email verification code.

After successful registration and login, you can start using it.

Configure Environment Variables

After the project is running, you still need to configure some environment variables, otherwise parts involving AI and file upload/download will error.

There are 3 main parts:

  1. OBW_VECTOR: Vector search related
  2. OBW_GRIMOIRE: LLM Q&A related
  3. OBB_S3_PUBLIC_ENDPOINT: File upload/download related

Edit .env:

shell
# >>> AI-related configuration >>>
OBW_VECTOR_EMBEDDING_API_KEY="sk-***"
OBW_VECTOR_EMBEDDING_BASE_URL="https://api.openai.com/v1"
OBW_VECTOR_EMBEDDING_MODEL="text-embedding-3-small"

OBW_GRIMOIRE_OPENAI_DEFAULT_API_KEY="***"
OBW_GRIMOIRE_OPENAI_DEFAULT_BASE_URL="https://api.openai.com/v1"
OBW_GRIMOIRE_OPENAI_DEFAULT_MODEL="gpt-4o"
OBW_GRIMOIRE_OPENAI_MINI_MODEL="gpt-4o-mini"
OBW_GRIMOIRE_OPENAI_LARGE_MODEL="gpt-4.1"
OBW_GRIMOIRE_OPENAI_LARGE_THINKING_MODEL="o3"
# <<< AI-related configuration <<<

# Here our server IP is 192.168.0.100, please replace it with your own server IP in actual operation
OBB_S3_PUBLIC_ENDPOINT="http://192.168.0.100:9000"

After editing, run again:

shell
docker compose -f compose.yaml -f compose/deps.yaml up -d