Every enterprise AI deployment eventually hits the same wall. Your compliance team needs data sovereignty guarantees. Your security team needs air-gapped network support. Your infrastructure team needs to run the stack on hardware they control, behind firewalls they manage, with audit trails they own.

Cloud-hosted agent platforms can't satisfy these requirements. The moment your agent telemetry, decision logs, or customer data crosses a network boundary you don't control, you've introduced risk that no SLA can mitigate. Regulated industries — financial services, healthcare, defense, legal — need a different model entirely.

Oceum Enterprise is the self-hosted distribution of Oceum's governed agent infrastructure. It runs the full platform — every API endpoint, every cron job, every fleet management capability — inside your own infrastructure using Docker Compose. No cloud dependency. No telemetry. No external calls except the ones your agents make.

Architecture overview

The self-hosted stack consists of three containers orchestrated by Docker Compose:

An optional fourth container — nginx — provides TLS termination and reverse proxying for production deployments. If you already have a load balancer or ingress controller, you can skip it.

The key architectural pattern is the DB Adapter. The codebase uses an environment variable DB_PROVIDER=postgres to route all database calls through a Postgres adapter backed by the pg package. There is zero Supabase SDK usage in the enterprise build. Every query is raw SQL.

Prerequisites

Before you begin, ensure you have:

Step-by-step deployment

1. Get the enterprise repository. Enterprise customers receive access to the private repository after licensing. Once you have access:

git clone <your-enterprise-repo-url>
cd oceum-enterprise

Don't have access yet? Contact us to get started with an enterprise license.

2. Configure your environment. Copy the example file and fill in your values:

cp .env.example .env

The key variables:

# Database
POSTGRES_USER=oceum
POSTGRES_PASSWORD=your-secure-password
POSTGRES_DB=oceum
DATABASE_URL=postgresql://oceum:your-secure-password@db:5432/oceum

# Application
DB_PROVIDER=postgres
JWT_SECRET=your-jwt-secret-min-32-chars
NODE_ENV=production
PORT=3000

# AI Provider
ANTHROPIC_API_KEY=sk-ant-...

# License
OCEUM_LICENSE_KEY=your-license-key

3. Start the stack.

docker compose up -d

This pulls the Node 20 Alpine image and Postgres 16 image, creates the containers, and starts them. The API server waits for Postgres to be healthy before accepting connections.

4. Run the setup script. On first deployment, initialize the database schema, create your admin organization, first user, and API key:

docker compose exec api node scripts/setup-db.js

The script outputs your admin credentials and an API key. Store these securely — the API key is shown only once.

5. Verify the deployment. Hit the health endpoint:

curl http://localhost:3000/api/health

A healthy response returns:

{
  "status": "ok",
  "version": "enterprise",
  "db": "connected",
  "crons": "running",
  "uptime": 12
}

License tiers

Oceum Enterprise ships with three license tiers. Each unlocks different fleet capacity and platform features:

Pro ($49/mo). Unlimited agents. 10,000 governed executions per month. Full API access, core integrations, zero-knowledge vault, 7-day audit trail. Ideal for builders and small teams.

Team ($999/mo). 250,000 governed executions. Drift Engine included. Advanced approval workflows, 90-day audit trail, up to 5 protocol adapters, priority support. Built for operations teams scaling governed autonomy.

Enterprise (custom, $30k+ ARR). Everything in Team, plus self-hosted or air-gapped deployment, Orion LLM for sovereign environments, full protocol adapter suite, SSO/SAML, SLA, and dedicated architecture support.

License enforcement is local. The server validates the key on startup and checks tier limits before agent creation. No license server. No phone-home. The key is a signed JWT that encodes the tier, org ID, and expiration. It works fully offline.

What you get

The self-hosted build is not a stripped-down version. It is the full platform:

Three utility scripts ship with the repo:

Optional: nginx and TLS

For production deployments exposed beyond localhost, enable the nginx container in docker-compose.yml:

# Uncomment the nginx service in docker-compose.yml
# Place your TLS certificate and key in ./nginx/certs/
# nginx/conf.d/default.conf routes to the API container

docker compose --profile production up -d

The included nginx configuration handles TLS termination, proxies all /api/* routes to the Express server, and serves static assets. If you're running behind an existing reverse proxy or Kubernetes ingress, point it directly to the API container on port 3000 and skip the nginx container entirely.

For air-gapped environments, pre-pull the Docker images on a connected machine, export them with docker save, transfer via secure media, and load with docker load. The stack requires no internet access once the images and your model provider keys are in place.

Same platform. Your infrastructure. Oceum Enterprise gives you the full agent management stack — every endpoint, every cron, every fleet capability — running on hardware you control. No telemetry. No cloud dependency. No data leaving your perimeter. For regulated industries and security-conscious organizations, this is not a feature. It is the prerequisite.