Oceum is BYO Agent. Bring whatever you've built — a Python script, a LangChain pipeline, a CrewAI crew, a raw HTTP loop — and Oceum manages it. Monitoring, reputation, fleet-wide visibility, and autonomy tiers. No vendor lock-in. No proprietary agent format. If your code can make an HTTP request, it can join the fleet.
This tutorial walks through the full connection flow: get an API key, send your first heartbeat, report task lifecycles, and integrate with five different stacks.
Step 1: Get your API key
Sign up at /portal and create a workspace. Navigate to Settings and copy your API key. Every request to the Oceum webhook requires this key in the x-api-key header.
You also need an agent ID — a unique string that identifies this agent in your fleet. Use something descriptive: support-triage-v2, content-writer, data-pipeline-prod.
Step 2: Send your first heartbeat
A heartbeat tells Oceum your agent is alive. Send one on startup, then on a regular interval (every 30–60 seconds is typical). If heartbeats stop, Uptime flags the agent as degraded.
curl -X POST https://oceum.ai/api/webhook \
-H "Content-Type: application/json" \
-H "x-api-key: YOUR_API_KEY" \
-d '{
"agent_id": "support-triage-v2",
"event": "heartbeat",
"data": {}
}'
That's it. Your agent now appears in the Oceum fleet dashboard with a green liveness indicator.
Step 3: Report a task lifecycle
Agents do work. Oceum tracks that work through two events: taskStart and taskComplete. Start a task when execution begins, complete it when it finishes. This gives you timing, success rates, and a full activity log.
# Start a task
curl -X POST https://oceum.ai/api/webhook \
-H "Content-Type: application/json" \
-H "x-api-key: YOUR_API_KEY" \
-d '{
"agent_id": "support-triage-v2",
"event": "taskStart",
"data": { "name": "classify-ticket-4812" }
}'
# Complete the task
curl -X POST https://oceum.ai/api/webhook \
-H "Content-Type: application/json" \
-H "x-api-key: YOUR_API_KEY" \
-d '{
"agent_id": "support-triage-v2",
"event": "taskComplete",
"data": {
"name": "classify-ticket-4812",
"meta": { "category": "billing", "confidence": 0.94 }
}
}'
The meta field is freeform. Pass whatever context is useful — confidence scores, output summaries, token counts, latency.
Integration examples
Five ways to connect. Pick the one that fits your stack.
1. curl (raw HTTP)
Already covered above. The webhook is a single POST endpoint. Any language or tool that can make HTTP requests works.
POST https://oceum.ai/api/webhook
Content-Type: application/json
x-api-key: YOUR_API_KEY
{
"agent_id": "my-agent",
"event": "heartbeat" | "taskStart" | "taskComplete" | "error",
"data": { ... }
}
2. Node.js SDK
The oceum npm package wraps the webhook with a clean API. Zero dependencies.
npm install oceum
import Oceum from 'oceum';
const agent = new Oceum({
apiKey: process.env.OCEUM_API_KEY,
agentId: 'support-triage-v2'
});
// Heartbeat
await agent.heartbeat();
// Task lifecycle
await agent.taskStart('classify-ticket-4812');
// ... do work ...
await agent.taskComplete('classify-ticket-4812', {
category: 'billing',
confidence: 0.94
});
// Error reporting
await agent.error('OpenAI rate limit exceeded', {
retryIn: 30
});
// Wrap a function (auto taskStart + taskComplete)
const result = await agent.wrap('classify-ticket-4812', async () => {
return await classifyTicket(ticket);
});
3. Python (requests)
Direct HTTP calls with the requests library. No SDK required.
import requests
import os
OCEUM_URL = "https://oceum.ai/api/webhook"
HEADERS = {
"Content-Type": "application/json",
"x-api-key": os.environ["OCEUM_API_KEY"]
}
AGENT_ID = "data-pipeline-prod"
def heartbeat():
requests.post(OCEUM_URL, json={
"agent_id": AGENT_ID,
"event": "heartbeat",
"data": {}
}, headers=HEADERS)
def task_start(name):
requests.post(OCEUM_URL, json={
"agent_id": AGENT_ID,
"event": "taskStart",
"data": {"name": name}
}, headers=HEADERS)
def task_complete(name, meta=None):
requests.post(OCEUM_URL, json={
"agent_id": AGENT_ID,
"event": "taskComplete",
"data": {"name": name, "meta": meta or {}}
}, headers=HEADERS)
# Usage
heartbeat()
task_start("etl-run-2026-03-18")
# ... do work ...
task_complete("etl-run-2026-03-18", {
"rows_processed": 14200,
"duration_ms": 3420
})
4. LangChain callback
Wrap your LangChain agent execution with Oceum reporting. The callback fires on chain start and end.
import requests
import os
from langchain.callbacks.base import BaseCallbackHandler
class OceumCallback(BaseCallbackHandler):
def __init__(self, agent_id):
self.agent_id = agent_id
self.url = "https://oceum.ai/api/webhook"
self.headers = {
"Content-Type": "application/json",
"x-api-key": os.environ["OCEUM_API_KEY"]
}
def _send(self, event, data):
requests.post(self.url, json={
"agent_id": self.agent_id,
"event": event,
"data": data
}, headers=self.headers)
def on_chain_start(self, serialized, inputs, **kwargs):
name = serialized.get("name", "langchain-task")
self._send("taskStart", {"name": name})
def on_chain_end(self, outputs, **kwargs):
self._send("taskComplete", {
"name": "langchain-task",
"meta": {"output_keys": list(outputs.keys())}
})
def on_chain_error(self, error, **kwargs):
self._send("error", {"message": str(error)})
# Usage
callback = OceumCallback("langchain-support-agent")
agent.run("Classify this ticket", callbacks=[callback])
5. CrewAI
Hook into CrewAI's execution flow. Report each crew kickoff as a task, and report completion when the crew finishes.
import requests
import os
from crewai import Crew, Agent, Task
OCEUM_URL = "https://oceum.ai/api/webhook"
HEADERS = {
"Content-Type": "application/json",
"x-api-key": os.environ["OCEUM_API_KEY"]
}
AGENT_ID = "crewai-research-crew"
def oceum_event(event, data):
requests.post(OCEUM_URL, json={
"agent_id": AGENT_ID,
"event": event,
"data": data
}, headers=HEADERS)
# Define your crew
researcher = Agent(role="Researcher", ...)
writer = Agent(role="Writer", ...)
task = Task(description="Research and write report", ...)
crew = Crew(agents=[researcher, writer], tasks=[task])
# Run with Oceum tracking
oceum_event("heartbeat", {})
oceum_event("taskStart", {"name": "research-report-q1"})
try:
result = crew.kickoff()
oceum_event("taskComplete", {
"name": "research-report-q1",
"meta": {"output_length": len(str(result))}
})
except Exception as e:
oceum_event("error", {"message": str(e)})
What happens next
Once your agent sends its first event, three things happen immediately:
- Fleet dashboard. The agent appears in your fleet with real-time status, liveness indicator, and reputation score.
- Uptime monitoring. Oceum's monitoring system begins tracking heartbeat intervals, task durations, and error rates. If the agent goes silent or starts failing, you get alerted.
- Activity log. Every event — heartbeats, task starts, completions, errors — shows up in the agent's activity log with full metadata and timestamps.
From there, you can configure autonomy tiers, set up workflow rules, and connect the agent to fleet-wide memory. But the foundation is this: one API key, one endpoint, and a few lines of code.
Five minutes. Any framework. Full visibility. That's the Oceum integration model. No proprietary agent format, no SDK requirement, no migration path. Your agent stays yours — Oceum just makes it observable, manageable, and ready for production.