Integrate in minutes, not weeks.
Chronicle exposes three integration surfaces: an enforcement endpoint, an OpenAI-compatible LLM gateway, and an MCP tool gateway. If your system makes HTTP calls, Chronicle can wrap it.
/enforce — the decision point
Every action your AI system is about to take should pass through/enforcefirst. Chronicle evaluates it against your policy pack and returns a verdict synchronously.
- →Low latency verdict on the enforcement path
- →Policy applied without model changes
- →Evidence encrypted and emitted asynchronously
- →Decision ID enables full forensic trace
# Enforce a decision
curl -X POST http://localhost:8300/enforce \
-H "Content-Type: application/json" \
-H "Authorization: Bearer <jwt>" \
-d '{
"decision_id": "dec_01HXYZ...",
"session_id": "ses_01HXYZ...",
"actor_id": "agent:my-orchestrator",
"action": "execute_sql_query",
"subject": "production.customers",
"context": {
"query": "SELECT * FROM customers LIMIT 100"
}
}'
# Response
{
"verdict": "ALLOWED",
"decision_id": "dec_01HXYZ...",
"replay_mode": "L2",
"latency_ms": 1.4,
"policy_trace": { "layer": "rules", "matched": "sql_read_allowed" }
}# Drop-in LLM gateway (OpenAI-compatible)
import openai
client = openai.OpenAI(
base_url="http://localhost:8300/v1", # Chronicle LLM gateway
api_key="<your-openai-key>"
)
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Summarize Q3 results"}],
extra_headers={
"X-Chronicle-Session": "ses_01HXYZ...",
"X-Chronicle-Actor": "agent:analyst",
}
)
# Chronicle captures pre/post-call evidence automaticallyOpenAI-compatible. Zero refactoring.
Point your existing LLM client at Chronicle's gateway endpoint instead of the model provider. Chronicle transparently proxies requests, normalizes streaming SSE, and emits pre/post-call evidence.
- →Works with any OpenAI-compatible client
- →Pre/post-call evidence — full request + response captured
- →Streaming SSE normalized and passed through intact
- →Actor identity injected via session headers
Docker-first deployment.
Chronicle ships as a Docker image. The full client stack — runtime, Vault, Redis, Postgres, MinIO — stands up with a singledocker compose up.
- →ghcr.io/stoe/chronicle-client — the enforcement runtime
- →Vault AppRole bootstrapped from Docker secrets
- →All secrets fetched from Vault at startup — no env var secrets in prod
- →Convenience scripts: up / rotate / status
# docker-compose.yml (minimal)
services:
chronicle:
image: ghcr.io/stoe/chronicle-client:latest
ports:
- "8300:8300"
environment:
VAULT_ADDR: http://vault:8200
CHRONICLE_TENANT_ID: my-tenant
CHRONICLE_API_URL: https://api.stoe-ai.com
PROMPTSHIELD_RULE_PACK_PATH: /rules/v0_1_0.yaml
volumes:
- ./rules:/rules:ro
depends_on: [vault, redis, postgres]Coming Q2 2026
Full developer documentation.
Chronicle's full API reference, integration guides, policy pack specification, replay level documentation, and SDK references are in active development. Request early access to get notified when they ship.
Get notified