Documentation Index
Fetch the complete documentation index at: https://docs.steward.fi/llms.txt
Use this file to discover all available pages before exploring further.
Proxy API
The Steward Proxy runs on a separate port (default: 8080) and handles credential injection for outbound API requests. Agents make requests to the proxy, and Steward injects real credentials before forwarding.
Base URL
http://steward-proxy:8080
Typically accessed from within the Docker network. Not exposed to the public internet.
Authentication
All proxy requests require an agent JWT:
Authorization: Bearer stwd_agent_jwt_...
Making Requests
Via Named Alias
Named aliases provide a clean URL pattern:
# OpenAI
curl -X POST http://steward-proxy:8080/openai/v1/chat/completions \
-H "Authorization: Bearer agent-jwt" \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-4o",
"messages": [{"role": "user", "content": "Hello"}]
}'
# Anthropic
curl -X POST http://steward-proxy:8080/anthropic/v1/messages \
-H "Authorization: Bearer agent-jwt" \
-H "Content-Type: application/json" \
-d '{
"model": "claude-sonnet-4-20250514",
"max_tokens": 1024,
"messages": [{"role": "user", "content": "Hello"}]
}'
The proxy:
- Strips the agent’s
Authorization header
- Resolves
openai → api.openai.com
- Finds the matching route for
api.openai.com/*
- Decrypts and injects the real credential
- Forwards to
https://api.openai.com/v1/chat/completions
Via Direct Host
For APIs without a named alias:
curl -X GET http://steward-proxy:8080/proxy/public-api.birdeye.so/defi/price?address=So11... \
-H "Authorization: Bearer agent-jwt"
SDK Integration
Most LLM SDKs support custom base URLs, making proxy integration trivial:
import OpenAI from "openai";
const openai = new OpenAI({
apiKey: "steward", // dummy — stripped by proxy
baseURL: `${process.env.STEWARD_PROXY_URL}/openai/v1`,
});
const response = await openai.chat.completions.create({
model: "gpt-4o",
messages: [{ role: "user", content: "Hello" }],
});
import Anthropic from "@anthropic-ai/sdk";
const anthropic = new Anthropic({
apiKey: "steward",
baseURL: `${process.env.STEWARD_PROXY_URL}/anthropic`,
});
const response = await anthropic.messages.create({
model: "claude-sonnet-4-20250514",
max_tokens: 1024,
messages: [{ role: "user", content: "Hello" }],
});
const response = await fetch(
`${process.env.STEWARD_PROXY_URL}/openai/v1/chat/completions`,
{
method: "POST",
headers: {
"Authorization": `Bearer ${process.env.STEWARD_AGENT_TOKEN}`,
"Content-Type": "application/json",
},
body: JSON.stringify({
model: "gpt-4o",
messages: [{ role: "user", content: "Hello" }],
}),
}
);
Proxy Endpoints
ANY /:alias/* → Proxied via named alias (e.g., /openai/v1/...)
ANY /proxy/:host/* → Proxied via direct host (e.g., /proxy/api.example.com/...)
Error Responses
| Status | Meaning |
|---|
| 401 | Invalid or missing agent JWT |
| 403 | No route configured for this host/path, or policy denied |
| 502 | Upstream API error |
| 504 | Upstream API timeout |
{
"ok": false,
"error": "No credential route configured for this endpoint"
}
Configuring Aliases
Aliases are configured per-tenant via route definitions. When you create a route for api.openai.com, the alias openai is automatically available.
Default alias mappings:
| Alias | Host |
|---|
openai | api.openai.com |
anthropic | api.anthropic.com |
birdeye | public-api.birdeye.so |
Custom aliases can be configured by the platform operator.