Welcome to Coalex AI Documentation¶
-
Quick Start (Python)
Get started with Coalex AI in Python in 5 minutes
-
Quick Start (TypeScript)
Get started with Coalex AI in TypeScript/Node.js
-
Dashboard Guide
Learn how to use the Coalex AI dashboard
What is Coalex AI?¶
Coalex AI is an LLM observability and evaluation platform that helps you:
- Monitor your AI agents in production with real-time tracing
- Track performance metrics including quality, cost, and sustainability
- Implement human-in-the-loop approval workflows for low-confidence outputs
- Optimize your AI applications with detailed insights
Key Features¶
- Real-time observability for LLM calls
- Multi-LLM provider support (see compatibility table below)
- Human-in-the-loop workflows
- Performance and sustainability metrics
- Easy integration with OpenTelemetry
Provider Compatibility¶
Coalex AI supports any LLM provider that has an OpenInference instrumentation package. Since Coalex uses OpenInference on the client side and accepts standard OpenTelemetry traces with GenAI semantic conventions, you automatically get support for all instrumented providers.
Python SDK¶
Supports 31+ providers via OpenInference, including:
- OpenAI (GPT-4, GPT-3.5, etc.)
- Anthropic (Claude models)
- Google Vertex AI (PaLM, Gemini)
- Google GenAI (Gemini)
- AWS Bedrock (Claude, Titan, etc.)
- MistralAI (Mistral models)
- Groq (Fast inference)
- LiteLLM (Multi-provider proxy)
- LangChain (All LangChain integrations)
- LlamaIndex (All LlamaIndex integrations)
- DSPy, CrewAI, Haystack, Instructor, Guardrails, and many more
👉 View complete Python provider list
TypeScript SDK¶
Supports 10+ providers via OpenInference, including:
- OpenAI (GPT-4, GPT-3.5, etc.)
- Anthropic (Claude models)
- AWS Bedrock (Claude, Titan, etc.)
- LangChain.js (All LangChain.js integrations)
- Vercel AI SDK (Vercel AI integrations)
- Google Gemini (via custom instrumentation)
- And more
👉 View complete TypeScript provider list
Note: All providers automatically support token tracking, cost calculation, and sustainability metrics.
Quick Example¶
from coalex import register, coalex_context
from openinference.instrumentation.vertexai import VertexAIInstrumentor
# Step 1: Register with your agent ID from the dashboard
tracer_provider = register(agent_id="your-agent-id")
# Step 2: Instrument your LLM provider
VertexAIInstrumentor().instrument(tracer_provider=tracer_provider)
# Step 3: Use coalex_context for proper tracking
with coalex_context(request_id="req_001", prompt_version="v1.0.0"):
response = model.generate_content("Your prompt here")
import { register, coalexContext } from '@coalex-ai/sdk';
// Step 1: Register with your agent ID from the dashboard
const tracerProvider = register({ agentId: 'your-agent-id' });
// Step 2: Use coalexContext for proper tracking
await coalexContext(
{ requestId: 'req_001', promptVersion: 'v1.0.0' },
async () => {
// Your LLM call here
const response = await llm.complete('Your prompt here');
return response;
}
);
// Don't forget to shutdown to flush telemetry
await tracerProvider.shutdown();
Getting Started¶
Ready to start monitoring your AI agents? Choose your language:
- Python: Follow our Python Quickstart Guide
- TypeScript/Node.js: Follow our TypeScript Quickstart Guide
You'll learn how to:
- Create an agent in the Coalex AI dashboard
- Install the SDK
- Instrument your code
- View real-time traces
- Submit custom metrics
- Implement approval workflows