Skip to main content

RDK Python SDK

RDK (Reinforcement Development Kit) is a lightweight Python library for tracing LLM applications. It provides automatic instrumentation for popular LLM providers and frameworks, giving you complete visibility into your AI application’s behavior.

Why RDK?

Building LLM applications requires understanding what’s happening under the hood. RDK helps you:
  • Debug faster - See exactly what prompts are sent and responses received
  • Monitor costs - Track token usage across all your LLM calls
  • Optimize performance - Identify slow calls and bottlenecks
  • Ensure compliance - Automatically redact PII from traces

Key Features

Auto-instrumentation

Automatic tracing for Anthropic, OpenAI, LangChain, and Gemini SDKs

PII Redaction

Built-in redaction for emails, phone numbers, SSNs, API keys, and custom patterns

Minimal Overhead

Async batching with configurable flush intervals

Framework Agnostic

Works with FastAPI, Flask, or any Python application

Supported Integrations

ProviderAuto-instrumentationNotes
AnthropicYesAuto-instrumented at init()
OpenAIYesAuto-instrumented at init()
LangChainYesAuto-instrumented at init()
Google GeminiYesAuto-instrumented at init()
BAMLManualRequires b = instrument_baml(b) after init()

Quick Example

from rdk import init, observe
from anthropic import Anthropic

# Initialize RDK
init(
    endpoint="https://collector.021labs.ai",
    api_key="your-api-key"
)

# All LLM calls are automatically traced
@observe(name="my-chat")
def chat(message: str) -> str:
    client = Anthropic()
    response = client.messages.create(
        model="claude-sonnet-4-6",
        max_tokens=1024,
        messages=[{"role": "user", "content": message}]
    )
    return response.content[0].text

result = chat("Hello, world!")

Next Steps