Skip to main content

Installation

Install RDK using pip or uv:
pip install rdk --extra-index-url https://pypi.fury.io/021labs/

Basic Setup

1. Set your API key

export RDK_API_KEY="your-api-key"

2. Initialize and trace

Call init() to start tracing — all LLM calls are captured automatically:
from rdk import init, shutdown
from anthropic import Anthropic

init()

client = Anthropic()
response = client.messages.create(
    model="claude-sonnet-4-6",
    max_tokens=1024,
    messages=[{"role": "user", "content": "Hello!"}]
)

shutdown()
Use @observe to group multiple calls into a single trace:
from rdk import observe
from anthropic import Anthropic

@observe(name="chat-completion")
def chat(message: str) -> str:
    client = Anthropic()
    response = client.messages.create(
        model="claude-sonnet-4-6",
        max_tokens=1024,
        messages=[{"role": "user", "content": message}]
    )
    return response.content[0].text

3. Shutdown gracefully

Flush remaining traces before your app exits:
from rdk import shutdown

shutdown()

Complete Example

from rdk import observe, shutdown
from anthropic import Anthropic

@observe(name="summarize")
def summarize(text: str) -> str:
    client = Anthropic()
    response = client.messages.create(
        model="claude-sonnet-4-6",
        max_tokens=500,
        messages=[{"role": "user", "content": f"Summarize this text:\n\n{text}"}]
    )
    return response.content[0].text

result = summarize("Your long text here...")
print(result)

shutdown()

Custom Configuration

Need custom settings? Call init() before your first @observe call:
from rdk import init

init(
    sample_rate=0.1,  # Capture 10% of traces
    redact_pii=True,  # Remove emails, phones, SSNs, etc.
)

Configuration Options

ParameterTypeDefaultDescription
api_keystrRDK_API_KEY env varYour RDK API key
endpointstrhttps://collector.021labs.aiCollector URL
batch_sizeint10Number of spans per batch
flush_intervalfloat5.0Seconds between auto-flushes
timeoutfloat30.0HTTP request timeout in seconds
sample_ratefloat1.0Fraction of traces to capture (0–1)
redact_piiboolFalseEnable built-in PII redaction
redactorCallableNoneCustom redaction function
debugboolFalseEnable verbose debug logging
enabledboolTrueSet to False to disable all tracing
modestr"default"Operating mode: "default", "test", or "eval"
on_errorCallableNoneCallback for transport errors
instrument_langchainboolTrueAuto-instrument LangChain
instrument_anthropicboolTrueAuto-instrument Anthropic SDK
instrument_openaiboolTrueAuto-instrument OpenAI SDK
instrument_geminiboolTrueAuto-instrument Gemini SDK
See init() for full documentation.

Environment Variables

VariableEffect
RDK_API_KEYAPI key for authentication
RDK_ENDPOINTOverride the default collector URL
RDK_MODEOperating mode (default, test, eval)

What Gets Traced?

RDK automatically captures all LLM calls:
  • Token Usage — Prompt, completion, and total tokens
  • Cost — Calculated automatically based on model pricing
  • Timing — Start time, end time, and duration
  • Model Info — Model name and provider
  • Errors — Exception messages
Input messages and output content are not captured by default. Enable them with capture_input=True and capture_output=True on @observe.

Next Steps

Anthropic Integration

Set up Claude tracing

PII Redaction

Protect sensitive data

Testing

Write tests without real API calls

Synthetic Worlds

Simulate tool calls for fast iteration

API Reference

Full parameter reference