Skip to main content

Installation

Install RDK using pip or uv:
pip install rdk --extra-index-url https://pypi.fury.io/021labs/

Basic Setup

1. Initialize RDK

Initialize RDK at the start of your application:
import os
from rdk import init

init(
    endpoint=os.environ["RDK_ENDPOINT"],
    api_key=os.environ["RDK_API_KEY"],
)
Store your API key in environment variables and never commit it to source control.

2. Add the @observe Decorator

Wrap your functions with @observe to create traces:
from rdk import observe
from anthropic import Anthropic

@observe(name="chat-completion")
def chat(message: str) -> str:
    client = Anthropic()
    response = client.messages.create(
        model="claude-sonnet-4-6",
        max_tokens=1024,
        messages=[{"role": "user", "content": message}]
    )
    return response.content[0].text
The name parameter is optional — if omitted, the function name is used.

3. Shutdown Gracefully

Flush remaining traces before your app exits:
from rdk import shutdown

# At app shutdown
shutdown()

Complete Example

import os
from rdk import init, observe, shutdown
from anthropic import Anthropic

# Initialize
init(
    endpoint=os.environ["RDK_ENDPOINT"],
    api_key=os.environ["RDK_API_KEY"],
)

@observe(name="summarize")
def summarize(text: str) -> str:
    client = Anthropic()
    response = client.messages.create(
        model="claude-sonnet-4-6",
        max_tokens=500,
        messages=[{
            "role": "user",
            "content": f"Summarize this text:\n\n{text}"
        }]
    )
    return response.content[0].text

# Use your function
result = summarize("Your long text here...")
print(result)

# Clean shutdown
shutdown()

Configuration Options

ParameterTypeDefaultDescription
endpointstrrequiredURL of your trace collector
api_keystrrequiredYour RDK API key
batch_sizeint10Number of spans per batch
flush_intervalfloat5.0Seconds between auto-flushes
timeoutfloat30.0HTTP request timeout in seconds
sample_ratefloat1.0Fraction of traces to capture (0–1)
redact_piiboolFalseEnable built-in PII redaction
redactorCallableNoneCustom redaction function
debugboolFalseEnable verbose debug logging
enabledboolTrueSet to False to disable all tracing
test_modeboolNoneUse null transport + mock tools
on_errorCallableNoneCallback for transport errors
instrument_langchainboolTrueAuto-instrument LangChain
instrument_anthropicboolTrueAuto-instrument Anthropic SDK
instrument_openaiboolTrueAuto-instrument OpenAI SDK
instrument_geminiboolTrueAuto-instrument Gemini SDK
See init() for full documentation.

Environment Variables

VariableEffect
RDK_TEST_MODE=1Equivalent to test_mode=True

What Gets Traced?

When you use @observe, RDK automatically captures LLM calls within the function:
  • Token Usage — Prompt, completion, and total tokens
  • Timing — Start time, end time, and duration
  • Model Info — Model name and provider
  • Errors — Exception messages
Input messages and output content are not captured by default. To capture them, use capture_input=True and capture_output=True on @observe, or enable PII-safe redaction first.

Next Steps