Installation
Install RDK using pip or uv:
pip install rdk --extra-index-url https://pypi.fury.io/021labs/
Basic Setup
1. Initialize RDK
Initialize RDK at the start of your application:
import os
from rdk import init
init(
endpoint=os.environ["RDK_ENDPOINT"],
api_key=os.environ["RDK_API_KEY"],
)
Store your API key in environment variables and never commit it to source control.
2. Add the @observe Decorator
Wrap your functions with @observe to create traces:
from rdk import observe
from anthropic import Anthropic
@observe(name="chat-completion")
def chat(message: str) -> str:
client = Anthropic()
response = client.messages.create(
model="claude-sonnet-4-6",
max_tokens=1024,
messages=[{"role": "user", "content": message}]
)
return response.content[0].text
The name parameter is optional — if omitted, the function name is used.
3. Shutdown Gracefully
Flush remaining traces before your app exits:
from rdk import shutdown
# At app shutdown
shutdown()
Complete Example
import os
from rdk import init, observe, shutdown
from anthropic import Anthropic
# Initialize
init(
endpoint=os.environ["RDK_ENDPOINT"],
api_key=os.environ["RDK_API_KEY"],
)
@observe(name="summarize")
def summarize(text: str) -> str:
client = Anthropic()
response = client.messages.create(
model="claude-sonnet-4-6",
max_tokens=500,
messages=[{
"role": "user",
"content": f"Summarize this text:\n\n{text}"
}]
)
return response.content[0].text
# Use your function
result = summarize("Your long text here...")
print(result)
# Clean shutdown
shutdown()
Configuration Options
| Parameter | Type | Default | Description |
|---|
endpoint | str | required | URL of your trace collector |
api_key | str | required | Your RDK API key |
batch_size | int | 10 | Number of spans per batch |
flush_interval | float | 5.0 | Seconds between auto-flushes |
timeout | float | 30.0 | HTTP request timeout in seconds |
sample_rate | float | 1.0 | Fraction of traces to capture (0–1) |
redact_pii | bool | False | Enable built-in PII redaction |
redactor | Callable | None | Custom redaction function |
debug | bool | False | Enable verbose debug logging |
enabled | bool | True | Set to False to disable all tracing |
test_mode | bool | None | Use null transport + mock tools |
on_error | Callable | None | Callback for transport errors |
instrument_langchain | bool | True | Auto-instrument LangChain |
instrument_anthropic | bool | True | Auto-instrument Anthropic SDK |
instrument_openai | bool | True | Auto-instrument OpenAI SDK |
instrument_gemini | bool | True | Auto-instrument Gemini SDK |
See init() for full documentation.
Environment Variables
| Variable | Effect |
|---|
RDK_TEST_MODE=1 | Equivalent to test_mode=True |
What Gets Traced?
When you use @observe, RDK automatically captures LLM calls within the function:
- Token Usage — Prompt, completion, and total tokens
- Timing — Start time, end time, and duration
- Model Info — Model name and provider
- Errors — Exception messages
Input messages and output content are not captured by default. To capture them, use capture_input=True and capture_output=True on @observe, or enable PII-safe redaction first.
Next Steps