Signature
Overview
Unlike Anthropic, OpenAI, LangChain, and Gemini — which are auto-instrumented wheninit() is called — BAML clients are code-generated per project and cannot be monkey-patched. You must explicitly wrap the client with instrument_baml() after calling init().
Parameters
The BAML client instance to wrap. Typically imported as
b from baml_client.Returns
A proxy-wrapped client with tracing enabled. Assign the result back to your client variable.Example
How It Works
instrument_baml() wraps the BAML client in a proxy that intercepts method calls. When a BAML function is called:
- The proxy detects the active trace (from
@observeortrace()) - A span is created with the function name and arguments
- The real BAML function executes normally
- The span is completed with the structured result and token usage
What Gets Captured
| Field | Description |
|---|---|
type | LLM |
name | BAML function name (e.g., baml.ClassifyInquiry) |
input.args | Function arguments |
output.result | Structured result |
token_usage | Token counts |
metadata.provider | "baml" |
See Also
- BAML Integration — Full setup guide
- @observe — Create a trace context

