Wrap an LLM client with automatic tracing
OpenAI, Anthropic, or Bedrock client instance
Optional
Optional context (sessionId, userId, etc.)
The wrapped client with the same type
import { observe } from '@lelemondev/sdk';import OpenAI from 'openai';const openai = observe(new OpenAI());// All calls are now automatically tracedconst response = await openai.chat.completions.create({...}); Copy
import { observe } from '@lelemondev/sdk';import OpenAI from 'openai';const openai = observe(new OpenAI());// All calls are now automatically tracedconst response = await openai.chat.completions.create({...});
Wrap an LLM client with automatic tracing