@lelemondev/sdk - v0.9.9
    Preparing search index...

    Function observe

    • Wrap an LLM client with automatic tracing

      Type Parameters

      • T

      Parameters

      • client: T

        OpenAI, Anthropic, or Bedrock client instance

      • Optionaloptions: ObserveOptions

        Optional context (sessionId, userId, etc.)

      Returns T

      The wrapped client with the same type

      import { observe } from '@lelemondev/sdk';
      import OpenAI from 'openai';

      const openai = observe(new OpenAI());

      // All calls are now automatically traced
      const response = await openai.chat.completions.create({...});