Dash0 Acquires Lumigo to Expand Agentic Observability Across AWS and Serverless
OpenTelemetry-native observability for LLM applications with automatic instrumentation for all major AI providers
OpenLLMetry is a lightweight, open-source OpenTelemetry-based observability SDK for LLM applications. Developed by Traceloop, it provides automatic instrumentation for various LLM providers (OpenAI, Anthropic, Cohere, and more), vector databases, and frameworks with minimal configuration.
OpenLLMetry automatically captures key metrics like latency, token usage, costs, and full prompt/response pairs without requiring manual span creation or extensive code changes. It's designed to work seamlessly with the OpenTelemetry ecosystem, making it easy to integrate with existing observability platforms like Dash0.
OpenLLMetry (Traceloop SDK) provides automatic OpenTelemetry instrumentation for LLM applications. It automatically detects and instruments LLM provider SDKs (like Anthropic, OpenAI), vector databases, and AI frameworks without requiring manual span creation.
This guide shows how to set up OpenLLMetry to send traces to Dash0 via the OpenTelemetry Protocol (OTLP).
anthropic, openai, cohere)Install OpenLLMetry (Traceloop SDK) alongside your LLM provider SDK:
Or for other providers:
For production deployments, route traces through an OpenTelemetry Collector:
The collector will forward traces to Dash0. For collector setup, see:
For simpler setups, send traces directly to Dash0:
Once initialized, OpenLLMetry automatically instruments all supported LLM calls:
Control what data is captured in traces:
After running your application with OpenLLMetry:
service.name matching your app_namegen_ai.* attributesFor the latest list of supported integrations, see the OpenLLMetry documentation.