Skip to main content

Traceloop (OpenLLMetry) - Tracing LLMs with OpenTelemetry

Traceloop is a platform for monitoring and debugging the quality of your LLM outputs. It provides you with a way to track the performance of your LLM application; rollout changes with confidence; and debug issues in production. It is based on OpenTelemetry, so it can provide full visibility to your LLM requests, as well vector DB usage, and other infra in your stack.

Getting Started​

First, sign up to get an API key on the Traceloop dashboard.

Then, install the Traceloop SDK:

pip install traceloop-sdk

Use just 1 line of code, to instantly log your LLM responses:

litellm.success_callback = ["traceloop"]

When running your app, make sure to set the TRACELOOP_API_KEY environment variable to your API key.

To get better visualizations on how your code behaves, you may want to annotate specific parts of your LLM chain. See Traceloop docs on decorators for more information.

Exporting traces to other systems (e.g. Datadog, New Relic, and others)​

Since Traceloop SDK uses OpenTelemetry to send data, you can easily export your traces to other systems, such as Datadog, New Relic, and others. See Traceloop docs on exporters for more information.

Support​

For any question or issue with integration you can reach out to the Traceloop team on Slack or via email.