Skip to content

OpenLIT

OpenLIT simplifies the development process of generative AI and large language models (LLMs), provides comprehensive observability support, and reports observability data to Guance.

Configuration

Before sending APM data to DataKit using OTEL, ensure that you have configured the Collector. Additionally, adjust the configuration file customer_tags = ["gen_ai.application_name","gen_ai.request.model","gen_ai.prompt","gen_ai.completion","gen_ai.request.temperature","gen_ai.usage.input_tokens","gen_ai.usage.output_tokens","gen_ai.usage.total_tokens","gen_ai.endpoint","gen_ai.system"] as follows:

[[inputs.opentelemetry]]
  ## customer_tags will work as a whitelist to prevent tags send to data center.
  ## All . will replace to _ ,like this :
    customer_tags = ["gen_ai.application_name","gen_ai.request.model","gen_ai.prompt","gen_ai.completion","gen_ai.request.temperature","gen_ai.usage.input_tokens","gen_ai.usage.output_tokens","gen_ai.usage.total_tokens","gen_ai.endpoint","gen_ai.system"]

  ...

After making adjustments, restart DataKit

Install OpenLIT SDK

pip install openlit

Initialize OpenLIT in your application

import openlit

openlit.init(otlp_endpoint="http://127.0.0.1:9529/otel")

Sample code for monitoring OpenAI usage:

from openai import OpenAI
import openlit

# Init OpenLit
openlit.init(
    otlp_endpoint="http://127.0.0.1:9529/otel",
    application_name="openlit_demo"
)

client = OpenAI(
    api_key="YOUR_OPENAI_KEY"
)

chat_completion = client.chat.completions.create(
    messages=[
        {
            "role": "user",
            "content": "What is LLM observability",
        }
    ],
    model="gpt-3.5-turbo",
)

References

Feedback

Is this page helpful? ×