Skip to main content
Experimental Feature: This integration is experimental and can change at any point. The OpenTelemetry GenAI spec itself is evolving rapidly, which may affect this integration. This service might not be available in all variations of SGP.
The OpenTelemetry integration allows your agents to be instrumented with standard OpenTelemetry integrations for your framework. By passing a few headers (your SGP API key and account ID), traces are automatically sent to SGP for observability. For native SDK integration instead, see Tracing SDK Initialization.

How It Works

This experimental forwarder service acts as a proxy between OpenTelemetry-instrumented applications and SGP’s tracing system. When you send OTLP traces to the SGP OpenTelemetry endpoint, the service automatically converts them into SGP’s internal tracing format. The conversion process maps OpenTelemetry’s GenAI semantic conventions to SGP metadata (such as model names, token counts, and provider information), auto-detects frameworks like Pydantic AI to extract input and output data, and infers operation types including completions, retrieval, reranking, and other operations. For detailed information on GenAI semantic conventions, refer to the OpenTelemetry GenAI attributes specification.

Prerequisites

To use the OpenTelemetry integration, you’ll need:
  1. SGP API credentials: Your API key and account ID
  2. OpenTelemetry SDK: Install the appropriate SDK for your language
  3. OTLP endpoint: https://opentelemetry.<sgp-url-placeholder>
Configuration:
export OTEL_EXPORTER_OTLP_ENDPOINT="https://opentelemetry.<sgp-url-placeholder>"
export OTEL_EXPORTER_OTLP_HEADERS="x-sgp-api-key=YOUR_API_KEY,x-sgp-account-id=YOUR_ACCOUNT_ID"

Usage Examples

These examples show how to use ecosystem integrations with SGP’s OpenTelemetry forwarder.

OpenAI with OpenLLMetry

OpenLLMetry provides automatic instrumentation for OpenAI and other LLM providers:
Python
import os
from openai import OpenAI
from traceloop.sdk import Traceloop

# Set SGP credentials
os.environ["OTEL_EXPORTER_OTLP_ENDPOINT"] = "https://opentelemetry.<sgp-url-placeholder>"
os.environ["OTEL_EXPORTER_OTLP_HEADERS"] = "x-sgp-api-key=YOUR_KEY,x-sgp-account-id=YOUR_ACCOUNT_ID"

# Initialize Traceloop - automatically instruments OpenAI
Traceloop.init(
    app_name="my-app",
    api_endpoint=os.environ["OTEL_EXPORTER_OTLP_ENDPOINT"],
    disable_batch=False,
)

# Use OpenAI normally - traces are automatically sent to SGP
client = OpenAI()
response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "What is the capital of France?"}],
)
OpenLLMetry automatically captures model names, token usage, prompts, completions, function calls, and performance metrics.

OpenAI Agents SDK with Logfire

Logfire can instrument the OpenAI Agents SDK for comprehensive observability:
Python
import os
import logfire
from agents import Agent, Runner

# Set SGP credentials
os.environ["OTEL_EXPORTER_OTLP_ENDPOINT"] = "https://opentelemetry.<sgp-url-placeholder>"
os.environ["OTEL_EXPORTER_OTLP_HEADERS"] = "x-sgp-api-key=YOUR_KEY,x-sgp-account-id=YOUR_ACCOUNT_ID"

# Configure Logfire to use custom OTLP endpoint (respects OTEL env vars)
logfire.configure(send_to_logfire=False)
logfire.instrument_openai_agents()

# Create and run agent - traces are automatically sent to SGP
agent = Agent(name="assistant", instructions="You are a helpful assistant")
result = Runner.run_sync(agent, "Write a haiku about recursion in programming.")

Pydantic AI with Logfire

Pydantic AI integrates with Logfire for observability. You can redirect Logfire traces to SGP by configuring the OpenTelemetry SDK providers:
Python
import os
import logfire
from pydantic_ai import Agent

# Set SGP credentials
os.environ["OTEL_EXPORTER_OTLP_ENDPOINT"] = "https://opentelemetry.<sgp-url-placeholder>"
os.environ["OTEL_EXPORTER_OTLP_HEADERS"] = "x-sgp-api-key=YOUR_KEY,x-sgp-account-id=YOUR_ACCOUNT_ID"

# Configure Logfire to use custom OTLP endpoint (respects OTEL env vars)
logfire.configure(send_to_logfire=False)
logfire.instrument_pydantic_ai()

# Create and run agent - traces are automatically sent to SGP
agent = Agent("openai:gpt-4o-mini")
result = agent.run_sync("What is the capital of France?")

LangChain with LangSmith

LangChain’s observability can be configured to export traces via OpenTelemetry. Note that LangChain requires the full /v1/traces path in the endpoint URL:
Python
import os
from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate

# Enable LangSmith OpenTelemetry export to SGP
os.environ["LANGSMITH_OTEL_ENABLED"] = "true"
# Note: LangChain requires the full /v1/traces path in the endpoint
os.environ["OTEL_EXPORTER_OTLP_ENDPOINT"] = "https://opentelemetry.<sgp-url-placeholder>/v1/traces"
os.environ["OTEL_EXPORTER_OTLP_HEADERS"] = "x-sgp-api-key=YOUR_KEY,x-sgp-account-id=YOUR_ACCOUNT_ID"

# Use LangChain normally - traces are automatically sent to SGP
prompt = ChatPromptTemplate.from_template("Tell me a joke about {topic}")
model = ChatOpenAI(model="gpt-4o-mini")
chain = prompt | model
result = chain.invoke({"topic": "programming"})

Viewing Traces

After sending traces, view them in the SGP UI:
  1. Navigate to your application in the SGP dashboard
  2. Go to the Traces tab
  3. Filter and drill down into your OTLP-generated traces
Traces will appear with the operation types and metadata automatically extracted from your OTLP spans. For more information about SGP tracing features and concepts, see the Tracing Overview.