Skip to main content
This guide provides instructions for integrating LangSmith with the TrueFoundry AI Gateway to export OpenTelemetry traces.

What is LangSmith?

LangSmith is LangChain’s observability and evaluation platform for LLM applications. It provides comprehensive tracing, debugging, and monitoring capabilities to help teams build reliable AI applications.

Key Features of LangSmith

  • LLM Tracing: Capture detailed traces of all LLM interactions including inputs, outputs, latency, and token usage with automatic instrumentation
  • Evaluation & Testing: Run evaluations on your LLM outputs with custom evaluators, datasets, and automated testing pipelines
  • Prompt Hub: Version control and manage prompts with collaboration features and A/B testing capabilities

Prerequisites

Before integrating LangSmith with TrueFoundry, ensure you have:
  1. TrueFoundry Account: Create a Truefoundry account and follow the instructions in our Gateway Quick Start Guide
  2. LangSmith Account: Sign up for a LangSmith account
  3. LangSmith API Key: Generate an API key from your LangSmith settings

Integration Steps

TrueFoundry AI Gateway supports exporting OpenTelemetry traces to LangSmith, allowing you to monitor and analyze your LLM requests in LangSmith’s observability platform.
1

Get Your LangSmith API Key

  1. Log into your LangSmith dashboard
  2. Navigate to SettingsAPI Keys
  3. Create a new API key or copy an existing one
2

Configure OTEL Export in TrueFoundry

  1. Navigate to the Configs tab in the AI Gateway section
  2. Click on OTEL Config
  3. Toggle on OTEL Traces Exporter Configuration
  4. Select HTTP Configuration
  5. Enter the LangSmith traces endpoint: https://api.smith.langchain.com/otel/v1/traces
  6. Set Encoding to Proto
TrueFoundry OTEL Traces Exporter Configuration showing HTTP configuration with LangSmith endpoint
3

Configure Headers

Add the required header for LangSmith authentication:
HeaderValue
x-api-keyYour LangSmith API key
Click Save to apply your configuration.
4

Verify the Integration

  1. Make some requests through the TrueFoundry AI Gateway
  2. Navigate to the Monitor section in TrueFoundry to verify traces are being generated
  3. Log into your LangSmith dashboard and navigate to the Projects section
  4. Verify that traces from TrueFoundry are appearing in your project
LangSmith dashboard showing traces exported from TrueFoundry AI Gateway

Configuration Options

LangSmith Endpoint

LangSmith uses a single endpoint for OTEL trace ingestion:
ConfigurationValue
Traces Endpointhttps://api.smith.langchain.com/otel/v1/traces
ProtocolHTTP
EncodingProto / JSON