README.md

May 2, 2026 Β· View on GitHub

OpenInference

OpenInference is a set of conventions and plugins that is complimentary to OpenTelemetry to enable tracing of AI applications. OpenInference is natively supported by arize-phoenix, but can be used with any OpenTelemetry-compatible backend as well.

Specification

The OpenInference specification is edited in markdown files found in the spec directory. It's designed to provide insight into the invocation of LLMs and the surrounding application context such as retrieval from vector stores and the usage of external tools such as search engines or APIs. The specification is transport and file-format agnostic, and is intended to be used in conjunction with other specifications such as JSON, ProtoBuf, and DataFrames.

Instrumentation

OpenInference provides a set of instrumentations for popular machine learning SDKs and frameworks in a variety of languages.

Python

Libraries

PackageDescriptionVersion
openinference-semantic-conventionsSemantic conventions for tracing of LLM Apps.PyPI Version
openinference-instrumentationReusable utilities, decorators, configurations, and helpers for instrumentation.PyPI Version
openinference-instrumentation-agnoOpenInference Instrumentation for Agno Agents.PyPI Version
openinference-instrumentation-openaiOpenInference Instrumentation for OpenAI SDK.PyPI Version
openinference-instrumentation-openai-agentsOpenInference Instrumentation for OpenAI Agents SDK.PyPI Version
openinference-instrumentation-claude-agent-sdkOpenInference Instrumentation for Claude Agent SDK.PyPI Version
openinference-instrumentation-llama-indexOpenInference Instrumentation for LlamaIndex.PyPI Version
openinference-instrumentation-dspyOpenInference Instrumentation for DSPy.PyPI Version
openinference-instrumentation-bedrockOpenInference Instrumentation for AWS Bedrock.PyPI Version
openinference-instrumentation-langchainOpenInference Instrumentation for LangChain.PyPI Version
openinference-instrumentation-mcpOpenInference Instrumentation for MCP.PyPI Version
openinference-instrumentation-mistralaiOpenInference Instrumentation for MistralAI.PyPI Version
openinference-instrumentation-portkeyOpenInference Instrumentation for Portkey.PyPI Version
openinference-instrumentation-guardrailsOpenInference Instrumentation for Guardrails.PyPI Version
openinference-instrumentation-vertexaiOpenInference Instrumentation for VertexAI.PyPI Version
openinference-instrumentation-crewaiOpenInference Instrumentation for CrewAI.PyPI Version
openinference-instrumentation-haystackOpenInference Instrumentation for Haystack.PyPI Version
openinference-instrumentation-litellmOpenInference Instrumentation for liteLLM.PyPI Version
openinference-instrumentation-groqOpenInference Instrumentation for Groq.PyPI Version
openinference-instrumentation-instructorOpenInference Instrumentation for Instructor.PyPI Version
openinference-instrumentation-anthropicOpenInference Instrumentation for Anthropic.PyPI Version
openinference-instrumentation-beeaiOpenInference Instrumentation for BeeAI.PyPI Version
openinference-instrumentation-google-genaiOpenInference Instrumentation for Google GenAI.PyPI Version
openinference-instrumentation-google-adkOpenInference Instrumentation for Google ADK.PyPI Version
openinference-instrumentation-autogen-agentchatOpenInference Instrumentation for Microsoft Autogen AgentChat.PyPI Version
openinference-instrumentation-pydantic-aiOpenInference Instrumentation for PydanticAI.PyPI Version
openinference-instrumentation-smolagentsOpenInference Instrumentation for smolagents.PyPI Version
openinference-instrumentation-pipecatOpenInference Instrumentation for Pipecat.PyPI Version
openinference-instrumentation-agentspecOpenInference Instrumentation for Open Agent Specification.PyPI Version
openinference-instrumentation-strands-agentsOpenInference Instrumentation for Strands Agents.PyPI Version

Span Processors

Normalize and convert data across other instrumentation libraries by adding span processors that unify data.

PackageDescriptionVersion
openinference-instrumentation-openlitOpenInference Span Processor for OpenLIT traces.PyPI Version
openinference-instrumentation-openllmetryOpenInference Span Processor for OpenLLMetry (Traceloop) traces.PyPI Version

Examples

NameDescriptionComplexity Level
AgnoAgno agent examplesBeginner
OpenAI SDKOpenAI Python SDK, including chat completions and embeddingsBeginner
Claude Agent SDKClaude Agent SDKBeginner
MistralAI SDKMistralAI Python SDKBeginner
VertexAI SDKVertexAI Python SDKBeginner
LlamaIndexLlamaIndex query enginesBeginner
DSPyDSPy primitives and custom RAG modulesBeginner
Boto3 Bedrock ClientBoto3 Bedrock clientBeginner
LangChainLangChain primitives and simple chainsBeginner
LiteLLMA lightweight LiteLLM frameworkBeginner
LiteLLM ProxyLiteLLM Proxy to log OpenAI, Azure, Vertex, BedrockBeginner
GroqGroq and AsyncGroq chat completionsBeginner
AnthropicAnthropic Messages clientBeginner
BeeAIAgentic instrumentation in the BeeAI frameworkBeginner
HaystackA Haystack QA RAG applicationIntermediate
OpenAI AgentsOpenAI Agents with handoffsIntermediate
Autogen AgentChatMicrosoft Autogen Assistant Agent and Team ChatIntermediate
PydanticAIPydanticAI agent examplesIntermediate
PipecatPipecat application examplesIntermediate

JavaScript

Libraries

PackageDescriptionVersion
@arizeai/openinference-semantic-conventionsSemantic conventions for tracing of LLM Apps.NPM Version
@arizeai/openinference-coreReusable utilities, configuration, and helpers for instrumentation.NPM Version
@arizeai/openinference-instrumentation-bedrockOpenInference Instrumentation for AWS Bedrock.NPM Version
@arizeai/openinference-instrumentation-bedrock-agent-runtimeOpenInference Instrumentation for AWS Bedrock Agent Runtime.NPM Version
@arizeai/openinference-instrumentation-beeaiOpenInference Instrumentation for BeeAI.NPM Version
@arizeai/openinference-instrumentation-langchainOpenInference Instrumentation for LangChain.js.NPM Version
@arizeai/openinference-instrumentation-mcpOpenInference Instrumentation for MCP.NPM Version
@arizeai/openinference-instrumentation-openaiOpenInference Instrumentation for OpenAI SDK.NPM Version
@arizeai/openinference-instrumentation-anthropicOpenInference Instrumentation for the Anthropic SDK.NPM Version
@arizeai/openinference-instrumentation-claude-agent-sdkOpenInference Instrumentation for Claude Agent SDK.NPM Version
@arizeai/openinference-vercelOpenInference Support for Vercel AI SDK.NPM Version
@arizeai/openinference-tanstack-aiOpenInference middleware for TanStack AI.NPM Version
@arizeai/openinference-genaiOpenInference Support for GenAI conventionsNPM Version

Java

Libraries

PackageDescriptionVersion
openinference-semantic-conventionsSemantic conventions for tracing of LLM Apps.Maven Central Version
openinference-instrumentationBase instrumentation utilities.Maven Central Version
openinference-instrumentation-langchain4jOpenInference Instrumentation for LangChain4j.Maven Central Version
openinference-instrumentation-springAIOpenInference Instrumentation for Spring AI.Maven Central Version
openinference-instrumentation-annotationAnnotation-based manual tracing with ByteBuddy.Maven Central Version

Examples

NameDescriptionComplexity Level
LangChain4j ExampleSimple example using LangChain4j with OpenAIBeginner
Spring AI ExampleSpring AI example with OpenAI and tool callingBeginner
Annotation ExampleAnnotation-based tracing with @Chain, @LLM, @Tool, @AgentBeginner
Programmatic ExampleManual tracing with typed span classes (LLMSpan, AgentSpan, etc.)Beginner

Supported Destinations

OpenInference supports the following destinations as span collectors.

Community

Join our community to connect with thousands of machine learning practitioners and LLM observability enthusiasts!

  • 🌍 Join our Slack community.
  • πŸ’‘ Ask questions and provide feedback in the #phoenix-support channel.
  • 🌟 Leave a star on our GitHub.
  • 🐞 Report bugs with GitHub Issues.
  • 𝕏 Follow us on X.
  • πŸ—ΊοΈ Check out our roadmap to see where we're heading next.