openinference

OpenInfence

OpenInference is a set of conventions and plugins that is complimentary to OpenTelemetry to enable tracing of AI applications. OpenInference is natively supported by arize-phoenix, but can be used with any OpenTelemetry-compatible backend as well.

Specification

The OpenInference specification is edited in markdown files found in the spec directory. It’s designed to provide insight into the invocation of LLMs and the surrounding application context such as retrieval from vector stores and the usage of external tools such as search engines or APIs. The specification is transport and file-format agnostic, and is intended to be used in conjunction with other specifications such as JSON, ProtoBuf, and DataFrames.

Instrumentation

OpenInference provides a set of instrumentations for popular machine learning SDKs and frameworks in a variety of languages.

Python

Libraries

Package Description Version
openinference-semantic-conventions Semantic conventions for tracing of LLM Apps. PyPI Version
openinference-instrumentation-openai OpenInference Instrumentation for OpenAI SDK. PyPI Version
openinference-instrumentation-llama-index OpenInference Instrumentation for LlamaIndex. PyPI Version
openinference-instrumentation-dspy OpenInference Instrumentation for DSPy. PyPI Version
openinference-instrumentation-bedrock OpenInference Instrumentation for AWS Bedrock. PyPI Version
openinference-instrumentation-langchain OpenInference Instrumentation for LangChain. PyPI Version
openinference-instrumentation-mistralai OpenInference Instrumentation for MistralAI. PyPI Version
openinference-instrumentation-guardrails OpenInference Instrumentation for Guardrails. PyPI Version
openinference-instrumentation-vertexai OpenInference Instrumentation for VertexAI. PyPI Version
openinference-instrumentation-crewai OpenInference Instrumentation for CrewAI. PyPI Version
openinference-instrumentation-haystack OpenInference Instrumentation for Haystack. PyPI Version
openinference-instrumentation-litellm OpenInference Instrumentation for liteLLM. PyPI Version
openinference-instrumentation-groq OpenInference Instrumentation for Groq. PyPI Version
openinference-instrumentation-instructor OpenInference Instrumentation for Instructor. PyPI Version
openinference-instrumentation-anthropic OpenInference Instrumentation for Anthropic. PyPI Version

Examples

Name Description Complexity Level
OpenAI SDK OpenAI Python SDK, including chat completions and embeddings Beginner
MistralAI SDK MistralAI Python SDK Beginner
VertexAI SDK VertexAI Python SDK Beginner
LlamaIndex LlamaIndex query engines Beginner
DSPy DSPy primitives and custom RAG modules Beginner
Boto3 Bedrock Client Boto3 Bedrock client Beginner
LangChain LangChain primitives and simple chains Beginner
LiteLLM A lightweight LiteLLM framework Beginner
LiteLLM Proxy LiteLLM Proxy to log OpenAI, Azure, Vertex, Bedrock Beginner
Groq Groq and AsyncGroq chat completions Beginner
Anthropic Anthropic Messages client Beginner
LlamaIndex + Next.js Chatbot A fully functional chatbot using Next.js and a LlamaIndex FastAPI backend Intermediate
LangServe A LangChain application deployed with LangServe using custom metadata on a per-request basis Intermediate
DSPy A DSPy RAG application using FastAPI, Weaviate, and Cohere Intermediate
Haystack A Haystack QA RAG application Intermediate

JavaScript

Libraries

Package Description Version
@arizeai/openinference-semantic-conventions Semantic conventions for tracing of LLM Apps. NPM Version
@arizeai/openinference-core Core utility functions for instrumentation NPM Version
@arizeai/openinference-instrumentation-openai OpenInference Instrumentation for OpenAI SDK. NPM Version
@arizeai/openinference-instrumentation-langchain OpenInference Instrumentation for LangChain.js. NPM Version
@arizeai/openinference-vercel OpenInference Support for Vercel AI SDK NPM Version

Examples

Name Description Complexity Level
OpenAI SDK OpenAI Node.js client Beginner
LlamaIndex Express App A fully functional LlamaIndex chatbot with a Next.js frontend and a LlamaIndex Express backend, instrumented using openinference-instrumentation-openai Intermediate
LangChain OpenAI A simple script to call OpenAI via LangChain, instrumented using openinference-instrumentation-langchain Beginner
LangChain RAG Express App A fully functional LangChain chatbot that uses RAG to answer user questions. It has a Next.js frontend and a LangChain Express backend, instrumented using openinference-instrumentation-langchain Intermediate
Next.js + OpenAI A Next.js 13 project bootstrapped with create-next-app that uses OpenAI to generate text Beginner

Supported Destinations

OpenInference supports the following destinations as span collectors.

Community

Join our community to connect with thousands of machine learning practitioners and LLM observability enthusiasts!