You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
OpenInference is a set of conventions and plugins that is complimentary to OpenTelemetry to
enable tracing of AI applications. OpenInference is natively supported
by arize-phoenix, but can be used with any OpenTelemetry-compatible backend as
well.
Specification
The OpenInference specification is edited in markdown files found in the spec directory. It's designed to
provide insight into the invocation of LLMs and the surrounding application context such as retrieval from vector stores
and the usage of external tools such as search engines or APIs. The specification is transport and file-format agnostic,
and is intended to be used in conjunction with other specifications such as JSON, ProtoBuf, and DataFrames.
Instrumentation
OpenInference provides a set of instrumentations for popular machine learning SDKs and frameworks in a variety of
languages.
A fully functional LlamaIndex chatbot with a Next.js frontend and a LlamaIndex Express backend, instrumented using openinference-instrumentation-openai
A fully functional LangChain chatbot that uses RAG to answer user questions. It has a Next.js frontend and a LangChain Express backend, instrumented using openinference-instrumentation-langchain