Skip to main content

AI Observability and Evaluation

Project description

phoenix banner

Add Arize Phoenix MCP server to Cursor

Phoenix is an open-source AI observability platform designed for experimentation, evaluation, and troubleshooting. It provides:

  • Tracing - Trace your LLM application's runtime using OpenTelemetry-based instrumentation.
  • Evaluation - Leverage LLMs to benchmark your application's performance using response and retrieval evals.
  • Datasets - Create versioned datasets of examples for experimentation, evaluation, and fine-tuning.
  • Experiments - Track and evaluate changes to prompts, LLMs, and retrieval.
  • Playground- Optimize prompts, compare models, adjust parameters, and replay traced LLM calls.
  • Prompt Management- Manage and test prompt changes systematically using version control, tagging, and experimentation.

Phoenix is vendor and language agnostic with out-of-the-box support for popular frameworks (OpenAI Agents SDK, LangGraph, Vercel AI SDK, Mastra, CrewAI, LlamaIndex, DSPy) and LLM providers (OpenAI, Anthropic, Google GenAI, Google ADK, AWS Bedrock, LiteLLM, and more). For details on auto-instrumentation, check out the OpenInference project.

Phoenix runs practically anywhere, including your local machine, a Jupyter notebook, a containerized deployment, or in the cloud.

Installation

Install Phoenix via pip or conda

pip install arize-phoenix

Phoenix container images are available via Docker Hub and can be deployed using Docker or Kubernetes. Arize AI also provides cloud instances at app.phoenix.arize.com.

Packages

The arize-phoenix package includes the entire Phoenix platform. However, if you have deployed the Phoenix platform, there are lightweight Python sub-packages and TypeScript packages that can be used in conjunction with the platform.

Python Subpackages

Package Version & Docs Description
arize-phoenix-otel PyPI Version Docs Provides a lightweight wrapper around OpenTelemetry primitives with Phoenix-aware defaults
arize-phoenix-client PyPI Version Docs Lightweight client for interacting with the Phoenix server via its OpenAPI REST interface
arize-phoenix-evals PyPI Version Docs Tooling to evaluate LLM applications including RAG relevance, answer relevance, and more

TypeScript Subpackages

Package Version & Docs Description
@arizeai/phoenix-otel NPM Version Docs Provides a lightweight wrapper around OpenTelemetry primitives with Phoenix-aware defaults
@arizeai/phoenix-client NPM Version Docs Client for the Arize Phoenix API
@arizeai/phoenix-evals NPM Version Docs TypeScript evaluation library for LLM applications (alpha release)
@arizeai/phoenix-mcp NPM Version Docs MCP server implementation for Arize Phoenix providing unified interface to Phoenix's capabilities
@arizeai/phoenix-cli NPM Version Docs CLI for fetching traces, datasets, and experiments for use with Claude Code, Cursor, and other coding agents

Tracing Integrations

Phoenix is built on top of OpenTelemetry and is vendor, language, and framework agnostic. For details about tracing integrations and example applications, see the OpenInference project.

Python Integrations

Integration Package Version Badge
OpenAI openinference-instrumentation-openai PyPI Version
OpenAI Agents openinference-instrumentation-openai-agents PyPI Version
LlamaIndex openinference-instrumentation-llama-index PyPI Version
DSPy openinference-instrumentation-dspy PyPI Version
AWS Bedrock openinference-instrumentation-bedrock PyPI Version
LangChain openinference-instrumentation-langchain PyPI Version
MistralAI openinference-instrumentation-mistralai PyPI Version
Google GenAI openinference-instrumentation-google-genai PyPI Version
Google ADK openinference-instrumentation-google-adk PyPI Version
Guardrails openinference-instrumentation-guardrails PyPI Version
VertexAI openinference-instrumentation-vertexai PyPI Version
CrewAI openinference-instrumentation-crewai PyPI Version
Haystack openinference-instrumentation-haystack PyPI Version
LiteLLM openinference-instrumentation-litellm PyPI Version
Groq openinference-instrumentation-groq PyPI Version
Instructor openinference-instrumentation-instructor PyPI Version
Anthropic openinference-instrumentation-anthropic PyPI Version
Smolagents openinference-instrumentation-smolagents PyPI Version
Agno openinference-instrumentation-agno PyPI Version
MCP openinference-instrumentation-mcp PyPI Version
Pydantic AI openinference-instrumentation-pydantic-ai PyPI Version
Autogen AgentChat openinference-instrumentation-autogen-agentchat PyPI Version
Portkey openinference-instrumentation-portkey PyPI Version
Agent Spec openinference-instrumentation-agentspec PyPI Version

Span Processors

Normalize and convert data across other instrumentation libraries by adding span processors that unify data.

Package Description Version
openinference-instrumentation-openlit OpenInference Span Processor for OpenLIT traces. PyPI Version
openinference-instrumentation-openllmetry OpenInference Span Processor for OpenLLMetry (Traceloop) traces. PyPI Version

JavaScript Integrations

Integration Package Version Badge
OpenAI @arizeai/openinference-instrumentation-openai NPM Version
LangChain.js @arizeai/openinference-instrumentation-langchain NPM Version
Vercel AI SDK @arizeai/openinference-vercel NPM Version
BeeAI @arizeai/openinference-instrumentation-beeai NPM Version
Claude Agent SDK @arizeai/openinference-instrumentation-claude-agent-sdk NPM Version
Mastra @mastra/arize NPM Version

Java Integrations

Integration Package Version Badge
LangChain4j openinference-instrumentation-langchain4j Maven Central
SpringAI openinference-instrumentation-springAI Maven Central

Platforms

Platform Description Docs
BeeAI AI agent framework with built-in observability Integration Guide
Dify Open-source LLM app development platform Integration Guide
Envoy AI Gateway AI Gateway built on Envoy Proxy for AI workloads Integration Guide
LangFlow Visual framework for building multi-agent and RAG applications Integration Guide
LiteLLM Proxy Proxy server for LLMs Integration Guide

Security & Privacy

We take data security and privacy very seriously. For more details, see our Security and Privacy documentation.

Telemetry

By default, Phoenix collects basic web analytics (e.g., page views, UI interactions) to help us understand how Phoenix is used and improve the product. None of your trace data, evaluation results, or any sensitive information is ever collected.

You can opt-out of telemetry by setting the environment variable: PHOENIX_TELEMETRY_ENABLED=false

Community

Join our community to connect with thousands of AI builders.

Breaking Changes

See the migration guide for a list of breaking changes.

Copyright, Patent, and License

Copyright 2025 Arize AI, Inc. All Rights Reserved.

Portions of this code are patent protected by one or more U.S. Patents. See the IP_NOTICE.

This software is licensed under the terms of the Elastic License 2.0 (ELv2). See LICENSE.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

arize_phoenix-13.5.0.tar.gz (699.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

arize_phoenix-13.5.0-py3-none-any.whl (924.6 kB view details)

Uploaded Python 3

File details

Details for the file arize_phoenix-13.5.0.tar.gz.

File metadata

  • Download URL: arize_phoenix-13.5.0.tar.gz
  • Upload date:
  • Size: 699.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for arize_phoenix-13.5.0.tar.gz
Algorithm Hash digest
SHA256 d8b10050be82df90337e6e0370b87e6f1783f23e1f0d8034e1feb7af1e120bc1
MD5 3fdd032a64e2727b0e36217477882cbd
BLAKE2b-256 8f6cc0ff07ee279bff4ea9f0ee220e7dfd18ca888332b226632b9accb8cf5a4d

See more details on using hashes here.

Provenance

The following attestation bundles were made for arize_phoenix-13.5.0.tar.gz:

Publisher: publish.yaml on Arize-ai/phoenix

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file arize_phoenix-13.5.0-py3-none-any.whl.

File metadata

  • Download URL: arize_phoenix-13.5.0-py3-none-any.whl
  • Upload date:
  • Size: 924.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for arize_phoenix-13.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 18fb1194558ca46b9dff88c99aefacf355a6580afbca833a1d37b74885c76d38
MD5 2380d610169534d0f2a4069c4f17e3fb
BLAKE2b-256 0c9f90d323ccb43eed0042b8dea32d76554e1bef8f1c26d3d9460d14d5cf6059

See more details on using hashes here.

Provenance

The following attestation bundles were made for arize_phoenix-13.5.0-py3-none-any.whl:

Publisher: publish.yaml on Arize-ai/phoenix

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page