Setting up HoneyHive in your Python Package Manager

Learn how to properly include HoneyHive in your project’s pyproject.toml file using optional dependency groups for clean, targeted installations.

Overview

HoneyHive provides optional dependency groups that bundle the SDK with specific LLM provider instrumentors and SDKs. This approach offers:

  • 🎯 Targeted Dependencies: Only install what you need

  • 📦 Automatic Resolution: Correct versions guaranteed to work together

  • 🚀 Zero Configuration: Everything ready after installation

  • 🔄 Easy Switching: Change providers by updating dependency group

  • 📊 Clear Intent: Your pyproject.toml shows exactly which providers you use

Single Provider Integration

Most Common Pattern - Add one provider:

[project]
name = "my-llm-app"
version = "0.1.0"
dependencies = [
    "honeyhive[openinference-openai]",  # OpenAI + instrumentor + SDK
    "fastapi>=0.100.0",
    "uvicorn>=0.20.0"
]

Available Single Provider Options:

dependencies = [
    "honeyhive[openinference-openai]",        # OpenAI GPT models
    "honeyhive[openinference-anthropic]",     # Anthropic Claude models
    "honeyhive[openinference-google-ai]",     # Google Gemini models
    "honeyhive[openinference-bedrock]",   # AWS Bedrock multi-model
    "honeyhive[openinference-azure-openai]",  # Azure-hosted OpenAI
]

Multiple Provider Integration

Production Apps with Multiple Providers:

[project]
name = "my-multi-provider-app"
version = "1.0.0"
dependencies = [
    "honeyhive[openinference-openai,openinference-anthropic,openinference-google-ai]",  # Multiple providers
    "fastapi>=0.100.0",
    "pydantic>=2.0.0"
]

Popular Provider Combination:

dependencies = [
    "honeyhive[openinference-llm-providers]",  # OpenAI + Anthropic + Google (most popular)
]

Framework-Specific Integration

LangChain Applications:

[project]
name = "my-langchain-app"
dependencies = [
    "honeyhive[openinference-langchain]",     # LangChain + instrumentor
    "honeyhive[openai]",        # Add your LLM provider
    "chromadb>=0.4.0"
]

LlamaIndex RAG Applications:

[project]
name = "my-rag-app"
dependencies = [
    "honeyhive[llamaindex]",    # LlamaIndex + instrumentor
    "honeyhive[openai]",        # Add your LLM provider
    "pinecone-client>=2.0.0"
]

DSPy Programming Framework:

[project]
name = "my-dspy-app"
dependencies = [
    "honeyhive[dspy]",          # DSPy + instrumentor
    "honeyhive[openai]",        # Add your LLM provider
]

All Integrations (Kitchen Sink)

Enterprise Apps with Comprehensive Provider Support:

[project]
name = "enterprise-llm-platform"
version = "2.0.0"
dependencies = [
    "honeyhive[all-integrations]",  # All providers + frameworks
    "fastapi>=0.100.0",
    "sqlalchemy>=2.0.0",
    "redis>=4.0.0"
]

Note: Only use all-integrations if you actually need multiple providers. For most apps, specific provider groups are better.

Tool-Specific Examples

requirements.txt (pip)

# Core app dependencies
honeyhive[openinference-openai,openinference-anthropic]>=1.0.0
fastapi>=0.100.0
uvicorn>=0.20.0

# Framework integration example
# honeyhive[openinference-langchain]>=1.0.0

# Multiple providers
# honeyhive[openinference-llm-providers]>=1.0.0
# Install from requirements.txt
pip install -r requirements.txt

# Or install directly
pip install "honeyhive[openinference-openai,openinference-anthropic]>=1.0.0"

uv

# Initialize new project with uv
uv init my-llm-app
cd my-llm-app

# Add HoneyHive with providers
uv add "honeyhive[openinference-openai]"
uv add "honeyhive[openinference-anthropic]"

# Or add multiple providers at once
uv add "honeyhive[openinference-openai,openinference-anthropic]"

# Add framework integration
uv add "honeyhive[openinference-langchain]"

# Run your application
uv run python main.py
# pyproject.toml (generated by uv)
[project]
name = "my-llm-app"
version = "0.1.0"
dependencies = [
    "honeyhive[openinference-openai,openinference-anthropic]>=1.0.0",
    "fastapi>=0.100.0",
]

Poetry

[tool.poetry.dependencies]
python = "^3.11"
honeyhive = {extras = ["openinference-openai", "openinference-anthropic"], version = "^1.0.0"}
fastapi = "^0.100.0"

pip-tools (requirements.in)

# Core app dependencies
honeyhive[openinference-openai,openinference-anthropic]>=1.0.0
fastapi>=0.100.0
uvicorn>=0.20.0
# Compile to requirements.txt
pip-compile requirements.in

# Install
pip-sync requirements.txt

Pipenv

[packages]
honeyhive = {extras = ["openinference-openai"], version = "*"}
fastapi = "*"

Hatch

[project]
dependencies = [
    "honeyhive[openinference-google-ai]",
]

[project.optional-dependencies]
dev = ["honeyhive[openinference-openai,openinference-anthropic]"]  # More providers for testing

Available Optional Dependencies

🤖 LLM Providers

Extra

What’s Included

openai

OpenAI SDK + OpenInference OpenAI instrumentor

anthropic

Anthropic SDK + OpenInference Anthropic instrumentor

google-ai

Google Generative AI SDK + OpenInference Google instrumentor

google-adk

Google Agent Development Kit + OpenInference ADK instrumentor

bedrock

Boto3 + OpenInference Bedrock instrumentor

azure-openai

OpenAI SDK + Azure Identity + OpenInference OpenAI instrumentor

mcp

OpenInference MCP instrumentor for Model Context Protocol

🔧 Framework Integrations

Extra

What’s Included

langchain

LangChain + OpenInference LangChain instrumentor

llamaindex

LlamaIndex + OpenInference LlamaIndex instrumentor

dspy

DSPy + OpenInference DSPy instrumentor

🌟 Additional Providers

Extra

What’s Included

cohere

Cohere SDK + OpenInference Cohere instrumentor

huggingface

Transformers + OpenInference HuggingFace instrumentor

mistralai

Mistral AI SDK + OpenInference Mistral instrumentor

groq

Groq SDK + OpenInference Groq instrumentor

ollama

Ollama SDK + OpenInference Ollama instrumentor

litellm

LiteLLM + OpenInference LiteLLM instrumentor

📦 Convenience Groups

Extra

What’s Included

llm-providers

OpenAI + Anthropic + Google AI (most popular providers)

all-integrations

All available instrumentors and SDKs

Best Practices

✅ Do This

# Good: Specific providers you actually use
dependencies = ["honeyhive[openai,anthropic]"]

# Good: Let users choose in a library
[project.optional-dependencies]
openai = ["honeyhive[openinference-openai]"]

❌ Avoid This

# Avoid: Installing everything when you only use OpenAI
dependencies = ["honeyhive[all-integrations]"]

# Avoid: Manual instrumentor management
dependencies = [
    "honeyhive",
    "openinference-instrumentation-openai",  # Use honeyhive[openinference-openai] instead
    "openai"
]

🎯 Choosing the Right Pattern

  • Application: Use specific provider extras like honeyhive[openinference-openai]

  • Library: Use optional dependencies to let users choose

  • Enterprise: Consider honeyhive[llm-providers] for popular providers

  • Testing: Use honeyhive[all-integrations] for comprehensive testing

Migration from Manual Installation

Before (Manual):

dependencies = [
    "honeyhive",
    "openinference-instrumentation-openai",
    "openinference-instrumentation-anthropic",
    "openai",
    "anthropic"
]

After (Optional Dependencies):

dependencies = [
    "honeyhive[openai,anthropic]"  # Much cleaner!
]

Benefits of Migration:

  • Fewer Dependencies: One line instead of five

  • Version Compatibility: Guaranteed to work together

  • Easier Maintenance: Update one package instead of tracking multiple

  • Clearer Intent: Obvious which providers you use

Troubleshooting

Import Errors After Installation

Make sure you installed the right extra:

# If using OpenAI
pip install honeyhive[openinference-openai]

# If using multiple providers
pip install honeyhive[openinference-openai,openinference-anthropic]

Version Conflicts

The optional dependencies are curated to avoid conflicts. If you see version conflicts:

  1. Use the optional dependency groups instead of manual installation

  2. Update to the latest HoneyHive version

  3. Check that you’re not manually specifying conflicting versions

Missing Provider Support

If a provider isn’t available as an optional dependency:

# Fall back to manual installation
pip install honeyhive
pip install openinference-instrumentation-<provider>
pip install <provider-sdk>

# Then file an issue to request the provider be added!

Next Steps