Setting up HoneyHive in your Python Package Manager
Learn how to properly include HoneyHive in your project’s pyproject.toml file using optional dependency groups for clean, targeted installations.
Overview
HoneyHive provides optional dependency groups that bundle the SDK with specific LLM provider instrumentors and SDKs. This approach offers:
🎯 Targeted Dependencies: Only install what you need
📦 Automatic Resolution: Correct versions guaranteed to work together
🚀 Zero Configuration: Everything ready after installation
🔄 Easy Switching: Change providers by updating dependency group
📊 Clear Intent: Your
pyproject.tomlshows exactly which providers you use
Single Provider Integration
Most Common Pattern - Add one provider:
[project]
name = "my-llm-app"
version = "0.1.0"
dependencies = [
"honeyhive[openinference-openai]", # OpenAI + instrumentor + SDK
"fastapi>=0.100.0",
"uvicorn>=0.20.0"
]
Available Single Provider Options:
dependencies = [
"honeyhive[openinference-openai]", # OpenAI GPT models
"honeyhive[openinference-anthropic]", # Anthropic Claude models
"honeyhive[openinference-google-ai]", # Google Gemini models
"honeyhive[openinference-bedrock]", # AWS Bedrock multi-model
"honeyhive[openinference-azure-openai]", # Azure-hosted OpenAI
]
Multiple Provider Integration
Production Apps with Multiple Providers:
[project]
name = "my-multi-provider-app"
version = "1.0.0"
dependencies = [
"honeyhive[openinference-openai,openinference-anthropic,openinference-google-ai]", # Multiple providers
"fastapi>=0.100.0",
"pydantic>=2.0.0"
]
Popular Provider Combination:
dependencies = [
"honeyhive[openinference-llm-providers]", # OpenAI + Anthropic + Google (most popular)
]
Framework-Specific Integration
LangChain Applications:
[project]
name = "my-langchain-app"
dependencies = [
"honeyhive[openinference-langchain]", # LangChain + instrumentor
"honeyhive[openai]", # Add your LLM provider
"chromadb>=0.4.0"
]
LlamaIndex RAG Applications:
[project]
name = "my-rag-app"
dependencies = [
"honeyhive[llamaindex]", # LlamaIndex + instrumentor
"honeyhive[openai]", # Add your LLM provider
"pinecone-client>=2.0.0"
]
DSPy Programming Framework:
[project]
name = "my-dspy-app"
dependencies = [
"honeyhive[dspy]", # DSPy + instrumentor
"honeyhive[openai]", # Add your LLM provider
]
Optional Dependencies Pattern (Recommended)
Flexible User Choice - Let users pick providers:
[project]
name = "my-flexible-library"
version = "0.1.0"
dependencies = [
"honeyhive", # Core SDK only - no provider lock-in
"pydantic>=2.0.0",
"httpx>=0.24.0"
]
[project.optional-dependencies]
# Let users choose their providers
openai = ["honeyhive[openinference-openai]"]
anthropic = ["honeyhive[anthropic]"]
google = ["honeyhive[google-ai]"]
aws = ["honeyhive[bedrock]"]
azure = ["honeyhive[azure-openai]"]
# Framework integrations
langchain = ["honeyhive[openinference-langchain]"]
llamaindex = ["honeyhive[llamaindex]"]
# Convenience groups
popular = ["honeyhive[llm-providers]"] # OpenAI + Anthropic + Google
all-providers = ["honeyhive[all-integrations]"] # Everything
# Development dependencies
dev = [
"honeyhive[openai,anthropic]", # Test with multiple providers
"pytest>=7.0.0",
"black>=23.0.0",
"mypy>=1.0.0"
]
Users can then install with:
# Install your library with OpenAI support
pip install my-flexible-library[openai]
# Install with multiple providers
pip install my-flexible-library[openai,anthropic]
# Install with all providers for testing
pip install my-flexible-library[all-providers]
All Integrations (Kitchen Sink)
Enterprise Apps with Comprehensive Provider Support:
[project]
name = "enterprise-llm-platform"
version = "2.0.0"
dependencies = [
"honeyhive[all-integrations]", # All providers + frameworks
"fastapi>=0.100.0",
"sqlalchemy>=2.0.0",
"redis>=4.0.0"
]
Note: Only use all-integrations if you actually need multiple providers. For most apps, specific provider groups are better.
Tool-Specific Examples
requirements.txt (pip)
# Core app dependencies
honeyhive[openinference-openai,openinference-anthropic]>=1.0.0
fastapi>=0.100.0
uvicorn>=0.20.0
# Framework integration example
# honeyhive[openinference-langchain]>=1.0.0
# Multiple providers
# honeyhive[openinference-llm-providers]>=1.0.0
# Install from requirements.txt
pip install -r requirements.txt
# Or install directly
pip install "honeyhive[openinference-openai,openinference-anthropic]>=1.0.0"
uv
# Initialize new project with uv
uv init my-llm-app
cd my-llm-app
# Add HoneyHive with providers
uv add "honeyhive[openinference-openai]"
uv add "honeyhive[openinference-anthropic]"
# Or add multiple providers at once
uv add "honeyhive[openinference-openai,openinference-anthropic]"
# Add framework integration
uv add "honeyhive[openinference-langchain]"
# Run your application
uv run python main.py
# pyproject.toml (generated by uv)
[project]
name = "my-llm-app"
version = "0.1.0"
dependencies = [
"honeyhive[openinference-openai,openinference-anthropic]>=1.0.0",
"fastapi>=0.100.0",
]
Poetry
[tool.poetry.dependencies]
python = "^3.11"
honeyhive = {extras = ["openinference-openai", "openinference-anthropic"], version = "^1.0.0"}
fastapi = "^0.100.0"
pip-tools (requirements.in)
# Core app dependencies
honeyhive[openinference-openai,openinference-anthropic]>=1.0.0
fastapi>=0.100.0
uvicorn>=0.20.0
# Compile to requirements.txt
pip-compile requirements.in
# Install
pip-sync requirements.txt
Pipenv
[packages]
honeyhive = {extras = ["openinference-openai"], version = "*"}
fastapi = "*"
Hatch
[project]
dependencies = [
"honeyhive[openinference-google-ai]",
]
[project.optional-dependencies]
dev = ["honeyhive[openinference-openai,openinference-anthropic]"] # More providers for testing
Available Optional Dependencies
🤖 LLM Providers
Extra |
What’s Included |
|---|---|
|
OpenAI SDK + OpenInference OpenAI instrumentor |
|
Anthropic SDK + OpenInference Anthropic instrumentor |
|
Google Generative AI SDK + OpenInference Google instrumentor |
|
Google Agent Development Kit + OpenInference ADK instrumentor |
|
Boto3 + OpenInference Bedrock instrumentor |
|
OpenAI SDK + Azure Identity + OpenInference OpenAI instrumentor |
|
OpenInference MCP instrumentor for Model Context Protocol |
🔧 Framework Integrations
Extra |
What’s Included |
|---|---|
|
LangChain + OpenInference LangChain instrumentor |
|
LlamaIndex + OpenInference LlamaIndex instrumentor |
|
DSPy + OpenInference DSPy instrumentor |
🌟 Additional Providers
Extra |
What’s Included |
|---|---|
|
Cohere SDK + OpenInference Cohere instrumentor |
|
Transformers + OpenInference HuggingFace instrumentor |
|
Mistral AI SDK + OpenInference Mistral instrumentor |
|
Groq SDK + OpenInference Groq instrumentor |
|
Ollama SDK + OpenInference Ollama instrumentor |
|
LiteLLM + OpenInference LiteLLM instrumentor |
📦 Convenience Groups
Extra |
What’s Included |
|---|---|
|
OpenAI + Anthropic + Google AI (most popular providers) |
|
All available instrumentors and SDKs |
Best Practices
✅ Do This
# Good: Specific providers you actually use
dependencies = ["honeyhive[openai,anthropic]"]
# Good: Let users choose in a library
[project.optional-dependencies]
openai = ["honeyhive[openinference-openai]"]
❌ Avoid This
# Avoid: Installing everything when you only use OpenAI
dependencies = ["honeyhive[all-integrations]"]
# Avoid: Manual instrumentor management
dependencies = [
"honeyhive",
"openinference-instrumentation-openai", # Use honeyhive[openinference-openai] instead
"openai"
]
🎯 Choosing the Right Pattern
Application: Use specific provider extras like
honeyhive[openinference-openai]Library: Use optional dependencies to let users choose
Enterprise: Consider
honeyhive[llm-providers]for popular providersTesting: Use
honeyhive[all-integrations]for comprehensive testing
Migration from Manual Installation
Before (Manual):
dependencies = [
"honeyhive",
"openinference-instrumentation-openai",
"openinference-instrumentation-anthropic",
"openai",
"anthropic"
]
After (Optional Dependencies):
dependencies = [
"honeyhive[openai,anthropic]" # Much cleaner!
]
Benefits of Migration:
Fewer Dependencies: One line instead of five
Version Compatibility: Guaranteed to work together
Easier Maintenance: Update one package instead of tracking multiple
Clearer Intent: Obvious which providers you use
Troubleshooting
Import Errors After Installation
Make sure you installed the right extra:
# If using OpenAI
pip install honeyhive[openinference-openai]
# If using multiple providers
pip install honeyhive[openinference-openai,openinference-anthropic]
Version Conflicts
The optional dependencies are curated to avoid conflicts. If you see version conflicts:
Use the optional dependency groups instead of manual installation
Update to the latest HoneyHive version
Check that you’re not manually specifying conflicting versions
Missing Provider Support
If a provider isn’t available as an optional dependency:
# Fall back to manual installation
pip install honeyhive
pip install openinference-instrumentation-<provider>
pip install <provider-sdk>
# Then file an issue to request the provider be added!
Next Steps
Quick Start: How-to Guides - Choose your provider integration
Examples: Tutorials - See complete examples
Deployment: Production Deployment Guide - Production deployment guides