AI Chat SDK

Our AI Chat SDK provides a unified interface for working with multiple AI models, offering seamless integration, reliable performance, and enhanced capabilities.

Installation
pip install zeebee-ai-chat-sdk
Quick Start
from zeebee_ai import ChatClient

# Initialize client with your API key
client = ChatClient(api_key="your_api_key")

# Send a message
response = client.chat(
    messages=[{"role": "user", "content": "Hello, who are you?"}],
    model="claude-3-5-sonnet-20241022"
)

print(response["content"])
Multiple Providers

Unified interface for OpenAI, Anthropic, Google, and Mistral models with a single API.

Real-time Integration

WebSocket support for streaming responses with minimal latency.

Voice Capabilities

Built-in speech-to-text and text-to-speech for voice-based interactions.

SDK Features

Model Context Protocol

Advanced context management for more relevant AI responses. The Model Context Protocol (MCP) provides structured ways to include external knowledge and instructions.

Vector Search Integration

Built-in semantic search capabilities for retrieving relevant information from your knowledge base and integrating it with AI responses.

Telemetry & Analytics

Comprehensive usage tracking and performance analytics through Langfuse integration. Monitor model performance, costs, and user interactions.

Policy Enforcement

Configurable policy engine to ensure AI outputs meet your organization's standards for safety, accuracy, and compliance.

Supported Models

Provider Models Features
OpenAI GPT-4o, GPT-4, GPT-3.5 Turbo Chat, Function Calling, Vision, Embedding
Anthropic Claude 3.5 Sonnet, Claude 3 Opus, Claude 3 Sonnet Chat, Vision, Tool Use
Google Gemini 1.5 Pro, Gemini 1.0 Pro Chat, Vision, Function Calling
Mistral Mistral Large, Mistral Medium, Mistral Small Chat, Function Calling

Ready to get started?

Create your account today and start building AI-powered experiences.

Sign Up for Free

No credit card required for Free tier.