LangDB Self-Hosted API Reference
Complete API reference for LangDB self-hosted AI gateway services.
Introduction
This API is documented in OpenAPI format and provides access to LangDB's self-hosted AI gateway services.
LangDB Self-Hosted offers a comprehensive platform for managing and interacting with LLM models through your own infrastructure. The API supports:
- Multiple Model Providers: Access models from OpenAI, Anthropic, Google, Meta, and many more
- Unified Interface: Use OpenAI-compatible API format for all models
- Self-Hosted Deployment: Full control over your AI infrastructure and data
- Thread Management: Track and manage conversation threads across models
- Usage Analytics: Monitor your API usage and costs in real-time
All API endpoints use Bearer Authentication with JWT tokens. Include your API key in the Authorization header:
Authorization: Bearer <your-api-key>
Additionally, include your Project ID in the x-project-id header for all requests:
x-project-id: <your-project-id>
For self-hosted deployments, you can configure the base URL to point to your own LangDB instance.
API Specifications
- Analytics API
- Completions API
- Misc API
- Threads API
Analytics API
Analytics and usage tracking endpoints for monitoring your self-hosted LangDB instance.
Completions API
Chat completions and text generation endpoints for interacting with LLM models on your self-hosted instance.
Misc API
Miscellaneous endpoints including embeddings and other utility functions for self-hosted deployments.
Threads API
Thread management endpoints for managing conversation threads on your self-hosted instance.