A unified source of truth for LLM providers, models, pricing, and capabilities.
Project description
LLM Providers (Python)
A unified source of truth for LLM providers, models, pricing, and capabilities.
Overview
Python bindings for the LLM Providers registry. Provides zero-latency access to a curated dataset of LLM providers, models, pricing, and capabilities — powered by Rust via PyO3.
The registry is embedded in the native extension at build time from data/providers.json via build.rs (no runtime JSON parsing).
Features
- 🚀 Zero-Latency: Data is compiled into the binary; no runtime I/O or API calls.
- 🐍 Pythonic API: Simple functions and typed objects.
- 🔄 Unified Schema: Consistent data structure across all providers (OpenAI, Anthropic, DeepSeek, etc.).
- 📦 Rich Metadata: Includes pricing, context length, and tool support flags.
Installation
pip install llm-providers-list
Usage
import llm_providers_list
# 1. List all provider families
print(llm_providers_list.list_providers())
# Output: ['aliyun', 'anthropic', 'deepseek', 'openai', ...]
# 2. List all endpoint IDs (for direct configuration)
print(llm_providers_list.list_endpoints())
# Output: ['aliyun:cn', 'anthropic:global', 'moonshot:cn', 'moonshot:global', ...]
# 3. Get endpoint details by ID
family_id, ep = llm_providers_list.get_endpoint("moonshot:global")
print(f"Family: {family_id}, Base URL: {ep.base_url}, Region: {ep.region}")
# 4. Get specific model details
model = llm_providers_list.get_model("openai", "gpt-4o")
print(f"Model: {model.name}, Price: ${model.input_price}/1M tokens")
Supported Providers
- OpenAI (GPT-4o, GPT-3.5, o1)
- Anthropic (Claude 3.5 Sonnet, Haiku, Opus)
- DeepSeek (Chat, Reasoner)
- Aliyun (Qwen Max, Plus, Turbo)
- Tencent (Hunyuan)
- Moonshot (Kimi)
- Moonshot AI (CN)
- Moonshot AI Global
- MiniMax
- MiniMax (CN)
- MiniMax Global
- Zhipu (GLM-4)
- BigModel (Zhipu CN)
- Z.ai (Zhipu Global)
- Volcengine (Doubao)
- LongCat
Contributing
Contributions are welcome! See the main repository for details.
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llm_providers_list-0.8.1-cp38-abi3-macosx_11_0_arm64.whl.
File metadata
- Download URL: llm_providers_list-0.8.1-cp38-abi3-macosx_11_0_arm64.whl
- Upload date:
- Size: 299.3 kB
- Tags: CPython 3.8+, macOS 11.0+ ARM64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7de6dc5dfd8f83b03052a44b973e4a3dbf1f1f2f6f4a9feebaca15b1cf692fe6
|
|
| MD5 |
441cc6558b3ebbd54fdf468b3cace33c
|
|
| BLAKE2b-256 |
162f5a06df309936899276fa16c8271512238770895315a13c6b036ece1bae29
|