Provider
13 documents categorized under Provider.
Anthropic
Cloud API provider for Claude models via the Anthropic Messages API, a non-OpenAI-compatible backend.
DeepSeek
Cloud API provider for DeepSeek chat and reasoning models via api.deepseek.com.
Fireworks AI
Cloud API provider for fast open-model inference via api.fireworks.ai.
Google Gemini
Cloud API provider for Gemini models via the OpenAI-compatible endpoint at generativelanguage.googleapis.com.
Groq
Cloud API provider for fast inference on Groq LPU hardware via api.groq.com.
llama.cpp
High-performance local inference engine with OpenAI-compatible server mode.
Mistral AI
Cloud API provider for Mistral models via api.mistral.ai.
Ollama
Local model runner with an OpenAI-compatible API. The default backend for musegpt.
OpenAI
Cloud API provider for GPT-4o, o1, and other OpenAI models via api.openai.com.
OpenRouter
Cloud API aggregator providing access to many models from multiple providers through a single OpenAI-compatible endpoint.
Together AI
Cloud API provider popular for open-weight models via api.together.xyz.
vLLM
High-throughput LLM serving engine with OpenAI-compatible API.
whisper.cpp
Local speech recognition engine for audio-to-text transcription in the musegpt audio pipeline.