[view as .md]

Providers

A provider is an upstream LLM vendor — OpenAI, Anthropic, Google, Azure OpenAI, AWS Bedrock, Groq, Fireworks. Modelux proxies your requests using provider credentials you supply (BYO keys). We don’t mark up per-token costs.

Supported providers

ProviderStatus
OpenAIShipped
AnthropicShipped
Google (Gemini)Shipped
Azure OpenAIShipped
AWS BedrockShipped
GroqIn progress
FireworksIn progress

Adding a provider

  1. Open Providers in the dashboard.
  2. Click Add provider.
  3. Select the vendor, paste your API key, optionally set a base URL for self-hosted or regional endpoints.
  4. Modelux stores the credential encrypted and runs a verification call before marking it active.

Health monitoring

Modelux tracks provider health continuously:

  • Success rate — rolling window of 2xx vs 4xx/5xx
  • p50 latency — per-model, per-region where applicable
  • Last check timestamp — indicates how fresh the health signal is

When a provider is marked unhealthy, health-aware routing strategies automatically prefer other providers until it recovers.

Credential rotation

Rotate a provider’s API key without downtime:

  1. Edit the provider in the dashboard
  2. Paste the new key and save
  3. Modelux verifies the new key, then atomically swaps it

Old in-flight requests finish with the old key; new requests pick up the new key immediately.

Custom base URLs

For Azure OpenAI deployments, self-hosted vLLM endpoints, or regional Bedrock routes, set a custom base URL when creating the provider. Modelux will use that URL for all requests routed to this provider.