mirror of
https://github.com/openclaw/openclaw.git
synced 2026-05-06 10:50:44 +00:00
feat(providers): add DeepInfra provider plugin (#73038)
* feat(providers): add DeepInfra provider plugin * feat(deepinfra): add media provider surfaces * fix(deepinfra): satisfy provider boundary checks * docs: add gitcrawl maintainer skill * test: include deepinfra in live media sweeps * fix: remove stale tts contract import
This commit is contained in:
committed by
GitHub
parent
1fde7dbc0e
commit
0294aebe6f
@@ -84,8 +84,8 @@ See [Models](/providers/models) for pricing config and [Token use & costs](/refe
|
||||
|
||||
Inbound media can be summarized/transcribed before the reply runs. This uses model/provider APIs.
|
||||
|
||||
- Audio: OpenAI / Groq / Deepgram / Google / Mistral.
|
||||
- Image: OpenAI / OpenRouter / Anthropic / Google / MiniMax / Moonshot / Qwen / Z.AI.
|
||||
- Audio: OpenAI / Groq / Deepgram / DeepInfra / Google / Mistral.
|
||||
- Image: OpenAI / OpenRouter / Anthropic / DeepInfra / Google / MiniMax / Moonshot / Qwen / Z.AI.
|
||||
- Video: Google / Qwen / Moonshot.
|
||||
|
||||
See [Media understanding](/nodes/media-understanding).
|
||||
@@ -94,8 +94,8 @@ See [Media understanding](/nodes/media-understanding).
|
||||
|
||||
Shared generation capabilities can also spend provider keys:
|
||||
|
||||
- Image generation: OpenAI / Google / fal / MiniMax
|
||||
- Video generation: Qwen
|
||||
- Image generation: OpenAI / Google / DeepInfra / fal / MiniMax
|
||||
- Video generation: DeepInfra / Qwen
|
||||
|
||||
Image generation can infer an auth-backed provider default when
|
||||
`agents.defaults.imageGenerationModel` is unset. Video generation currently
|
||||
@@ -113,6 +113,7 @@ Semantic memory search uses **embedding APIs** when configured for remote provid
|
||||
- `memorySearch.provider = "gemini"` → Gemini embeddings
|
||||
- `memorySearch.provider = "voyage"` → Voyage embeddings
|
||||
- `memorySearch.provider = "mistral"` → Mistral embeddings
|
||||
- `memorySearch.provider = "deepinfra"` → DeepInfra embeddings
|
||||
- `memorySearch.provider = "lmstudio"` → LM Studio embeddings (local/self-hosted)
|
||||
- `memorySearch.provider = "ollama"` → Ollama embeddings (local/self-hosted; typically no hosted API billing)
|
||||
- Optional fallback to a remote provider if local embeddings fail
|
||||
|
||||
@@ -46,12 +46,12 @@ See [Active Memory](/concepts/active-memory) for the activation model, plugin-ow
|
||||
|
||||
## Provider selection
|
||||
|
||||
| Key | Type | Default | Description |
|
||||
| ---------- | --------- | ---------------- | ------------------------------------------------------------------------------------------------------------- |
|
||||
| `provider` | `string` | auto-detected | Embedding adapter ID: `bedrock`, `gemini`, `github-copilot`, `local`, `mistral`, `ollama`, `openai`, `voyage` |
|
||||
| `model` | `string` | provider default | Embedding model name |
|
||||
| `fallback` | `string` | `"none"` | Fallback adapter ID when the primary fails |
|
||||
| `enabled` | `boolean` | `true` | Enable or disable memory search |
|
||||
| Key | Type | Default | Description |
|
||||
| ---------- | --------- | ---------------- | -------------------------------------------------------------------------------------------------------------------------- |
|
||||
| `provider` | `string` | auto-detected | Embedding adapter ID: `bedrock`, `deepinfra`, `gemini`, `github-copilot`, `local`, `mistral`, `ollama`, `openai`, `voyage` |
|
||||
| `model` | `string` | provider default | Embedding model name |
|
||||
| `fallback` | `string` | `"none"` | Fallback adapter ID when the primary fails |
|
||||
| `enabled` | `boolean` | `true` | Enable or disable memory search |
|
||||
|
||||
### Auto-detection order
|
||||
|
||||
@@ -76,6 +76,9 @@ When `provider` is not set, OpenClaw selects the first available:
|
||||
<Step title="mistral">
|
||||
Selected if a Mistral key can be resolved.
|
||||
</Step>
|
||||
<Step title="deepinfra">
|
||||
Selected if a DeepInfra key can be resolved.
|
||||
</Step>
|
||||
<Step title="bedrock">
|
||||
Selected if the AWS SDK credential chain resolves (instance role, access keys, profile, SSO, web identity, or shared config).
|
||||
</Step>
|
||||
@@ -87,15 +90,16 @@ When `provider` is not set, OpenClaw selects the first available:
|
||||
|
||||
Remote embeddings require an API key. Bedrock uses the AWS SDK default credential chain instead (instance roles, SSO, access keys).
|
||||
|
||||
| Provider | Env var | Config key |
|
||||
| -------------- | -------------------------------------------------- | --------------------------------- |
|
||||
| Bedrock | AWS credential chain | No API key needed |
|
||||
| Gemini | `GEMINI_API_KEY` | `models.providers.google.apiKey` |
|
||||
| GitHub Copilot | `COPILOT_GITHUB_TOKEN`, `GH_TOKEN`, `GITHUB_TOKEN` | Auth profile via device login |
|
||||
| Mistral | `MISTRAL_API_KEY` | `models.providers.mistral.apiKey` |
|
||||
| Ollama | `OLLAMA_API_KEY` (placeholder) | -- |
|
||||
| OpenAI | `OPENAI_API_KEY` | `models.providers.openai.apiKey` |
|
||||
| Voyage | `VOYAGE_API_KEY` | `models.providers.voyage.apiKey` |
|
||||
| Provider | Env var | Config key |
|
||||
| -------------- | -------------------------------------------------- | ----------------------------------- |
|
||||
| Bedrock | AWS credential chain | No API key needed |
|
||||
| DeepInfra | `DEEPINFRA_API_KEY` | `models.providers.deepinfra.apiKey` |
|
||||
| Gemini | `GEMINI_API_KEY` | `models.providers.google.apiKey` |
|
||||
| GitHub Copilot | `COPILOT_GITHUB_TOKEN`, `GH_TOKEN`, `GITHUB_TOKEN` | Auth profile via device login |
|
||||
| Mistral | `MISTRAL_API_KEY` | `models.providers.mistral.apiKey` |
|
||||
| Ollama | `OLLAMA_API_KEY` (placeholder) | -- |
|
||||
| OpenAI | `OPENAI_API_KEY` | `models.providers.openai.apiKey` |
|
||||
| Voyage | `VOYAGE_API_KEY` | `models.providers.voyage.apiKey` |
|
||||
|
||||
<Note>
|
||||
Codex OAuth covers chat/completions only and does not satisfy embedding requests.
|
||||
|
||||
Reference in New Issue
Block a user