mirror of
https://github.com/openclaw/openclaw.git
synced 2026-05-06 15:00:41 +00:00
fix(memory): keep llama runtime optional (#71425)
* fix(memory): keep llama runtime optional * fix(memory): harden optional llama runtime guard
This commit is contained in:
@@ -38,8 +38,9 @@ To set a provider explicitly:
|
||||
|
||||
Without an embedding provider, only keyword search is available.
|
||||
|
||||
To force the built-in local embedding provider, point `local.modelPath` at a
|
||||
GGUF file:
|
||||
To force the built-in local embedding provider, install the optional
|
||||
`node-llama-cpp` runtime package next to OpenClaw, then point `local.modelPath`
|
||||
at a GGUF file:
|
||||
|
||||
```json5
|
||||
{
|
||||
@@ -66,7 +67,7 @@ GGUF file:
|
||||
| Voyage | `voyage` | Yes | |
|
||||
| Mistral | `mistral` | Yes | |
|
||||
| Ollama | `ollama` | No | Local, set explicitly |
|
||||
| Local | `local` | Yes (first) | GGUF model, ~0.6 GB download |
|
||||
| Local | `local` | Yes (first) | Optional `node-llama-cpp` runtime |
|
||||
|
||||
Auto-detection picks the first provider whose API key can be resolved, in the
|
||||
order shown. Set `memorySearch.provider` to override.
|
||||
|
||||
@@ -15,7 +15,8 @@ binary, and can index content beyond your workspace memory files.
|
||||
- **Reranking and query expansion** for better recall.
|
||||
- **Index extra directories** -- project docs, team notes, anything on disk.
|
||||
- **Index session transcripts** -- recall earlier conversations.
|
||||
- **Fully local** -- runs via Bun + node-llama-cpp, auto-downloads GGUF models.
|
||||
- **Fully local** -- runs with the optional node-llama-cpp runtime package and
|
||||
auto-downloads GGUF models.
|
||||
- **Automatic fallback** -- if QMD is unavailable, OpenClaw falls back to the
|
||||
builtin engine seamlessly.
|
||||
|
||||
|
||||
@@ -29,8 +29,8 @@ explicitly:
|
||||
}
|
||||
```
|
||||
|
||||
For local embeddings with no API key, use `provider: "local"` (requires
|
||||
node-llama-cpp).
|
||||
For local embeddings with no API key, install the optional `node-llama-cpp`
|
||||
runtime package next to OpenClaw and use `provider: "local"`.
|
||||
|
||||
## Supported providers
|
||||
|
||||
|
||||
Reference in New Issue
Block a user