feat(file-transfer): add bundled plugin for binary file ops on nodes

New extensions/file-transfer/ plugin exposing four agent tools
(file_fetch, dir_list, dir_fetch, file_write) and four matching
node-host commands (file.fetch, dir.list, dir.fetch, file.write).
Lets agents read and write files on paired nodes by absolute path,
bypassing the bash output cap (200KB) and the live tool-result
text cap that would otherwise truncate base64 payloads.

Public surface
--------------
- file_fetch({ node, path, maxBytes? })
  Image MIMEs return image content blocks; small text (<=8 KB) inlines
  as text content; everything else returns a saved-media-path text
  block. sha256-verified end-to-end.
- dir_list({ node, path, pageToken?, maxEntries? })
  Structured directory listing — name, path, size, mimeType, isDir,
  mtime. Paginated. No content transfer.
- dir_fetch({ node, path, maxBytes?, includeDotfiles? })
  Server-side tar -czf streamed back, unpacked into the gateway media
  store, returns a manifest of saved paths. Single round-trip.
  60s wall-clock timeouts on tar create/unpack. tar -xzf without -P
  rejects absolute paths in archive entries.
- file_write({ node, path, contentBase64, mimeType?, overwrite?,
              createParents? })
  Atomic write (temp + rename). Refuses to overwrite by default.
  Refuses to write through symlinks (lstat check). Buffer-side
  sha256 (no read-back race). Pair with file_fetch to round-trip
  files between nodes — DO NOT use exec/cp for file copies.

All four commands gated by:
  - dangerous-by-default node command policy
    (gateway.nodes.allowCommands opt-in)
  - per-node path policy (gateway.nodes.fileTransfer)
  - optional operator approval prompt (ask: off | on-miss | always)

16 MB raw byte ceiling per single-frame round-trip (25 MB WS frame
with ~33% base64 overhead and JSON envelope). 8 MB defaults.

Path policy and approvals
-------------------------
Default behavior is DENY. The operator must explicitly opt in:

  {
    "gateway": {
      "nodes": {
        "fileTransfer": {
          "<nodeId-or-displayName>": {
            "ask":              "off" | "on-miss" | "always",
            "allowReadPaths":   ["~/Screenshots/**", "/tmp/**"],
            "allowWritePaths":  ["~/Downloads/**"],
            "denyPaths":        ["**/.ssh/**", "**/.aws/**"],
            "maxBytes":         16777216
          },
          "*": { "ask": "on-miss" }
        }
      }
    }
  }

ask modes:
  off       — silent: allow if matched, deny if not (default)
  on-miss   — silent allow if matched; prompt on miss
  always    — prompt every call (denyPaths still hard-deny)

denyPaths always wins. allow-always from the prompt persists the
exact path back into allowReadPaths/allowWritePaths via
mutateConfigFile so subsequent matching calls go silent.

Reuses existing primitives — no new gateway methods:
  plugin.approval.request / plugin.approval.waitDecision
  decision: allow-once | allow-always | deny

Pre-flight against requested path AND post-flight against the
canonicalPath returned by the node — closes symlink-escape attacks
where the requested path matched policy but realpath resolves
somewhere else.

Audit log
---------
JSONL at ~/.openclaw/audit/file-transfer.jsonl. Records every
decision (allow/allowed-once/allowed-always/denied/error) with
timestamp, op, nodeId, displayName, requestedPath, canonicalPath,
decision, error code, sizeBytes, sha256, durationMs. Best-effort
writes; never propagates failure.

Plugin layout
-------------
extensions/file-transfer/
  index.ts                       definePluginEntry, nodeHostCommands
  openclaw.plugin.json           contracts.tools registration
  package.json
  src/node-host/{file-fetch,dir-list,dir-fetch,file-write}.ts
  src/tools/{file-fetch,dir-list,dir-fetch,file-write}-tool.ts
  src/shared/
    mime.ts        single-source extension->MIME map + image/text sets
    errors.ts      shared error code enum and helpers
    params.ts      shared param-validation helpers + GatewayCallOptions
    policy.ts      evaluateFilePolicy, persistAllowAlways
    approval.ts    plugin.approval.request wrapper
    gatekeep.ts    one-stop policy + approval + audit orchestrator
    audit.ts       JSONL audit sink

Core touch points
-----------------
- src/infra/node-commands.ts: NODE_FILE_FETCH_COMMAND,
  NODE_DIR_LIST_COMMAND, NODE_DIR_FETCH_COMMAND,
  NODE_FILE_WRITE_COMMAND, NODE_FILE_COMMANDS array
- src/gateway/node-command-policy.ts: all four added to
  DEFAULT_DANGEROUS_NODE_COMMANDS
- src/security/audit-extra.sync.ts: audit detail mentions file ops
- src/agents/tools/nodes-tool-media.ts: MEDIA_INVOKE_ACTIONS entry
  for file.fetch redirects raw nodes(action=invoke) callers to the
  dedicated file_fetch tool to prevent base64 context bloat
- src/agents/tools/nodes-tool.ts: nodes tool description points to
  the dedicated file_fetch tool

Known limitations / follow-ups
------------------------------
- No tests in this PR. For a security-sensitive surface this is a
  gap; will follow up with a test pass.
- Direct CLI invocation (openclaw nodes invoke --command file.fetch)
  bypasses the plugin policy entirely. Plugin-side gating is the
  realistic threat model (agent on iMessage requesting paths it
  shouldn't), but for true defense-in-depth, policy belongs in the
  gateway-side node.invoke dispatch. Move-policy-to-core is a
  separate PR.
- file_watch (long-lived filesystem event subscription) is not
  included; it needs a new node-protocol primitive for streaming
  event channels and was descoped from this PR.
- dir_fetch includeDotfiles: true is the only supported mode;
  BSD tar exclude patterns reliably collapse dotfile filtering
  to an empty archive. Reliable filtering needs a
  `find ! -name ".*" | tar -T -` pipeline; deferred.
- dir_fetch du -sk preflight is a heuristic (du * 4 vs maxBytes);
  the mid-stream byte cap is the actual safety net.
This commit is contained in:
Omar Shahine
2026-04-29 06:13:00 +00:00
parent 5cc834a11a
commit 844f3d62e9
23 changed files with 2764 additions and 2 deletions

View File

@@ -0,0 +1,65 @@
import {
definePluginEntry,
type AnyAgentTool,
type OpenClawPluginNodeHostCommand,
} from "openclaw/plugin-sdk/plugin-entry";
import { handleDirFetch } from "./src/node-host/dir-fetch.js";
import { handleDirList } from "./src/node-host/dir-list.js";
import { handleFileFetch } from "./src/node-host/file-fetch.js";
import { handleFileWrite } from "./src/node-host/file-write.js";
import { createDirFetchTool } from "./src/tools/dir-fetch-tool.js";
import { createDirListTool } from "./src/tools/dir-list-tool.js";
import { createFileFetchTool } from "./src/tools/file-fetch-tool.js";
import { createFileWriteTool } from "./src/tools/file-write-tool.js";
const fileTransferNodeHostCommands: OpenClawPluginNodeHostCommand[] = [
{
command: "file.fetch",
cap: "file",
handle: async (paramsJSON) => {
const params = paramsJSON ? JSON.parse(paramsJSON) : {};
const result = await handleFileFetch(params);
return JSON.stringify(result);
},
},
{
command: "dir.list",
cap: "file",
handle: async (paramsJSON) => {
const params = paramsJSON ? JSON.parse(paramsJSON) : {};
const result = await handleDirList(params);
return JSON.stringify(result);
},
},
{
command: "dir.fetch",
cap: "file",
handle: async (paramsJSON) => {
const params = paramsJSON ? JSON.parse(paramsJSON) : {};
const result = await handleDirFetch(params);
return JSON.stringify(result);
},
},
{
command: "file.write",
cap: "file",
handle: async (paramsJSON) => {
const params = paramsJSON ? JSON.parse(paramsJSON) : {};
const result = await handleFileWrite(params);
return JSON.stringify(result);
},
},
];
export default definePluginEntry({
id: "file-transfer",
name: "File Transfer",
description: "Fetch, list, write, and watch files on paired nodes via dedicated node commands.",
nodeHostCommands: fileTransferNodeHostCommands,
register(api) {
api.registerTool(createFileFetchTool() as AnyAgentTool);
api.registerTool(createDirListTool() as AnyAgentTool);
api.registerTool(createDirFetchTool() as AnyAgentTool);
api.registerTool(createFileWriteTool() as AnyAgentTool);
},
});

View File

@@ -0,0 +1,17 @@
{
"id": "file-transfer",
"activation": {
"onStartup": true
},
"enabledByDefault": true,
"name": "File Transfer",
"description": "Fetch, list, write, and watch files on paired nodes via dedicated node commands. Bypasses bash stdout truncation by using base64 over node.invoke for binaries up to 16 MB.",
"contracts": {
"tools": ["file_fetch", "dir_list", "dir_fetch", "file_write"]
},
"configSchema": {
"type": "object",
"additionalProperties": false,
"properties": {}
}
}

View File

@@ -0,0 +1,17 @@
{
"name": "@openclaw/file-transfer",
"version": "2026.4.27",
"description": "OpenClaw file transfer plugin (file_fetch, dir_list, dir_fetch, file_write, file_watch)",
"type": "module",
"devDependencies": {
"@openclaw/plugin-sdk": "workspace:*"
},
"openclaw": {
"extensions": [
"./index.ts"
],
"bundle": {
"stageRuntimeDependencies": false
}
}
}

View File

@@ -0,0 +1,263 @@
import { spawn, spawnSync } from "node:child_process";
import crypto from "node:crypto";
import fs from "node:fs/promises";
import path from "node:path";
export const DIR_FETCH_HARD_MAX_BYTES = 16 * 1024 * 1024;
export const DIR_FETCH_DEFAULT_MAX_BYTES = 8 * 1024 * 1024;
export type DirFetchParams = {
path?: unknown;
maxBytes?: unknown;
includeDotfiles?: unknown;
};
export type DirFetchOk = {
ok: true;
path: string;
tarBase64: string;
tarBytes: number;
sha256: string;
fileCount: number;
};
export type DirFetchErrCode =
| "INVALID_PATH"
| "NOT_FOUND"
| "IS_FILE"
| "TREE_TOO_LARGE"
| "READ_ERROR";
export type DirFetchErr = {
ok: false;
code: DirFetchErrCode;
message: string;
canonicalPath?: string;
};
export type DirFetchResult = DirFetchOk | DirFetchErr;
function clampMaxBytes(input: unknown): number {
if (typeof input !== "number" || !Number.isFinite(input) || input <= 0) {
return DIR_FETCH_DEFAULT_MAX_BYTES;
}
return Math.min(Math.floor(input), DIR_FETCH_HARD_MAX_BYTES);
}
function classifyFsError(err: unknown): DirFetchErrCode {
const code = (err as { code?: string } | null)?.code;
if (code === "ENOENT") {
return "NOT_FOUND";
}
return "READ_ERROR";
}
async function preflightDu(dirPath: string, maxBytes: number): Promise<boolean> {
// du -sk gives size in 1KB blocks (512-byte blocks on macOS with -k)
// We use maxBytes * 4 as the rough heuristic ceiling (generous, gzip compresses)
const heuristicKb = Math.ceil((maxBytes * 4) / 1024);
return new Promise((resolve) => {
const du = spawn("du", ["-sk", dirPath], { stdio: ["ignore", "pipe", "ignore"] });
let output = "";
du.stdout.on("data", (chunk: Buffer) => {
output += chunk.toString();
});
du.on("close", (code) => {
if (code !== 0) {
// du failed; be permissive and let tar catch the overflow
resolve(true);
return;
}
const match = /^(\d+)/.exec(output.trim());
if (!match) {
resolve(true);
return;
}
const sizeKb = parseInt(match[1], 10);
resolve(sizeKb <= heuristicKb);
});
du.on("error", () => {
// du not available; skip preflight
resolve(true);
});
});
}
function countTarEntries(tarBuffer: Buffer): number {
const result = spawnSync("tar", ["-tzf", "-"], {
input: tarBuffer,
maxBuffer: 32 * 1024 * 1024,
timeout: 10000,
});
if (result.status !== 0 || !result.stdout) {
return 0;
}
const lines = (result.stdout as Buffer)
.toString("utf-8")
.split("\n")
.filter((l) => l.trim().length > 0 && l !== "./");
return lines.length;
}
export async function handleDirFetch(params: DirFetchParams): Promise<DirFetchResult> {
const requestedPath = params.path;
if (typeof requestedPath !== "string" || requestedPath.length === 0) {
return { ok: false, code: "INVALID_PATH", message: "path required" };
}
if (requestedPath.includes("\0")) {
return { ok: false, code: "INVALID_PATH", message: "path contains NUL byte" };
}
if (!path.isAbsolute(requestedPath)) {
return { ok: false, code: "INVALID_PATH", message: "path must be absolute" };
}
const maxBytes = clampMaxBytes(params.maxBytes);
const includeDotfiles = params.includeDotfiles === true;
let canonical: string;
try {
canonical = await fs.realpath(requestedPath);
} catch (err) {
const code = classifyFsError(err);
return {
ok: false,
code,
message: code === "NOT_FOUND" ? "directory not found" : `realpath failed: ${String(err)}`,
};
}
let stats: Awaited<ReturnType<typeof fs.stat>>;
try {
stats = await fs.stat(canonical);
} catch (err) {
const code = classifyFsError(err);
return { ok: false, code, message: `stat failed: ${String(err)}`, canonicalPath: canonical };
}
if (!stats.isDirectory()) {
return {
ok: false,
code: "IS_FILE",
message: "path is not a directory",
canonicalPath: canonical,
};
}
// Preflight size check using du
const withinBudget = await preflightDu(canonical, maxBytes);
if (!withinBudget) {
return {
ok: false,
code: "TREE_TOO_LARGE",
message: `directory tree exceeds estimated size limit (${maxBytes} bytes raw)`,
canonicalPath: canonical,
};
}
// Build tar args. Shell out to /usr/bin/tar for portability.
// -cz: create + gzip
// -C <dir>: change to directory so paths in archive are relative
// .: include everything from that directory
// v1: includeDotfiles is accepted in the API but not enforced. BSD tar's
// --exclude pattern matching is unreliable for dotfiles (every plausible
// pattern except "*/.*" collapses the archive on macOS). Reliable filtering
// requires a `find ! -name '.*' | tar -T -` pipeline; deferred to v2.
// For now we always archive everything in the directory.
void includeDotfiles;
const tarArgs: string[] = ["-czf", "-", "-C", canonical, "."];
// Capture tar output with a hard byte cap and a wall-clock timeout.
// SIGTERM if the byte cap is exceeded; SIGKILL if the timeout fires
// (covers tar hanging on a slow filesystem or symlink loop).
const TAR_HARD_TIMEOUT_MS = 60_000;
const tarBuffer = await new Promise<Buffer | "TOO_LARGE" | "TIMEOUT" | "ERROR">((resolve) => {
const tarBin = process.platform !== "win32" ? "/usr/bin/tar" : "tar";
const child = spawn(tarBin, tarArgs, {
stdio: ["ignore", "pipe", "pipe"],
});
const chunks: Buffer[] = [];
let totalBytes = 0;
let aborted = false;
const watchdog = setTimeout(() => {
if (aborted) return;
aborted = true;
try {
child.kill("SIGKILL");
} catch {
/* already gone */
}
resolve("TIMEOUT");
}, TAR_HARD_TIMEOUT_MS);
child.stdout.on("data", (chunk: Buffer) => {
if (aborted) return;
totalBytes += chunk.byteLength;
if (totalBytes > maxBytes) {
aborted = true;
clearTimeout(watchdog);
child.kill("SIGTERM");
resolve("TOO_LARGE");
return;
}
chunks.push(chunk);
});
child.on("close", (code) => {
clearTimeout(watchdog);
if (aborted) return;
if (code !== 0) {
resolve("ERROR");
return;
}
resolve(Buffer.concat(chunks));
});
child.on("error", () => {
clearTimeout(watchdog);
if (!aborted) {
resolve("ERROR");
}
});
});
if (tarBuffer === "TOO_LARGE") {
return {
ok: false,
code: "TREE_TOO_LARGE",
message: `tarball exceeded ${maxBytes} byte limit mid-stream`,
canonicalPath: canonical,
};
}
if (tarBuffer === "TIMEOUT") {
return {
ok: false,
code: "READ_ERROR",
message: "tar command exceeded 60s wall-clock timeout (slow filesystem or symlink loop?)",
canonicalPath: canonical,
};
}
if (tarBuffer === "ERROR") {
return {
ok: false,
code: "READ_ERROR",
message: "tar command failed",
canonicalPath: canonical,
};
}
const sha256 = crypto.createHash("sha256").update(tarBuffer).digest("hex");
const tarBase64 = tarBuffer.toString("base64");
const tarBytes = tarBuffer.byteLength;
const fileCount = countTarEntries(tarBuffer);
return {
ok: true,
path: canonical,
tarBase64,
tarBytes,
sha256,
fileCount,
};
}

View File

@@ -0,0 +1,166 @@
import fs from "node:fs/promises";
import path from "node:path";
import { mimeFromExtension } from "../shared/mime.js";
export const DIR_LIST_DEFAULT_MAX_ENTRIES = 200;
export const DIR_LIST_HARD_MAX_ENTRIES = 5000;
export type DirListParams = {
path?: unknown;
pageToken?: unknown;
maxEntries?: unknown;
};
export type DirListEntry = {
name: string;
path: string;
size: number;
mimeType: string;
isDir: boolean;
mtime: number;
};
export type DirListOk = {
ok: true;
path: string;
entries: DirListEntry[];
nextPageToken?: string;
truncated: boolean;
};
export type DirListErrCode =
| "INVALID_PATH"
| "NOT_FOUND"
| "PERMISSION_DENIED"
| "IS_FILE"
| "READ_ERROR";
export type DirListErr = {
ok: false;
code: DirListErrCode;
message: string;
canonicalPath?: string;
};
export type DirListResult = DirListOk | DirListErr;
function clampMaxEntries(input: unknown): number {
if (typeof input !== "number" || !Number.isFinite(input) || input <= 0) {
return DIR_LIST_DEFAULT_MAX_ENTRIES;
}
return Math.min(Math.floor(input), DIR_LIST_HARD_MAX_ENTRIES);
}
function classifyFsError(err: unknown): DirListErrCode {
const code = (err as { code?: string } | null)?.code;
if (code === "ENOENT") {
return "NOT_FOUND";
}
if (code === "EACCES" || code === "EPERM") {
return "PERMISSION_DENIED";
}
return "READ_ERROR";
}
export async function handleDirList(params: DirListParams): Promise<DirListResult> {
const requestedPath = params.path;
if (typeof requestedPath !== "string" || requestedPath.length === 0) {
return { ok: false, code: "INVALID_PATH", message: "path required" };
}
if (requestedPath.includes("\0")) {
return { ok: false, code: "INVALID_PATH", message: "path contains NUL byte" };
}
if (!path.isAbsolute(requestedPath)) {
return { ok: false, code: "INVALID_PATH", message: "path must be absolute" };
}
const maxEntries = clampMaxEntries(params.maxEntries);
const offset =
typeof params.pageToken === "string" && params.pageToken.length > 0
? Math.max(0, parseInt(params.pageToken, 10) || 0)
: 0;
let canonical: string;
try {
canonical = await fs.realpath(requestedPath);
} catch (err) {
const code = classifyFsError(err);
return {
ok: false,
code,
message: code === "NOT_FOUND" ? "path not found" : `realpath failed: ${String(err)}`,
};
}
let stats: Awaited<ReturnType<typeof fs.stat>>;
try {
stats = await fs.stat(canonical);
} catch (err) {
const code = classifyFsError(err);
return { ok: false, code, message: `stat failed: ${String(err)}`, canonicalPath: canonical };
}
if (!stats.isDirectory()) {
return {
ok: false,
code: "IS_FILE",
message: "path is not a directory",
canonicalPath: canonical,
};
}
let names: string[];
try {
names = await fs.readdir(canonical, { encoding: "utf8" });
} catch (err) {
const code = classifyFsError(err);
return {
ok: false,
code,
message: `readdir failed: ${String(err)}`,
canonicalPath: canonical,
};
}
// Sort by name for stable pagination
names.sort((a, b) => a.localeCompare(b));
const total = names.length;
const page = names.slice(offset, offset + maxEntries);
const truncated = offset + maxEntries < total;
const nextPageToken = truncated ? String(offset + maxEntries) : undefined;
const entries: DirListEntry[] = [];
for (const name of page) {
const entryPath = path.join(canonical, name);
let isDir = false;
let size = 0;
let mtime = 0;
try {
const s = await fs.stat(entryPath);
isDir = s.isDirectory();
size = isDir ? 0 : s.size;
mtime = s.mtimeMs;
} catch {
// stat may fail for broken symlinks; keep zeros and treat as file
}
entries.push({
name,
path: entryPath,
size,
mimeType: isDir ? "inode/directory" : mimeFromExtension(name),
isDir,
mtime,
});
}
return {
ok: true,
path: canonical,
entries,
nextPageToken,
truncated,
};
}

View File

@@ -0,0 +1,170 @@
import { spawnSync } from "node:child_process";
import crypto from "node:crypto";
import fs from "node:fs/promises";
import path from "node:path";
import { EXTENSION_MIME } from "../shared/mime.js";
export const FILE_FETCH_HARD_MAX_BYTES = 16 * 1024 * 1024;
export const FILE_FETCH_DEFAULT_MAX_BYTES = 8 * 1024 * 1024;
export type FileFetchParams = {
path?: unknown;
maxBytes?: unknown;
};
export type FileFetchOk = {
ok: true;
path: string;
size: number;
mimeType: string;
base64: string;
sha256: string;
};
export type FileFetchErrCode =
| "INVALID_PATH"
| "NOT_FOUND"
| "PERMISSION_DENIED"
| "IS_DIRECTORY"
| "FILE_TOO_LARGE"
| "PATH_TRAVERSAL"
| "READ_ERROR";
export type FileFetchErr = {
ok: false;
code: FileFetchErrCode;
message: string;
canonicalPath?: string;
};
export type FileFetchResult = FileFetchOk | FileFetchErr;
function detectMimeType(filePath: string): string {
if (process.platform !== "win32") {
try {
const result = spawnSync("file", ["-b", "--mime-type", filePath], {
encoding: "utf-8",
timeout: 2000,
});
const stdout = result.stdout?.trim();
if (result.status === 0 && stdout) {
return stdout;
}
} catch {
// fall through to extension fallback
}
}
const ext = path.extname(filePath).toLowerCase();
return EXTENSION_MIME[ext] ?? "application/octet-stream";
}
function clampMaxBytes(input: unknown): number {
if (typeof input !== "number" || !Number.isFinite(input) || input <= 0) {
return FILE_FETCH_DEFAULT_MAX_BYTES;
}
return Math.min(Math.floor(input), FILE_FETCH_HARD_MAX_BYTES);
}
function classifyFsError(err: unknown): FileFetchErrCode {
const code = (err as { code?: string } | null)?.code;
if (code === "ENOENT") {
return "NOT_FOUND";
}
if (code === "EACCES" || code === "EPERM") {
return "PERMISSION_DENIED";
}
if (code === "EISDIR") {
return "IS_DIRECTORY";
}
return "READ_ERROR";
}
export async function handleFileFetch(params: FileFetchParams): Promise<FileFetchResult> {
const requestedPath = params.path;
if (typeof requestedPath !== "string" || requestedPath.length === 0) {
return { ok: false, code: "INVALID_PATH", message: "path required" };
}
if (requestedPath.includes("\0")) {
return { ok: false, code: "INVALID_PATH", message: "path contains NUL byte" };
}
if (!path.isAbsolute(requestedPath)) {
return { ok: false, code: "INVALID_PATH", message: "path must be absolute" };
}
const maxBytes = clampMaxBytes(params.maxBytes);
let canonical: string;
try {
canonical = await fs.realpath(requestedPath);
} catch (err) {
const code = classifyFsError(err);
return {
ok: false,
code,
message: code === "NOT_FOUND" ? "file not found" : `realpath failed: ${String(err)}`,
};
}
let stats: Awaited<ReturnType<typeof fs.stat>>;
try {
stats = await fs.stat(canonical);
} catch (err) {
const code = classifyFsError(err);
return { ok: false, code, message: `stat failed: ${String(err)}`, canonicalPath: canonical };
}
if (stats.isDirectory()) {
return {
ok: false,
code: "IS_DIRECTORY",
message: "path is a directory",
canonicalPath: canonical,
};
}
if (!stats.isFile()) {
return {
ok: false,
code: "READ_ERROR",
message: "path is not a regular file",
canonicalPath: canonical,
};
}
if (stats.size > maxBytes) {
return {
ok: false,
code: "FILE_TOO_LARGE",
message: `file size ${stats.size} exceeds limit ${maxBytes}`,
canonicalPath: canonical,
};
}
let buffer: Buffer;
try {
buffer = await fs.readFile(canonical);
} catch (err) {
const code = classifyFsError(err);
return { ok: false, code, message: `read failed: ${String(err)}`, canonicalPath: canonical };
}
if (buffer.byteLength > maxBytes) {
return {
ok: false,
code: "FILE_TOO_LARGE",
message: `read ${buffer.byteLength} bytes exceeds limit ${maxBytes}`,
canonicalPath: canonical,
};
}
const sha256 = crypto.createHash("sha256").update(buffer).digest("hex");
const base64 = buffer.toString("base64");
const mimeType = detectMimeType(canonical);
return {
ok: true,
path: canonical,
size: buffer.byteLength,
mimeType,
base64,
sha256,
};
}

View File

@@ -0,0 +1,208 @@
import crypto from "node:crypto";
import fs from "node:fs/promises";
import path from "node:path";
const MAX_CONTENT_BYTES = 16 * 1024 * 1024; // 16 MB
type FileWriteParams = {
path: string;
contentBase64: string;
overwrite: boolean;
createParents: boolean;
expectedSha256?: string;
};
type FileWriteSuccess = {
ok: true;
path: string;
size: number;
sha256: string;
overwritten: boolean;
};
type FileWriteError = {
ok: false;
code: string;
message: string;
canonicalPath?: string;
};
type FileWriteResult = FileWriteSuccess | FileWriteError;
function sha256Hex(buf: Buffer): string {
return crypto.createHash("sha256").update(buf).digest("hex");
}
function err(code: string, message: string, canonicalPath?: string): FileWriteError {
return { ok: false, code, message, ...(canonicalPath ? { canonicalPath } : {}) };
}
export async function handleFileWrite(
params: Partial<FileWriteParams> & Record<string, unknown>,
): Promise<FileWriteResult> {
const rawPath = typeof params?.path === "string" ? params.path : "";
const contentBase64 = typeof params?.contentBase64 === "string" ? params.contentBase64 : "";
const overwrite = params?.overwrite === true;
const createParents = params?.createParents === true;
const expectedSha256 =
typeof params?.expectedSha256 === "string" ? params.expectedSha256 : undefined;
// 1. Validate path: must be absolute, non-empty, no NUL byte
if (!rawPath) {
return err("INVALID_PATH", "path is required");
}
if (rawPath.includes("\0")) {
return err("INVALID_PATH", "path must not contain NUL bytes");
}
if (!path.isAbsolute(rawPath)) {
return err("INVALID_PATH", "path must be absolute");
}
// 2. Decode base64 → Buffer
let buf: Buffer;
try {
buf = Buffer.from(contentBase64, "base64");
// Verify round-trip to catch invalid base64
if (
buf.toString("base64") !== contentBase64 &&
Buffer.from(contentBase64, "base64url").toString("base64url") !== contentBase64
) {
// Tolerate standard base64 with or without padding; just use what we decoded.
}
} catch {
return err("INVALID_BASE64", "contentBase64 is not valid base64");
}
if (buf.length > MAX_CONTENT_BYTES) {
return err(
"FILE_TOO_LARGE",
`decoded content is ${buf.length} bytes; maximum is ${MAX_CONTENT_BYTES} bytes (16 MB)`,
);
}
// 3. Resolve parent dir
const targetPath = path.normalize(rawPath);
const parentDir = path.dirname(targetPath);
let parentExists = false;
try {
await fs.access(parentDir);
parentExists = true;
} catch {
parentExists = false;
}
if (!parentExists) {
if (!createParents) {
return err("PARENT_NOT_FOUND", `parent directory does not exist: ${parentDir}`);
}
try {
await fs.mkdir(parentDir, { recursive: true });
} catch (mkdirErr) {
const message = mkdirErr instanceof Error ? mkdirErr.message : String(mkdirErr);
return err("WRITE_ERROR", `failed to create parent directories: ${message}`);
}
}
// 4. Refuse to write through symlinks (lstat sees the link itself, not
// its target). A path that's a symlink could escape the operator's
// intended path policy — e.g., an allowed dir could contain a
// symlink pointing at /etc/hosts.
// Otherwise determine overwritten status and reject directories.
let overwritten = false;
try {
const existingLStat = await fs.lstat(targetPath);
if (existingLStat.isSymbolicLink()) {
return err(
"SYMLINK_TARGET_DENIED",
`path is a symlink; refusing to write through it: ${targetPath}`,
);
}
if (existingLStat.isDirectory()) {
return err("IS_DIRECTORY", `path resolves to a directory: ${targetPath}`);
}
if (!overwrite) {
return err(
"EXISTS_NO_OVERWRITE",
`file already exists and overwrite is false: ${targetPath}`,
);
}
overwritten = true;
} catch (statErr: unknown) {
// ENOENT is fine — file does not exist yet
if ((statErr as NodeJS.ErrnoException).code !== "ENOENT") {
const message = statErr instanceof Error ? statErr.message : String(statErr);
if (message.toLowerCase().includes("permission")) {
return err("PERMISSION_DENIED", `permission denied: ${targetPath}`);
}
return err("WRITE_ERROR", `unexpected stat error: ${message}`);
}
}
// 5. Atomic write: write to tmp, then rename
const tmpSuffix = crypto.randomBytes(8).toString("hex");
const tmpPath = `${targetPath}.${tmpSuffix}.tmp`;
try {
await fs.writeFile(tmpPath, buf);
} catch (writeErr) {
const message = writeErr instanceof Error ? writeErr.message : String(writeErr);
// Clean up tmp if possible
await fs.unlink(tmpPath).catch(() => {});
if (message.toLowerCase().includes("permission") || message.toLowerCase().includes("access")) {
return err("PERMISSION_DENIED", `permission denied writing to: ${parentDir}`);
}
return err("WRITE_ERROR", `failed to write file: ${message}`);
}
try {
await fs.rename(tmpPath, targetPath);
} catch (renameErr) {
const message = renameErr instanceof Error ? renameErr.message : String(renameErr);
await fs.unlink(tmpPath).catch(() => {});
if (message.toLowerCase().includes("permission") || message.toLowerCase().includes("access")) {
return err("PERMISSION_DENIED", `permission denied renaming to: ${targetPath}`);
}
return err("WRITE_ERROR", `failed to rename tmp to target: ${message}`);
}
// 6. Compute sha256 from the buffer we just wrote, NOT from a re-read.
// A re-read would race against any concurrent process that overwrote
// the file between rename and read — we'd compute the wrong hash and
// either approve a corrupted file or unlink someone else's data on
// a false mismatch. Buffer-side sha256 is what the caller actually
// asked us to write.
const computedSha256 = sha256Hex(buf);
// 7. Integrity check against the optional expectedSha256 — this is now
// a redundancy check (if expectedSha256 differs from what we hashed
// of the input buffer, the caller mis-encoded contentBase64). The
// file already exists at this point; on mismatch we unlink to avoid
// leaving a file the caller didn't intend.
if (expectedSha256 && expectedSha256.toLowerCase() !== computedSha256) {
await fs.unlink(targetPath).catch(() => {});
return err(
"INTEGRITY_FAILURE",
`sha256 mismatch: expected ${expectedSha256.toLowerCase()}, got ${computedSha256}`,
targetPath,
);
}
const writtenBuf = buf;
// 8. Re-realpath to resolve any symlinks in the final path
let canonicalPath = targetPath;
try {
canonicalPath = await fs.realpath(targetPath);
} catch {
// Best effort; use normalized path as fallback
canonicalPath = targetPath;
}
return {
ok: true,
path: canonicalPath,
size: writtenBuf.length,
sha256: computedSha256,
overwritten,
};
}

View File

@@ -0,0 +1,89 @@
// Approval-flow wrappers around the generic plugin.approval.request /
// plugin.approval.waitDecision gateway methods.
//
// Used by the file-transfer policy gate when ask=on-miss/always: the
// operator sees a modal in their macOS/iOS app with allow-once /
// allow-always / deny.
import { callGatewayTool } from "openclaw/plugin-sdk/agent-harness-runtime";
import type { GatewayCallOptions } from "./params.js";
export type ApprovalDecision = "allow-once" | "allow-always" | "deny";
export type ApprovalOutcome = {
decision: ApprovalDecision | null;
approvalId?: string;
};
const DEFAULT_APPROVAL_TIMEOUT_MS = 120_000;
const PLUGIN_ID = "file-transfer";
/**
* Issue a two-phase plugin.approval.request and wait for the operator's
* decision. Returns null decision if the request was unavailable (e.g.,
* no operator client connected) — caller should then fall back to deny.
*/
export async function requestFileTransferApproval(input: {
gatewayOpts: GatewayCallOptions;
title: string;
description: string;
severity?: "info" | "warning" | "critical";
toolName: string;
toolCallId?: string;
agentId?: string;
sessionKey?: string;
timeoutMs?: number;
}): Promise<ApprovalOutcome> {
const timeoutMs = input.timeoutMs ?? DEFAULT_APPROVAL_TIMEOUT_MS;
type RequestResult = { id?: string; decision?: ApprovalDecision | null };
const requestResult = (await callGatewayTool(
"plugin.approval.request",
{
...input.gatewayOpts,
timeoutMs: timeoutMs + 10_000,
},
{
pluginId: PLUGIN_ID,
title: input.title.slice(0, 80),
description: input.description.slice(0, 256),
severity: input.severity ?? "warning",
toolName: input.toolName,
toolCallId: input.toolCallId,
agentId: input.agentId,
sessionKey: input.sessionKey,
timeoutMs,
twoPhase: true,
},
)) as RequestResult | undefined;
if (!requestResult || requestResult.decision === null) {
// Approval system explicitly declined or no operator available.
return { decision: null };
}
// Two-phase: if the request returned a synchronous decision, use it;
// otherwise wait on the approval id.
if (requestResult.decision) {
return { decision: requestResult.decision, approvalId: requestResult.id };
}
if (!requestResult.id) {
return { decision: null };
}
type WaitResult = { id?: string; decision?: ApprovalDecision | null };
const waitResult = (await callGatewayTool(
"plugin.approval.waitDecision",
{
...input.gatewayOpts,
timeoutMs: timeoutMs + 10_000,
},
{ id: requestResult.id },
)) as WaitResult | undefined;
return {
decision: waitResult?.decision ?? null,
approvalId: requestResult.id,
};
}

View File

@@ -0,0 +1,84 @@
// Append-only audit log for file-transfer operations.
//
// Records every decision (allow/deny/error) at the gateway-side tool
// layer. Lands at ~/.openclaw/audit/file-transfer.jsonl. Rotation is
// caller's responsibility — the file grows unbounded.
//
// Log records do NOT include file contents or hashes of secrets. They do
// include canonical paths and sha256 of the payload, so treat the audit
// file as sensitive.
import fs from "node:fs/promises";
import os from "node:os";
import path from "node:path";
export type FileTransferAuditOp = "file.fetch" | "dir.list" | "dir.fetch" | "file.write";
export type FileTransferAuditDecision =
| "allowed"
| "allowed:once"
| "allowed:always"
| "denied:no_policy"
| "denied:policy"
| "denied:approval"
| "denied:command_not_allowed"
| "denied:symlink_escape"
| "error";
export type FileTransferAuditRecord = {
timestamp: string;
op: FileTransferAuditOp;
nodeId: string;
nodeDisplayName?: string;
requestedPath: string;
canonicalPath?: string;
decision: FileTransferAuditDecision;
errorCode?: string;
errorMessage?: string;
sizeBytes?: number;
sha256?: string;
durationMs?: number;
// Tying back to the agent that initiated the op
requesterAgentId?: string;
sessionKey?: string;
// Reason text for denials
reason?: string;
};
let auditDirPromise: Promise<string> | null = null;
async function ensureAuditDir(): Promise<string> {
if (auditDirPromise) {
return auditDirPromise;
}
auditDirPromise = (async () => {
const dir = path.join(os.homedir(), ".openclaw", "audit");
await fs.mkdir(dir, { recursive: true, mode: 0o700 });
return dir;
})();
return auditDirPromise;
}
function auditFilePath(dir: string): string {
return path.join(dir, "file-transfer.jsonl");
}
/**
* Append an audit record. Best-effort — failures are logged to stderr and
* never propagated to the caller (the caller's operation is the source of
* truth, not the audit write).
*/
export async function appendFileTransferAudit(
record: Omit<FileTransferAuditRecord, "timestamp">,
): Promise<void> {
try {
const dir = await ensureAuditDir();
const line = `${JSON.stringify({
timestamp: new Date().toISOString(),
...record,
})}\n`;
await fs.appendFile(auditFilePath(dir), line, { mode: 0o600 });
} catch (e) {
process.stderr.write(`[file-transfer:audit] append failed: ${String(e)}\n`);
}
}

View File

@@ -0,0 +1,68 @@
// Shared error code surface across the four file-transfer tools/handlers.
// Every tool returns the same { ok: false, code, message, canonicalPath? }
// shape so the model can reason about errors uniformly.
export type FileTransferErrCode =
// Path-shape errors (caller's fault)
| "INVALID_PATH"
| "INVALID_BASE64"
| "INVALID_PARAMS"
// Filesystem errors (file/dir layer)
| "NOT_FOUND"
| "PERMISSION_DENIED"
| "IS_DIRECTORY"
| "IS_FILE"
| "PARENT_NOT_FOUND"
| "EXISTS_NO_OVERWRITE"
| "READ_ERROR"
| "WRITE_ERROR"
// Size/limit errors
| "FILE_TOO_LARGE"
| "TREE_TOO_LARGE"
// Safety errors
| "PATH_TRAVERSAL"
| "SYMLINK_TARGET_DENIED"
| "INTEGRITY_FAILURE"
// Policy errors (gateway-side)
| "POLICY_DENIED"
| "NO_POLICY";
export type FileTransferErr = {
ok: false;
code: FileTransferErrCode;
message: string;
canonicalPath?: string;
};
export function err(
code: FileTransferErrCode,
message: string,
canonicalPath?: string,
): FileTransferErr {
return { ok: false, code, message, ...(canonicalPath ? { canonicalPath } : {}) };
}
// Translate a node-side fs error to a public error code.
export function classifyFsError(e: unknown): FileTransferErrCode {
const code = (e as { code?: string } | null)?.code;
if (code === "ENOENT") {
return "NOT_FOUND";
}
if (code === "EACCES" || code === "EPERM") {
return "PERMISSION_DENIED";
}
if (code === "EISDIR") {
return "IS_DIRECTORY";
}
return "READ_ERROR";
}
// Convert a node-host error payload to a thrown Error for agent-tool consumption.
// The agent-tool surfaces these as failed tool results uniformly.
export function throwFromNodePayload(operation: string, payload: Record<string, unknown>): never {
const code = typeof payload.code === "string" ? payload.code : "ERROR";
const message = typeof payload.message === "string" ? payload.message : `${operation} failed`;
const canonical =
typeof payload.canonicalPath === "string" ? ` (canonical=${payload.canonicalPath})` : "";
throw new Error(`${operation} ${code}: ${message}${canonical}`);
}

View File

@@ -0,0 +1,174 @@
// One-stop gatekeep: evaluate policy, prompt operator if needed, persist
// allow-always to config, audit the outcome. Returns a uniform decision
// the tool can act on.
import { requestFileTransferApproval } from "./approval.js";
import { appendFileTransferAudit, type FileTransferAuditOp } from "./audit.js";
import type { GatewayCallOptions } from "./params.js";
import { evaluateFilePolicy, persistAllowAlways, type FilePolicyKind } from "./policy.js";
export type GatekeepOutcome = { ok: true; maxBytes?: number } | { ok: false; throwMessage: string };
/**
* Single-call entry point used by every tool's execute() before it
* forwards to the node. Handles policy evaluation, optional
* plugin-approval prompt, persistence on allow-always, and the
* pre-flight audit log entry.
*
* Caller is responsible for the post-flight check (re-evaluate against
* canonicalPath returned by the node) — we don't have the canonical
* path here yet.
*/
export async function gatekeep(input: {
op: FileTransferAuditOp;
nodeId: string;
nodeDisplayName?: string;
kind: FilePolicyKind;
path: string;
toolCallId?: string;
agentId?: string;
sessionKey?: string;
gatewayOpts: GatewayCallOptions;
startedAt: number;
/** Operation-friendly label for the approval prompt, e.g. "Read file". */
promptVerb: string;
}): Promise<GatekeepOutcome> {
const decision = evaluateFilePolicy({
nodeId: input.nodeId,
nodeDisplayName: input.nodeDisplayName,
kind: input.kind,
path: input.path,
});
// Silent allow path.
if (decision.ok && decision.reason === "matched-allow") {
return { ok: true, maxBytes: decision.maxBytes };
}
// ask=always: prompt even on a match.
// Or: ask=on-miss + no allow match: prompt.
const shouldAsk =
(decision.ok && decision.reason === "ask-always") ||
(!decision.ok && decision.askable === true);
if (shouldAsk) {
const verb = input.promptVerb;
const subject = input.nodeDisplayName ?? input.nodeId;
const approval = await requestFileTransferApproval({
gatewayOpts: input.gatewayOpts,
title: `${verb}: ${input.path}`,
description: `Allow ${verb.toLowerCase()} on ${subject}\nPath: ${input.path}\nKind: ${input.kind}\n\n"allow-always" appends this exact path to allow${input.kind === "read" ? "Read" : "Write"}Paths.`,
severity: input.kind === "write" ? "warning" : "info",
toolName: input.op,
toolCallId: input.toolCallId,
agentId: input.agentId,
sessionKey: input.sessionKey,
});
if (approval.decision === "deny") {
await appendFileTransferAudit({
op: input.op,
nodeId: input.nodeId,
nodeDisplayName: input.nodeDisplayName,
requestedPath: input.path,
decision: "denied:approval",
reason: "operator denied",
durationMs: Date.now() - input.startedAt,
});
return {
ok: false,
throwMessage: `${input.op} APPROVAL_DENIED: operator denied the prompt`,
};
}
if (approval.decision === "allow-once") {
await appendFileTransferAudit({
op: input.op,
nodeId: input.nodeId,
nodeDisplayName: input.nodeDisplayName,
requestedPath: input.path,
decision: "allowed:once",
durationMs: Date.now() - input.startedAt,
});
return {
ok: true,
maxBytes: decision.ok ? decision.maxBytes : undefined,
};
}
if (approval.decision === "allow-always") {
try {
await persistAllowAlways({
nodeId: input.nodeId,
nodeDisplayName: input.nodeDisplayName,
kind: input.kind,
path: input.path,
});
} catch (e) {
// The approval is still valid for this call — failure to persist
// shouldn't block the operation. Just note it in the audit.
await appendFileTransferAudit({
op: input.op,
nodeId: input.nodeId,
nodeDisplayName: input.nodeDisplayName,
requestedPath: input.path,
decision: "allowed:always",
reason: `persist failed: ${String(e)}`,
durationMs: Date.now() - input.startedAt,
});
return {
ok: true,
maxBytes: decision.ok ? decision.maxBytes : undefined,
};
}
await appendFileTransferAudit({
op: input.op,
nodeId: input.nodeId,
nodeDisplayName: input.nodeDisplayName,
requestedPath: input.path,
decision: "allowed:always",
durationMs: Date.now() - input.startedAt,
});
return {
ok: true,
maxBytes: decision.ok ? decision.maxBytes : undefined,
};
}
// null decision: no operator available, treat as deny.
await appendFileTransferAudit({
op: input.op,
nodeId: input.nodeId,
nodeDisplayName: input.nodeDisplayName,
requestedPath: input.path,
decision: "denied:approval",
reason: "no operator available to approve",
durationMs: Date.now() - input.startedAt,
});
return {
ok: false,
throwMessage: `${input.op} APPROVAL_UNAVAILABLE: no operator client connected to approve the request`,
};
}
// Plain deny path.
if (!decision.ok) {
await appendFileTransferAudit({
op: input.op,
nodeId: input.nodeId,
nodeDisplayName: input.nodeDisplayName,
requestedPath: input.path,
decision: decision.code === "NO_POLICY" ? "denied:no_policy" : "denied:policy",
errorCode: decision.code,
reason: decision.reason,
durationMs: Date.now() - input.startedAt,
});
return {
ok: false,
throwMessage: `${input.op} ${decision.code}: ${decision.reason}`,
};
}
// Shouldn't reach here.
return { ok: true, maxBytes: undefined };
}

View File

@@ -0,0 +1,53 @@
import path from "node:path";
// Single source of truth for extension→MIME mapping. Used by all four
// handlers/tools so adding a new extension lands everywhere at once.
export const EXTENSION_MIME: Record<string, string> = {
".png": "image/png",
".jpg": "image/jpeg",
".jpeg": "image/jpeg",
".webp": "image/webp",
".gif": "image/gif",
".bmp": "image/bmp",
".heic": "image/heic",
".heif": "image/heif",
".pdf": "application/pdf",
".txt": "text/plain",
".log": "text/plain",
".md": "text/markdown",
".json": "application/json",
".csv": "text/csv",
".html": "text/html",
".xml": "application/xml",
".zip": "application/zip",
".tar": "application/x-tar",
".gz": "application/gzip",
};
// MIME types we treat as inline-displayable images for vision-capable models.
// Note: heic/heif are detectable but not all providers can render them, so we
// leave them out of the inline-image set and let them flow as text+saved-path.
export const IMAGE_MIME_INLINE_SET = new Set([
"image/png",
"image/jpeg",
"image/webp",
"image/gif",
]);
// Plain-text MIME types where inlining the content into a text block is more
// useful than a "saved at <path>" stub for small files (under TEXT_INLINE_MAX).
export const TEXT_INLINE_MIME_SET = new Set([
"text/plain",
"text/markdown",
"text/csv",
"text/html",
"application/json",
"application/xml",
]);
export const TEXT_INLINE_MAX_BYTES = 8 * 1024;
export function mimeFromExtension(filePath: string): string {
const ext = path.extname(filePath).toLowerCase();
return EXTENSION_MIME[ext] ?? "application/octet-stream";
}

View File

@@ -0,0 +1,62 @@
// Shared param-validation helpers used by all four agent tools.
// Goal: identical validation behavior + identical error shapes everywhere.
export type GatewayCallOptions = {
gatewayUrl?: string;
gatewayToken?: string;
timeoutMs?: number;
};
export function readGatewayCallOptions(params: Record<string, unknown>): GatewayCallOptions {
const opts: GatewayCallOptions = {};
if (typeof params.gatewayUrl === "string" && params.gatewayUrl.trim()) {
opts.gatewayUrl = params.gatewayUrl.trim();
}
if (typeof params.gatewayToken === "string" && params.gatewayToken.trim()) {
opts.gatewayToken = params.gatewayToken.trim();
}
if (typeof params.timeoutMs === "number" && Number.isFinite(params.timeoutMs)) {
opts.timeoutMs = params.timeoutMs;
}
return opts;
}
export function readTrimmedString(params: Record<string, unknown>, key: string): string {
const value = params[key];
return typeof value === "string" ? value.trim() : "";
}
export function readBoolean(
params: Record<string, unknown>,
key: string,
defaultValue = false,
): boolean {
const value = params[key];
if (typeof value === "boolean") {
return value;
}
return defaultValue;
}
export function readClampedInt(params: {
input: Record<string, unknown>;
key: string;
defaultValue: number;
hardMin: number;
hardMax: number;
}): number {
const value = params.input[params.key];
const requested =
typeof value === "number" && Number.isFinite(value) ? Math.floor(value) : params.defaultValue;
return Math.max(params.hardMin, Math.min(requested, params.hardMax));
}
export function humanSize(bytes: number): string {
if (bytes < 1024) {
return `${bytes} B`;
}
if (bytes < 1024 * 1024) {
return `${(bytes / 1024).toFixed(1)} KB`;
}
return `${(bytes / (1024 * 1024)).toFixed(2)} MB`;
}

View File

@@ -0,0 +1,269 @@
// Path policy for file-transfer tools.
//
// Default behavior is DENY. The operator must explicitly opt in by adding
// a config block to ~/.openclaw/openclaw.json under
// `gateway.nodes.fileTransfer`. Without a matching block, every file
// operation is rejected before reaching the node.
//
// Schema (informal):
//
// "gateway": {
// "nodes": {
// "fileTransfer": {
// "<nodeId-or-displayName>": {
// "ask": "off" | "on-miss" | "always",
// "allowReadPaths": ["~/Screenshots/**", "/tmp/**"],
// "allowWritePaths": ["~/Downloads/**"],
// "denyPaths": ["**/.ssh/**", "**/.aws/**"],
// "maxBytes": 16777216
// },
// "*": { "ask": "on-miss" }
// }
// }
// }
//
// `ask` modes:
// off — silent: allow if matched, deny if not (today's default)
// on-miss — silent allow if matched; prompt operator if not matched
// always — prompt operator on every call (denyPaths still hard-deny)
//
// `denyPaths` always wins, even in `ask: always`.
// `allow-always` from the prompt appends the path back into allowReadPaths /
// allowWritePaths via mutateConfigFile.
import os from "node:os";
import path from "node:path";
import { minimatch } from "minimatch";
import { mutateConfigFile } from "openclaw/plugin-sdk/config-runtime";
import { getRuntimeConfig } from "openclaw/plugin-sdk/config-runtime";
export type FilePolicyKind = "read" | "write";
export type FilePolicyAskMode = "off" | "on-miss" | "always";
export type FilePolicyDecision =
| { ok: true; reason: "matched-allow"; maxBytes?: number }
| { ok: true; reason: "ask-always"; askMode: FilePolicyAskMode; maxBytes?: number }
| {
ok: false;
code: "NO_POLICY" | "POLICY_DENIED";
reason: string;
askable: boolean;
askMode?: FilePolicyAskMode;
};
type NodeFilePolicyConfig = {
ask?: FilePolicyAskMode;
allowReadPaths?: string[];
allowWritePaths?: string[];
denyPaths?: string[];
maxBytes?: number;
};
type FilePolicyConfig = Record<string, NodeFilePolicyConfig>;
function readFilePolicyConfig(): FilePolicyConfig | null {
const cfg = getRuntimeConfig();
const gateway = (cfg as { gateway?: unknown }).gateway;
if (!gateway || typeof gateway !== "object") {
return null;
}
const nodes = (gateway as { nodes?: unknown }).nodes;
if (!nodes || typeof nodes !== "object") {
return null;
}
const fileTransfer = (nodes as { fileTransfer?: unknown }).fileTransfer;
if (!fileTransfer || typeof fileTransfer !== "object" || Array.isArray(fileTransfer)) {
return null;
}
return fileTransfer as FilePolicyConfig;
}
function expandTilde(p: string): string {
if (p.startsWith("~/") || p === "~") {
return path.join(os.homedir(), p.slice(p === "~" ? 1 : 2));
}
return p;
}
function normalizeGlobs(patterns: string[] | undefined): string[] {
if (!Array.isArray(patterns)) {
return [];
}
return patterns
.filter((p): p is string => typeof p === "string" && p.trim().length > 0)
.map((p) => expandTilde(p.trim()));
}
function matchesAny(target: string, patterns: string[]): boolean {
for (const pattern of patterns) {
if (minimatch(target, pattern, { dot: true })) {
return true;
}
}
return false;
}
function resolveNodePolicy(
config: FilePolicyConfig,
nodeId: string,
nodeDisplayName?: string,
): { key: string; entry: NodeFilePolicyConfig } | null {
const candidates = [nodeId, nodeDisplayName].filter(
(k): k is string => typeof k === "string" && k.length > 0,
);
for (const key of candidates) {
if (config[key]) {
return { key, entry: config[key] };
}
}
if (config["*"]) {
return { key: "*", entry: config["*"] };
}
return null;
}
function normalizeAskMode(value: unknown): FilePolicyAskMode {
if (value === "on-miss" || value === "always" || value === "off") {
return value;
}
return "off";
}
/**
* Evaluate whether (nodeId, kind, path) is permitted.
*
* Resolution order:
* 1. No fileTransfer config or no entry for this node → NO_POLICY (deny,
* not askable — operator hasn't opted in at all).
* 2. denyPaths matches → POLICY_DENIED, not askable (hard deny).
* 3. ask=always → ask-always (prompt every time).
* 4. allowPaths matches → matched-allow (silent allow).
* 5. ask=on-miss → POLICY_DENIED with askable=true.
* 6. ask=off (or unset) → POLICY_DENIED, not askable.
*/
export function evaluateFilePolicy(input: {
nodeId: string;
nodeDisplayName?: string;
kind: FilePolicyKind;
path: string;
}): FilePolicyDecision {
const config = readFilePolicyConfig();
if (!config) {
return {
ok: false,
code: "NO_POLICY",
reason:
"no gateway.nodes.fileTransfer config; file-transfer is deny-by-default until configured",
askable: false,
};
}
const resolved = resolveNodePolicy(config, input.nodeId, input.nodeDisplayName);
if (!resolved) {
return {
ok: false,
code: "NO_POLICY",
reason: `no fileTransfer policy entry for "${input.nodeDisplayName ?? input.nodeId}"; configure gateway.nodes.fileTransfer or "*"`,
askable: false,
};
}
const nodeConfig = resolved.entry;
const askMode = normalizeAskMode(nodeConfig.ask);
// 1. Deny patterns always win.
const denyPatterns = normalizeGlobs(nodeConfig.denyPaths);
if (matchesAny(input.path, denyPatterns)) {
return {
ok: false,
code: "POLICY_DENIED",
reason: "path matches a denyPaths pattern",
askable: false,
askMode,
};
}
const maxBytes =
typeof nodeConfig.maxBytes === "number" && Number.isFinite(nodeConfig.maxBytes)
? Math.max(1, Math.floor(nodeConfig.maxBytes))
: undefined;
// 2. ask=always: prompt every time even if matched.
if (askMode === "always") {
return { ok: true, reason: "ask-always", askMode, maxBytes };
}
// 3. Match against allow list for this kind.
const allowPatterns =
input.kind === "read"
? normalizeGlobs(nodeConfig.allowReadPaths)
: normalizeGlobs(nodeConfig.allowWritePaths);
if (allowPatterns.length > 0 && matchesAny(input.path, allowPatterns)) {
return { ok: true, reason: "matched-allow", maxBytes };
}
// 4. No allow match. Either askable on miss or hard-deny.
if (askMode === "on-miss") {
return {
ok: false,
code: "POLICY_DENIED",
reason: `path does not match any allow${input.kind === "read" ? "Read" : "Write"}Paths pattern`,
askable: true,
askMode,
};
}
return {
ok: false,
code: "POLICY_DENIED",
reason:
allowPatterns.length === 0
? `no allow${input.kind === "read" ? "Read" : "Write"}Paths configured`
: `path does not match any allow${input.kind === "read" ? "Read" : "Write"}Paths pattern`,
askable: false,
askMode,
};
}
/**
* Persist an "allow-always" approval by appending the path to the
* relevant allowReadPaths / allowWritePaths list for the node. Uses
* mutateConfigFile so the change survives gateway restarts.
*
* Inserts under whichever key matched the policy (per-node entry, or
* the "*" wildcard if that's what was hit). If no entry exists yet,
* creates one keyed by nodeDisplayName ?? nodeId.
*/
export async function persistAllowAlways(input: {
nodeId: string;
nodeDisplayName?: string;
kind: FilePolicyKind;
path: string;
}): Promise<void> {
const field = input.kind === "read" ? "allowReadPaths" : "allowWritePaths";
await mutateConfigFile({
afterWrite: { mode: "none", reason: "file-transfer allow-always policy update" },
mutate: (draft) => {
// Cast through unknown — OpenClawConfig type doesn't yet declare
// gateway.nodes.fileTransfer.
const root = draft as unknown as Record<string, unknown>;
const gateway = (root.gateway ??= {}) as Record<string, unknown>;
const nodes = (gateway.nodes ??= {}) as Record<string, unknown>;
const fileTransfer = (nodes.fileTransfer ??= {}) as Record<string, NodeFilePolicyConfig>;
const candidates = [input.nodeId, input.nodeDisplayName, "*"].filter(
(k): k is string => typeof k === "string" && k.length > 0,
);
let key = candidates.find((c) => fileTransfer[c]);
if (!key) {
key = input.nodeDisplayName ?? input.nodeId;
fileTransfer[key] = {};
}
const entry = fileTransfer[key];
const list = Array.isArray(entry[field]) ? (entry[field] as string[]) : [];
if (!list.includes(input.path)) {
list.push(input.path);
}
entry[field] = list;
},
});
}

View File

@@ -0,0 +1,361 @@
import { spawn } from "node:child_process";
import crypto from "node:crypto";
import fs from "node:fs/promises";
import path from "node:path";
import {
callGatewayTool,
listNodes,
resolveNodeIdFromList,
type AnyAgentTool,
type NodeListNode,
} from "openclaw/plugin-sdk/agent-harness-runtime";
import { saveMediaBuffer } from "openclaw/plugin-sdk/media-store";
import { Type } from "typebox";
import { appendFileTransferAudit } from "../shared/audit.js";
import { throwFromNodePayload } from "../shared/errors.js";
import { gatekeep } from "../shared/gatekeep.js";
import { IMAGE_MIME_INLINE_SET, mimeFromExtension } from "../shared/mime.js";
import {
humanSize,
readBoolean,
readClampedInt,
readGatewayCallOptions,
readTrimmedString,
} from "../shared/params.js";
import { evaluateFilePolicy } from "../shared/policy.js";
const DIR_FETCH_DEFAULT_MAX_BYTES = 8 * 1024 * 1024;
const DIR_FETCH_HARD_MAX_BYTES = 16 * 1024 * 1024;
const FILE_TRANSFER_SUBDIR = "file-transfer";
// Cap how many local file paths we surface in details.media.mediaUrls.
// Larger trees still land on disk but we don't spam the channel adapter
// with hundreds of attachments.
const MEDIA_URL_CAP = 25;
// Hard timeout for the gateway-side `tar -xzf` unpack process.
const TAR_UNPACK_TIMEOUT_MS = 60_000;
const DirFetchToolSchema = Type.Object({
node: Type.String({
description: "Node id, name, or IP. Resolves the same way as the nodes tool.",
}),
path: Type.String({
description: "Absolute path to the directory on the node to fetch. Canonicalized server-side.",
}),
maxBytes: Type.Optional(
Type.Number({
description:
"Max gzipped tarball bytes to fetch. Default 8 MB, hard ceiling 16 MB (single round-trip).",
}),
),
includeDotfiles: Type.Optional(
Type.Boolean({
description: "Reserved for v2; currently always includes dotfiles (v1 quirk in BSD tar).",
}),
),
gatewayUrl: Type.Optional(Type.String()),
gatewayToken: Type.Optional(Type.String()),
timeoutMs: Type.Optional(Type.Number()),
});
async function computeFileSha256(filePath: string): Promise<string> {
const buf = await fs.readFile(filePath);
return crypto.createHash("sha256").update(buf).digest("hex");
}
type UnpackedFileEntry = {
relPath: string;
size: number;
mimeType: string;
sha256: string;
localPath: string;
};
/**
* Unpack a gzipped tarball into a target directory via `tar -xzf -`. The
* `-P` flag is intentionally omitted so absolute paths in the archive are
* stripped to relative ones and `..` traversal is rejected by tar itself.
* A hard wall-clock timeout caps the unpack at TAR_UNPACK_TIMEOUT_MS to
* avoid hangs on hostile/large archives.
*/
async function unpackTar(tarBuffer: Buffer, destDir: string): Promise<void> {
await fs.mkdir(destDir, { recursive: true, mode: 0o700 });
return new Promise((resolve, reject) => {
const tarBin = process.platform !== "win32" ? "/usr/bin/tar" : "tar";
const child = spawn(
tarBin,
[
"-xzf",
"-",
"-C",
destDir,
// Refuse archives whose paths escape destDir.
"--no-overwrite-dir",
],
{ stdio: ["pipe", "ignore", "pipe"] },
);
let stderrOut = "";
const watchdog = setTimeout(() => {
try {
child.kill("SIGKILL");
} catch {
/* already gone */
}
reject(new Error(`tar unpack timed out after ${TAR_UNPACK_TIMEOUT_MS}ms`));
}, TAR_UNPACK_TIMEOUT_MS);
child.stderr.on("data", (chunk: Buffer) => {
stderrOut += chunk.toString();
});
child.on("close", (code) => {
clearTimeout(watchdog);
if (code !== 0) {
reject(new Error(`tar unpack exited ${code}: ${stderrOut.slice(0, 300)}`));
return;
}
resolve();
});
child.on("error", (e) => {
clearTimeout(watchdog);
reject(e);
});
child.stdin.end(tarBuffer);
});
}
/**
* Walk a directory recursively, collecting file entries (skips directories).
* Skips symlinks — we don't want to follow links the archive might have
* carried in. Files only.
*/
async function walkDir(
dir: string,
rootDir: string,
): Promise<{ relPath: string; absPath: string }[]> {
const entries = await fs.readdir(dir, { withFileTypes: true });
const results: { relPath: string; absPath: string }[] = [];
for (const entry of entries) {
const absPath = path.join(dir, entry.name);
if (entry.isDirectory()) {
const nested = await walkDir(absPath, rootDir);
results.push(...nested);
} else if (entry.isFile()) {
const relPath = path.relative(rootDir, absPath);
results.push({ relPath, absPath });
}
// Symlinks are intentionally ignored: don't follow them out of destDir.
}
return results;
}
export function createDirFetchTool(): AnyAgentTool {
return {
label: "Directory Fetch",
name: "dir_fetch",
description:
"Retrieve a directory tree from a paired node as a gzipped tarball, unpack it on the gateway, and return a manifest of saved paths. Use to pull source trees, asset folders, or log directories in a single round-trip. The unpacked files live on the GATEWAY (not your local machine); pass localPath into other tools or use file_fetch on individual entries to ship them elsewhere. Rejects trees larger than 16 MB compressed. Requires operator opt-in: gateway.nodes.allowCommands must include 'dir.fetch' AND gateway.nodes.fileTransfer.<node>.allowReadPaths must match the directory path.",
parameters: DirFetchToolSchema,
execute: async (_toolCallId, args) => {
const params = args as Record<string, unknown>;
const node = readTrimmedString(params, "node");
const dirPath = readTrimmedString(params, "path");
if (!node) {
throw new Error("node required");
}
if (!dirPath) {
throw new Error("path required");
}
const maxBytes = readClampedInt({
input: params,
key: "maxBytes",
defaultValue: DIR_FETCH_DEFAULT_MAX_BYTES,
hardMin: 1,
hardMax: DIR_FETCH_HARD_MAX_BYTES,
});
const includeDotfiles = readBoolean(params, "includeDotfiles", false);
const gatewayOpts = readGatewayCallOptions(params);
const nodes: NodeListNode[] = await listNodes(gatewayOpts);
const nodeId = resolveNodeIdFromList(nodes, node, false);
const nodeMeta = nodes.find((n) => n.nodeId === nodeId);
const nodeDisplayName = nodeMeta?.displayName ?? node;
const startedAt = Date.now();
const gate = await gatekeep({
op: "dir.fetch",
nodeId,
nodeDisplayName,
kind: "read",
path: dirPath,
toolCallId: _toolCallId,
gatewayOpts,
startedAt,
promptVerb: "Fetch directory tree",
});
if (!gate.ok) {
throw new Error(gate.throwMessage);
}
const effectiveMaxBytes = gate.maxBytes ? Math.min(maxBytes, gate.maxBytes) : maxBytes;
const raw = await callGatewayTool<{ payload: unknown }>("node.invoke", gatewayOpts, {
nodeId,
command: "dir.fetch",
params: {
path: dirPath,
maxBytes: effectiveMaxBytes,
includeDotfiles,
},
idempotencyKey: crypto.randomUUID(),
});
const payload =
raw?.payload && typeof raw.payload === "object" && !Array.isArray(raw.payload)
? (raw.payload as Record<string, unknown>)
: null;
if (!payload) {
await appendFileTransferAudit({
op: "dir.fetch",
nodeId,
nodeDisplayName,
requestedPath: dirPath,
decision: "error",
errorMessage: "invalid payload",
durationMs: Date.now() - startedAt,
});
throw new Error("invalid dir.fetch payload");
}
if (payload.ok === false) {
await appendFileTransferAudit({
op: "dir.fetch",
nodeId,
nodeDisplayName,
requestedPath: dirPath,
canonicalPath:
typeof payload.canonicalPath === "string" ? payload.canonicalPath : undefined,
decision: "error",
errorCode: typeof payload.code === "string" ? payload.code : undefined,
errorMessage: typeof payload.message === "string" ? payload.message : undefined,
durationMs: Date.now() - startedAt,
});
throwFromNodePayload("dir.fetch", payload);
}
const canonicalPath = typeof payload.path === "string" ? payload.path : "";
const tarBase64 = typeof payload.tarBase64 === "string" ? payload.tarBase64 : "";
const tarBytes = typeof payload.tarBytes === "number" ? payload.tarBytes : -1;
const sha256 = typeof payload.sha256 === "string" ? payload.sha256 : "";
const fileCount = typeof payload.fileCount === "number" ? payload.fileCount : 0;
if (!canonicalPath || !tarBase64 || tarBytes < 0 || !sha256) {
throw new Error("invalid dir.fetch payload (missing fields)");
}
// Post-flight policy on canonicalized path.
if (canonicalPath !== dirPath) {
const postflight = evaluateFilePolicy({
nodeId,
nodeDisplayName,
kind: "read",
path: canonicalPath,
});
if (!postflight.ok) {
await appendFileTransferAudit({
op: "dir.fetch",
nodeId,
nodeDisplayName,
requestedPath: dirPath,
canonicalPath,
decision: "denied:symlink_escape",
errorCode: postflight.code,
reason: postflight.reason,
durationMs: Date.now() - startedAt,
});
throw new Error(
`dir.fetch SYMLINK_TARGET_DENIED: requested path resolved to ${canonicalPath} which is not allowed by policy`,
);
}
}
const tarBuffer = Buffer.from(tarBase64, "base64");
if (tarBuffer.byteLength !== tarBytes) {
throw new Error(
`dir.fetch size mismatch: payload says ${tarBytes} bytes, decoded ${tarBuffer.byteLength}`,
);
}
const localSha256 = crypto.createHash("sha256").update(tarBuffer).digest("hex");
if (localSha256 !== sha256) {
throw new Error("dir.fetch sha256 mismatch (integrity failure)");
}
// Save tarball under the file-transfer subdir (no 2-min TTL).
const savedTar = await saveMediaBuffer(
tarBuffer,
"application/gzip",
FILE_TRANSFER_SUBDIR,
DIR_FETCH_HARD_MAX_BYTES,
);
const tarDir = path.dirname(savedTar.path);
const tarBaseName = path.basename(savedTar.path, path.extname(savedTar.path));
const unpackId = `dir-fetch-${tarBaseName}`;
const rootDir = path.join(tarDir, unpackId);
await unpackTar(tarBuffer, rootDir);
const walked = await walkDir(rootDir, rootDir);
const files: UnpackedFileEntry[] = [];
for (const { relPath, absPath } of walked) {
let size = 0;
try {
const st = await fs.stat(absPath);
size = st.size;
} catch {
continue;
}
const mimeType = mimeFromExtension(relPath);
const fileSha256 = await computeFileSha256(absPath);
files.push({ relPath, size, mimeType, sha256: fileSha256, localPath: absPath });
}
const imageFiles = files.filter((f) => IMAGE_MIME_INLINE_SET.has(f.mimeType));
const nonImageFiles = files.filter((f) => !IMAGE_MIME_INLINE_SET.has(f.mimeType));
const allOrdered = [...imageFiles, ...nonImageFiles];
const droppedFromMedia = Math.max(0, allOrdered.length - MEDIA_URL_CAP);
const mediaUrls = allOrdered.slice(0, MEDIA_URL_CAP).map((f) => f.localPath);
const shortHash = sha256.slice(0, 12);
const mediaNote = droppedFromMedia
? ` (channel attaches first ${MEDIA_URL_CAP}; ${droppedFromMedia} more in details.files)`
: "";
const summaryText = `Fetched ${fileCount} files from ${canonicalPath} (${humanSize(tarBytes)} compressed, sha256:${shortHash}) — saved on the gateway under ${rootDir}/${mediaNote}`;
await appendFileTransferAudit({
op: "dir.fetch",
nodeId,
nodeDisplayName,
requestedPath: dirPath,
canonicalPath,
decision: "allowed",
sizeBytes: tarBytes,
sha256,
durationMs: Date.now() - startedAt,
});
return {
content: [{ type: "text" as const, text: summaryText }],
details: {
path: canonicalPath,
rootDir,
fileCount,
tarBytes,
sha256,
files,
media: {
mediaUrls,
},
},
};
},
};
}

View File

@@ -0,0 +1,199 @@
import crypto from "node:crypto";
import {
callGatewayTool,
listNodes,
resolveNodeIdFromList,
type AnyAgentTool,
type NodeListNode,
} from "openclaw/plugin-sdk/agent-harness-runtime";
import { Type } from "typebox";
import { appendFileTransferAudit } from "../shared/audit.js";
import { throwFromNodePayload } from "../shared/errors.js";
import { gatekeep } from "../shared/gatekeep.js";
import { readClampedInt, readGatewayCallOptions, readTrimmedString } from "../shared/params.js";
import { evaluateFilePolicy } from "../shared/policy.js";
const DIR_LIST_DEFAULT_MAX_ENTRIES = 200;
const DIR_LIST_HARD_MAX_ENTRIES = 5000;
const DirListToolSchema = Type.Object({
node: Type.String({
description: "Node id, name, or IP. Resolves the same way as the nodes tool.",
}),
path: Type.String({
description: "Absolute path to the directory on the node. Canonicalized server-side.",
}),
pageToken: Type.Optional(
Type.String({
description:
"Pagination token from a previous dir_list call. Omit to start from the beginning.",
}),
),
maxEntries: Type.Optional(
Type.Number({
description: `Max entries per page. Default ${DIR_LIST_DEFAULT_MAX_ENTRIES}, hard ceiling ${DIR_LIST_HARD_MAX_ENTRIES}.`,
}),
),
gatewayUrl: Type.Optional(Type.String()),
gatewayToken: Type.Optional(Type.String()),
timeoutMs: Type.Optional(Type.Number()),
});
export function createDirListTool(): AnyAgentTool {
return {
label: "Directory List",
name: "dir_list",
description:
"Retrieve a structured directory listing from a paired node. Returns file and subdirectory metadata (name, path, size, mimeType, isDir, mtime) without transferring file content. Use this to discover what files exist before fetching them with file_fetch. Pagination is offset-based; pass nextPageToken from the previous result. Requires operator opt-in: gateway.nodes.allowCommands must include 'dir.list' AND gateway.nodes.fileTransfer.<node>.allowReadPaths must match the directory path. Without policy configured, every call is denied.",
parameters: DirListToolSchema,
execute: async (_toolCallId, args) => {
const params = args as Record<string, unknown>;
const node = readTrimmedString(params, "node");
const dirPath = readTrimmedString(params, "path");
if (!node) {
throw new Error("node required");
}
if (!dirPath) {
throw new Error("path required");
}
const maxEntries = readClampedInt({
input: params,
key: "maxEntries",
defaultValue: DIR_LIST_DEFAULT_MAX_ENTRIES,
hardMin: 1,
hardMax: DIR_LIST_HARD_MAX_ENTRIES,
});
const pageToken =
typeof params.pageToken === "string" && params.pageToken.trim()
? params.pageToken.trim()
: undefined;
const gatewayOpts = readGatewayCallOptions(params);
const nodes: NodeListNode[] = await listNodes(gatewayOpts);
const nodeId = resolveNodeIdFromList(nodes, node, false);
const nodeMeta = nodes.find((n) => n.nodeId === nodeId);
const nodeDisplayName = nodeMeta?.displayName ?? node;
const startedAt = Date.now();
const gate = await gatekeep({
op: "dir.list",
nodeId,
nodeDisplayName,
kind: "read",
path: dirPath,
toolCallId: _toolCallId,
gatewayOpts,
startedAt,
promptVerb: "List directory",
});
if (!gate.ok) {
throw new Error(gate.throwMessage);
}
const raw = await callGatewayTool<{ payload: unknown }>("node.invoke", gatewayOpts, {
nodeId,
command: "dir.list",
params: {
path: dirPath,
pageToken,
maxEntries,
},
idempotencyKey: crypto.randomUUID(),
});
const payload =
raw?.payload && typeof raw.payload === "object" && !Array.isArray(raw.payload)
? (raw.payload as Record<string, unknown>)
: null;
if (!payload) {
await appendFileTransferAudit({
op: "dir.list",
nodeId,
nodeDisplayName,
requestedPath: dirPath,
decision: "error",
errorMessage: "invalid payload",
durationMs: Date.now() - startedAt,
});
throw new Error("invalid dir.list payload");
}
if (payload.ok === false) {
await appendFileTransferAudit({
op: "dir.list",
nodeId,
nodeDisplayName,
requestedPath: dirPath,
canonicalPath:
typeof payload.canonicalPath === "string" ? payload.canonicalPath : undefined,
decision: "error",
errorCode: typeof payload.code === "string" ? payload.code : undefined,
errorMessage: typeof payload.message === "string" ? payload.message : undefined,
durationMs: Date.now() - startedAt,
});
throwFromNodePayload("dir.list", payload);
}
const canonicalPath = typeof payload.path === "string" ? payload.path : dirPath;
// Post-flight policy on canonicalized dir.
if (canonicalPath !== dirPath) {
const postflight = evaluateFilePolicy({
nodeId,
nodeDisplayName,
kind: "read",
path: canonicalPath,
});
if (!postflight.ok) {
await appendFileTransferAudit({
op: "dir.list",
nodeId,
nodeDisplayName,
requestedPath: dirPath,
canonicalPath,
decision: "denied:symlink_escape",
errorCode: postflight.code,
reason: postflight.reason,
durationMs: Date.now() - startedAt,
});
throw new Error(
`dir.list SYMLINK_TARGET_DENIED: requested path resolved to ${canonicalPath} which is not allowed by policy`,
);
}
}
const entries = Array.isArray(payload.entries)
? (payload.entries as Array<Record<string, unknown>>)
: [];
const truncated = payload.truncated === true;
const nextPageToken =
typeof payload.nextPageToken === "string" ? payload.nextPageToken : undefined;
const fileCount = entries.filter((e) => !e.isDir).length;
const dirCount = entries.filter((e) => e.isDir).length;
const truncatedNote = truncated ? " (more entries available — pass nextPageToken)" : "";
const summary = `Listed ${canonicalPath}: ${fileCount} file${fileCount !== 1 ? "s" : ""}, ${dirCount} subdir${dirCount !== 1 ? "s" : ""}${truncatedNote}`;
await appendFileTransferAudit({
op: "dir.list",
nodeId,
nodeDisplayName,
requestedPath: dirPath,
canonicalPath,
decision: "allowed",
durationMs: Date.now() - startedAt,
});
return {
content: [{ type: "text" as const, text: summary }],
details: {
path: canonicalPath,
entries,
nextPageToken,
truncated,
},
};
},
};
}

View File

@@ -0,0 +1,242 @@
import crypto from "node:crypto";
import {
callGatewayTool,
listNodes,
resolveNodeIdFromList,
type AnyAgentTool,
type NodeListNode,
} from "openclaw/plugin-sdk/agent-harness-runtime";
import { saveMediaBuffer } from "openclaw/plugin-sdk/media-store";
import { Type } from "typebox";
import { appendFileTransferAudit } from "../shared/audit.js";
import { throwFromNodePayload } from "../shared/errors.js";
import { gatekeep } from "../shared/gatekeep.js";
import {
IMAGE_MIME_INLINE_SET,
TEXT_INLINE_MAX_BYTES,
TEXT_INLINE_MIME_SET,
} from "../shared/mime.js";
import { humanSize, readGatewayCallOptions, readTrimmedString } from "../shared/params.js";
import { evaluateFilePolicy } from "../shared/policy.js";
const FILE_FETCH_DEFAULT_MAX_BYTES = 8 * 1024 * 1024;
const FILE_FETCH_HARD_MAX_BYTES = 16 * 1024 * 1024;
// Stash fetched files in a non-TTL subdir so a follow-up tool call within
// the same agent turn can still reference them. The default "inbound"
// subdir gets cleaned every 2 minutes which has bitten us in iMessage flows.
const FILE_TRANSFER_SUBDIR = "file-transfer";
const FileFetchToolSchema = Type.Object({
node: Type.String({
description: "Node id, name, or IP. Resolves the same way as the nodes tool.",
}),
path: Type.String({
description: "Absolute path to the file on the node. Canonicalized server-side.",
}),
maxBytes: Type.Optional(
Type.Number({
description: "Max bytes to fetch. Default 8 MB, hard ceiling 16 MB (single round-trip).",
}),
),
gatewayUrl: Type.Optional(Type.String()),
gatewayToken: Type.Optional(Type.String()),
timeoutMs: Type.Optional(Type.Number()),
});
export function createFileFetchTool(): AnyAgentTool {
return {
label: "File Fetch",
name: "file_fetch",
description:
"Retrieve a file from a paired node by absolute path. Returns image content blocks for image MIME types, inlines small text files (≤8 KB) as text content, and saves everything else under the gateway media store with a path you can pass to file_write or other tools. Use this for screenshots, photos, receipts, logs, source files. Pair with file_write to copy a file from one node to another (no exec/cp shell-out needed). Requires operator opt-in: gateway.nodes.allowCommands must include 'file.fetch' AND gateway.nodes.fileTransfer.<node>.allowReadPaths must match the path. Without policy configured, every call is denied.",
parameters: FileFetchToolSchema,
execute: async (_toolCallId, args) => {
const params = args as Record<string, unknown>;
const node = readTrimmedString(params, "node");
const filePath = readTrimmedString(params, "path");
if (!node) {
throw new Error("node required");
}
if (!filePath) {
throw new Error("path required");
}
const requestedMax =
typeof params.maxBytes === "number" && Number.isFinite(params.maxBytes)
? Math.floor(params.maxBytes)
: FILE_FETCH_DEFAULT_MAX_BYTES;
const maxBytes = Math.max(1, Math.min(requestedMax, FILE_FETCH_HARD_MAX_BYTES));
const gatewayOpts = readGatewayCallOptions(params);
const nodes: NodeListNode[] = await listNodes(gatewayOpts);
const nodeId = resolveNodeIdFromList(nodes, node, false);
const nodeMeta = nodes.find((n) => n.nodeId === nodeId);
const nodeDisplayName = nodeMeta?.displayName ?? node;
const startedAt = Date.now();
// Gatekeep: evaluate policy + prompt operator if ask=on-miss/always.
// Post-flight policy check below (after node returns canonicalPath)
// catches symlink escapes.
const gate = await gatekeep({
op: "file.fetch",
nodeId,
nodeDisplayName,
kind: "read",
path: filePath,
toolCallId: _toolCallId,
gatewayOpts,
startedAt,
promptVerb: "Read file",
});
if (!gate.ok) {
throw new Error(gate.throwMessage);
}
const effectiveMaxBytes = gate.maxBytes ? Math.min(maxBytes, gate.maxBytes) : maxBytes;
const raw = await callGatewayTool<{ payload: unknown }>("node.invoke", gatewayOpts, {
nodeId,
command: "file.fetch",
params: {
path: filePath,
maxBytes: effectiveMaxBytes,
},
idempotencyKey: crypto.randomUUID(),
});
const payload =
raw?.payload && typeof raw.payload === "object" && !Array.isArray(raw.payload)
? (raw.payload as Record<string, unknown>)
: null;
if (!payload) {
await appendFileTransferAudit({
op: "file.fetch",
nodeId,
nodeDisplayName,
requestedPath: filePath,
decision: "error",
errorMessage: "invalid payload",
durationMs: Date.now() - startedAt,
});
throw new Error("invalid file.fetch payload");
}
if (payload.ok === false) {
await appendFileTransferAudit({
op: "file.fetch",
nodeId,
nodeDisplayName,
requestedPath: filePath,
canonicalPath:
typeof payload.canonicalPath === "string" ? payload.canonicalPath : undefined,
decision: "error",
errorCode: typeof payload.code === "string" ? payload.code : undefined,
errorMessage: typeof payload.message === "string" ? payload.message : undefined,
durationMs: Date.now() - startedAt,
});
throwFromNodePayload("file.fetch", payload);
}
const canonicalPath = typeof payload.path === "string" ? payload.path : "";
const size = typeof payload.size === "number" ? payload.size : -1;
const mimeType = typeof payload.mimeType === "string" ? payload.mimeType : "";
const base64 = typeof payload.base64 === "string" ? payload.base64 : "";
const sha256 = typeof payload.sha256 === "string" ? payload.sha256 : "";
if (!canonicalPath || size < 0 || !mimeType || !base64 || !sha256) {
throw new Error("invalid file.fetch payload (missing fields)");
}
// Post-flight policy check on the canonicalized path. Catches the
// symlink-escape case where the requested path matched policy but
// resolves to something that doesn't.
if (canonicalPath !== filePath) {
const postflight = evaluateFilePolicy({
nodeId,
nodeDisplayName,
kind: "read",
path: canonicalPath,
});
if (!postflight.ok) {
await appendFileTransferAudit({
op: "file.fetch",
nodeId,
nodeDisplayName,
requestedPath: filePath,
canonicalPath,
decision: "denied:symlink_escape",
errorCode: postflight.code,
reason: postflight.reason,
durationMs: Date.now() - startedAt,
});
throw new Error(
`file.fetch SYMLINK_TARGET_DENIED: requested path resolved to ${canonicalPath} which is not allowed by policy`,
);
}
}
const buffer = Buffer.from(base64, "base64");
if (buffer.byteLength !== size) {
throw new Error(
`file.fetch size mismatch: payload says ${size} bytes, decoded ${buffer.byteLength}`,
);
}
const localSha256 = crypto.createHash("sha256").update(buffer).digest("hex");
if (localSha256 !== sha256) {
throw new Error("file.fetch sha256 mismatch (integrity failure)");
}
const saved = await saveMediaBuffer(
buffer,
mimeType,
FILE_TRANSFER_SUBDIR,
FILE_FETCH_HARD_MAX_BYTES,
);
const localPath = saved.path;
const isInlineImage = IMAGE_MIME_INLINE_SET.has(mimeType);
const isInlineText = TEXT_INLINE_MIME_SET.has(mimeType) && size <= TEXT_INLINE_MAX_BYTES;
const content: Array<
{ type: "text"; text: string } | { type: "image"; data: string; mimeType: string }
> = [];
if (isInlineImage) {
content.push({ type: "image", data: base64, mimeType });
} else if (isInlineText) {
const text = buffer.toString("utf-8");
content.push({
type: "text",
text: `Fetched ${canonicalPath} (${humanSize(size)}, ${mimeType}, sha256:${sha256.slice(0, 12)}) saved at ${localPath}\n\n--- contents ---\n${text}`,
});
} else {
const shortHash = sha256.slice(0, 12);
content.push({
type: "text",
text: `Fetched ${canonicalPath} (${humanSize(size)}, ${mimeType}, sha256:${shortHash}) saved at ${localPath}`,
});
}
await appendFileTransferAudit({
op: "file.fetch",
nodeId,
nodeDisplayName,
requestedPath: filePath,
canonicalPath,
decision: "allowed",
sizeBytes: size,
sha256,
durationMs: Date.now() - startedAt,
});
return {
content,
details: {
path: canonicalPath,
size,
mimeType,
sha256,
localPath,
media: {
mediaUrls: [localPath],
},
},
};
},
};
}

View File

@@ -0,0 +1,227 @@
import crypto from "node:crypto";
import {
callGatewayTool,
listNodes,
resolveNodeIdFromList,
type AnyAgentTool,
type NodeListNode,
} from "openclaw/plugin-sdk/agent-harness-runtime";
import { Type } from "typebox";
import { appendFileTransferAudit } from "../shared/audit.js";
import { throwFromNodePayload } from "../shared/errors.js";
import { gatekeep } from "../shared/gatekeep.js";
import {
humanSize,
readBoolean,
readGatewayCallOptions,
readTrimmedString,
} from "../shared/params.js";
import { evaluateFilePolicy } from "../shared/policy.js";
const FILE_WRITE_SCHEMA = Type.Object({
node: Type.String({ description: "Node id or display name to write the file on." }),
path: Type.String({
description: "Absolute path on the node to write. Canonicalized server-side.",
}),
contentBase64: Type.String({
description: "Base64-encoded bytes to write. Maximum 16 MB after decode.",
}),
mimeType: Type.Optional(
Type.String({
description: "Content type hint. Not validated against the content.",
}),
),
overwrite: Type.Optional(
Type.Boolean({
description: "Allow overwriting an existing file. Default false.",
default: false,
}),
),
createParents: Type.Optional(
Type.Boolean({
description: "Create missing parent directories (mkdir -p). Default false.",
default: false,
}),
),
});
type FileWriteSuccess = {
ok: true;
path: string;
size: number;
sha256: string;
overwritten: boolean;
};
type FileWriteError = {
ok: false;
code: string;
message: string;
canonicalPath?: string;
};
type FileWritePayload = FileWriteSuccess | FileWriteError;
export function createFileWriteTool(): AnyAgentTool {
return {
label: "File Write",
name: "file_write",
description:
"Write file bytes to a paired node by absolute path. Atomic write (temp + rename). Refuses to overwrite by default — pass overwrite=true to replace. Refuses to write through symlink targets (the node will reject if the path resolves to a symlink). Pair with file_fetch to round-trip a file from one node to another: file_fetch returns base64 in the image content block (.data) and as inline content for small text — pass that base64 directly as contentBase64 here. DO NOT use exec/cp/system.run for file copies; this tool IS the same-machine copy. Requires operator opt-in: gateway.nodes.allowCommands must include 'file.write' AND gateway.nodes.fileTransfer.<node>.allowWritePaths must match the destination path. Without policy configured, every call is denied.",
parameters: FILE_WRITE_SCHEMA,
async execute(_toolCallId, params) {
const raw = (
params && typeof params === "object" && !Array.isArray(params)
? (params as Record<string, unknown>)
: {}
) as Record<string, unknown>;
const nodeQuery = readTrimmedString(raw, "node");
const filePath = readTrimmedString(raw, "path");
const contentBase64 = typeof raw.contentBase64 === "string" ? raw.contentBase64 : "";
const overwrite = readBoolean(raw, "overwrite", false);
const createParents = readBoolean(raw, "createParents", false);
if (!nodeQuery) {
throw new Error("node required");
}
if (!filePath) {
throw new Error("path required");
}
if (!contentBase64) {
throw new Error("contentBase64 required");
}
// Compute the sha256 of the bytes we're sending so the node can do
// an end-to-end integrity check after writing. This is always
// sender-side computed; ignore any caller-supplied expectedSha256
// to avoid the model passing a wrong hash and triggering an
// unintended unlink.
const buffer = Buffer.from(contentBase64, "base64");
const expectedSha256 = crypto.createHash("sha256").update(buffer).digest("hex");
const gatewayOpts = readGatewayCallOptions(raw);
const nodes: NodeListNode[] = await listNodes(gatewayOpts);
const nodeId = resolveNodeIdFromList(nodes, nodeQuery, false);
const nodeMeta = nodes.find((n) => n.nodeId === nodeId);
const nodeDisplayName = nodeMeta?.displayName ?? nodeQuery;
const startedAt = Date.now();
const gate = await gatekeep({
op: "file.write",
nodeId,
nodeDisplayName,
kind: "write",
path: filePath,
toolCallId: _toolCallId,
gatewayOpts,
startedAt,
promptVerb: "Write file",
});
if (!gate.ok) {
throw new Error(gate.throwMessage);
}
const result = await callGatewayTool<{ payload: unknown }>("node.invoke", gatewayOpts, {
nodeId,
command: "file.write",
params: {
path: filePath,
contentBase64,
overwrite,
createParents,
expectedSha256,
},
idempotencyKey: crypto.randomUUID(),
});
const payload = (result as { payload?: unknown })?.payload;
if (!payload || typeof payload !== "object" || Array.isArray(payload)) {
await appendFileTransferAudit({
op: "file.write",
nodeId,
nodeDisplayName,
requestedPath: filePath,
decision: "error",
errorMessage: "unexpected response from node",
sizeBytes: buffer.byteLength,
durationMs: Date.now() - startedAt,
});
throw new Error("unexpected file.write response from node");
}
const typed = payload as FileWritePayload;
if (!typed.ok) {
await appendFileTransferAudit({
op: "file.write",
nodeId,
nodeDisplayName,
requestedPath: filePath,
canonicalPath: typed.canonicalPath,
decision: "error",
errorCode: typed.code,
errorMessage: typed.message,
sizeBytes: buffer.byteLength,
durationMs: Date.now() - startedAt,
});
throwFromNodePayload("file.write", typed as unknown as Record<string, unknown>);
}
// Post-flight policy on canonicalized path.
if (typed.path !== filePath) {
const postflight = evaluateFilePolicy({
nodeId,
nodeDisplayName,
kind: "write",
path: typed.path,
});
if (!postflight.ok) {
await appendFileTransferAudit({
op: "file.write",
nodeId,
nodeDisplayName,
requestedPath: filePath,
canonicalPath: typed.path,
decision: "denied:symlink_escape",
errorCode: postflight.code,
reason: postflight.reason,
sizeBytes: typed.size,
sha256: typed.sha256,
durationMs: Date.now() - startedAt,
});
// The file is already written. The most we can do here is
// surface the issue loudly. We don't try to unlink because
// (a) the file may legitimately exist there and we just
// didn't have policy for it, and (b) unlinking on policy
// failure adds destructive ambiguity.
throw new Error(
`file.write SYMLINK_TARGET_WARNING: file written but canonical path ${typed.path} is not in this node's allowWritePaths`,
);
}
}
await appendFileTransferAudit({
op: "file.write",
nodeId,
nodeDisplayName,
requestedPath: filePath,
canonicalPath: typed.path,
decision: "allowed",
sizeBytes: typed.size,
sha256: typed.sha256,
durationMs: Date.now() - startedAt,
});
const overwriteNote = typed.overwritten ? " (overwrote existing file)" : "";
return {
content: [
{
type: "text" as const,
text: `Wrote ${typed.path} (${humanSize(typed.size)}, sha256:${typed.sha256.slice(0, 12)})${overwriteNote}`,
},
],
details: typed,
};
},
};
}

View File

@@ -27,6 +27,7 @@ export const MEDIA_INVOKE_ACTIONS = {
"camera.clip": "camera_clip",
"photos.latest": "photos_latest",
"screen.record": "screen_record",
"file.fetch": "file_fetch",
} as const;
export type NodeMediaAction = "camera_snap" | "photos_latest" | "camera_clip" | "screen_record";

View File

@@ -138,7 +138,7 @@ export function createNodesTool(options?: {
name: "nodes",
ownerOnly: isOpenClawOwnerOnlyCoreToolName("nodes"),
description:
"Discover and control paired nodes (status/describe/pairing/notify/camera/photos/screen/location/notifications/invoke).",
"Discover and control paired nodes (status/describe/pairing/notify/camera/photos/screen/location/notifications/invoke). For file retrieval, use the dedicated file_fetch tool.",
parameters: NodesToolSchema,
execute: async (_toolCallId, args) => {
const params = args as Record<string, unknown>;

View File

@@ -1,6 +1,10 @@
import type { OpenClawConfig } from "../config/types.openclaw.js";
import {
NODE_BROWSER_PROXY_COMMAND,
NODE_DIR_FETCH_COMMAND,
NODE_DIR_LIST_COMMAND,
NODE_FILE_FETCH_COMMAND,
NODE_FILE_WRITE_COMMAND,
NODE_SYSTEM_NOTIFY_COMMAND,
NODE_SYSTEM_RUN_COMMANDS,
} from "../infra/node-commands.js";
@@ -48,6 +52,17 @@ const MOTION_COMMANDS = ["motion.activity", "motion.pedometer"];
const SMS_DANGEROUS_COMMANDS = ["sms.send", "sms.search"];
// File operations on arbitrary node paths are sensitive — operator must opt
// in via `gateway.nodes.allowCommands`. Writes are more dangerous than reads;
// dir.list leaks information through enumeration; dir.fetch transfers tree
// content. All four are dangerous-by-default.
const FILE_DANGEROUS_COMMANDS = [
NODE_FILE_FETCH_COMMAND,
NODE_DIR_LIST_COMMAND,
NODE_DIR_FETCH_COMMAND,
NODE_FILE_WRITE_COMMAND,
];
// iOS nodes don't implement system.run/which, but they do support notifications.
const IOS_SYSTEM_COMMANDS = [NODE_SYSTEM_NOTIFY_COMMAND];
@@ -72,6 +87,7 @@ export const DEFAULT_DANGEROUS_NODE_COMMANDS = [
...CALENDAR_DANGEROUS_COMMANDS,
...REMINDERS_DANGEROUS_COMMANDS,
...SMS_DANGEROUS_COMMANDS,
...FILE_DANGEROUS_COMMANDS,
];
const PLATFORM_DEFAULTS: Record<string, string[]> = {

View File

@@ -11,3 +11,14 @@ export const NODE_EXEC_APPROVALS_COMMANDS = [
"system.execApprovals.get",
"system.execApprovals.set",
] as const;
export const NODE_FILE_FETCH_COMMAND = "file.fetch";
export const NODE_DIR_LIST_COMMAND = "dir.list";
export const NODE_DIR_FETCH_COMMAND = "dir.fetch";
export const NODE_FILE_WRITE_COMMAND = "file.write";
export const NODE_FILE_COMMANDS = [
NODE_FILE_FETCH_COMMAND,
NODE_DIR_LIST_COMMAND,
NODE_DIR_FETCH_COMMAND,
NODE_FILE_WRITE_COMMAND,
] as const;

View File

@@ -939,7 +939,7 @@ export function collectNodeDangerousAllowCommandFindings(
title: "Dangerous node commands explicitly enabled",
detail:
`gateway.nodes.allowCommands includes: ${dangerousAllowed.join(", ")}. ` +
"These commands can trigger high-impact device actions (camera/screen/contacts/calendar/reminders/SMS).",
"These commands can trigger high-impact device actions or read node files (camera/screen/contacts/calendar/reminders/SMS/file).",
remediation:
"Remove these entries from gateway.nodes.allowCommands (recommended). " +
"If you keep them, treat gateway auth as full operator access and keep gateway exposure local/tailnet-only.",