Win11+Docker+qwen3.5本地化养虾

张开发
2026/4/9 13:18:29 15 分钟阅读

分享文章

Win11+Docker+qwen3.5本地化养虾
安全提醒需要说明的是这里我们的安装方案是图方便把Ollama直接安装在宿主机了理论上应该像KTransformer一样装在Docker里边通过虚拟化进行管理会更合适。安装npm和pnpm环境需要在Ubuntu的Linux子系统中建立一个空目录然后依照如下顺序安装一个pnpm$ mkdir nodejs $ cd nodejs/ $ curl -fsSL https://deb.nodesource.com/setup_20.x | sudo -E bash - $ sudo apt-get install -y nodejs $ curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.40.0/install.sh | bash安装成功后可以看到版本号$ node --version v20.20.1 $ npm --version 11.8.0然后用npm安装我们所需的pnpm$ npm install -g pnpm $ pnpm --version 10.31.0 $ pnpm install安装完成后就可以正式开始部署OpenClaw了。WSL2环境下的OpenClaw环境部署在Win11上打开一个Ubuntu Linux的子系统窗口下载openclaw的仓库源代码$ git clone https://github.com/openclaw/openclaw.git $ cd openclaw/直接运行$ ./docker-setup.sh理论上来说如果环境配置的没有问题网络也很通畅的话是可以直接运行成功的。所以如果这一步运行没有问题本章节后面的内容就不用看了。如果有遇到一些报错信息这里逐个排查。docker-image报错如果发生报错 ERROR resolve image config for docker-image://docker.io/docker/dockerfile:1.7那就把Dockerfile中的第一行内容删掉# syntaxdocker/dockerfile:1.7网络连接TimeOut问题如果因为网络问题导致下载过程报错可以在Docker的配置文件daemon.json中添加如下配置registry-mirrors: [https://docker.1ms.run]frozen-lockfile报错如果遇到frozen-lockfile的报错ERR_PNPM_OUTDATED_LOCKFILE Cannot install with frozen-lockfile because pnpm-lock.yaml is not up to date with ROOT/package.json这个问题是由于npm和pnpm的环境没有配置好需要按照上一个章节中的配置命令进行相应的模块安装。一切就绪如果一切都没有问题执行Docker的构建你会得到 Starting gateway [] Running 1/0 ✔ Container openclaw-openclaw-gateway-1 Running 0.0s [] Creating 1/0 ✔ Container openclaw-openclaw-gateway-1 Running0.0s (node:7) [DEP0040] DeprecationWarning: The punycode module is deprecated. Please use a userland alternative instead. (Use node --trace-deprecation ... to show where the warning was created) Config overwrite: /home/node/.openclaw/openclaw.json (sha256 65de - edbd, backup/home/node/.openclaw/openclaw.json.bak) Gateway running with host port mapping. Access from tailnet devices via the hosts tailnet IP. Config: /home/xxx/.openclaw Workspace: /home/xxx/.openclaw/workspace Token: fbf6 Commands: docker compose -f /mnt/k/openclaw/docker-compose.yml logs -f openclaw-gateway docker compose -f /mnt/k/openclaw/docker-compose.yml exec openclaw-gateway node dist/index.js health --token 819e到这一步OpenClaw的界面就可以打开了不过这里界面上还是会有一些报错这是因为我们还没有完成本地模型的配置所以会话窗口无法识别到本地的大模型。我们在下一个章节中继续介绍一下这个问题的解决方案。本地大模型配置然后按照上述启动的指令就完成了Docker镜像的启动然后可以通过docker attach或者VSCode的插件进入到容器内部。然后查看openclaw的版本$ openclaw --version OpenClaw 2026.3.8根据需求配置一下openclaw.json的内容以下是我的配置清单比较重点的就是本地模型部分的配置参数{ meta: { lastTouchedVersion: 2026.3.8, lastTouchedAt: 2026-03-10T07:12:39.442Z }, wizard: { lastRunAt: 2026-03-10T07:12:39.438Z, lastRunVersion: 2026.3.8, lastRunCommand: onboard, lastRunMode: local }, models: { providers: { ollama: { baseUrl: http://127.0.0.1:xxxxx/v1, apiKey: ollama-local, api: openai-completions, models: [ { id: your_model, name: your_model, reasoning: true, input: [text], cost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0 }, contextWindow: 32768, maxTokens: 32768 } ] } } }, agents: { defaults: { model: { primary: ollama/your_model }, models: { ollama/your_model: {} }, workspace: /home/node/.openclaw/workspace, compaction: { mode: safeguard }, maxConcurrent: 4, subagents: { maxConcurrent: 8 }, sandbox: { mode: off } } }, tools: { profile: coding }, messages: { ackReactionScope: group-mentions }, commands: { native: auto, nativeSkills: auto, restart: true, ownerDisplay: raw }, session: { dmScope: per-channel-peer }, gateway: { port: 18789, mode: local, bind: auto, controlUi: { allowedOrigins: [ http://localhost:xxx, http://127.0.0.1:xxx ] }, auth: { mode: password, token: c9b9, password: xxx }, tailscale: { mode: off, resetOnExit: false }, nodes: { denyCommands: [ camera.snap, camera.clip, screen.record, contacts.add, calendar.add, reminders.add, sms.send ] } } }重启Docker容器可以查看到相应的openclaw容器$ docker ps -a CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES xxxx openclaw:local docker-entrypoint.s… 18 hours ago Up 18 hours (healthy) xxx:xxx-xxx-xxx-xxx/tcp openclaw-openclaw-gateway-1如果继续使用docker attach或者vscode插件进入到容器内部可以查看到openclaw现已加载的模型$ openclaw models list OpenClaw 2026.3.8 (unknown) — Like having a senior engineer on call, except I dont bill hourly or sigh audibly. Model Input Ctx Local Auth Tags ollama/qwen3.5:latest text 32k no yes default,configured这里链接到模型后还会有一个本地配对的请求授权这个直接按照操作指引配置一下即可$ openclaw devices list OpenClaw 2026.3.8 (unknown) — Say stop and Ill stop—say ship and well both learn a lesson. │ ◇ Pending (1) ┌──────────────────────────────────────┬─────────────────────────────────────────────────────────────────────────────┬──────────┬────────────┬──────────┬────────┐ │ Request │ Device │ Role │ IP │ Age │ Flags │ ├──────────────────────────────────────┼─────────────────────────────────────────────────────────────────────────────┼──────────┼────────────┼──────────┼────────┤ │ e6dc │ 177f │ operator │ 172.18.0.1 │ just now │ │ └──────────────────────────────────────┴─────────────────────────────────────────────────────────────────────────────┴──────────┴────────────┴──────────┴────────┘ Paired (1) ┌───────────────────────────────────────────────────┬────────────┬─────────────────────────────────────────────────────────────────────┬────────────┬────────────┐ │ Device │ Roles │ Scopes │ Tokens │ IP │ ├───────────────────────────────────────────────────┼────────────┼─────────────────────────────────────────────────────────────────────┼────────────┼────────────┤ │ 250c │ operator │ operator.read, operator.admin, operator.write, operator.approvals, │ operator │ │ │ 5958 │ │ operator.pairing │ │ │ └───────────────────────────────────────────────────┴────────────┴─────────────────────────────────────────────────────────────────────┴────────────┴────────────┘ $ openclaw devices 177f OpenClaw 2026.3.8 (unknown) — Making Ill automate that later happen now. │ ◇ Approved 177f)再回到主界面就可以正常对话了例如简单点的任务可以让他自己切换本地时区到北京时间甚至可以让它自己在本地配置一个Anaconda的环境配置完成后进入容器就可以看到安装的环境nodexxx:~/.openclaw$ /home/node/.conda/bin/conda env list # conda environments: # # * - active # - frozen base /home/node/.conda python39_env /home/node/.conda/envs/python39_env在容器内基础的读写权限也是有的总结概要本文介绍了一种在Win11操作系统下使用Docker部署OpenClaw的一种方案并且Token由本地部署的Ollama加载开源的qwen3.5模型产生实现零成本、相对安全可控的一种部署方案。当然目前OpenClaw和Ollama的安全性还是有待提升结合自己的情况慎重部署

更多文章