OpenClaw整合Lux模型,打造全自主OS層級Agent指令中心
OpenClaw整合Lux模型,打造全自主OS層級Agent指令中心。
OpenClaw現已成為全自主OS層級Agent的指令中心,透過整合業界領先的「Computer Use」模型Lux,讓Agent能直接觀看螢幕、自行點擊與輸入,處理複雜QA測試並秒速抓蟲。OpenAGI Labs強調OpenClaw如橋樑、Lux如大腦,遠勝僵硬腳本或基本巨集,避免網站更新即失效的問題。
OpenClaw x Lux核心整合
OpenClaw透過plugin形式整合Lux模型,提供桌面自動化功能,使用者可透過GitHub專案「https://github.com/agiopen-org/openclaw-oagi」下載。安裝指令為`openclaw plugins install @oagi/openclaw-computer-use`,安裝後必須手動建置原生依賴,以確保安全性。
- 安裝需先取得OAGI API金鑰,設定環境變數
OAGI_API_KEY或plugin設定。 - macOS需授予終端機「Accessibility」權限(系統設定 > 隱私與安全性 > Accessibility)。
- Linux需安裝X11開發標頭(
sudo apt install libx11-dev libxtst-dev)。
該plugin依賴robotjs(C++原生模組,用於螢幕擷取與輸入模擬),OpenClaw安裝器因安全考量略過建置腳本,使用者須手動執行:
cd ~/.openclaw/extensions/openclaw-computer-use/node_modules/robotjs
npx node-gyp rebuild
建置成功顯示gyp info ok;macOS若失敗需安裝Xcode Command Line Tools(xcode-select --install),Linux則補齊建置依賴(sudo apt install build-essential libx11-dev libxtst-dev)。驗證指令:
cd ~/.openclaw/extensions/openclaw-computer-use
node -e "require('robotjs'); console.log('robotjs OK')"
Lux模型視覺理解優勢
Lux為真實「Computer Use」模型,在視覺層運作,能視覺理解複雜、完全客製化UI,推理工作流程,並如真人使用者般執行真實任務。相較僵硬腳本或基本巨集(網站更新即壞),Lux不依賴固定結構,適用無限情境,只要能在螢幕上操作,即可自動化。
無限應用案例
Lux視覺層運作帶來廣泛用途,涵蓋:
- 🐛 自主軟體QA與視覺測試,能運行複雜QA測試並秒速抓蟲。
- 📊 跨應用程式資料輸入與大量操作。
- 🛒 電商爬取與管理,如Demo中透過OpenClaw TUI搜尋Amazon。
- 🛠️ 舊有IT系統自動化。
OpenAGI Labs承諾打造開放「Computer Use」生態,使用者可直接使用Lux,或建置自訂自主Agent。
詳細設定與開發
plugin設定表如下,預設值已優化常見需求:
| 鍵值 | 預設值 | 說明 |
|---|---|---|
apiKey |
$OAGI_API_KEY |
OAGI API金鑰 |
baseUrl |
https://api.agiopen.org |
API基礎URL |
model |
lux-actor-1 |
模型ID |
maxSteps |
20 |
每任務最大步驟數 |
temperature |
0.5 |
取樣溫度 |
stepDelay |
1.0 |
步驟間延遲(秒) |
開發模式下,複製專案git clone https://github.com/agiopen-org/openclaw-oagi.git,執行npm install,並在OpenClaw設定中將路徑加入plugins.load.paths作為本地擴充載入。SDK建置資源見「http://developer.agiopen.org/」。
開放生態承諾
OpenAGI Labs強調開放生態,無論即用Lux或自建Agent,工具皆已就緒,授權為MIT。該整合展示Agent從指令中心向全自主OS層進化,解決傳統自動化痛點,但需注意原生建置門檻與平台權限設定,以確保穩定執行。
OpenClaw is now a command center for a fully autonomous OS level agent.
— OpenAGI Labs (@agiopen_org) April 21, 2026
We integrated our industry-leading Computer Use model, Lux, directly into it. Now OpenClaw can see your screen, click and type on its own.
Watch it run a complex QA test and catch bugs in seconds. 👇 pic.twitter.com/AMHQz9lm0l
OpenClaw is the bridge; Lux is the brain.
— OpenAGI Labs (@agiopen_org) April 21, 2026
Unlike rigid scripts or basic macros that break when a website updates, Lux is a true Computer Use model. It visually comprehends complex, completely custom UIs, reasons through the workflow, and executes real-world tasks exactly like a…
Because Lux operates at the visual layer, the use cases are limitless:
— OpenAGI Labs (@agiopen_org) April 21, 2026
🐛 Autonomous software QA & visual testing
📊 Cross-app data entry & bulk operations
🛒 E-commerce scraping & management
🛠️ Legacy IT system automation
If you can do it on a screen, Lux can automate it for…
We are committed to building an Open Ecosystem for Computer Use.
— OpenAGI Labs (@agiopen_org) April 21, 2026
Whether you want to use Lux out of the box or build your own custom autonomous agents, the tools are ready for you.
💻 Check out the Openclaw x Lux integration on GitHub: https://t.co/cHLgXFCGnw
— OpenAGI Labs (@agiopen_org) April 21, 2026
🛠️ Build with the SDK: https://t.co/ey3Kc8jGoi
