Key Takeaways
Install and run OpenClaw on Ubuntu, verify local UI availability, and prepare the environment for Ollama integration.
Install OpenClaw on Ubuntu: Step 2 of Local AI Workflow
In Part 1, we prepared Ollama as the local model runtime. In this article, we install OpenClaw and make sure the local control interface is up and running.
What You Will Have After This
- A compatible Node.js runtime for OpenClaw
- OpenClaw CLI installed and verified
- Local UI available at
http://localhost:3000
1) Install Node.js
OpenClaw requires a modern Node.js version (22+ recommended).
sudo apt update
sudo apt install -y nodejs npmCheck versions:
node -v
npm -v2) Install OpenClaw
Install globally with npm:
npm install -g openclawVerify:
openclaw --version3) Start OpenClaw
openclawOpen in browser:
http://localhost:3000On first access, the browser may ask for a Token. This is expected and part of OpenClaw authentication.
3.1 Get and Enter the Token
- Keep the
openclawprocess running. - Read the local config file
~/.openclaw/openclaw.json. - Copy the value from
gateway.auth.tokenand paste it into the browser prompt.
Do not commit or expose this token in screenshots, logs, or public repositories.
After token verification, if you can enter the console, the OpenClaw service is working correctly.
Common Issues
Command Not Found
If openclaw is not found, your npm global bin path may not be in PATH. Check npm prefix and shell profile configuration.
Port Conflict
If port 3000 is occupied, free it first or set a different port in your OpenClaw runtime configuration.
Node Version Too Old
Older Node versions may break installation or runtime dependencies. Upgrade and rerun installation.
Browser Asks for Token
This is default OpenClaw auth behavior. Check:
~/.openclaw/openclaw.jsonexists and includesgateway.auth.token- the token is copied completely (no missing chars or extra spaces)
- you are using the token from the current runtime session/config
Wrap-up
Your local OpenClaw runtime is now available on Ubuntu. In Part 3, we connect OpenClaw to local Ollama and run an end-to-end local agent flow.