Quick Disclaimer:
- I am not an OpenClaw expert.
- This lab includes my takeaways after working with it for 2-3 weeks.
- If you have tips, tricks, or shortcuts, I’d love to hear them.
Lab Objectives: (Project Here) ← You don’t have to do anything yet.
-
Deploy the following project components via the provided ./setup.sh script:
- Local Ollama LLM container
- We’ll use
qwen2.5-coder:7b
- This is a 7 billion parameter (
7b) parameter model, which is fairly small and lightweight.
- Local Openclaw container
- This will send all requests to a proxy that logs all information sent to/from Openclaw
- Local
socat proxy container.
-
Examine why and how OpenClaw creates overhead by wrapping user requests in layers of pre-written prompts/instructions.
- We’ll see why a prompt like ‘Hello OpenClaw’, can generate a request where the LLM has to process thousands of tokens.
- We’ll look at ways to monitor/reduce the amount of tokens consumed per request.
- A lot of people have a lot of ideas about this!
- We’ll look at how the response to a request like ‘Create a folder called
MyNewFolder’ is interpreted and eventually executed as an action within OpenClaw
-
Finally, we’ll hack the OpenClaw system prompt to make the agent believe in its heart of hearts that it is no longer your personal assistant. We will turn this thing into Shagga, Son of Dolf! (AKA: Shagga-Claw)
- It may require tribute of steel and the promise of battle before Shagga-Claw performs tasks for you, but that’s the price of doing business with the Tribesmen of the Vale!
System Requirement:
- Kali/Ubuntu VM with:
- 16GB of RAM
- You can try this with 8GB, but I’m going to suggest 16GB to be safe.
- 15 Gigabytes Free Disk Space. (To be safe)
Make sure Docker is installed
apt install docker.io docker-compose
Clone the Project Repository and run the setup.sh script.
- The script will prompt you as it installs each component
- Be patient. It has to download quite a bit.
- Ollama, OpenClaw, the LLM, etc….
git clone <https://github.com/androidteacher/DisposeClaw-Local-LLM-OpenClaw-Tutorial.git>
cd DisposeClaw-Local-LLM-OpenClaw-Tutorial
./setup.sh

Evaluate a Simple Request using the Web UI Chat App