I will never understand why people don't just give OpenClaw instances their own set of credentials, that are limited, like you would a newbie employee.
Or even more limiting, use proxies, for instance when setting up the LLM connection instead of giving it the OpenRouter, OpenAI api keys directly, give it access via proxy that is running on a machine next to it.
noduerme•1h ago
This seems like a sensible thing to build if you're going to allow it to run on your actual machine, but... I'd like to hear more context about why it was allowed to run on a machine with your SSH keys, and how it tried to exfiltrate them, how you caught it doing so, etc.
enjoykaz•58m ago
Shell RC files (`~/.bashrc`, `~/.zshrc`) are write-protected in the rules but not read-protected. OpenAI's own quickstart tells you to put your API key there — so anyone who followed that tutorial has `OPENAI_API_KEY` sitting in their zshrc, readable by the agent. DLP is the only backstop, and only for known formats. Am I reading the rules wrong?
Zekio•1h ago
Or even more limiting, use proxies, for instance when setting up the LLM connection instead of giving it the OpenRouter, OpenAI api keys directly, give it access via proxy that is running on a machine next to it.