OpenCrabs AI Agent is a new open‑source framework that lets you run a fully autonomous AI bot on your own servers. It learns from interactions, fixes its own mistakes, and can be customized for many tasks—from answering questions to automating workflows. In this article we’ll walk through what makes OpenCrabs special, how to set it up, and why it’s a great choice for developers who want full control over their AI agents.
What Is the OpenCrabs AI Agent?
OpenCrabs AI Agent is a self‑hosted, self‑learning, and self‑healing bot. Unlike many cloud‑based assistants that rely on a single provider, OpenCrabs runs locally and can connect to any LLM (Large Language Model) you choose. It is built in Rust and uses a terminal‑based UI (TUI) that shows the conversation, tool calls, and logs in real time.
Key features include:
- Multi‑profile support – Run several isolated bots from one installation, each with its own memory and configuration.
- Self‑healing – If the bot crashes or a provider fails, OpenCrabs automatically recovers and continues where it left off.
- Tool integration – Call external tools (CLI commands, APIs, web browsers) directly from the conversation.
- CLI and TUI – Use the command line or a full‑screen interface to interact with the bot.
- Open‑source – All code is on GitHub, so you can inspect, modify, or extend it.
OpenCrabs is ideal for developers who want to experiment with autonomous agents without paying for cloud usage or giving up data privacy.
Why Choose OpenCrabs Over Other Agent Frameworks?
Many agent frameworks exist, such as LangChain, Agentic, or proprietary solutions from big vendors. OpenCrabs stands out because:
- Full control – You host the bot yourself, so your data never leaves your network.
- Low cost – You only pay for the LLM you run locally (e.g., open‑source models or paid APIs you already own).
- Rapid iteration – The code is modular; you can swap out the LLM provider or add new tools without rewriting the whole system.
- Self‑healing – The bot can recover from crashes or provider outages automatically, reducing downtime.
- Community‑driven – The project is growing fast, with a growing list of skills and integrations on the ClawHub registry.
If you’re building a chatbot for internal use, a personal assistant, or a research prototype, OpenCrabs gives you the flexibility to tailor the agent exactly to your needs.
Installing OpenCrabs AI Agent
Below is a step‑by‑step guide to get OpenCrabs up and running on a Linux machine. The process is similar on macOS and Windows (via WSL).
1. Install Rust and Cargo
OpenCrabs is written in Rust, so you need the Rust toolchain:
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
source $HOME/.cargo/env
2. Clone the Repository
git clone https://github.com/adolfousier/opencrabs.git
cd opencrabs
3. Build the Binary
cargo build --release
The compiled binary will be in target/release/opencrabs.
4. Configure Your LLM Provider
OpenCrabs supports many providers (OpenAI, Anthropic, Mistral, Gemini, etc.). Create a config.toml file in the project root:
[provider]
name = "openai"
api_key = "YOUR_OPENAI_KEY"
model = "gpt-4o-mini"
[tools]
# Enable the browser tool
browser = true
Replace YOUR_OPENAI_KEY with your actual key. You can also use local models by setting name = "local" and pointing to the model path.
5. Run the Bot
./target/release/opencrabs
You’ll see a TUI that shows the conversation and tool calls. Type /help to see available commands.
6. Create Multiple Profiles
If you want to run several bots simultaneously, use the profile commands:

opencrabs profile create dev
opencrabs profile create prod
Switch between them with:
opencrabs -p dev
Each profile has its own memory, config, and logs.
Using OpenCrabs: A Practical Example
Let’s walk through a real‑world scenario: building a customer support bot that can answer FAQs, fetch ticket status, and open a browser to a knowledge base.
- Enable the Browser Tool – In
config.toml, setbrowser = true. OpenCrabs will use the built‑in browser tool to open URLs and scrape content. - Add a Ticket API Tool – Create a new tool script in
tools/ticket_api.pythat calls your ticketing system’s REST API. - Define a Skill – In the
skillsdirectory, add a skill filesupport_skill.tomlthat tells the agent how to use the browser and ticket API tools. - Run the Bot – Start OpenCrabs and ask: “What’s the status of ticket #1234?” The bot will call the ticket API, fetch the status, and if needed, open the knowledge base page to provide additional context.
Because OpenCrabs stores conversation history locally, the bot remembers past interactions and can improve its responses over time.
Self‑Healing and Recovery
One of the most impressive aspects of OpenCrabs is its self‑healing capability. If the bot crashes or a provider becomes unavailable, the following steps happen automatically:
- Crash Recovery – The bot restarts and reloads the last conversation from the SQLite database.
- Provider Health Tracking – OpenCrabs logs success/failure rates for each provider. If a provider fails repeatedly, the bot can switch to a backup provider.
- Session Persistence – Tool call results are saved to the database, so you never lose data even if the bot stops mid‑conversation.
These features reduce maintenance overhead and make OpenCrabs reliable for production use.
Extending OpenCrabs
OpenCrabs is designed to be modular. You can add new tools, change the LLM provider, or tweak the routing logic with minimal effort.
Adding a New Tool
- Create a script in the
toolsfolder (e.g.,weather.py). - Define the tool’s name, description, and arguments in a TOML file.
- Update the
skillsconfiguration to include the new tool.
OpenCrabs will automatically detect the new tool and make it available in conversations.
Switching LLM Providers
If you want to use a local model like Llama 3, edit config.toml:
[provider]
name = "local"
model_path = "/models/llama3"
OpenCrabs will load the model via the local inference engine.
Customizing the UI
The TUI is built with the Ratatui library. If you prefer a web interface, you can fork the project and add a simple HTTP server that forwards messages to the core engine.
Community and Ecosystem
OpenCrabs has a growing community of developers and researchers. The ClawHub registry hosts over 5,000 modular skills that can be imported into your bot. You can also contribute new skills or tools by opening a pull request on GitHub.
The project’s GitHub page is active, with frequent releases and issue discussions. The latest release (0.2.94) added multi‑profile support, profile migration, and improved CLI stream handling.
Future Roadmap
The OpenCrabs team plans to add:
- Graph‑based memory – Better long‑term context handling.
- Advanced debugging tools – Visualize tool call graphs.
- Marketplace integration – Easy import of third‑party skills.
- Enhanced security – Fine‑grained access control for tools.
These updates will make OpenCrabs even more powerful for enterprise deployments.
Conclusion
OpenCrabs AI Agent gives developers a powerful, self‑hosted platform for building autonomous AI bots. With its self‑learning, self‑healing, and multi‑profile capabilities, it’s a great fit for anyone who wants full control over their AI infrastructure. Whether you’re building a personal assistant, a customer support bot, or a research prototype, OpenCrabs offers the flexibility and reliability you need.
If you’re interested in trying OpenCrabs, check out the GitHub repository, explore the ClawHub skills, and start building your own autonomous agent today.