OpenCrabs is a new kind of AI bot that you can run on your own computer or server. It learns from the data you give it, fixes itself when something goes wrong, and can run many different tasks at once. In this article we’ll walk through what OpenCrabs is, why it matters, how it works, and how you can start using it today.
What Is OpenCrabs?
OpenCrabs is a self‑hosted AI agent written in Rust. It is designed to be lightweight, fast, and easy to extend. Unlike cloud‑based assistants that rely on a single provider, OpenCrabs can connect to multiple LLMs (Large Language Models) and run them locally or in the cloud. It also has a built‑in “self‑healing” system that recovers from crashes, corrupted configuration files, or database errors without manual intervention.
Key Features
- Self‑healing – automatically recovers from corrupted config files, database corruption, or provider failures.
- Multi‑profile support – run several isolated bots from one installation, each with its own memory and settings.
- Token‑lock isolation – prevents two bots from using the same API token at the same time.
- CLI and TUI – command‑line interface and terminal UI for quick setup and debugging.
- Extensible tool calls – can call external tools (e.g., Git, curl, custom scripts) and return results to the model.
- Open‑source – all code is on GitHub, so you can audit, modify, or contribute.
OpenCrabs is ideal for developers who want a private, customizable AI assistant that can run on a laptop, a home server, or a cloud VM.
Why OpenCrabs Matters
Many people rely on cloud‑based AI services like OpenAI or Anthropic. Those services are convenient but come with limits: data privacy concerns, cost, and dependency on a single provider. OpenCrabs gives you full control:
- Privacy – all data stays on your machine unless you choose to send it elsewhere.
- Cost control – you can switch between free local models or paid cloud models without changing the bot.
- Reliability – the self‑healing system keeps the bot running even if a provider goes down.
- Extensibility – you can add new tools or integrate with other services (e.g., GitHub, Slack) without waiting for a vendor.
For developers building AI‑powered tools, OpenCrabs offers a solid foundation that can be customized to fit any workflow.
Architecture Overview
Below is a simplified diagram of how OpenCrabs works.
(Image alt text: “OpenCrabs AI agent architecture diagram showing user, CLI, TUI, LLM providers, tool calls, and self‑healing components.”)
- User Interface – CLI or TUI where you type commands or interact with the bot.
- Router – decides which LLM provider to use based on the request.
- LLM Provider – could be local (e.g., llama.cpp) or cloud (e.g., OpenAI, Anthropic).
- Tool Executor – runs external commands and feeds results back to the LLM.
- Self‑Healing Engine – monitors health, recovers from crashes, and restores configuration.
The router is the heart of OpenCrabs. It keeps track of which provider is healthy, which tokens are locked, and which profile is active. When a request comes in, the router forwards it to the chosen provider, collects the response, and sends it back to the user.
Installing OpenCrabs
OpenCrabs is available on GitHub. The installation steps are straightforward:
# Clone the repository
git clone https://github.com/adolfousier/opencrabs.git
cd opencrabs
# Build the binary
cargo build --release
# Run the bot
./target/release/opencrabs
If you prefer a pre‑built binary, check the releases page on GitHub. The binary is available for Linux, macOS, and Windows.
Setting Up a Profile
Profiles let you run multiple bots from the same installation. Each profile has its own config, memory, and sessions.
# Create a new profile named “hermes”
opencrabs profile create hermes
# Switch to the hermes profile
opencrabs -p hermes
# Start the bot
opencrabs
You can export or import profiles as .tar.gz archives, making it easy to share settings between machines.
Self‑Healing in Action
One of the most impressive aspects of OpenCrabs is its self‑healing capability. Here’s how it works:
- Configuration Recovery – If
config.tomlbecomes corrupted, OpenCrabs restores the last‑known‑good snapshot automatically. - Database Integrity – On startup, the bot runs
PRAGMA integrity_checkon its SQLite database. If corruption is detected, it notifies you and attempts to recover. - Provider Health Tracking – The bot keeps a JSON file with success/failure history for each provider. If a provider fails repeatedly, the router stops using it until it recovers.
- Token Locking – PID‑based lock files prevent two profiles from using the same API token, avoiding race conditions.
These features mean that even if you accidentally delete a file or a network outage occurs, the bot will keep running and recover automatically.
Extending OpenCrabs
OpenCrabs is designed to be modular. You can add new tools or change the LLM provider without touching the core code.
Adding a Custom

- Create a new Rust file in
src/tools/. - Implement the
Tooltrait, which requires aname,description, andrunmethod. - Register the tool in
src/main.rsby adding it to thetoolsvector.
Once registered, the tool will appear in the bot’s help menu and can be called from the chat.
Switching LLM Providers
OpenCrabs supports many providers out of the box: OpenAI, Anthropic, Gemini, and local models via llama.cpp. To switch providers:
- Edit
config.tomland set theproviderfield. - Add your API key under the provider section.
- Restart the bot.
The router will automatically detect the new provider and start using it.
Use Cases
OpenCrabs can be used in many scenarios:
- Personal Assistant – schedule meetings, draft emails, or browse the web.
- Developer Tool – run code linting, generate documentation, or manage GitHub issues.
- Customer Support – answer FAQs, route tickets, or provide product information.
- Research Assistant – fetch academic papers, summarize articles, or generate research outlines.
Because it is self‑hosted, you can keep sensitive data local, which is especially useful for regulated industries.
Comparison with Cloud‑Based Bots
| Feature | OpenCrabs | Cloud Bot (e.g., OpenAI) |
|---|---|---|
| Privacy | Full control over data | Data sent to provider |
| Cost | Pay for compute only | Pay per token |
| Reliability | Self‑healing, multi‑provider | Single provider, no self‑healing |
| Extensibility | Add custom tools, local models | Limited to provider APIs |
| Setup | Requires local install | No install needed |
OpenCrabs offers a compelling alternative for teams that need privacy, cost control, and flexibility.
Getting Started with a Sample Project
Let’s walk through a quick example: building a simple “Git Assistant” that can pull the latest changes, run tests, and report results.
- Create a new tool –
src/tools/git.rs
use std::process::Command;
pub struct GitTool;
impl Tool for GitTool {
fn name(&self) -> &str { "git" }
fn description(&self) -> &str { "Run git commands" }
fn run(&self, args: Vec<String>) -> Result<String, String> {
let output = Command::new("git")
.args(&args)
.output()
.map_err(|e| e.to_string())?;
Ok(String::from_utf8_lossy(&output.stdout).to_string())
}
}
- Register the tool – add
GitToolto the tools vector inmain.rs. - Use it – in the chat, type
git statusorgit pull. - Result – the bot returns the output of the command.
This simple example shows how quickly you can extend OpenCrabs to fit your workflow.
Future Roadmap
OpenCrabs is actively developed. The latest release (0.2.94) added several new features:
- Multi‑profile support – run isolated bots from one installation.
- Profile migration and export/import – share profiles easily.
- Token‑lock isolation – prevent token conflicts.
- Profile‑aware daemon services – run each profile as a separate systemd unit.
Upcoming plans include:
- Web UI – a browser interface for easier interaction.
- Graphical TUI enhancements – better split‑pane support.
- More provider integrations – support for additional LLMs.
- Advanced self‑healing – automatic database repair and backup.
Keep an eye on the GitHub releases page for the next updates.
Conclusion
OpenCrabs AI Agent is a powerful, self‑hosted solution for developers who want a private, reliable, and extensible AI assistant. Its self‑healing architecture, multi‑profile support, and easy tool integration make it a standout choice for teams that need full control over their AI workflows. Whether you’re building a personal assistant, a developer tool, or a customer support bot, OpenCrabs gives you the flexibility to tailor the experience to your exact needs.
If you’re ready to try OpenCrabs, head over to the GitHub repository, clone the repo, and start building your own AI bot today.