OpenCrabs is a new open‑source AI agent that runs right on your own computer or server. It can learn from the data you give it, fix itself when something goes wrong, and work with many other tools. In this article we will explain what it is, why it matters, how to set it up, and how it can help you in real projects. We will also show how it fits into the larger Neura AI ecosystem and what the future looks like.

What is OpenCrabs?

OpenCrabs is a single‑binary program written in Rust. It uses a text‑based user interface (TUI) built with Ratatui, so you can run it in a terminal or in a Docker container. The name “Crabs” comes from the way the agent can move around in different “profiles” and “channels” like a crab on a beach.

Core Features

  • Self‑Hosted – You own the data and the code. No cloud provider stores your conversations.
  • Self‑Learning – The agent can add new skills and remember past interactions.
  • Self‑Healing – If a tool fails or a configuration file gets corrupted, OpenCrabs can recover automatically.
  • Multi‑Profile Support – Run several isolated bots from one installation, each with its own memory and settings.
  • Provider Health Tracking – The agent keeps a history of successes and failures for each external API.

Architecture Overview

OpenCrabs is split into three main parts:

  1. Core Engine – Handles the conversation loop, tool calls, and state persistence.
  2. Channel Layer – Connects to messaging platforms (Telegram, Discord, Slack, WhatsApp, Trello) and to the command‑line interface.
  3. Provider Layer – Manages calls to external LLMs, APIs, and local tools.

The engine stores all messages in a SQLite database. When the agent restarts, it reads the database and continues where it left off. The channel layer uses a simple event system so that each platform can send and receive messages without interfering with the core logic.

Why Self‑Hosted AI Agents Matter

Privacy and Control

Because OpenCrabs runs on your own hardware, you never send sensitive data to a third‑party server. This is important for companies that handle confidential information or for developers who want to keep their experiments private.

Customization and Extensibility

OpenCrabs is built to be extended. You can add new tools, change the way it talks, or plug it into other services. The multi‑profile feature lets you keep separate bots for different teams or projects without installing multiple copies.

Getting Started with OpenCrabs

System Requirements

  • Linux, macOS, or Windows with WSL
  • Rust 1.70+ (for building from source)
  • Docker (optional, for containerized deployment)
  • A recent LLM API key (OpenAI, Anthropic, Gemini, etc.)

Installation Steps

  1. Clone the repository

    git clone https://github.com/adolfousier/opencrabs.git
    cd opencrabs
    
  2. Build the binary

    cargo build --release
    
  3. Move the binary to a directory in your PATH

    sudo mv target/release/opencrabs /usr/local/bin/
    
  4. Create a configuration file

    opencrabs init
    

    This will create ~/.opencrabs/config.toml with placeholders for your API keys.

  5. Start the agent

    opencrabs
    

    The TUI will open, showing the conversation history and available commands.

Running Your First Bot

OpenCrabs can talk to you through the terminal or through a messaging platform. To test it in the terminal:

Article supporting image

  1. Type a question, e.g. “What is the weather in Paris?”
  2. The agent will call the weather API, format the answer, and display it.

If you want to use Telegram:

  1. Add the bot token to config.toml.
  2. Run opencrabs -p telegram to start the Telegram channel.
  3. Send a message to the bot from your phone.

Advanced Features

Multi‑Profile Support

OpenCrabs can run several profiles at once. Each profile has its own configuration, memory, and channel connections. This is useful when you need separate bots for different departments or for testing new features without affecting the main bot.

opencrabs profile create dev
opencrabs -p dev

Self‑Healing and Recovery

If the configuration file becomes corrupted, OpenCrabs will automatically restore the last good copy. It also keeps a log of provider failures and can retry failed calls. When a tool crashes, the agent will not lose the conversation context; it will simply skip the failed step and continue.

Provider Health Tracking

The /doctor command shows the health of each provider. It lists the number of successful and failed calls, helping you spot problems early. This feature is available on all channels, so you can check it from Telegram, Discord, or the terminal.

CLI Enhancements

The command‑line interface now supports longer context windows and better error handling. If a tool call fails, the agent will retry automatically up to three times before giving up. The new timeout settings prevent the agent from timing out on slow providers.

Integrating with Neura AI Ecosystem

OpenCrabs can work together with Neura AI’s router and other apps. The Neura Router is a single API endpoint that can call over 500 AI models. By adding the router to OpenCrabs’ provider list, you can switch between models without changing the code.

[providers]
router = { url = "https://router.meetneura.ai", api_key = "YOUR_KEY" }

Using Neura Router

The router supports advanced routing logic, so you can tell OpenCrabs to use a specific model for certain tasks. For example, use a cheaper model for casual chat and a more powerful one for code generation.

Connecting to Neura Apps

OpenCrabs can trigger Neura apps like Neura Artifacto or Neura ACE. For instance, after generating a blog post, you can automatically send it to Neura ACE for SEO optimization. This creates a seamless workflow from conversation to published content.

Real‑World Use Cases

Customer Support Automation

A small e‑commerce store can run OpenCrabs on a local server to answer common questions. Because the bot is self‑hosted, the store keeps all customer data on its own network. The bot can also hand off complex issues to a human agent.

Internal Knowledge Base

A software team can use OpenCrabs to answer questions about code, documentation, or deployment. The bot can pull information from internal wikis, GitHub, or Jira, and it can learn new facts from the team’s conversations.

DevOps Monitoring

OpenCrabs can watch logs, run health checks, and alert on anomalies. By connecting to monitoring tools like Prometheus or Grafana, the bot can provide real‑time status updates to Slack or Discord.

Future Roadmap

Upcoming Features

  • Graph‑Based Memory – A new way to store context that makes it easier to retrieve related information.
  • Web Scraping Tool – Let the bot fetch data from websites automatically.
  • Advanced Error Recovery – More granular control over which failures trigger retries.

Community Contributions

OpenCrabs is open source, so developers can add new tools, improve the UI, or create new channel adapters. The community can also share profiles that work well for specific industries.

Conclusion

OpenCrabs AI Agent is a powerful, self‑hosted solution that lets you run an AI assistant on your own hardware. Its self‑learning and self‑healing capabilities make it reliable, while the multi‑profile support gives you flexibility. By integrating with Neura AI’s router and apps, you can build a complete AI workflow that stays private and customizable. Whether you’re a developer, a small business, or a large organization, OpenCrabs offers a solid foundation for building intelligent assistants that stay under your control.