Open Source Β· Written in Rust Β· MIT License

The AI agent that
lives on your server

A self-hostable, self-improving agent runtime. Chat from your terminal, connect to Telegram / Discord / Slack / LINE / WhatsApp, or call over HTTP β€” all from a single 10 MB binary.

terminal
$ garudust setup
βœ“ Provider: Anthropic
βœ“ Model: claude-sonnet-4-6
βœ“ Config saved to ~/.garudust/
$ garudust
Garudust 0.4.0 β€” type your task, Ctrl+C to quit
you > summarise last week's git log
⟳ thinking…
~10 MB
binary size
<20 ms
cold start
200+
LLM models
7
platforms supported

Capabilities

Everything you'd expect β€” and more

Self-improving

Saves your preferences and corrections to persistent memory. Gets smarter every session without repeating yourself.

Multilingual by default

Detects Thai, Chinese, Japanese, Arabic, Korean and more automatically. No configuration needed.

Reusable skills

Save multi-step workflows as Markdown skills. Hot-reloaded on every call β€” edit a file and the next message picks it up.

Provider-agnostic

Swap between Anthropic, OpenRouter, Bedrock, Ollama, vLLM, or any OpenAI-compatible endpoint with one env var.

Headless server

HTTP gateway with blocking, streaming, and WebSocket endpoints. Prometheus metrics included. Deploy anywhere Docker runs.

Composable crates

Every piece is an independent crate. Add a tool, platform, or transport in under 100 lines without touching anything else.

Get started

Up and running in 60 seconds

01

Install

# download for your platform
curl -LO github.com/garudust-org/
garudust-agent/releases/latest/
download/garudust-linux-x64.tar.gz
tar -xzf garudust-*.tar.gz
sudo mv garudust /usr/local/bin/
02

Configure

# interactive wizard
garudust setup
βœ“ Pick provider
βœ“ Enter API key
βœ“ Choose model
# or set env directly
ANTHROPIC_API_KEY=sk-ant-...
03

Run

# interactive TUI
garudust
# or one-shot task
garudust "summarise git log"
# or headless server
garudust-server --port 3000
Production server example
# ~/.garudust/.env
ANTHROPIC_API_KEY=sk-ant-...
GARUDUST_API_KEY=my-secret-token # protect /chat* endpoints
TELEGRAM_TOKEN=123:ABC
# launch with Docker sandbox + Telegram + daily memory consolidation
GARUDUST_TERMINAL_SANDBOX=docker \
GARUDUST_MEMORY_CRON="0 3 * * *" \
garudust-server --port 3000 --approval-mode smart

Platform adapters

Meet your users where they are

Every adapter runs in the same process. Set the tokens and start the server β€” that's it.

Telegram
Discord
Slack
Matrix
LINE
WhatsApp
HTTP / WS

Security

Secure by design,
not by accident

Every layer is independently hardened β€” so a compromise at one level can't cascade to the next.

Docker sandbox

Commands run inside isolated containers with --cap-drop ALL, pid limits, and ephemeral /tmp.

Hardline command blocks

Recursive root deletion, mkfs, fork bombs, block-device writes, and system shutdown are unconditionally refused.

Memory-poisoning protection

Retrieved memories are tagged <untrusted_memory> so injected jailbreaks can't masquerade as trusted instructions.

Automatic secret redaction

API keys are scrubbed from tool output before reaching the model. Output is capped at 50 KB to prevent context flooding.

Sensitive path write protection

Writes to ~/.ssh, ~/.aws, shell dotfiles, and system files are always blocked.

~/.garudust/config.yaml
security:
terminal_sandbox: docker
terminal_sandbox_image: ubuntu:24.04
terminal_sandbox_opts:
- "--network=none"
- "--memory=512m"
Auto-detected if Docker is missing β€” safe fallback with clear error
Config hot-reloaded β€” no restart needed
Extra flags (network, memory, cpu) fully customisable

LLM Providers

Your model, your rules

Switch providers with a single environment variable. No code changes required.

ProviderHow to selectNotes
Anthropic ANTHROPIC_API_KEY Direct Messages API
OpenRouter OPENROUTER_API_KEY 200+ models default
AWS Bedrock AWS_ACCESS_KEY_ID SigV4 auth, Converse API
Ollama OLLAMA_BASE_URL Local, no key required
vLLM VLLM_BASE_URL OpenAI-compatible server
Any OpenAI-compat GARUDUST_BASE_URL Generic transport
Garudust logo

Ready to deploy?

One binary. No cloud lock-in. Runs on your laptop or your server.