Skip to main content

BookwormPRO - Self-hosted AI Agent CLI for academic research

Project description

BookwormPRO

BookwormPRO ☤

Documentation Discord License: MIT Built by BookwormPRO Project

The self-improving AI agent built by BookwormPRO Project. It's the only agent with a built-in learning loop — it creates skills from experience, improves them during use, nudges itself to persist knowledge, searches its own past conversations, and builds a deepening model of who you are across sessions. Run it on a $5 VPS, a GPU cluster, or serverless infrastructure that costs nearly nothing when idle. It's not tied to your laptop — talk to it from Telegram while it works on a cloud VM.

Use any model you want — BookwormPRO Portal, OpenRouter (200+ models), NVIDIA NIM (Nemotron), Xiaomi MiMo, z.ai/GLM, Kimi/Moonshot, MiniMax, Hugging Face, OpenAI, or your own endpoint. Switch with bookworm model — no code changes, no lock-in.

A real terminal interfaceFull TUI with multiline editing, slash-command autocomplete, conversation history, interrupt-and-redirect, and streaming tool output.
Lives where you doTelegram, Discord, Slack, WhatsApp, Signal, and CLI — all from a single gateway process. Voice memo transcription, cross-platform conversation continuity.
A closed learning loopAgent-curated memory with periodic nudges. Autonomous skill creation after complex tasks. Skills self-improve during use. FTS5 session search with LLM summarization for cross-session recall. Honcho dialectic user modeling. Compatible with the agentskills.io open standard.
Scheduled automationsBuilt-in cron scheduler with delivery to any platform. Daily reports, nightly backups, weekly audits — all in natural language, running unattended.
Delegates and parallelizesSpawn isolated subagents for parallel workstreams. Write Python scripts that call tools via RPC, collapsing multi-step pipelines into zero-context-cost turns.
Runs anywhere, not just your laptopSix terminal backends — local, Docker, SSH, Daytona, Singularity, and Modal. Daytona and Modal offer serverless persistence — your agent's environment hibernates when idle and wakes on demand, costing nearly nothing between sessions. Run it on a $5 VPS or a GPU cluster.
Research-readyBatch trajectory generation, Atropos RL environments, trajectory compression for training the next generation of tool-calling models.

Quick Install

curl -fsSL https://raw.githubusercontent.com/huakoh/BookwormPRO/main/scripts/install.sh | bash

Works on Linux, macOS, WSL2, and Android via Termux. The installer handles the platform-specific setup for you.

Android / Termux: The tested manual path is documented in the Termux guide. On Termux, BookwormPRO installs a curated .[termux] extra because the full .[all] extra currently pulls Android-incompatible voice dependencies.

Windows: Native Windows is not supported. Please install WSL2 and run the command above.

After installation:

source ~/.bashrc    # reload shell (or: source ~/.zshrc)
bookworm              # start chatting!

Docker Compose (with host bridge)

Run BookwormPRO in Docker and let it operate on your real Desktop and project files instead of an isolated sandbox.

Linux / macOS / WSL:

git clone https://github.com/huakoh/BookwormPRO.git
cd BookwormPRO
./scripts/setup-host-bridge.sh    # writes .env with detected host paths
docker compose up -d --build

Windows (Docker Desktop):

git clone https://github.com/huakoh/BookwormPRO.git
cd BookwormPRO
.\scripts\setup-host-bridge.ps1
docker compose up -d --build

The setup scripts auto-detect your Desktop and a workspace root and write them into a repo-root .env. The container then mounts them at /host/desktop and /host/workspace and exports BOOKWORMPRO_HOST_BRIDGE=1 so the agent knows it can read, write, and delete real local files. Full background, security model, and scope- restriction options: docs/host-bridge.md.

To disable the bridge entirely, comment the two /host/* volume lines and the BOOKWORMPRO_HOST_BRIDGE env var in docker-compose.yml — the agent reverts to closed-sandbox behavior.


Getting Started

bookworm              # Interactive CLI — start a conversation
bookworm model        # Choose your LLM provider and model
bookworm tools        # Configure which tools are enabled
bookworm config set   # Set individual config values
bookworm gateway      # Start the messaging gateway (Telegram, Discord, etc.)
bookworm setup        # Run the full setup wizard (configures everything at once)
bookworm claw migrate # Migrate from OpenClaw (if coming from OpenClaw)
bookworm update       # Update to the latest version
bookworm doctor       # Diagnose any issues

📖 Full documentation →

CLI vs Messaging Quick Reference

BookwormPRO has two entry points: start the terminal UI with bookworm, or run the gateway and talk to it from Telegram, Discord, Slack, WhatsApp, Signal, or Email. Once you're in a conversation, many slash commands are shared across both interfaces.

Action CLI Messaging platforms
Start chatting bookworm Run bookworm gateway setup + bookworm gateway start, then send the bot a message
Start fresh conversation /new or /reset /new or /reset
Change model /model [provider:model] /model [provider:model]
Set a personality /personality [name] /personality [name]
Retry or undo the last turn /retry, /undo /retry, /undo
Compress context / check usage /compress, /usage, /insights [--days N] /compress, /usage, /insights [days]
Browse skills /skills or /<skill-name> /<skill-name>
Interrupt current work Ctrl+C or send a new message /stop or send a new message
Platform-specific status /platforms /status, /sethome

For the full command lists, see the CLI guide and the Messaging Gateway guide.


Documentation

All documentation lives at bookwormpro.local/docs:

Section What's Covered
Quickstart Install → setup → first conversation in 2 minutes
CLI Usage Commands, keybindings, personalities, sessions
Configuration Config file, providers, models, all options
Messaging Gateway Telegram, Discord, Slack, WhatsApp, Signal, Home Assistant
Security Command approval, DM pairing, container isolation
Tools & Toolsets 40+ tools, toolset system, terminal backends
Skills System Procedural memory, Skills Hub, creating skills
Memory Persistent memory, user profiles, best practices
MCP Integration Connect any MCP server for extended capabilities
Cron Scheduling Scheduled tasks with platform delivery
Context Files Project context that shapes every conversation
Architecture Project structure, agent loop, key classes
Contributing Development setup, PR process, code style
CLI Reference All commands and flags
Environment Variables Complete env var reference

Migrating from OpenClaw

If you're coming from OpenClaw, BookwormPRO can automatically import your settings, memories, skills, and API keys.

During first-time setup: The setup wizard (bookworm setup) automatically detects ~/.openclaw and offers to migrate before configuration begins.

Anytime after install:

bookworm claw migrate              # Interactive migration (full preset)
bookworm claw migrate --dry-run    # Preview what would be migrated
bookworm claw migrate --preset user-data   # Migrate without secrets
bookworm claw migrate --overwrite  # Overwrite existing conflicts

What gets imported:

  • SOUL.md — persona file
  • Memories — MEMORY.md and USER.md entries
  • Skills — user-created skills → ~/.bookwormpro/skills/openclaw-imports/
  • Command allowlist — approval patterns
  • Messaging settings — platform configs, allowed users, working directory
  • API keys — allowlisted secrets (Telegram, OpenRouter, OpenAI, Anthropic, ElevenLabs)
  • TTS assets — workspace audio files
  • Workspace instructions — AGENTS.md (with --workspace-target)

See bookworm claw migrate --help for all options, or use the openclaw-migration skill for an interactive agent-guided migration with dry-run previews.


Contributing

We welcome contributions! See the Contributing Guide for development setup, code style, and PR process.

Quick start for contributors — clone and go with setup-bookworm.sh:

git clone https://github.com/huakoh/BookwormPRO.git
cd bookwormpro
./setup-bookworm.sh     # installs uv, creates venv, installs .[all], symlinks ~/.local/bin/bookworm
./bookworm              # auto-detects the venv, no need to `source` first

Manual path (equivalent to the above):

curl -LsSf https://astral.sh/uv/install.sh | sh
uv venv venv --python 3.11
source venv/bin/activate
uv pip install -e ".[all,dev]"
scripts/run_tests.sh

RL Training (optional): The RL/Atropos integration (environments/) ships via the atroposlib and tinker dependencies pulled in by .[all,dev] — no submodule setup required.


Community


License

MIT — see LICENSE.

Built by BookwormPRO Project.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bookwormpro-7.0.0.tar.gz (5.7 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

bookwormpro-7.0.0-py3-none-any.whl (2.6 MB view details)

Uploaded Python 3

File details

Details for the file bookwormpro-7.0.0.tar.gz.

File metadata

  • Download URL: bookwormpro-7.0.0.tar.gz
  • Upload date:
  • Size: 5.7 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.10

File hashes

Hashes for bookwormpro-7.0.0.tar.gz
Algorithm Hash digest
SHA256 017689606974d2438d2f8fc502ae42b39a9c0b54b2fa7182b7247abdc1e13ec6
MD5 aec96a5b86f487a27e7fadd375966f1e
BLAKE2b-256 dce97bbc12519b735a2aaaf8b5225685b75618a329d13dffeeaa8010e26ca778

See more details on using hashes here.

File details

Details for the file bookwormpro-7.0.0-py3-none-any.whl.

File metadata

  • Download URL: bookwormpro-7.0.0-py3-none-any.whl
  • Upload date:
  • Size: 2.6 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.10

File hashes

Hashes for bookwormpro-7.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c20e68ac174a0457550fe2aeb1d26caf738b94b76b728e65fc7e1aa7c25efe7f
MD5 02eacc3d43e09d9c4fa304ab7556d478
BLAKE2b-256 e9103fa0463b572c0ede45fd0f5e7fdf1e004f7295b437f9787cc1a59b441429

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page