Slash palette
A built-in command palette — /plan, /resume, /compact, /mcp,
/memory, /skills, /permissions, /agents, and more. Type /
to fuzzy-find; custom commands from ~/.lime/commands/ show up inline.
A production-grade coding agent built in Rust. One engine drives three surfaces — terminal, editor, pocket — sharing the same sessions, tools, and permissions.
A streaming TUI built for the way coding agents actually work — not a chat window with a code block. Token, tool, and reasoning streams render live; plan mode gates risky work behind explicit approval; sessions persist, fork, and resume.
Slash palette
A built-in command palette — /plan, /resume, /compact, /mcp,
/memory, /skills, /permissions, /agents, and more. Type /
to fuzzy-find; custom commands from ~/.lime/commands/ show up inline.
Plan mode
Two-stage execution. The model writes a plan, you approve it, then it
runs — surviving context compaction. Toggle with /plan or
Shift+Tab.
Permission prompts
Inline approval before any write, exec, or network tool. Five modes from read-only to YOLO; per-tool allow/deny lists; exec rules; hooks that fire on every decision.
Sessions & forks
Every turn is persisted. Resume by ID with /resume, fork to try
an alternative without losing the original, jump back to any earlier
point with Ctrl+D.
Auto-compaction
Sessions never crash on context limits. Older turns get summarized, tool outputs that still matter are preserved, plan and reasoning state are kept whole.
Bash & LSP
The agent runs commands through a sandbox (srt) and reads
diagnostics through a real LSP client — not regex-grep-and-pray.
The TUI is the main event, but the same Rust core drives an editor sidebar and a phone-friendly companion. Start a turn in the terminal, answer a permission prompt from your phone, review the diff in VS Code — same sessions, same tools, same permissions.
Terminal CLI
A streaming TUI for the shell, plus non-interactive modes for scripts and CI pipelines. Output schemas, session forks, full bridge access.
VS Code extension
Chat sidebar, inline editor actions, SCM tracking, Copilot Chat participant, MCP bridge — all driving the same agent loop.
Web companion (PWA)
Drive a session on your laptop from a phone or any browser, over a local WebSocket or a Cloudflare tunnel. Installable.
Bridge protocol
JSON-RPC 2.0 / NDJSON over stdio or WebSocket. Same protocol, same capabilities, same permission model on every surface.
First-class streaming for every major provider, Model Context Protocol across four transports, and a sandbox runtime that keeps exec local unless a rule says otherwise.
OpenAI
Responses API with WebSocket streaming, encrypted reasoning echo-back for ZDR, prompt caching, parallel tool calls, MCP server tools.
Anthropic
Messages API with adaptive thinking, tiered prompt caching (1h prefix
Gemini & OpenAI-compat
Google’s Generative Language API with thinking mode, plus any OpenAI-compatible endpoint — Groq, Ollama, vLLM, your own gateway.
MCP everywhere
stdio, SSE, HTTP, WebSocket. Browse the official registry from
/mcp, install in two clicks, OAuth handled, telemetry redacted.
Sandbox runtime
srt isolates exec under namespace / network / filesystem
boundaries. Local by default, never broader than the rule allows.
Hooks & skills
Run shell hooks on every decision, ship skills as portable bundles, install plugins from the marketplace.