Skip to content
Built in Rust · v0.1

The agent that
lives where you code.

A production-grade coding agent built in Rust. One engine drives three surfaces — terminal, editor, pocket — sharing the same sessions, tools, and permissions.

Inside a session

A streaming TUI built for the way coding agents actually work — not a chat window with a code block. Token, tool, and reasoning streams render live; plan mode gates risky work behind explicit approval; sessions persist, fork, and resume.

Slash palette

A built-in command palette — /plan, /resume, /compact, /mcp, /memory, /skills, /permissions, /agents, and more. Type / to fuzzy-find; custom commands from ~/.lime/commands/ show up inline.

Plan mode

Two-stage execution. The model writes a plan, you approve it, then it runs — surviving context compaction. Toggle with /plan or Shift+Tab.

Permission prompts

Inline approval before any write, exec, or network tool. Five modes from read-only to YOLO; per-tool allow/deny lists; exec rules; hooks that fire on every decision.

Sessions & forks

Every turn is persisted. Resume by ID with /resume, fork to try an alternative without losing the original, jump back to any earlier point with Ctrl+D.

Auto-compaction

Sessions never crash on context limits. Older turns get summarized, tool outputs that still matter are preserved, plan and reasoning state are kept whole.

Bash & LSP

The agent runs commands through a sandbox (srt) and reads diagnostics through a real LSP client — not regex-grep-and-pray.

Three surfaces. One engine.

The TUI is the main event, but the same Rust core drives an editor sidebar and a phone-friendly companion. Start a turn in the terminal, answer a permission prompt from your phone, review the diff in VS Code — same sessions, same tools, same permissions.

Terminal CLI

A streaming TUI for the shell, plus non-interactive modes for scripts and CI pipelines. Output schemas, session forks, full bridge access.

VS Code extension

Chat sidebar, inline editor actions, SCM tracking, Copilot Chat participant, MCP bridge — all driving the same agent loop.

Web companion (PWA)

Drive a session on your laptop from a phone or any browser, over a local WebSocket or a Cloudflare tunnel. Installable.

Bridge protocol

JSON-RPC 2.0 / NDJSON over stdio or WebSocket. Same protocol, same capabilities, same permission model on every surface.

Models, tools, and the world outside

First-class streaming for every major provider, Model Context Protocol across four transports, and a sandbox runtime that keeps exec local unless a rule says otherwise.

OpenAI

Responses API with WebSocket streaming, encrypted reasoning echo-back for ZDR, prompt caching, parallel tool calls, MCP server tools.

Anthropic

Messages API with adaptive thinking, tiered prompt caching (1h prefix

  • 5m tail), service-tier parity, pause-turn handling.

Gemini & OpenAI-compat

Google’s Generative Language API with thinking mode, plus any OpenAI-compatible endpoint — Groq, Ollama, vLLM, your own gateway.

MCP everywhere

stdio, SSE, HTTP, WebSocket. Browse the official registry from /mcp, install in two clicks, OAuth handled, telemetry redacted.

Sandbox runtime

srt isolates exec under namespace / network / filesystem boundaries. Local by default, never broader than the rule allows.

Hooks & skills

Run shell hooks on every decision, ship skills as portable bundles, install plugins from the marketplace.

Architecture

Lime architecture Three surfaces — Terminal CLI, VS Code extension, and Web PWA — speak JSON-RPC 2.0 to a single Rust engine that owns the agent loop, sessions, tools, providers, and sandbox. Terminal CLI Streaming TUI · scripts · CI VS Code extension Sidebar · inline · SCM · MCP Web companion (PWA) Phone · browser · tunnel JSON-RPC 2.0 · NDJSON over stdio / WebSocket lime engine · Rust CLI REPL / TUI Bridge Remote control Commands Slash commands Custom commands Aliases Runtime Agent loop Sessions · compaction Permissions · sandbox MCP · memory Tools File ops · patches Bash · LSP Web fetch · notebooks Plugins Loader · manifests Hooks · skills Marketplace Crates lime-core · shared types lime-config · layered config openai · Responses API anthropic · Messages gemini · Gen-Lang openai-compat · Chat completions telemetry · usage / cost srt · sandbox runtime

Pick your path