Overview
Today's agent tools are opaque. Once a session ends, all of that context is lost: the decisions, the errors, the lessons learned.
Tapes records every request and response between your agent and model providers. It acts as a proxy server that captures and stores conversation history, allowing you to inspect, search, and verify what happened. Learn more on why transparent telemetry changes everything.
We are working to establish standards within Agent Trace, an open specification.
Install
Install the latest version with a single command:
curl -fsSL https://download.tapes.dev/install | bash Run
Start both the proxy and API server together:
tapes serve Automatically targets Ollama on localhost:11434. Proxy runs on :8080, API on :8081. Add --sqlite "./tapes.db" for persistent storage.
Use
Start an interactive chat session, search conversations, checkpoint previous states, or explore sessions in Deck:
# Start a chat session
tapes chat --model gemma3
# Search conversation turns
tapes search "What's the weather like in New York?"
# Open the Deck TUI
tapes deck --since 720h
# Checkout a previous conversation state for context check-pointing and retry
tapes checkout abc123xyz987
tapes chat All commands interact through the proxy, which forwards to your upstream LLM and stores conversation history.
Play
Explore the TUI with demo data and get a feel for the Deck workflow.
# Seed demo data and open Deck
tapes deck --demo Save
The quick start above stores sessions in memory. To persist data, initialize a project:
tapes init Creates a .tapes/ directory with a SQLite database. See Config for more options.