About
Letta AI, originally known as MemGPT, is a cutting-edge platform for building stateful AI agents that remember everything, learn continuously, and improve themselves over time. At its core is Letta Code, a memory-first coding agent designed to replace stateless AI sessions with persistent, evolving agent experiences. Unlike traditional AI assistants that start fresh each conversation, Letta agents maintain a persistent memory across sessions. Background memory subagents automatically refine prompts, context, and skills with each interaction, making your agent smarter over time through a visual 'Memory Palace' interface. Letta gives users full ownership and transparency over their agent's memory — you can view, edit, and even port memory and conversation histories between different model providers. This model-agnostic approach means you're never locked into a single AI provider. The platform supports multiple deployment modes: a desktop app for macOS, a terminal CLI (installable via npm), and an SDK for programmatic integration. Developers can connect their own API keys and existing coding plans, making Letta Code free to use with your preferred LLM backend. Letta is ideal for developers, power users, and teams who want deeply personalized AI agents that grow with them — whether for coding assistance, automation, or building custom agent-powered applications.
Key Features
- Persistent Agent Memory: Agents retain memory and context across all sessions instead of resetting each conversation, enabling truly personalized and continuous experiences.
- Background Memory Subagents: Automated subagents run in the background to improve prompts, refine context, and build new skills over time based on your usage.
- Memory Palace Visualization: Transparently view and edit your agent's memory, context, and learned behaviors through an intuitive Memory Palace interface.
- Model-Agnostic Memory Portability: Port your agent's memory and conversation histories between different LLM providers without losing any history or context.
- Multi-Environment Deployment: Run agents locally via macOS desktop app, terminal CLI, or remotely using letta server — and move them across machines with full memory intact.
Use Cases
- Developers building personalized coding assistants that remember project context, past decisions, and personal coding preferences across long-term work.
- Teams deploying persistent AI agents for internal tooling that continuously improve their responses based on organizational knowledge.
- Researchers and power users who want fine-grained control over AI memory and context, including the ability to audit and edit stored information.
- Software engineers who want to leverage their existing LLM API subscriptions to power a stateful local AI coding agent without additional costs.
- AI application builders using the Letta SDK to embed memory-aware, stateful agents into their own products and workflows.
Pros
- True Persistent Memory: Unlike most AI tools, Letta maintains genuine long-term memory across sessions, making agents dramatically more useful over time.
- Bring Your Own API Keys: Connect your existing LLM API keys and coding plans to use Letta Code for free, avoiding vendor lock-in.
- Full Memory Transparency: Users can inspect, modify, and migrate their agent's memory — providing a level of control rarely seen in AI agent platforms.
- Flexible Deployment Options: Available as a desktop app, CLI tool, and SDK, making it accessible across a wide range of developer workflows.
Cons
- Early-Stage Desktop App: The macOS desktop app was listed as 'coming very soon' at launch, requiring users to start with the CLI in the interim.
- Requires Node.js Setup: CLI installation requires Node.js 18+ and familiarity with terminal environments, which may be a barrier for non-developer users.
- macOS-First: Desktop app support is initially limited to macOS, with Windows and Linux users needing to rely on the CLI or web interface.
Frequently Asked Questions
Letta AI is the evolution of the open-source MemGPT project. It expands on MemGPT's memory management concepts to provide a full platform for building stateful, memory-first AI agents with a polished UI, desktop app, and cloud infrastructure.
Yes, Letta Code can be used for free by connecting your own API keys from providers like OpenAI or Anthropic, or by linking existing coding plans. Letta also offers its own hosted plans with additional features.
You can install Letta Code via the CLI using npm: run `npm i -g @letta-ai/letta-code` (requires Node.js 18+), then start it with the `letta` command. A macOS desktop app is also available.
Yes. Letta is model-agnostic and allows you to port your agent's memory and conversation history between different LLM providers, so your agent's knowledge persists even when you switch models.
The Memory Palace is a visual interface in Letta that lets you view, inspect, and modify your agent's stored memories, context windows, and learned skills, giving you full transparency and control over what your agent knows.
