About
AIChat is a feature-rich, open-source CLI tool designed to make large language models accessible and productive from the terminal. It supports a wide array of LLM providers — including OpenAI, Anthropic Claude, Google Gemini, Ollama, and Groq — giving developers the flexibility to switch between models without changing their workflow. Key capabilities include a Shell Assistant that converts natural language into executable shell commands, a CMD & REPL mode for interactive and one-shot queries, and Retrieval-Augmented Generation (RAG) for grounding responses in local documents. AIChat also supports defining and running AI Tools and Agents, enabling complex multi-step automation workflows from the command line. Installation is straightforward across platforms via cargo, Homebrew, Pacman, Scoop, or pre-built binaries for macOS, Linux, Windows, and Android (Termux). Its extensive configuration via YAML files makes it highly customizable for power users. AIChat is ideal for developers, DevOps engineers, and CLI power users who want to integrate AI into their daily workflows, automate shell tasks with natural language, query local knowledge bases, or experiment with multiple LLM providers in a unified interface. With over 9,600 GitHub stars, it has established itself as one of the most capable open-source AI CLI tools available.
Key Features
- Multi-Provider LLM Support: Connect to OpenAI, Anthropic Claude, Google Gemini, Ollama, Groq, and more from a single unified CLI interface.
- Shell Assistant: Convert natural language instructions into shell commands, making terminal automation intuitive and accessible.
- Retrieval-Augmented Generation (RAG): Index and query local documents to ground LLM responses in your own data and knowledge bases.
- AI Tools & Agents: Define and execute multi-step AI agents that can use tools and perform complex automated workflows directly from the CLI.
- CMD & REPL Modes: Use AIChat in one-shot command mode for scripting or in an interactive REPL for conversational sessions.
Use Cases
- Developers automating shell tasks using natural language to generate and execute terminal commands.
- DevOps engineers querying local runbooks and documentation via RAG to get contextual answers without leaving the terminal.
- AI researchers and power users experimenting with and comparing multiple LLM providers in a unified CLI environment.
- Developers building and testing multi-step AI agents and tool-use workflows from the command line.
- Technical teams embedding LLM capabilities into scripts and pipelines using AIChat's CMD mode.
Pros
- Broad Provider Compatibility: Supports many LLM providers out of the box, letting users switch between models without workflow disruption.
- Fully Open Source: Licensed under Apache-2.0 and MIT, with an active community and nearly 10k GitHub stars ensuring ongoing development.
- Cross-Platform: Works on macOS, Linux, Windows, and even Android via Termux, with multiple installation methods for convenience.
- Advanced RAG & Agent Support: Goes beyond simple chat by enabling local document retrieval and autonomous agent workflows uncommon in CLI tools.
Cons
- Command-Line Only: Lacks a graphical interface, which may be a barrier for users who are not comfortable working in the terminal.
- Setup Complexity: Configuring multiple providers, RAG pipelines, and agents via YAML files requires technical knowledge and effort.
- No Built-in UI for History/Management: Managing conversation history and agent configurations is done through config files rather than a visual dashboard.
Frequently Asked Questions
AIChat supports a wide range of providers including OpenAI, Anthropic Claude, Google Gemini, Ollama (for local models), Groq, and many more configurable via its YAML config file.
AIChat can be installed via Rust's cargo (`cargo install aichat`), Homebrew (`brew install aichat`), Pacman, Windows Scoop, Android Termux, or by downloading pre-built binaries from GitHub Releases.
The Shell Assistant lets you describe what you want to do in natural language and AIChat translates it into the appropriate shell command, helping you work faster without memorizing complex syntax.
Yes. AIChat integrates with Ollama, which allows you to run open-source models locally without sending data to external APIs.
AIChat itself is free and open-source. However, using cloud-based LLM providers like OpenAI or Anthropic Claude requires API keys and may incur costs depending on the provider's pricing.
