AMNI-CODE

AGENTIC AI CODING ASSISTANT • MULTI-PROVIDER • AUTONOMOUS TOOL USE

VIEW INTERFACE ☕ SUPPORT

AI CODING WORKFLOW

A lightweight, agentic AI coding assistant that lives in a single binary. Connects to any LLM provider, explores your codebase autonomously, and executes multi-step tasks with built-in tools — no extensions, no plugins, no setup.

🚀

AGENTIC TOOL LOOP

Up to 15 autonomous iterations per request. The AI reads files, writes code, runs commands, and verifies — without waiting for you.

🔌

MULTI-PROVIDER

Switch between xAI Grok, OpenAI GPT, Anthropic Claude, Ollama, or any OpenAI-compatible local server from one settings panel.

📁

DEEP FILE SYSTEM

Read, write, edit, search, and list files across your entire project. The agent resolves relative paths from your working directory automatically.

💻

INTEGRATED TERMINAL

The agent runs shell commands in your project directory — builds, tests, git, package managers — and feeds the output back into its reasoning loop.

🔎

LOCAL MODEL DISCOVERY

Auto-detects GGUF and SafeTensors model files on disk, and imports them into Ollama on first use. No manual model setup required.

🎨

DIFF VIEWER & HISTORY

Built-in diff highlighting shows exactly what the agent changed. Persistent chat history across sessions so you never lose context.

BUILT-IN AGENT TOOLS

The AI agent has direct access to six tools it can invoke autonomously during each conversation turn.

TOOLDESCRIPTION
read_fileRead a file’s full contents from the project directory
write_fileCreate or overwrite a file with new content
edit_fileSurgically replace a specific string inside a file
run_commandExecute any shell command and capture stdout/stderr
list_directoryList all files and folders in a given path
search_filesSearch for text across files using grep/findstr

SPECIFICATIONS

PARAMETERDETAILS
TypeAgentic AI Coding Assistant
LanguageRust (Axum + WebView2)
BinarySingle executable — no runtimes, no dependencies
PlatformsWindows, macOS, Linux
ProvidersxAI Grok, OpenAI, Anthropic Claude, Ollama, Custom Local
Agent LoopUp to 15 autonomous iterations per request
Toolsread_file, write_file, edit_file, run_command, list_directory, search_files
Model DiscoveryAuto-detect GGUF & SafeTensors • Auto-import into Ollama
ThemesLight & Dark mode with persistence
Chat HistoryPersistent sessions stored locally
Config Storage~/.amni/config.json • survives restarts
LicenseCC BY-NC 4.0

HOW IT WORKS

1. CONNECT

Pick a provider and model from the settings panel. For local models, point to your GGUF directory — Amni-Code discovers and imports them automatically.

2. EXPLORE

The agent reads your project structure, entry points, and config files to build context before responding. No copy-pasting required.

3. ACT

Multi-step tool calls — reading code, writing fixes, running builds, searching for patterns — executed autonomously across up to 15 iterations.

4. VERIFY

The agent runs your tests and build commands, reads the output, and continues iterating until the task is complete or verified passing.

GET AMNI-CODE

A single Rust binary with a native window. No runtimes, no Docker, zero dependencies.

Works with xAI Grok, OpenAI, Anthropic Claude, Ollama, or any OpenAI-compatible server • Windows, macOS, Linux

VIEW ON GITHUB