System info, file operations, git workflows, shell execution, web requests, Docker status, and SSH -- all available out of the box.
Ollama for local models, GitHub Models, OpenAI, and Anthropic. Switch providers with a single flag or set a default.
Save conversations by name, resume them later, list and search your history. Pick up right where you left off.
Confirmation prompts, path sandboxing, command allow/deny lists, and more. Every destructive action requires approval.
Extend emprise with custom tool plugins or connect to any Model Context Protocol server for additional capabilities.
Properly signed and notarized for macOS. No Gatekeeper warnings. Native Apple Silicon binary for maximum performance.
/help -- show all commands/model -- switch LLM backend/save <name> -- save conversation/resume <name> -- resume conversation/list -- list saved conversations/forget -- clear conversation history/system <msg> -- set system prompt/plugin <path> -- load a plugin/mcp <url> -- connect MCP server/cost -- show token usage and cost/quit -- exit emprise-m, --model -- select model-p, --provider -- LLM provider-r, --resume -- resume conversation-s, --system -- system prompt-y, --yes -- auto-approve tools--no-tools -- disable all tools--safe -- read-only mode--verbose -- debug output--version -- print version