Ask LLMAI-to-AI collaboration via MCP
Bridge Claude with Gemini, Codex, and Ollama. Multi-provider code review, brainstorming, and automated hooks — as a Claude Code plugin or standalone MCP servers.
Bridge Claude with Gemini, Codex, and Ollama. Multi-provider code review, brainstorming, and automated hooks — as a Claude Code plugin or standalone MCP servers.
Run in your terminal:
# Project scope (current project only)
claude mcp add ask-llm -- npx -y ask-llm-mcp
# User scope (all projects)
claude mcp add --scope user ask-llm -- npx -y ask-llm-mcp Or install as a plugin (adds slash commands like /multi-review, /brainstorm, /compare, plus reviewer subagents and a pre-commit hook):
/plugin marketplace add Lykhoyda/ask-llm
/plugin install ask-llm@ask-llm-pluginsGet started in minutes. Pick your client, add the MCP server, and you're live. Requires Node.js v20+ and the relevant CLI authenticated.
Ask your AI assistant: Use Gemini ping to test the connection. Got Pong! back? You're ready. See How to Ask for usage examples.
Works with 40+ MCP-compatible clients. No prompt hacks. Your primary LLM transparently delegates research, reviews, or brainstorming to other providers.
Continue conversations across multiple calls. Review code, then follow up for fixes. See Multi-Turn Sessions for details.