Skip to content

Ask LLMAI-to-AI collaboration via MCP

Bridge Claude with Gemini, Codex, and Ollama. Multi-provider code review, brainstorming, and automated hooks — as a Claude Code plugin or standalone MCP servers.

Claude Code Plugin

MCP Servers

Installation

Run in your terminal:

bash
# Project scope (current project only)
claude mcp add ask-llm -- npx -y ask-llm-mcp

# User scope (all projects)
claude mcp add --scope user ask-llm -- npx -y ask-llm-mcp

Or install as a plugin (adds slash commands like /multi-review, /brainstorm, /compare, plus reviewer subagents and a pre-commit hook):

bash
/plugin marketplace add Lykhoyda/ask-llm
/plugin install ask-llm@ask-llm-plugins

Quick Setup

Get started in minutes. Pick your client, add the MCP server, and you're live. Requires Node.js v20+ and the relevant CLI authenticated.

Verify

Ask your AI assistant: Use Gemini ping to test the connection. Got Pong! back? You're ready. See How to Ask for usage examples.

Standard MCP

Works with 40+ MCP-compatible clients. No prompt hacks. Your primary LLM transparently delegates research, reviews, or brainstorming to other providers.

Multi-Turn Sessions

Continue conversations across multiple calls. Review code, then follow up for fixes. See Multi-Turn Sessions for details.

Explore the Docs

Released under the MIT License.