Content
<img src="/docs/assets/images/rubyllm-mcp-logo-text.svg" alt="RubyLLM" height="120" width="250">
**Aiming to make using MCPs with RubyLLM and Ruby as easy as possible.**
This project is a Ruby client for the [Model Context Protocol (MCP)](https://modelcontextprotocol.io/), designed to work seamlessly with [RubyLLM](https://github.com/crmne/ruby_llm). This gem enables Ruby applications to connect to MCP servers and use their tools, resources and prompts as part of LLM conversations.
For a more detailed guide, see the [RubyLLM::MCP docs](https://rubyllm-mcp.com/).
Currently full support for MCP protocol version up to `2025-06-18`.
<div class="badge-container">
<a href="https://badge.fury.io/rb/ruby_llm-mcp"><img src="https://badge.fury.io/rb/ruby_llm-mcp.svg" alt="Gem Version" /></a>
<a href="https://rubygems.org/gems/ruby_llm-mcp"><img alt="Gem Downloads" src="https://img.shields.io/gem/dt/ruby_llm-mcp"></a>
</div>
## RubyLLM::MCP Features
- 🔌 **Multiple Transport Types**: Streamable HTTP, and STDIO and legacy SSE transports
- 🛠️ **Tool Integration**: Automatically converts MCP tools into RubyLLM-compatible tools
- 📄 **Resource Management**: Access and include MCP resources (files, data) and resource templates in conversations
- 🎯 **Prompt Integration**: Use predefined MCP prompts with arguments for consistent interactions
- 🎛️ **Client Features**: Support for sampling, roots, and elicitation
- 🎨 **Enhanced Chat Interface**: Extended RubyLLM chat methods for seamless MCP integration
- 🔄 **Multiple Client Management**: Create and manage multiple MCP clients simultaneously for different servers and purposes
- 📚 **Simple API**: Easy-to-use interface that integrates seamlessly with RubyLLM
## Installation
```bash
bundle add ruby_llm-mcp
```
or add this line to your application's Gemfile:
```ruby
gem 'ruby_llm-mcp'
```
And then execute:
```bash
bundle install
```
Or install it yourself as:
```bash
gem install ruby_llm-mcp
```
## Usage
### Basic Setup
First, configure your RubyLLM client and create an MCP connection:
```ruby
require 'ruby_llm/mcp'
# Configure RubyLLM
RubyLLM.configure do |config|
config.openai_api_key = "your-api-key"
end
# Connect to an MCP server via SSE
client = RubyLLM::MCP.client(
name: "my-mcp-server",
transport_type: :sse,
config: {
url: "http://localhost:9292/mcp/sse"
}
)
# Or connect via stdio
client = RubyLLM::MCP.client(
name: "my-mcp-server",
transport_type: :stdio,
config: {
command: "node",
args: ["path/to/mcp-server.js"],
env: { "NODE_ENV" => "production" }
}
)
# Or connect via streamable HTTP
client = RubyLLM::MCP.client(
name: "my-mcp-server",
transport_type: :streamable,
config: {
url: "http://localhost:8080/mcp",
headers: { "Authorization" => "Bearer your-token" }
}
)
```
### Using MCP Tools with RubyLLM
```ruby
# Get available tools from the MCP server
tools = client.tools
puts "Available tools:"
tools.each do |tool|
puts "- #{tool.name}: #{tool.description}"
end
# Create a chat session with MCP tools
chat = RubyLLM.chat(model: "gpt-4")
chat.with_tools(*client.tools)
# Ask a question that will use the MCP tools
response = chat.ask("Can you help me search for recent files in my project?")
puts response
```
### Manual Tool Execution
You can also execute MCP tools directly:
```ruby
# Tools Execution
tool = client.tool("search_files")
# Execute a specific tool
result = tool.execute(
name: "search_files",
parameters: {
query: "*.rb",
directory: "/path/to/search"
}
)
puts result
```
### Working with Resources
MCP servers can provide access to resources - structured data that can be included in conversations. Resources come in two types: normal resources and resource templates.
#### Normal Resources
```ruby
# Get available resources from the MCP server
resources = client.resources
puts "Available resources:"
resources.each do |resource|
puts "- #{resource.name}: #{resource.description}"
end
# Access a specific resource by name
file_resource = client.resource("project_readme")
content = file_resource.content
puts "Resource content: #{content}"
# Include a resource in a chat conversation for reference with an LLM
chat = RubyLLM.chat(model: "gpt-4")
chat.with_resource(file_resource)
# Or add a resource directly to the conversation
file_resource.include(chat)
response = chat.ask("Can you summarize this README file?")
puts response
```
#### Resource Templates
Resource templates are parameterized resources that can be dynamically configured:
```ruby
# Get available resource templates
templates = client.resource_templates
log_template = client.resource_template("application_logs")
# Use a template with parameters
chat = RubyLLM.chat(model: "gpt-4")
chat.with_resource_template(log_template, arguments: {
date: "2024-01-15",
level: "error"
})
response = chat.ask("What errors occurred on this date?")
puts response
# You can also get templated content directly
content = log_template.to_content(arguments: {
date: "2024-01-15",
level: "error"
})
puts content
```
### Working with Prompts
MCP servers can provide predefined prompts that can be used in conversations:
```ruby
# Get available prompts from the MCP server
prompts = client.prompts
puts "Available prompts:"
prompts.each do |prompt|
puts "- #{prompt.name}: #{prompt.description}"
prompt.arguments.each do |arg|
puts " - #{arg.name}: #{arg.description} (required: #{arg.required})"
end
end
# Use a prompt in a conversation
greeting_prompt = client.prompt("daily_greeting")
chat = RubyLLM.chat(model: "gpt-4")
# Method 1: Ask prompt directly
response = chat.ask_prompt(greeting_prompt, arguments: { name: "Alice", time: "morning" })
puts response
# Method 2: Add prompt to chat and then ask
chat.with_prompt(greeting_prompt, arguments: { name: "Alice", time: "morning" })
response = chat.ask("Continue with the greeting")
```
## Development
After checking out the repo, run `bundle` to install dependencies. Then, run `bundle exec rake` to run the tests. Tests currently use `bun` to run test MCP servers You can also run `bin/console` for an interactive prompt that will allow you to experiment.
There are also examples you you can run to verify the gem is working as expected.
```bash
bundle exec ruby examples/tools/local_mcp.rb
```
## Contributing
We welcome contributions! Bug reports and pull requests are welcome on GitHub at https://github.com/patvice/ruby_llm-mcp.
## License
Released under the MIT License.
You Might Also Like
Ollama
Ollama enables easy access to large language models on various platforms.

n8n
n8n is a secure workflow automation platform for technical teams with 400+...
OpenWebUI
Open WebUI is an extensible web interface for customizable applications.

Dify
Dify is a platform for AI workflows, enabling file uploads and self-hosting.

Zed
Zed is a high-performance multiplayer code editor from the creators of Atom.
MarkItDown MCP
markitdown-mcp is a lightweight MCP server for converting various URIs to Markdown.