Agents can discover & use tools hosted on Model Context Protocol (MCP) Server

Can I build an MCP server using the Python MCP SDK and use that server as a tool in CrewAI? I tried, but it didn’t work. I want to know if CrewAI currently supports using an MCP server as a tool input.

I also reviewed the enterprise-mcp-server, which supports deployment and works with Claude Desktop, but since I use Ubuntu, the desktop version isn’t supported.

That’s why I tried using the Python MCP SDK, but it didn’t work either.

So, my questions are:

  1. Can I use a custom MCP server (built using the Python SDK) as a tool in CrewAI?
  2. If yes, how can I properly integrate it?
  3. Is CrewAI actively working on adding support for MCP server integration?
3 Likes

Hi @Dhruvin_5179 - Thank you for posting to our community and welcome. To answer your questions, we’re currently working on adding MCP servers as tools in CrewAI OSS. As you mentioned, we do have the enterprise version of being able to use your deployed crew as a server. Stay tuned for more on this.

3 Likes

Hey Tony any update on this re timeline? Thanks

1 Like

@achris7,

It looks like MCP support for tools is already available in the latest stable version of crewai-tools (0.42.0):

pip install --upgrade "crewai-tools[mcp]"

Then:

from mcp import StdioServerParameters
from crewai_tools import MCPServerAdapter
import os

# For an STDIO based MCP server:
serverparams = StdioServerParameters(
    command="uvx",
    args=["--quiet", "pubmedmcp@0.1.3"],
    env={"UV_PYTHON": "3.12", **os.environ},
)

# etc...
4 Likes

It would be great for some doco or a video on how to set this up as I am super excited by it and want to use it

1 Like

Hey @Tony_Wood, how’s it going?

Actually, it already exists. It’s just that, once again, things aren’t exactly crystal clear. :sweat_smile:

The MCP server tools use the MCP Adapt library under the hood, which seems pretty good, by the way.

Here you can read more about the CrewAI integration, and here you’ll find an example (it actually works, I promise!). Just don’t forget to set up your LLM, if needed, before running the example, okay?

I figure we’ll get some good tutorials on this soon enough. In the meantime, you should probably start getting familiar with it, because this whole MCP topic really seems to be blowing up right now!

3 Likes

Hey @Max_Moura thanks for this… I will add to my research and excited to see what the community creates

2 Likes

a tutorial and docs are coming soon for this. at the moment we don’t push MCP hard because it’s still too early to recommend for production use.

5 Likes