Title header image: Explain it Like I'm Five: What the Heck is an MCP Server?

Explain It Like I’m Five: What the Heck Is an MCP Server?

Unless you’ve recently spent your free time nose-deep in GitHub repos or interrogating ChatGPT like it’s a barista who got your coffee order wrong, you may not be familiar with MCP servers. That’s okay. Two weeks ago, I wasn’t either.

But thanks to a casual, “Hey, can you connect Notion to our project via an MCP server?” from a teammate (and my relentless need to avoid looking clueless), I dove headfirst into the rabbit hole. What I discovered is something potentially revolutionary for AI—and pretty dang cool for devs.

So what is an MCP server? Buckle up, friends. I’m gonna break it down in explain-it-like-I’m-five terms.

Step One: Understanding the Problem

Large Language Models (LLMs)—like ChatGPT or Claude—are great, but on their own, they’re basically really enthusiastic know-it-alls who haven’t read the news since 2023. Ask them about the weather in Austin today, and they’ll politely shrug. That’s because they’re trained on static data. No real-time updates. No live integrations. Just vibes.

To fix that, engineers introduced tools—special plugins that let LLMs access external information. Like the weather. Or a calendar. Or your inbox (with your permission, of course). But these tools come with a huge limitation: they’re bound to the specific client or interface you’re using.

So ChatGPT has its set. Claude has another. There’s no plug-and-play. Just silos.

Step Two: Enter the Hero—MCP Servers

On November 25, 2024, Anthropic dropped the Model Context Protocol (MCP), and suddenly, things got interesting.

Think of MCP as the USB-C of AI. It’s a standardized interface that lets any supported LLM interact with external tools, data sources, or APIs. Like a universal port. You build an MCP server once, and any client that supports MCP can use it.

Just like you don’t care what brand your USB-C charger is as long as it works—MCP makes it possible to treat tools the same way. Plug. Play. Done.

Okay, But What Does That Actually Mean?

Before MCP, if I wanted my LLM to interact with Gmail, I’d have to define every single interaction as its own tool:

  • Send email
  • Create draft
  • Get message
  • Delete email
  • [Insert 12 more verbs here]

Each one required setup. Each one needed configuration.

After MCP? You point the LLM to one server (like a GitHub repo), and suddenly, it can see a whole menu of available actions. In one neat package.

I connected the Notion MCP server this way—just pasted in the repo address, and boom. There was a gorgeous list of available API calls, pre-configured, no duct tape required.

The (Very Cool) Client-Server Dance

MCP uses a client-server architecture.

Here’s how it works under the hood:

  • The MCP Host is the LLM (like Claude, ChatGPT, Gemini, etc).
  • The MCP Client is part of the host application, maintaining a connection with an MCP server.
  • The MCP Server is where the action happens—it provides the tools, context, and prompts.
  • The LLM picks up what it needs, when it needs it, and runs with it.


Unlike traditional REST APIs which operate on a stateless model, MCP does rely on a persistent connection between the AI agent and the tool it interacts with. So yes, there are some security implications (more on that in the resources I’ll post). But the potential for interoperability and dynamic AI workflows? Off the charts.

Why It Matters

If you’re like me—new-ish to this whole AI-meets-dev-tools world—you might be asking, “Why should I care?”

Here’s why:

  1. Fewer one-off tools
    With MCP, we can stop reinventing the wheel for every single integration.
  2. Faster experimentation
    Pull down an MCP server from GitHub and start playing. No need to build everything yourself.
  3. Scalable infrastructure
    Standardization means your tools and workflows can scale with way less friction.
  4. Actual collaboration
    This helps democratize AI integrations. Less gatekeeping. More innovation.

The TL;DR

MCP Servers are the USB-C ports of the AI world. They standardize how tools are connected to language models, making it easier (and faster) to bring live data, APIs, and external context into the conversation.

And yes, they’re still new. The docs can be fuzzy. The GitHub projects vary wildly in quality. And when I asked Claude how to connect an MCP server, it thought I meant Minecraft. So be prepared for some trial and error.

But if you’re curious? Start simple. Try a weather server. Or Gmail. Or Notion (just know their API calls have as many underscores as a snake in a sweater). And if you ever feel overwhelmed, remember: it’s okay to ask questions. Even if your first one is, “What even is an MCP server?”

Developed from P2’s April Lightning Talks.