A Small Business, A Big Dream
Meet Sarah, owner of Bean Blossom, a cozy coffee shop. She excels at making lattes but struggles to connect her AI to tools like inventory and sales apps, a costly and time-consuming task. The Model Context Protocol (MCP), launched by Anthropic in November 2024, simplifies this with a universal standard. Let’s explore how it helps Sarah.
What is MCP? The Universal Adapter for AI
Connecting AI to apps was once like using mismatched phone chargers — custom code for each tool, slow and expensive. MCP acts as a universal USB-C charger, standardizing connections for any AI and app. Created by Anthropic and open-source, it helps Sarah’s AI work seamlessly.
The Old Way: A World of Chaos
Before MCP, Sarah’s developers wrote custom code to link her AI to apps, reworking it for every change or update. This was costly and risky. MCP provides a single, reliable connection method.
How MCP Works: The Nuts and Bolts
MCP uses a client-server system powered by JSON-RPC 2.0, a lightweight way for apps to send messages back and forth. Let’s break it down using Sarah’s coffee shop as our…
The Key Players in MCP
MCP has three main components that work together like a dream team:
- MCP Host: This is the app where you interact with your AI, like a desktop version of Claude or a coding tool like Cursor. For Sarah, the host is her coffee shop’s management app, where she chats with her AI about inventory, drinks, or sales.
- MCP Client: Think of the client as a messenger. It connects the host to a server, passing requests (like “Check inventory”) and responses (like “You have 10 bags of beans”) back and forth. Each client handles one specific connection.
- MCP Server: The server is a small program that provides data or tools. Sarah might have one server for her inventory database, another for a weather API, and a third for her sales records. Servers share three things:
- Resources: Data like inventory lists, sales logs, or weather forecasts, defined with a clear structure (schema) so the AI knows what to expect.
- Tools: Actions like calculating profits, sending orders, or generating reports.
- Prompts: User-controlled instructions that guide the AI, like “Suggest a drink using [ingredients] for [weather condition].” These are created by users to tell the AI exactly how to use tools or data.

Prompts: Your Control Center
Prompts are a cornerstone of MCP, and they’re fully controlled by you, the user. Think of them as clear, reusable instructions you give your AI to make it do exactly what you want. For example,
In Sarah’s AI. Her prompt, “Suggest a drink using [ingredients] for [weather condition],” might yield “vanilla iced latte” for sunny weather. She can use “Check low-stock ingredients and suggest a drink using [low-stock ingredients] for [weather condition], ensuring sufficient stock remains” to manage inventory, like using hazelnut syrup (2 oz, enough for 5 drinks).
Sampling: Letting Servers Tap into AI Power
MCP’s sampling feature lets servers ask the AI (through the client) to generate creative or intelligent responses, like text or ideas, without needing direct access to the AI model. It’s like letting Sarah’s inventory server say, “Hey, AI, come up with a catchy name for a drink using these ingredients!” The client controls the AI, ensuring security and privacy, while the server gets the AI’s output. This makes the AI even smarter, enabling what’s called “agentic behaviors” — where it acts more like a proactive assistant.
Here’s how sampling works in Sarah’s coffee shop:
- The Need: Sarah wants her AI to create a unique drink name for a special using low-stock hazelnut syrup to attract customers.
- The Process: The inventory server sends a sampling request to the client, asking the AI to generate a response. The client reviews the request, decides which AI model to use (based on preferences like speed or intelligence), and sends the AI the request. After the AI generates a response, the client reviews it (with Sarah’s approval) and sends it back to the server.
- Human-in-the-Loop: Sarah stays in control, approving or tweaking the request and response to ensure the AI’s output fits her brand (e.g., no wacky names like “Nutty Sludge Surprise”).
For example, suppose Sarah’s inventory server wants to name a drink. It sends a request like this (simplified from the MCP docs):
{
"jsonrpc": "2.0",
"id": 1,
"method": "sampling/createMessage",
"params": {
"messages": [
{
"role": "user",
"content": {
"type": "text",
"text": "Suggest a catchy drink name using hazelnut syrup."
}
}
],
"systemPrompt": "You are a creative barista naming drinks for a cozy coffee shop.",
"includeContext": "thisServer",
"maxTokens": 50
}
}
- What It Means: The server asks the AI to suggest a drink name (“messages”), sets the AI’s role as a creative barista (“systemPrompt”), uses inventory data like hazelnut syrup (“includeContext”), and limits the response to 50 tokens (short and snappy).
- The Result: The AI might respond, “Hazelnut Harmony Latte.” Sarah reviews it, loves it, and the client sends it back to the server, which updates the inventory system to track the new special.
- Why It’s Cool: Sampling lets Sarah’s servers tap into the AI’s creativity without needing their own AI access, keeping everything secure and under her control.
Sampling makes Sarah’s AI more dynamic, like a brainstorming partner that suggests ideas while staying within her guidelines. It’s perfect for tasks like generating marketing slogans, analyzing customer feedback, or even drafting social media posts for Bean Blossom.
The Connection Lifecycle: Getting Started
When Sarah’s app wants to use MCP, here’s how the connection kicks off, using her drink suggestion prompt as an example:
- Initialize Request: The client sends a message to the server, saying, “Let’s talk using JSON-RPC 2.0. What can you do?”
- Initialize Response: The server replies, “I’m ready! I use JSON-RPC 2.0, and I can provide inventory data, weather forecasts, and a user-controlled prompt like ‘Suggest a drink using [ingredients] for [weather condition].’”
- Share Abilities: The server sends a detailed list of its tools (e.g., “check_inventory”), resources (e.g., “ingredient_list”, “weather_data”), prompts (e.g., “Suggest a drink using [ingredients] for [weather condition]”), and sampling capabilities (e.g., “request creative names”).
- Start Working: The client and server exchange messages. Sarah’s AI can ask for data (like ingredients or weather), use tools, follow her prompt to suggest a drink, or use sampling to generate creative ideas, and the server responds instantly.
This process is fast, secure, and standardised, so it works whether Sarah’s connecting to a local database or a cloud-based API. The user-controlled prompt and sampling ensure the AI delivers exactly what Sarah wants, like a drink tailored to her shop’s inventory and the day’s weather or a catchy new menu item.
Sarah’s AI in Action: A Day at Bean Blossom
Let’s see MCP at work in Sarah’s coffee shop. It’s a warm Monday morning, and Sarah wants her AI to suggest a new drink special based on the weather and her current inventory, name it creatively, and calculate the profit margin. She also wants to promote low-stock ingredients to clear space for a big restock next week. Here’s how MCP makes it happen:
- Sarah opens her management app (the host) and asks, “What’s a good drink for today, ideally using low-stock ingredients, with a catchy name, and how profitable would it be?”
- The client sends this request to three servers:
- A weather server checks the local forecast and returns a resource: “It’s 75°F and sunny.”
- An inventory server checks her stock and returns a resource: “You have vanilla syrup (10 oz), coconut milk (8 oz), and hazelnut syrup (2 oz, low stock, enough for 5 drinks).”
- A customer feedback server pulls data from her loyalty program, returning a resource: “Customers love nutty flavors this month.”
Then it continues as follows,
- The inventory server provides a user-controlled prompt Sarah created: “Check low-stock ingredients and suggest a drink using [low-stock ingredients] for [weather condition], ensuring sufficient stock remains.”
- The AI sees that hazelnut syrup is low but has enough for a few drinks. Combining the weather data, inventory, and customer feedback, and following Sarah’s prompt, it suggests, “How about a hazelnut iced coffee? It’s perfect for a sunny day, uses your low-stock hazelnut syrup, and matches customers’ love for nutty flavors.” The AI confirms there’s enough syrup for 5 drinks, so Sarah won’t run out immediately.
- For the drink name, the inventory server uses sampling to request a creative name. It sends a sampling request (like the example above) to the client, asking the AI to generate a name like “Hazelnut Harmony Latte.” Sarah reviews and approves the name, ensuring it fits Bean Blossom’s vibe.
- For the profit margin, the client sends a request to a calculator server with a tool called
calculate_profit
. Sarah’s prompt says, “Calculate profit for [drink] using [cost] and [price].” The server uses data (e.g., cost: $1.20, price: $4.50) to return, “Profit margin is 73%.”
- The client sends the drink suggestion, creative name, and profit margin back to Sarah’s app. Sarah loves the “Hazelnut Harmony Latte” — it’s a great way to use up the low-stock syrup, promote a limited-time special, and capitalize on customer preferences. She adds it to the menu, knowing she has enough stock for the day.
Note: Why focus on low-stock ingredients like hazelnut syrup? Sarah’s goal is to clear out small quantities of ingredients before her big restock.
This entire process takes seconds, securely and seamlessly, because MCP standardizes the connections. Sarah didn’t need spend too much money for software development or wait for custom integrations — her AI just works. Beyond drink suggestions, MCP lets her AI tackle other tasks, like predicting sales for the week by analyzing trends from the sales server.
MCP’s Superpowers: Why It’s a Game-Changer
MCP enhances AI with:
- Statefulness: Remembers context, like suggesting iced drinks if Sarah asks repeatedly.
- Interoperability: Works with any AI or app, easing switches.
- Agent-Centric Design: Makes smart choices within prompts, like naming drinks.
Key Advantages of MCP
- Easy Tool Swaps: Sarah can switch weather servers (e.g., Weather Underground to OpenWeatherMap) without major changes.
- Resilience to API Changes: Handles weather server updates without client/host fixes.
- Token Trimming and Caching: Limits responses (e.g., 50 tokens) and caches data for speed during rushes.
Local vs. Remote: MCP’s Flexibility
MCP supports two ways to communicate, giving you options based on your setup:
Local Communication (Stdio Transport):
- How it works: Uses your computer’s input/output to send messages, like a direct line between Sarah’s app and her inventory server.
- Why it’s great: Fast, secure, and doesn’t need the internet. Perfect for Sarah’s shop, where her sales data lives on a local computer.
- Example: Sarah’s AI checks her inventory file on her laptop to see how many espresso beans are left, ensuring sensitive data stays offline.
Remote Communication (SSE/HTTP Transport):
- How it works: Uses HTTP POST and Server-Sent Events (SSE) to connect over the internet, like linking to a cloud-based API.
- Why it’s great: Flexible for connecting to online tools, like a supplier’s ordering system or a weather service.
- Example: Sarah’s AI pulls customer feedback from an online loyalty program to suggest new menu items or connects to a supplier’s API to check prices.
For Sarah, local communication keeps her sensitive data safe on her shop’s computer, which is crucial for protecting customer information and sales records. When she needs to connect to an online supplier or a cloud-based analytics tool, remote communication makes it possible. MCP’s ability to handle both gives her the best of both worlds — security for local data and flexibility for online tools. This dual approach also means her AI can work even if the internet goes down, keeping her shop running smoothly.
A Hands-On Example: The Calculator Server
To get a clearer picture of MCP, let’s explore a calculator example from the MCP documentation. Think of this as a simplified version of Sarah’s profit calculator, but instead of dollars, we’re adding numbers.
Scenario: Building a Calculator Server
Suppose Sarah’s AI needs to calculate profit margins or track sales totals. An MCP server can provide a calculator with tools like add
, subtract
, and divide
, plus a resource to log past calculations. Here’s how it works:
- Start the Connection: Sarah’s app (the host) connects to the calculator server via a client using JSON-RPC 2.0 (either locally via Stdio or remotely via SSE).
- Share Abilities: The server says, “I have tools:
add
,subtract
,divide
. I also have a resource calledcalculation_log
and a user-controlled prompt like ‘Do [operation] on [numbers].’” - Use the Tool: Sarah’s AI asks, “Add $50 and $30.” The client sends this to the server, which calculates
$50 + $30 = $80
and responds, “$80.” - Check Resources: Sarah’s AI asks, “What calculations have I done?” The server returns the
calculation_log
: “Added 50 and 30, got 80.” - Close the Connection: When Sarah’s done, the client and server disconnect cleanly.
Here’s a Python code snippet for the calculator server, showing how MCP makes it easy to define tools and resources:
# MCP Calculator Server Example
from jsonrpcserver import method, Result, Success
# Store calculation history
log = []
# Tool: Add two numbers
@method
def add(a: float, b: float) -> Result:
result = a + b
log.append(f"Added {a} and {b}, got {result}")
return Success(result)
# Tool: Subtract two numbers
@method
def subtract(a: float, b: float) -> Result:
result = a - b
log.append(f"Subtracted {b} from {a}, got {result}")
return Success(result)
# Resource: Calculation log
@method
def calculation_log() -> Result:
return Success(log)
# User-controlled prompt (example)
@method
def math_prompt(operation: str, numbers: list) -> Result:
return Success(f"Do {operation} on {numbers}")
if __name__ == "__main__":
from jsonrpcserver import dispatch
# Example request: Add 50 and 30
request = {"jsonrpc": "2.0", "method": "add", "params": [50, 30], "id": 1}
print(dispatch(request))
This code sets up a server that adds or subtracts numbers, keeps a log, and supports a user-controlled prompt. For Sarah, a similar server could calculate profits, track daily sales, or even estimate ingredient costs for a new drink. MCP ensures the connection is fast, secure, and easy to set up, whether the server is running on her shop’s computer or in the cloud.
Why This Matters
The calculator example shows how MCP standardizes communication. Whether it’s adding numbers or checking coffee stock, MCP makes it simple to:
- Define tools, resources, and user-controlled prompts.
- Share them with any AI or app.
- Keep everything secure and efficient.
This simplicity means Sarah can add new tools — like a server to track customer loyalty points or analyze social media trends — without starting from scratch. Developers benefit too, as they can build reusable servers that work with any MCP-compatible AI, saving time and effort.
MCP vs. Google’s A2A: Two Pieces of the Puzzle
You might have heard of Agent2Agent (A2A), a protocol Google launched in April 2025 to help AI agents collaborate across systems. How does it compare to MCP? Think of them as complementary tools, each solving a different part of the AI equation.
MCP:
- Focus: Connects one AI to tools and data, like giving Sarah’s AI access to her inventory or a weather API.
- Use Case: Sarah’s AI uses MCP to check stock, calculate profits, or suggest drinks based on user-controlled prompts and sampling.
- Strength: Simplifies integration with apps and data, making AI more capable.
A2A:
- Focus: Helps multiple AI agents work together, like a team of experts collaborating on a project.
- Use Case: Sarah’s AI uses A2A to talk to a supplier’s AI for ordering beans or a marketing AI for planning a promotion.
- Strength: Enables AI-to-AI teamwork, supporting complex tasks like planning or negotiations.
Why MCP Matters: From Coffee Shops to Global Impact
MCP isn’t just a tool for developers — it’s a shift in how we use AI. Here’s why it’s a big deal for everyone:
- For Businesses: Small shops like Sarah’s can use AI to automate tasks, analyze data, and delight customers without spending a fortune on custom integrations. MCP lowers the barrier, making AI practical for small budgets.
- For Developers: MCP saves time by standardizing connections, letting you focus on building innovative features instead of wrestling with APIs. A single MCP server can work with multiple AIs, reducing redundant work.
- For Everyday Users: MCP makes AI more practical, whether you’re managing a business, organizing your life, or exploring new ideas. It’s about making AI a tool for everyone, not just tech giants.
- For the Future: As more companies adopt MCP, it could become the backbone of AI-powered apps, from smart homes to global enterprises.
Challenges and Solutions: Making MCP Work for Everyone
Here are few potential challenges with MCP and how it tackles them:
- Learning Curve: MCP’s JSON-RPC 2.0 setup and sampling might feel daunting for new developers. Solution: The MCP docs offer clear tutorials, and the calculator example above is a great starting point. Community resources, like Nimrita Koul’s Medium tutorial, also help bridge the gap.
- Adoption: MCP needs apps and AI models to support it to reach its full potential, and sampling isn’t yet available in all clients like Claude Desktop. Solution: Anthropic’s open-source approach encourages adoption, and early integrations (like Claude) are building momentum. As more developers see the time savings, adoption is likely to grow.
- Security: Connecting AI to sensitive data and enabling sampling raises concerns. Solution: MCP uses secure transports (Stdio for local, HTTPS for remote), lets servers set strict boundaries on what AI can access, and includes human-in-the-loop controls for sampling to keep Sarah in charge of creative outputs.
The Future of MCP: A Connected, Smarter World
MCP could significantly enhance Bean Blossom in the coming years. Imagine Sarah’s AI connecting to her customer loyalty program to offer personalized drink recommendations based on past purchases — like suggesting a hazelnut latte to a regular who loves nutty flavors. It might also integrate with local event calendars to craft themed drinks for community festivals, boosting foot traffic. Beyond Sarah’s shop, MCP’s open-source nature means developers worldwide can create new tools and plugins, such as advanced analytics for sales forecasting or automated marketing campaigns. This could spark a wave of community-driven innovation, enabling small businesses to collaborate on shared MCP servers — like a group of local cafes sharing a regional supplier API to negotiate better rates. As MCP evolves, it promises a future where AI seamlessly enhances daily operations, making businesses like Sarah’s more connected, efficient, and customer-focused.
References: