Based on a tutorial by IBM Technology
If you’re working with AI applications, you’ve probably wondered how to connect your language models to external data and services effectively. The traditional approach using APIs works, but it can be complex and inconsistent.
IBM Technology recently released an excellent explainer comparing the new Model Context Protocol (MCP) with traditional APIs. I’m summarizing their insights here to help you understand these two approaches and decide which might work better for your AI projects.
Quick Navigation
- What is MCP? The USB-C Analogy (00:00-02:30)
- MCP Architecture and Components (02:31-04:00)
- MCP Capabilities and Primitives (04:01-06:15)
- Understanding APIs and REST (06:16-08:30)
- Key Similarities Between MCP and APIs (08:31-09:45)
- Fundamental Differences (09:46-12:30)
- How MCP and APIs Work Together (12:31-end)
What is MCP? The USB-C Analogy
The Model Context Protocol (MCP) is Anthropic’s new open standard protocol introduced in late 2024. Think of it as a USB-C port for your AI applications.
Just like your laptop’s USB-C ports can connect to monitors, external drives, and power supplies from different manufacturers using the same standard, MCP standardizes connections between AI applications, LLMs, and external data sources.
Key Points:
- MCP standardizes how applications provide context to LLMs
- It works like USB-C – one standard for multiple connections
- Eliminates the need for custom integrations for each data source
- Already making significant impact in the AI development community
My Take:
The USB-C analogy is brilliant because it immediately clarifies MCP’s value proposition. If you’ve ever struggled with multiple proprietary connectors, you’ll appreciate why standardization matters for AI integrations.
MCP Architecture and Components
MCP follows a client-server architecture with specific components that work together seamlessly.
Architecture Components:
- MCP Host: Runs multiple MCP clients
- MCP Clients: Open JSON RPC 2.0 sessions using MCP protocol
- MCP Servers: Expose capabilities like database access, code repositories, email servers
- JSON RPC 2.0: The underlying communication protocol
In the USB-C analogy, the laptop represents the MCP host, the USB-C connection is the MCP protocol, and your peripherals (monitor, drive, power supply) are the MCP servers.
My Take:
This architecture is particularly elegant because it separates concerns cleanly. Your AI application doesn’t need to know the specifics of each external service – it just needs to speak MCP.
MCP Capabilities and Primitives
MCP addresses two main needs of LLM applications: providing contextual data and enabling tool usage. It does this through three key primitives.
The Three MCP Primitives:
- Tools: Discrete actions or functions the AI can call (e.g., get_weather, create_event)
- Resources: Read-only data items like text files, database schemas, file contents
- Prompt Templates: Predefined templates providing suggested prompts
The powerful feature here is runtime discovery. AI agents can query an MCP server to discover available primitives and invoke them uniformly. Every MCP server publishes machine-readable catalogs (tools/list, resources/list, prompts/list).
My Take:
Runtime discovery is a game-changer. Instead of hardcoding integrations, your AI agent can dynamically discover and use new functionality. This makes AI applications much more flexible and maintainable.
Understanding APIs and REST
APIs (Application Programming Interfaces) define rules for how systems request information or services from each other. They act as abstraction layers between requesting applications and the services they need.
REST API Characteristics:
- Communicates over HTTP using standard methods (GET, POST, PUT, DELETE)
- Uses endpoints like
GET /books/123
orPOST /loans
- Returns data typically in JSON format
- Most ubiquitous API style (the “web default”)
GET /books/123 - Fetch book details
POST /loans - Borrow a book
PUT /books/123 - Update book information
DELETE /books/123 - Remove book
Many commercial LLMs are offered over REST APIs – you send a JSON prompt and get a JSON completion back. AI agents often use REST APIs for web searches or interacting with company services.
Key Similarities Between MCP and APIs
Despite their differences, MCP and APIs share several fundamental characteristics that make them both valuable for system integration.
Shared Characteristics:
- Client-Server Architecture: Both use request-response patterns
- Abstraction Layer: Hide implementation details from clients
- Simplified Integration: Let developers wire systems together efficiently
- Standardized Communication: Follow defined protocols for interaction
My Take:
These similarities explain why developers familiar with APIs can quickly understand MCP concepts. The foundational principles remain the same – it’s the implementation and optimization that differs.
Fundamental Differences
While similar in structure, MCP and APIs differ significantly in purpose, discovery mechanisms, and standardization approaches.
Purpose-Built vs General Purpose:
- MCP: Explicitly designed for LLM applications and AI agents
- APIs: General-purpose, not created specifically for AI/LLMs
- MCP bakes in AI-friendly assumptions and patterns
Dynamic Discovery:
- MCP: Clients can ask servers “What can you do?” at runtime
- APIs: Typically require developer updates when endpoints change
- AI agents can pick up new MCP features automatically
Standardization:
- MCP: Every server speaks the same protocol and patterns
- APIs: Each API has unique endpoints, parameters, and authentication
- One AI agent can use multiple MCP servers without custom adapters
My Take:
The “build once, integrate many” principle of MCP is particularly compelling for AI development where you often need to connect to multiple data sources and services.
How MCP and APIs Work Together
Here’s the surprising twist: MCP and APIs aren’t competitors – they’re complementary layers in the AI stack.
Many MCP servers actually use traditional APIs under the hood. An MCP server often acts as a wrapper around existing APIs, translating between MCP format and the underlying service’s native interface.
Real-World Example:
- MCP GitHub server exposes tools like
repository/list
- Internally translates each tool call to GitHub’s REST API requests
- Provides AI-friendly interface on top of existing API infrastructure
Available MCP Servers:
- File systems
- Google Maps
- Docker
- Spotify
- Growing list of enterprise data sources
My Take:
This layered approach is brilliant – you get the benefits of MCP standardization without losing access to the vast ecosystem of existing APIs. It’s evolution, not revolution.