← Back to Blog

MCP Servers: The New USB for AI

Twenty years ago, every device had proprietary connectors. Your camera had one. Your phone had another. Your printer had yet another. Then USB came along and said: we're done with this.

The Model Context Protocol is doing the same thing for AI. And if you're not paying attention, you should be.

What MCP Actually Does

An MCP server is a standardized way for an AI to connect to anything outside itself — your database, your Slack, your files, your spreadsheets, your calendar. Instead of each AI vendor building custom integrations for every tool, you build one MCP server that works with any AI client.

That's the connective tissue.

Think about it: Claude needs access to your Salesforce. Before MCP, Anthropic either builds that integration, or your team does custom work, or you use some middleware layer. With MCP, you build one Salesforce connector and it works with Claude, with your custom agents, with the next AI model that comes out. The connection is tool-agnostic.

That's not a small thing. That's fundamental architecture shifting.

Why This Matters Right Now

We're in the age of horizontal AI deployment. Nobody's betting on one model anymore. You use Claude for reasoning, you use a smaller model for speed, you use a specialized model for specific domains. Your workflows need to talk to all of them.

But your data and systems don't change. Your Salesforce pipeline is still Salesforce. Your customer list is still in your database. Your operational context is still scattered across the same tools it's always been.

MCP solves the binding problem. It says: build the data connection once, make it model-agnostic, and plug it in wherever you need it.

What I'm Actually Doing With MCP

I've built three MCP servers in the last month. Real ones, in production:

1. Custom Dashboard MCP

My health dashboard is a static HTML file. Using an MCP server, I can tell Claude: "read my latest weight, glucose, and energy scores, then suggest this week's workout focus." Claude doesn't need to know where the data lives or how it's stored. It just queries the MCP and gets structured responses. When I update the dashboard format, I update the MCP schema. Claude keeps working.

2. Sales Pipeline MCP

I built an MCP that talks to a sales database. Now any AI I use can query pipeline health, forecast deals, or flag stuck opportunities. Before, each tool needed custom authentication and parsing. Now: one MCP, infinite clients.

3. Knowledge Base MCP

My internal docs live in Markdown. Instead of pushing raw files to every AI, I built an MCP that indexes them, enables semantic search, and returns contextual snippets. When I add a new document, the MCP picks it up automatically. My prompts don't change.

This is what scales.

The Real Power: Decoupling

Here's the insight that matters: MCP servers decouple the data layer from the model layer.

Before MCP, this was coupled:

With MCP, it's decoupled:

That's architectural freedom. That's not marketing. That's real.

The Current Friction Points

MCP isn't perfect yet. There are three annoying gaps:

Discovery

How do you find existing MCP servers? There's no npm-like registry yet. It exists, but it's not the default place most people look. That'll change, but right now you're often building from scratch instead of standing on shoulders.

Standardization of Patterns

MCP defines the protocol, but not best practices for common patterns. Should a database MCP return SQL errors directly? Should it abstract them? What should timeouts look like? These get figured out eventually, but early adopters are in the wild west.

Client Support

Not every AI client supports MCP yet. Claude does. Some others do. But if you're building with a model that doesn't have MCP support built in, you're out. That list is shrinking fast, but it's still a constraint.

What This Enables

When MCP matures — and it will — the implications are huge:

The Honest Take

MCP is a protocol that solves a real problem. It won't be the only protocol (there will be competition and fragmentation). But it's moving the industry in the right direction: towards decoupling, composability, and portability.

If you're building with AI and you're not thinking about MCP, you should be. Not because it's trendy. Because in five years, your competitors will have built single connectors that work everywhere, while you're still bolting custom integrations onto single vendors.

It's USB for AI. Just like USB, it takes a few years for the ecosystem to mature. But once it does, you never go back to proprietary cables.