Stop Making LLM Wrappers

    Kent C. DoddsKent C. Dodds

    Are you building an AI chatbot for your website? Stop right there. I've got news for you: that's not what users want.

    Think about it. When Tony Stark needs something done, does he switch between different AI assistants for each task? Of course not. He has Jarvis, a single AI that can do it all. The same goes for the Star Trek crew with their ship's computer. That's the future we're heading towards, and I suggest we prepare for it.

    The problem with AI islands

    Right now, we're creating isolated AI experiences. Each website or app has its own AI assistant, forcing users to constantly switch contexts. It's like having a different personal assistant for every room in your house. Unnecessary overhead.

    But here's the thing: users don't want this fragmentation.

    They want a single, powerful AI assistant that can interact with any website or app.

    And that's where MCP comes in.

    MCP (Model Context Protocol) allows a user's preferred AI assistant to interact directly with your website or app. No more context switching, no more isolated experiences. No more glue code to integrate different systems.

    With MCP, your website becomes a set of tools that any AI assistant can use. It's like giving Jarvis access to your app's control panel. The user stays in their comfort zone, using the AI they're familiar with, while still being able to interact with your specific services.

    What this means for web developers

    Now, I know what you're thinking. "But I've already invested time in wrapping an LLM for my site!" I get it. But here's the harsh truth: users aren't going to want that in the long run.

    Instead of creating yet another chat interface that doesn't have any context about the user's preferences or goals, we should focus on building robust MCP servers and providing clear tools and APIs for the AI assistants they're already using and has the necessary context for the user already.

    It's a shift in mindset, but it's crucial. We're moving from "How can I add AI to my site?" to "How can I make my site accessible to AI assistants?"

    What do we do right now?

    Unfortunately, AI assistants are really bad at MCP currently (April 2025). The client experience is not great, and MCP server discovery is not awesome. I believe that this will change very fast, but if you need to deliver great value to your users right now, there's something you can do to serve them today while preparing for the inevitable future.

    If you must wrap an LLM and add LLM chat to your application, at least build an MCP server that your LLM wrapper uses. That way, when the AI assistants get better, users can jump on and use your MCP server right away with their preferred client.

    The future is cohesive, not isolated

    To be clear, I'm not saying there's no place for backend AI processing or specialized agents that actually does interact with AI models. That stuff is great for behind-the-scenes magic. But when it comes to user interaction, we need to think of the best user experience.

    By embracing MCP and focusing on providing tools rather than wrappers, we're setting ourselves up for the future. A future where users have seamless AI interactions across the entire web, not just within our little corners of it.

    So, let's stop building islands and start building bridges. Your users (and their AI assistants) will thank you for it.

    Start designing the future of intelligent user experiences

    Join the EpicAI.pro newsletter to get updates on events, articles, and more as they’re released

    I respect your privacy. Unsubscribe at any time.

    Share