Rethinking APIs and Discoverability
A significant portion of the discussion revolves around the perceived novelty and necessity of the Model Context Protocol (MCP). Many users question whether MCP truly introduces new concepts or if it's essentially a repackaging of existing API patterns, albeit with a specific focus on LLM interaction. The core debate centers on whether MCP's "discoverability" feature, particularly the list-tools
endpoint, offers a substantial improvement over existing standards.
- "But the important thing in the picture isn't MCP - it is the power of the models themselves." (roenxi)
- "But part of me canât help wondering: isnât this the idea of APIs in general? Replace MCP with REST and does that really change anything in the article? Or even an Operating System API? POSIX, anyone? Programs? Unix pipes?" (jadar)
- "OpenAPI doesn't have a baked in discoverability mechanism. It isn't compatible with LLMs out of the box. It is a lower level abstraction." (doug_durham)
- "OpenAPI is not a protocol, but a standard for describing APIs. It is the list-tools for REST APIs." (opliko)
- "The main difference between MCP and REST is
list-tools
. REST APIs have 5 or 6 ways of doing that, including "read it from our docs site", HATEOAS, OAS running on an endpoint as part of the API. MCP has a single way of listing endpoints." (decaso) - "But the way I see it, AI agents created incentives for interoperability. Who needs an API when everyone is job secure via being a slow desktop user? Well, your new personal assistant who charges by the Watt hour NEEDS it. Like when the CEO will personally drive to get pizzas for that hackathon because thatâs practically free labor, so does everyone want everything connected." (sshine)
- "MCP is basically just a wrapper with a couple of opinions. Who cares. Itâs probably better to be more opinionated about what exactly you put into MCP, rather than just exposing your hundreds of existing endpoints." (anon7000)
- "The only thing novel about MCP is requiring the schema is provided as part of the protocol. Like, sure it's convenient that the shape of the requests/response wrappers are all the same, that certainly helps for management using libraries that can wrap dynamic types in static types, but everyone was already doing that with APIs already we just didn't agree on what that envelope's shape should be. BUT, with the requirement that schema be provided with the protocol, and the carrot of AI models seamlessly consuming it, that was enough of an impetus." (caust1c)
- "MCP is not REST. In your comparison, its more that MCP is a protocol for discovering REST endpoints at runtime and letting users configure what REST endpoints should be used at runtime." (Jonovono)
- "MCP makes [discoverability] central and maybe that makes all the difference." (spenczar5)
XML vs. JSON for LLM Interaction
A technical debate emerges regarding the suitability of JSON versus XML for LLM communication, with one user suggesting XML might offer advantages due to its more explicit structure and distinct syntax for tags and values.
- "XML actually works better with LLMs than JSON." (layer8)
- "Presumably because XML tags give better context. You have closing tags, and each array element has its own tags. The tag syntax is different from the value syntax, whereas in JSON both labels and string values use the same syntax. JSON strings are delimited by the same character ("), whereas XML uses two different characters (>âŚ<). Non-string values in JSON have more variations in their delimitation than values in XML." (layer8)
Historical Parallels and Cyclical Trends
Several participants draw parallels between MCP and earlier technological trends and standards, suggesting that the current excitement around MCP might be a recurring pattern in the evolution of software and interoperability. This section highlights a sense of dĂŠjĂ vu and caution born from past experiences with hyped technologies that ultimately failed to live up to expectations or were superseded by simpler solutions.
- "when I read about MCP the first time and saw that it requires a "tools/list" API reminded me of COM/DCOM/ActiveX from Microsoft, it had things like QueryInterface and IDispatch. And I'm sure that wasn't the first time someone came up with dynamic runtime discovery of APIs a server offers. Interestingly, ActiveX was quite the security nightmare for very similar reasons actually, and we had to deal with infamous "DLL Hell". So, history repeats itself." (kerng)
- "So... is this OpenAPI then?" (nikolayasdf123)
- "Basically, yes. But with much more enthusiasm!" (lobsterthief)
- "It's articles like this that tell you we're close to peak hype. There's nothing revolutionary about a text encoding plus a schema. SOAP could do this 20 years ago." (quotemstr)
- "The problem isnât technical â the APIs were shut down because consumer tech is governed by ads, which are not part of APIs (or would be trivial to remove). You have surely noticed that APIs are alive and well in enterprise, why? Because they have customers who pay money, and API access does not generally break their revenue stream (although even there some are skittish)." (klabb3)
- "Remember Web 2.0? Remember the semantic web? Remember folksonomies? Mash-ups? The end of information silos? The democratizing power of HTTP APIs? Anyone? Anyone?" (bitwize)
- "No. What they are saying is best said with a quote from Battlestar Galactica:\n\n> All of this has happened before, and all of this will happen again.\n\nâItâ here being the boom and inevitable bust of interop and open API access between products, vendors and so on. As a millenial, my flame of hope was lit during the API explosion of Web 2.0. If youâre older, your dreams were probably crushed already by something earlier. If youâre younger, and youâre genuinely excited about MCP for the potential explosion in interop, hit me up for a bulk discount on napkins." (klabb3)
- "The problem many LLMs have is that theyâre not very good at following step-by-step instructions. So when the goal is to offer a capability, the LLMâs job is to provide the human-readable description of the capability and the machine-readable schema. But if you change the schema and introduce a new tool at the end of the day youâre still going to have a LLM hallucinating and providing you with a response that doesnât perform the task." (anonymous)
- "It's not 100% but it's close enough for a lot of usecases now and going to change a lot of ways we build apps going forward" (mtkd)
- "The "this time it's different" energy is because assuming a human can interact with the system, and that vision models can drive a gui, who cares if there's an actual API, just have the AI interact with the system as if it was coming in as a human." (fragmede)
- "The problem of using a naive LLM to parse and execute based on arbitrary documentation is significant. The documentation is for humans and for humans only. So in that sense, you need a tool like MCP to make this work." (anonymous)
- "The incentives that existed 20 years ago for mashups are the same ones that will exist for LLM agents now. The corporations that are incentivized to offer interoperability (those whose business model is built around integration, like Salesforce, or companies that sell tools to developers) will continue to do so. Those that are not, will continue to resist." (anonymous)
- "MCP seems like a more "in-between" step until the AI models get better. I imagine in 2 years, instead of using an MCP, we will point to the tool's documentation or OpenAPI, and the AI can ingest the whole context without the middle layer." (jampa)
- "The future of web development doesn't necessarily imply we're going to get rid of APIs, but that we're going to move away from them gradually. This is a good thing, but also it is also a lost opportunity." (anonymous)
- "And then, there are "architecture astronaut"s dreaming of an entire internet of MCP speaking devices - an "internet of agents" if you will. That is now requiring a separate DNS, SMTP, BGP etc. for that internet." (bwfan123)
- "Maybe it's that the protocol is more universal than before, and they're opening things up more due to the current trends (AI/LLM vs web 2.0 i.e. creating site mashups for users)? If it follows the same trend then after a while it will become enshittified as well." (drivers99)
- "The problem with MCP as the "everything API" is that you can't really take the "AI" part out of it. MCP tools are not guaranteed to communicate in structured formats! Instead of getting an HTTP 401 you will get a natural language string like "You cannot access this content because the author hasn't shared it with you." That's not useful without the presence of a NL-capable component in your system. It's not parseable!" (potatolicious)
The Role of LLMs in the MCP Ecosystem
The discussion highlights the integral role of LLMs in realizing the potential of MCP, acting as the primary "users" of these protocols. The ability of LLMs to interpret human-readable descriptions and dynamically select and parameterize tools is seen as a key enabler, though concerns remain about the reliability and control of this interaction.
- "The way I see it, the key word here is "programmed". Sure, you read the links from responses and eliminated the need to hardcode API routes in the system, but what would happen if a new link is created or old link is unexpectedly removed? Unless an app somehow presents to the user all available actions generated from those links, it would have to be modified every time to take advantage of newly added links. It would also need a rigorous existence checking for every used link, otherwise the system would break if a link is suddenly removed. You could argue that it would not happen, but now it's just regular old API coupling with backward compatibility concerns." (renerick)
- "MCP is for technical users." (iLoveOncall)
- "They are structured in a way that machine program could parse and use. I don't believe it requires human-in-the-loop, although that is of course possible." (NomDePlum)
- "The main benefit is not that it made interoperability fashionable, or that it make things easy to interconnect. It is the LLM itself, if it knows how to wield tools. It's like you build a backend and the front-end is not your job anymore, AI does it. In my experience Claude and Gemini can take over tool use and all we need to do is tell them the goal. This is huge, we always had to specify the steps to achieve anything on a computer before. Writing a fixed program to deal with dynamic process is hard, while a LLM can adapt on the fly." (visarga)
- "The LLM uses MCP to learn what tools the server makes available; the LLM decides when to use them." (asteroidburger)
Vendor Lock-in and the Future of Interoperability
A recurring concern is whether MCP will usher in a new era of genuine interoperability or simply provide another avenue for vendors to lock users into their ecosystems. The sentiment is that while the current trend favors openness, past experiences suggest that commercial interests could lead to restricted access and walled gardens, even with new protocols.
- "The AI Agent wave made interoperability hype, and vendor lock-in old-fashioned." (phh)
- "The future of interoperability is that vendors will still be trying to lock us into their walled gardens, and theyâll be using these MCP servers to do it." (anonymous)
- "I don't think that's happened at all. I think some interoperability will be here to say - but those are overwhelmingly the products where interoperability was already the norm. The enterprise SaaS that your company is paying for will support their MCP servers. But they also probably already support various other plugin interfaces. And they're not doing this because of hype or new-fangledness, but because their incentives are aligned with interoperability. If their SaaS plugins into [some other thing] it increases their sales. In fact the lowering of integration effort is all upside for them. Where this is going to run into a brick wall (and I'd argue: already has to some degree) is that closed platforms that aren't incentivized to be interoperable still won't be. I don't think we've really moved the needle on that yet." (potatolicious)
- "But the way I AI agents created incentives for interoperability. Who needs an API when everyone is job secure via being a slow desktop user? Well, your new personal assistant who charges by the Watt hour NEEDS it. Like when the CEO will personally drive to get pizzas for that hackathon because thatâs practically free labor, so does everyone want everything connected." (sshine)
The Simplification of AI Agent Interaction
Some users believe MCP offers a valuable simplification for interacting with tools and data, particularly for AI agents. The idea is to abstract away the complexities of individual APIs, providing a standardized and more accessible interface.
- "MCP is almost literally just a JSON schema and a "yo, this stuff exists" for AI. It is great to have it standardised and we're all very thankful not to be using XML but there just isn't that much there." (roenxi)
- "The MCP is a good idea, but there is no real innovation there. The core idea is that a machine can understand the context that it is currently in." (anonymous)
- "The value proposition of MCP is that it makes it possible for AI agents to interface with the world around them in a standardized way. It is not the protocols themselves, but the ability of the LLM to wield them." (anonymous)
- "MCP is a desktop-first API so auth mostly stops being an issue. Most importantly, if you need to change anything you can, because the LLM will just figure it out, so you're way less product constrained." (mike_hearn)
Hype vs. Practical Value and Adoption
A recurring sentiment is skepticism regarding the current hype surrounding MCP, with some users feeling that the excitement outpaces tangible benefits or widespread practical adoption. There are comparisons to previous tech trends like blockchain and low-code, suggesting a potential for inflated expectations.
- "I donât want to sound like a skeptic, but I see way more people talking about how awesome MCP is rather than people building cool things with it. Reminds me of blockchain hype." (jampa)
- "The problem many LLMs have is that theyâre not very good at following step-by-step instructions. So when the goal is to offer a capability, the LLMâs job is to provide the human-readable description of the capability and the machine-readable schema. But if you change the schema and introduce a new tool at the end of the day youâre still going to have a LLM hallucinating and providing you with a response that doesnât perform the task." (anonymous)
- "My Judgement is a bit fogged. But if I get asked about building AI into our apps just one more time I am absolutely going to drop my job and switch careers." (moooo99)
- "Cynically, the more I look at the AI projects as an outsider, the more I think AI could fail in enterprises largely because of the same reason low code did. Organizations are made of people and people are messy, as a result the data is often equally messy." (moooo99)
- "MCP is a fad, itâs not long term tech." (kasey_junk)
- "Iâm still baffled no software vendor has already come up with a subscription to access the API via MCP. I mean obviously paid API access is nothing new, but "paid MCP access for our enterprise users" is surely on the pipeline everywhere, after which the openness will die down." (iLoveOncall)
- "In all seriousness though, I think HN has a larger-than-average amount of readers who've worked or studied around semantic web stuff." (1dom)
- "MCP is supposed to grant "agency" (whatever that means), not merely expose curated data and functionality. In practice, the distinction is little more than the difference between different HTTP verbs, but I think there is a real difference in what people are intending to enable when creating an MCP server vs. standard APIs." (TimTheTinker)
- "There is a long tail of applications that are not currently scriptable or have a public API. The kind that every so often make you think "if only I could automate this instead of clicking through this exact same dialog 25 times" Before, "add a public API to this comic reader/music player/home accounting software/CD archive manager/etc." would be a niche feature to benefit 1% of users. Now more people will expect to hook up their AI assistant of choice, so the feature can be prioritized." (tveita)
- "This collision of trends is what feels genuinely new and exciting to me. Conversational/voice AI tech now dropping + the current LLMs + MCP/tools/functions to mix in vendor APIs and private data/services etc. really feels like a new frontier. It's not 100% but it's close enough for a lot of usecases now and going to change a lot of ways we build apps going forward." (mtkd)
- "I don't understand what you mean. It (the main benefit?) is the LLM itself, if it knows how to wield tools. LLMs and their ability to use tools are not a benefit or feature that arose from MCP. There has been tool usage/support with various protocols and conventions way before MCP. MCP doesn't have any novel aspects that are making it successful. It's relatively simple and easy to understand (for humans), and luck was on Anthropic's side. So people were able to quickly write many kinds of MCP servers and it exploded in popularity. Interoperability and interconnecting tools, APIs, and models across providers are the main benefits of MCP, driven by its wide-scale adoption." (HumanOstrich)
- "MCP itself doesn't require the use of the LLM. There are other concepts, but for this use, Tools are key. A Tool is an operation, like a search. Have a look at the Filesystem example MCP server - https://github.com/modelcontextprotocol/servers/blob/main/src/filesystem/index.ts. It has a collection of Tools - read_file, read_multiple_files, write_file, etc." (asteroidburger)