Highlights:
- The Model Context Protocol (MCP) is available under an open-source license, and Anthropic reports that several tech companies have already adopted the software.
- To use MCP, developers must integrate it into both their LLM-powered application and the remote system with which the application will interact.
Recently, an artificial intelligence startup, Anthropic PBC has launched a toolkit designed to integrate large language models with external systems.
The Model Context Protocol (MCP) is now available as open-source software. Anthropic reports that several tech companies have already adopted it.
Companies can enhance the functionality of their LLMs by connecting them to external systems. For instance, an electronics manufacturer could enable an LLM to handle customer support inquiries by granting it access to a repository of troubleshooting guides. Additionally, AI models can interact with external applications in other ways, such as updating or modifying their data.
Integrating an LLM with an external system often involves writing extensive custom code. Anthropic’s new MCP protocol aims to simplify this process by offering pre-built components for connecting LLMs to external systems, reducing the need for developers to start from scratch.
Anthropic claims that MCP allows software teams to integrate LLMs in under an hour.
The process is further streamlined by Claude Desktop, an application that facilitates access to the company’s Claude LLMs and automates portions of the manual effort required.
To use MCP, developers need to integrate it into both their LLM-powered application and the remote system it will connect to. Connections are then made through a three-step process: the application sends a network request to the MCP-enabled remote system, the system responds with a similar request, and the application completes the connection with an automated confirmation.
MCP transmits data using the JSON-RPC 2.0 protocol, which organizes information into the JSON format—a structure well-suited for transferring files between different systems.
Granting AI applications access to data from remote systems is just one of the use cases supported by MCP. Anthropic also notes that the protocol allows LLMs to interact with cloud-based tools. For instance, a company could link an AI programming assistant to a cloud-hosted development environment, allowing it to test the code it generates.
MCP features a function called sampling, which enables an MCP-enabled server to request that an AI application perform tasks autonomously. Anthropic states that developers can configure this feature to let users review these requests before they are executed.
MCP has already gained traction among early adopters. Block Inc. and Apollo Inc., the startup behind a widely recognized sales platform, have implemented the protocol in several of their systems. According to Anthropic, several venture-backed developer tooling companies are also actively working on their own implementations.