The latest DeAI working group session showcased a working demonstration of the Model Context Protocol (MCP) running directly on Internet Computer Protocol (ICP) canisters. This marked a tangible step towards enabling AI agents to communicate in a decentralised environment using on-chain infrastructure.
Presented on 15 May 2025, the session featured contributions from Baolong Tan, who shared a prototype implementation of MCP within the ICP ecosystem. MCP, designed to facilitate context sharing between AI models, is now capable of running on smart contracts hosted by the Internet Computer. The working example demonstrated that this kind of inter-agent protocol can be embedded into existing decentralised systems, paving the way for more complex AI collaboration without relying on centralised intermediaries.
The session also included discussion around developing an MCP software development kit (SDK), which would offer developers a consistent and more accessible way to build with the protocol on ICP. An SDK would help standardise how MCP operates across different projects, reduce technical friction, and potentially expand its adoption. Developers working with AI agents would benefit from having a framework that simplifies integration while maintaining the protocol’s contextual richness.
However, technical limitations of the ICP infrastructure were also highlighted. One significant issue is the 2MB HTTP call limit on ICP, which can act as a constraint when trying to handle large or continuous streams of data. MCP’s design requires streaming capabilities to maintain coherent communication between agents, and this limitation presents a hurdle for applications that rely on higher throughput or uninterrupted data transfer.
MCP itself brings its own challenges. It manages nuanced context exchanges across multiple AI models, aiming to preserve continuity over time. This design requires persistent memory, reliable state handling, and a robust communication channel. ICP canisters were not initially built for such use cases, which means additional design considerations are required to accommodate MCP’s demands. While the prototype functions for demonstration purposes, scaling it for more intensive scenarios will need further engineering effort.
Participants in the session also discussed Google’s Agent-to-Agent (A2A) Protocol. This competing standard, developed to support structured communication between AI agents, was viewed as a potential counterpart or rival to MCP. A2A benefits from the backing of a large corporate entity and could offer easier integration paths for developers already working within existing AI ecosystems. From a technical perspective, integrating A2A into ICP was seen as possible, but it remains unclear whether this would be the right direction for decentralised projects.
Future development plans were also on the agenda. The group considered scheduling themed collaborative coding sessions, informally referred to as “Vibe Coding,” and confirmed that next week’s presentation would feature Icarus, who will explore hardware acceleration for AI within ICP. Hardware support is often an overlooked aspect of decentralised infrastructure but could become crucial as AI workloads grow heavier and more complex. Improved performance at the chip level may help resolve bottlenecks like the ones currently faced by MCP on ICP.
The DeAI working group continues to operate with a high degree of transparency. Rather than unveiling finished products, the sessions offer an ongoing view into the development process. Developers, researchers, and community members are invited to observe, contribute, and test ideas in real time. This approach encourages a wider range of feedback and ensures that technical challenges are addressed as they emerge, not after the fact.
Integrating MCP into ICP represents a broader attempt to rethink how decentralised AI systems should communicate. Protocols like MCP are trying to build a foundation where models can interact with one another, maintain memory across sessions, and do so securely within a decentralised environment. Although the challenges are many, the willingness of developers to test, iterate, and adapt keeps momentum moving.
There are still questions about how far MCP can scale within ICP’s current structure. Persistent context and high-frequency interaction between agents are not areas where blockchains have traditionally excelled. The success of MCP within ICP may depend on improvements to ICP’s underlying architecture or continued refinements in the way MCP handles streaming and memory.
The idea of AI agents interacting without central brokers appeals to many in the decentralised tech space. By embedding those interactions into canisters and building them on transparent, publicly auditable infrastructure, developers are looking to create systems where machine intelligence can operate with minimal reliance on external services. It’s an ambitious goal, and early progress like the MCP prototype shows that at least part of the vision is within reach.
Next week’s session, with a hardware focus, could provide new angles on the conversation. If hardware acceleration can reduce latency or increase the data-handling capacity of canisters, then previously difficult features of MCP might become more feasible. That, in turn, could make decentralised AI communication faster, cheaper, and more stable—three things any protocol developer wants.
For now, the DeAI group is focused on small, measurable steps. The MCP prototype is working, a potential SDK is in discussion, and the community continues to suggest themes for future sessions. The path forward includes continued testing, performance evaluation, and likely some negotiation between protocol limitations and design ambitions.
The working group’s open approach continues to set it apart. Rather than delivering fully formed products behind closed doors, it builds in the open and invites feedback at every stage. This openness makes each session more valuable, allowing the broader developer and AI community to influence how decentralised AI infrastructure evolves.
With MCP now showing signs of viability on ICP and new hardware-focused work on the horizon, the groundwork is being laid for a more communicative, interoperable future for AI models. There’s no single solution yet, but these sessions offer a window into how that future might be built—one prototype, protocol tweak, and community discussion at a time.




