DFINITY explores decentralised AI with onchain verification push

DFINITY researchers are testing a new approach that could allow powerful artificial intelligence models to run directly on the Internet Computer network, with a focus on verifiable results and lower infrastructure costs. Early discussions shared by founder Dominic Williams suggest the work centres on enabling GPU-based nodes to handle large language models while maintaining competitive performance against traditional cloud providers.

The proposal points to models of up to 70GB being able to run efficiently on relatively low-cost hardware, including devices that could operate outside conventional data centres. That marks a shift from earlier Internet Computer node setups, which often required specialised machines costing tens of thousands of dollars and were typically hosted in controlled environments.

At the centre of the idea is “onchain inference”, where AI outputs can be verified through decentralised algorithms rather than relying on trust in hardware. Current systems often depend on secure enclaves or trusted execution environments to guarantee privacy and integrity. DFINITY’s approach aims to reduce that reliance by leaning more heavily on cryptographic methods, though the network still uses hardware protections such as SEV-SNP.

Supporters argue this could widen access to advanced AI systems. Comments from community members suggest that if the model works at scale, it may lower barriers to entry, reduce costs, and limit the concentration of AI capabilities within a handful of large technology firms. The possibility of running capable models on distributed nodes, including at-home setups, has been framed as a step towards broader participation in AI infrastructure.

At the same time, questions remain around trade-offs. One concern raised by observers relates to whether verifiability adds enough practical value to offset potential increases in latency and cost. DFINITY’s response indicates that verification levels could be adjustable. A default mode would rely on mechanisms such as speculative verification and slashing to keep nodes accountable with minimal overhead, while higher levels of certainty would increase both computation time and cost.

Performance remains a key point in the discussion. According to Williams, the current focus sits within what he describes as a competitive range for open-weight models, where both speed and cost can match or even undercut traditional cloud-based inference. He suggests that verifiability could become an added benefit rather than a burden if efficiency holds at scale.

The longer-term direction may involve more specialised hardware. While the initial rollout is expected to rely on commodity GPUs, there are indications that application-specific chips could eventually play a role as the technology matures.

The broader context reflects a growing push across the industry to decentralise aspects of AI development and deployment. Competing projects are exploring similar ideas, often combining hardware-based security with distributed compute networks. Whether DFINITY’s model can deliver consistent performance while maintaining verifiability will likely shape how widely it is adopted.

For now, the concept remains in development, with many of its claims yet to be tested in large-scale, real-world conditions. Even so, the idea of running advanced AI in a decentralised, verifiable way is drawing attention, particularly as concerns about control, cost, and transparency continue to follow the rapid expansion of artificial intelligence.


Dear Reader,

Ledger Life is an independent platform dedicated to covering the Internet Computer (ICP) ecosystem and beyond. We focus on real stories, builder updates, project launches, and the quiet innovations that often get missed.

We’re not backed by sponsors. We rely on readers like you.

If you find value in what we publish—whether it’s deep dives into dApps, explainers on decentralised tech, or just keeping track of what’s moving in Web3—please consider making a donation. It helps us cover costs, stay consistent, and remain truly independent.

Your support goes a long way.

🧠 ICP Principal: ins6i-d53ug-zxmgh-qvum3-r3pvl-ufcvu-bdyon-ovzdy-d26k3-lgq2v-3qe

🧾 ICP Address: f8deb966878f8b83204b251d5d799e0345ea72b8e62e8cf9da8d8830e1b3b05f

Every contribution helps keep the lights on, the stories flowing, and the crypto clutter out.

Thank you for reading, sharing, and being part of this experiment in decentralised media.
—Team Ledger Life

0

Community Discussion

Loading discussion…

LEAVE A REPLY

Please enter your comment!
Please enter your name here

More like this

Liquidium expands Bitcoin lending push as deposits pass $700,000

Liquidium is building momentum in the emerging market for native Bitcoin lending, with recent updates pointing to...

Caffeine community app Tanhero offers a simple way to...

A new community-built application is drawing attention within the Caffeine ecosystem, offering anime fans a straightforward way...

Mission70 proposal targets lower ICP inflation and network growth

A new governance proposal tied to Mission70 is setting out a plan to reshape the token economics...