Home Blog

Oisy 1.4.6: Smarter Sends, Cleaner Swaps

0

The latest update from OISY Wallet, version 1.4.6, brings in a set of features designed to streamline daily crypto transactions without weighing down users with clutter or confusion. It’s less of a facelift and more of a polish job—one that focuses on fine-tuning the experience while expanding its reach across chains.

Support for Polygon is now baked in, which means native POL, USDC, and USDT are all good to go. This makes OISY even more versatile, stepping further into the multi-chain future. For users juggling stablecoins or hopping between chains, this update knocks down a few more walls.

One of the more subtle but meaningful upgrades is the smart send flow. OISY now remembers recently used addresses and flags anything unfamiliar. It’s a small addition that carries a lot of weight, especially in a space where a mistyped address can send assets into a digital void. By surfacing recently interacted-with wallets and giving a clear heads-up when something’s new, OISY adds a layer of reassurance without introducing friction.

The ICPSwap integration adds some automatic muscle to every transaction. Users get the best available swap offers by default, but the option to make manual adjustments remains. It’s not trying to be a black box; it’s just doing the legwork for you upfront, while keeping the driver’s seat unlocked.

On the interface side, the activity page has finally caught up with expectations. It now scrolls cleanly and loads progressively—something that sounds basic, but for power users flipping through transaction history, makes a noticeable difference. Combined with crisper fee displays and cleaned-up filters, the experience is more coherent. Buttons behave, data loads without tantrums, and filters feel like they were placed with intention.

Another new guest at the party is XTC, which now enjoys full token support. And for users working with euro-denominated assets, there’s vEUR, accompanied by a subtle but fresh Euro-stars background. It’s the kind of visual touch that doesn’t overwhelm the interface, but quietly signals that the token’s profile has grown.

There’s also better visibility for pending conversions, specifically for ‘ck’ tokens. These now show up in the activity log even before they fully land, and their values are accurately reflected. For anyone juggling conversions and watching balances across canisters, this saves some guesswork. It’s also one more way OISY tries to reduce the number of “wait, where did that go?” moments.

The wallet’s UI tweaks aren’t dramatic, but they’re deliberate. Fee displays are clearer, and filters are more responsive. There’s also the kind of bug fixing that doesn’t make headlines but keeps everything from wobbling under pressure.

The 1.4.6 update isn’t about changing direction. It’s about sticking to the path and making sure the pavement is smooth. The wallet continues to do what it set out to do—supporting a wide set of tokens across multiple chains, giving users control without overwhelming them, and keeping the day-to-day operations tidy.

There’s an obvious interest in making the app friendly for repeat users. The new address memory feature is one example of that, helping reduce errors and speed up transactions. The improved activity page is another. It’s an interface made with the understanding that crypto isn’t just something people check once in a while; it’s something they use like email or banking.

Polygon support brings in a large and active base, while still preserving OISY’s commitment to other networks. The ICPSwap integration is a good indicator of how OISY is quietly looking for ways to make smart decisions in the background, without being pushy.

These kinds of updates reflect a developer team that’s more interested in quality-of-life improvements than flash. That means focusing on the smaller frictions—slow loads, clunky filters, missing tokens—and smoothing them out. It’s work that rarely gets fanfare, but adds up to a better product over time.

For those keeping score, the combination of new token support (XTC, vEUR), enhanced transaction previews, and deeper chain compatibility ticks off several frequently requested features. None of them is flashy on its own, but together they show a wallet that’s listening closely to what users need.

The addition of visual cues like the Euro-stars backdrop for vEUR hints at a willingness to keep the interface personal and recognisable. The same goes for context-aware warnings during sends. These aren’t features that will go on marketing posters, but they’re the kind that users remember when things go wrong—or, more accurately, when they don’t.

Crypto wallets walk a fine line between control and convenience. Too much automation, and users feel boxed in. Too little, and it feels like a command-line interface wrapped in colours. OISY’s recent update doesn’t tip too far in either direction. It gives more options to those who want them, without demanding technical gymnastics. It’s also a quiet nudge that OISY is growing without trying to become a bloated control panel.

The pending conversion display for ck tokens fills a gap that previously led to uncertainty. By acknowledging what’s incoming—even if it’s not fully processed yet—OISY reduces mental overhead. The user doesn’t have to keep tabs in a separate tracker or refresh obsessively. It’s visible, it’s valued, and it behaves predictably.

With the current crypto wallet scene becoming increasingly noisy and fragmented, OISY’s update feels more like someone quietly reorganising your messy desk while you’re at lunch. You come back, and everything’s where it should be—with some new tools thrown in.

The update also shows how far things have come from the early days when wallets were rigid, feature-sparse, and indifferent to usability. Now, there’s an expectation that these tools behave like polished apps rather than experimental side-projects. OISY is clearly chasing that standard, and hitting more marks with each release.

There’s no bombast in version 1.4.6, but there is confidence. Confidence in the way new features are introduced, how they’re tucked into the interface without yelling for attention, and how they build on each other instead of cluttering things up.

Crypto users want wallets that don’t need a manual. They want clarity without compromise. This update brings a little more of that. It makes everyday tasks faster, decisions easier, and mistakes rarer.

As other products continue to chase new buzzwords and try to reinvent the wheel, OISY is doing the less glamorous but far more important work: making the wheel turn more smoothly. One feature at a time.

CycleOps Takes the Guesswork Out of Canister Launches

0

CycleOps has just flipped the switch on a new way to launch and manage Internet Computer canisters, and it comes with a sleek dashboard, an intuitive flow, and an absolute refusal to rely on complicated command-line guesswork. Developers can now spin up smart contracts in seconds without touching a CLI. The days of double-checking command syntax, hunting through docs for flags, or dealing with errors that are vaguely threatening and entirely unhelpful may be on the way out. CycleOps is giving developers the digital equivalent of a big red launch button.

It’s the kind of move that makes sense if you’ve ever worked with canisters on the Internet Computer. Creating and deploying them isn’t impossible through command-line tools, but it can be tedious, error-prone, and off-putting for developers who simply want to ship fast and iterate freely. CycleOps steps into that gap, streamlining the process into three easy steps: pick a subnet, set a cycle balance, and assign controllers. After that, you’re good to go. A new canister is live—no friction, no fuss.

Behind this simplicity is a clear goal: to take the repetitive overhead out of launching and managing smart contracts. By offering pre-defined subnet options—whether you’re aiming for low-latency execution, larger storage, fiduciary compliance, or compatibility with a specific fleet—you get to align your canister’s environment with its actual workload. It’s a small touch, but one that shows an awareness of the diverse needs across decentralised applications.

Once the canister is launched, the dashboard becomes your control room. Everything that used to be buried in CLI commands is now available with clicks: start or stop a canister on demand, edit who controls it, change operational thresholds, take snapshots, and restore state. If something goes wrong, live log tailing is built right in—no more jumping between terminals or logging tools.

The core idea behind CycleOps is about putting control back into the hands of developers, without making them wrestle with infrastructure. This isn’t a full-stack platform trying to do everything. Instead, it’s laser-focused on making canister lifecycle management fast, easy, and error-resistant. Whether you’re deploying a brand new app or just need to push a small upgrade to a microservice, the process is now significantly less painful.

What used to be a potential stumbling block for teams new to the Internet Computer ecosystem could now be one of its selling points. It removes that early developer friction, especially for those coming from traditional cloud services where dashboards and automation are the norm. A UI-first approach makes the Internet Computer ecosystem more accessible to devs who might not want to dive into command lines just to get started.

CycleOps also fits nicely into team workflows. By allowing multiple controllers to be added at the point of launch, collaborative development is better supported from the beginning. There’s no need to update configs manually after the fact. Everything that matters is frontloaded into the process, reducing post-launch overhead and keeping developers focused on building features instead of wrangling tools.

There’s something refreshingly direct about the way the CycleOps team has structured the dashboard. Instead of building a tool that tries to predict what users might need, they’ve simply made the fundamentals more approachable. The interface doesn’t overcomplicate things. There’s no unnecessary abstraction or branding flourish—it’s just practical, clean, and geared for people who want to get things done.

That attention to practical use cases extends to features like state snapshots and log tailing. These might sound minor, but for teams running canisters in production, they can be a lifeline. Rolling back to a known state after a failed update or bug is far easier with snapshots. Live logs can catch issues before users even notice them. All of this adds up to a smoother devops experience that’s likely to be appreciated across both small indie teams and larger organisations building on the Internet Computer.

The name “CycleOps” feels more apt now than ever—it’s truly about operations management through the lens of cycle economics. Since canisters require cycles to function, setting the initial balance at the launch stage avoids a common pain point: running out of cycles too soon or over-provisioning. Developers can make informed decisions right from the beginning, optimising both performance and cost.

Although it’s still early days for the dashboard, the approach is promising. There’s plenty of scope for future improvements—automated alerts, tighter integration with CI/CD pipelines, or expanded role-based access—but what’s here already addresses a core set of needs in a straightforward way. It doesn’t try to be everything; it just tries to be useful.

There’s a quiet confidence in that approach. While other platforms aim to be flashy or overly complex in their offering, CycleOps keeps things grounded. It knows its audience and delivers tools that respect their time. By hiding the rough edges of canister deployment and giving back visibility and control through a clean interface, it makes the whole process feel a lot more human.

That matters more than it might seem. Developers are at their most productive when they’re not distracted by tools. They need things to work without surprises, errors, or unnecessary hurdles. CycleOps delivers on that front by removing friction without removing control. It simplifies, but it doesn’t dumb down.

The growing interest in Internet Computer as a platform for building decentralised apps means this kind of tooling couldn’t come at a better time. Developers are hungry for tools that let them iterate quickly, deploy confidently, and recover from failure gracefully. CycleOps checks those boxes in a way that feels intuitive and developer-first.

As the platform matures, having more tools like CycleOps will be crucial for keeping the barrier to entry low while maintaining the high standards of decentralisation and performance that the Internet Computer ecosystem is aiming for. This dashboard isn’t just a quality-of-life upgrade—it’s a key step in making serious development on ICP accessible to more people.

CycleOps is now live and ready for action. If you’re building on the Internet Computer and have ever found yourself copy-pasting commands from docs, wrestling with dfx, or manually updating configs after launch, this dashboard is probably what you’ve been waiting for. It’s the kind of upgrade that makes things smoother without adding noise, and it’s likely to become a go-to tool for developers serious about scaling on-chain.

With its no-fuss launch process, built-in state controls, and live log visibility, CycleOps delivers a smarter way to manage canisters. One that’s less about ceremony, and more about shipping. And if it helps you avoid yet another late-night debugging session, that’s a win worth logging.

BitGo Adds Institutional Custody for ICP

BitGo has added support for Internet Computer Protocol (ICP), offering regulated custody and cold wallet self-custody for the network’s native token. It’s a move that pulls ICP into the fold of serious infrastructure designed for institutional-grade security. Without needing to rewire anything too complex, this step adds legitimacy and technical depth to ICP’s ambition of operating as a top-tier Layer 1.

This partnership matters because BitGo isn’t in the business of casual additions. Its role in the crypto industry is strongly tied to institutions that require a higher standard of protection, compliance, and long-term reliability. For Internet Computer, whose architecture and goals stretch far beyond conventional blockchain activity, this new availability through BitGo signals that big players now have a formal, compliant route to engage with the token.

The current support includes both regulated custody and cold wallet self-custody, a useful dual offering that lets institutional clients decide between third-party protection or managing keys themselves. Regulated custody implies a structure that ticks the legal and compliance boxes across multiple jurisdictions. Cold wallet self-custody, on the other hand, allows entities to retain control while still using BitGo’s tooling to safeguard digital assets offline — away from online threats or phishing attempts.

BitGo has been in the business of digital asset security since 2013, and its clientele often includes exchanges, asset managers, and crypto-native funds that can’t afford to take chances with custody. That BitGo now sees ICP as a worthy addition says something about the project’s maturation. Whether it’s the work of the DFINITY Foundation, the rapid development of decentralised applications on the network, or the growing market interest in Internet Computer’s novel smart contract model, this moment didn’t arrive without background effort.

The Internet Computer itself is an interesting outlier in a field packed with similar offerings. Rather than only handling smart contracts in the traditional sense, it allows developers to build complete Web3 applications directly on-chain. These are known as canisters, and they interact via secure APIs without relying on traditional web infrastructure. Unlike projects that run dApps which must link to off-chain servers or rely on cloud storage, Internet Computer pushes for a full-stack approach. Everything lives on-chain — code, data, logic — and executes at speed without passing through external nodes.

This is where secure custody becomes more than a passive vaulting service. It’s about enabling participants to interact with on-chain applications, participate in governance, or manage treasuries with higher operational comfort. While custody support itself doesn’t affect how the protocol runs, it opens new doors for how people engage with it. Institutional users can now hold and manage ICP tokens through familiar, battle-tested infrastructure, reducing the barrier to entry.

BitGo’s track record across other major cryptocurrencies shows a similar pattern — custody first, then support for staking, delegation, or native governance actions. That same expansion is expected here. BitGo has already hinted that staking and dApp token support for ICP are on the way, which would allow users to participate more actively in the network while keeping assets within the secure framework they already trust.

That matters when you’re dealing with tokens used for governance, node operation, or network expansion. Internet Computer’s native governance model, known as the Network Nervous System (NNS), allows ICP holders to vote on everything from code upgrades to parameter changes. Once staking is live on BitGo, participants will likely have an easier route into that governance process, making the network’s future less dependent on niche technical expertise.

It also creates space for projects building on ICP to feel more secure when onboarding partners or funding contributors. If your token lives within the Internet Computer universe and you’re working with an investor that requires safe handling of assets, BitGo’s arrival might make that conversation easier. Institutional custody acts as a quiet but powerful validator — not just of the asset’s security, but of its relevance in broader conversations.

One of the unique angles here is how this all plays into Internet Computer’s philosophy of decentralisation with usability. A frequent challenge for decentralised networks is matching the convenience of centralised tools without sacrificing autonomy. While BitGo’s involvement introduces a layer of centralisation through custody services, it balances this by offering cold wallet self-custody, letting users opt out of external control if they wish.

The announcement comes at a time when institutions are carefully picking their blockchain partners, looking past hype and diving into stability, throughput, and ecosystem health. By backing ICP at this stage, BitGo places a bet on its future — one that includes developers, investors, and regulators all being part of the same conversation. It’s not about short-term speculation but sustained involvement.

BitGo doesn’t typically publicise integrations unless it sees potential longevity. Its backend systems need to be robust enough to support a variety of clients, some of whom move large volumes or operate under intense scrutiny. To support ICP, BitGo has likely completed a thorough internal process involving compliance reviews, technical integration, risk assessment, and policy updates.

The fact that staking and support for dApp tokens are already on the roadmap suggests this isn’t a surface-level integration. These features could further open the door to decentralised finance (DeFi) activity on Internet Computer, making it more practical for funds to interact with liquidity pools, stablecoins, and governance tokens without having to exit their preferred custody platform.

There’s also a reputational win at play. Internet Computer has had to navigate a tough narrative since its launch, with critics questioning its architecture, decentralisation, and early token dynamics. Each move like this — custody support, exchange listings, developer tools — adds clarity to its actual use cases and distances the project from its more controversial launch days.

What’s also noteworthy is how this integration reinforces the idea of multi-chain institutions. Most serious investors aren’t betting everything on one network. They require compatibility, consistency, and comfort across several ecosystems. Adding ICP to a portfolio is easier when it lives beside Ethereum, Bitcoin, Solana, or Polkadot in the same secure interface. BitGo becomes the neutral ground where assets co-exist, while developers and asset managers choose where to build or participate.

There’s no marketing gimmick in this update. It’s a backend improvement that carries weight because of who’s involved and what it enables. Custody might not always feel exciting, but it’s a foundational requirement for trust. In an industry where confidence can evaporate in an hour, having names like BitGo backing token custody adds gravity to the conversation.

For developers working on ICP dApps, this also hints at future tooling improvements. Once BitGo supports dApp tokens, project teams can move faster, secure treasury funds more reliably, and potentially tap into new capital sources that would’ve been unavailable without this kind of custodial support.

So while this news may not make headlines across mainstream outlets, it marks another brick in the wall for Internet Computer’s infrastructure. The network has steadily built developer engagement, tooling, governance structures, and now institutional entry points. Custody is just one layer — but it’s an essential one. With BitGo on board, ICP sits a little closer to the industry’s centre of gravity.

Juno Just Made Campaign Tracking Effortless

Juno Analytics just got sharper. Campaign tracking has officially landed, and now developers can tell whether traffic is coming from tweets, newsletters, ads, or a random Reddit post. No added setup, no extra cost, and certainly no cookies. With UTM support rolled into the dashboard, this privacy-first analytics tool is shaping up to be that quiet workhorse that delivers without raising eyebrows—or red flags.

It’s a bit of a paradox: seeing more without watching anyone. That’s what Juno is now doing. The platform’s Analytics feature has always kept user privacy front and centre, and the latest update sticks to the same values. There are no cookies involved. No persistent identifiers, and definitely no sneaky tracking across devices. Everything is done anonymously, with each visit generating a fresh random ID. And thanks to its design, developers get the numbers they need without having to explain cookie pop-ups to confused users.

That tiny 3KB JavaScript snippet is doing quite a bit of heavy lifting. It slots into any UI without fuss and doesn’t interfere with how an app loads or performs. So if your conversion rates were doing just fine before you added analytics, they’ll continue to do just fine after. It’s built lean and fast, and the library’s size is hard to ignore. This isn’t a clunky add-on—it’s more like a lightweight layer that blends right into your application’s daily rhythm.

With campaign tracking in the picture, things get more focused. Developers can now attach standard UTM parameters to links and actually see what’s working. Ads, emails, and social media posts can be measured not just by hunches but by hard data. Which posts get people to show up? Which ones just generate likes without action? Juno’s dashboard has the answers, and they’re not hidden behind layers of vague graphs or locked behind a premium tier.

Juno also brings performance metrics to the table. Web Vitals—those signals Google and Chrome care about—are baked in. That means the same tool that tells you where your traffic is coming from also lets you measure how fast your interface is responding to that traffic. It’s a two-birds-one-stone kind of approach, and it helps developers keep things snappy without installing yet another plugin.

Open source from the start, Juno doesn’t require blind trust. The code is public, the logic is auditable, and there’s no mystery meat in how it works. Developers can poke around under the hood or fork it if they like. And because it doesn’t sell user data or send it off to someone else’s server farm, Juno sits well with projects that actually care about transparency.

A quirky little twist lies in where Juno stores your data—on the blockchain. By default, there’s no fixed geolocation, but if you’d prefer your data stay in Europe, you can pick a European subnet. That flexibility helps if you’ve got GDPR compliance on your mind or clients who ask where exactly the logs go. And for those extra cautious about data ownership, there’s a big green tick: all analytics data belongs to you. You can delete it. Reset it. Move it. Nothing is locked in a black box.

Juno’s future isn’t shy either. There are plans to gradually evolve into a decentralised organisation, guided by community members and contributors. That vision might not affect your dashboard today, but it sets a direction that’s refreshingly different from the big-name analytics tools that harvest more than they should.

Deploying the system involves creating something called an Orbiter, which collects and organises the analytics. You’ll need ICP to create one, but after that, a single Orbiter can handle data from multiple satellites. The structure may sound cosmic, but in practice it’s pretty straightforward. An Orbiter stores page views, custom tracking events, and performance data. It keeps everything timestamped and organised by session—each of which is a clean slate thanks to a new random ID each time.

There’s a ceiling though. Orbiters currently top out at 500 GB of stored data. That’s enough for most use cases, but it’s worth noting for larger operations planning to pull in years of data from multiple channels.

There’s also a bit of fine print. Because of how the system avoids tracking the origin of HTTP requests—by design—anyone can technically send data to your Orbiter if they mimic the expected format. This creates a small risk of polluted data or unexpected costs. That’s not unique to Juno, but the platform is upfront about it. Their advice: don’t overfill your smart contract with cycles. Keep a lean wallet, enable monitoring, and know that all analytics data, regardless of where it’s from, should be taken with a pinch of salt.

Campaign tracking makes things easier to analyse, but interpretation still requires judgement. That might sound like a disclaimer, but it’s more of a gentle reminder that numbers need context. Just because one ad drove more users doesn’t mean it converted better. Just because a link got clicks doesn’t mean it found the right audience. Juno gives you the map, but where you go from there is still up to you.

With this update, Juno Analytics has taken a step that feels both modest and meaningful. It didn’t add complexity or flashy dashboards with spinning rings. It added clarity. For developers who need to know what’s happening without compromising on user trust, that clarity counts.

The usual suspects in analytics have been relying on third-party cookies and persistent IDs for so long that privacy started to feel like a trade-off. Juno flipped that story. You don’t need to know who someone is to understand how they interact with your dapp or site. You just need honest metrics, a clean dashboard, and a system that doesn’t get in the way.

Campaign tracking brings sharper visibility to efforts that often live in the dark. Whether you’re running a weekend newsletter or launching a splashy product announcement on X, now you can actually tell what worked—right there inside Juno. It doesn’t require a tutorial. It just works. And when things just work, people tend to stick around.

That’s probably the best thing Juno’s offering here. Not big claims. Just small, thoughtful improvements that respect both the builder and the user. No cookies. No selling data. No weird detours. Just clear, fast, anonymous analytics. And now, with campaign tracking in the mix, a little more power to those quietly doing the work.

Notify No More: ICP Prepares to Retire Old Minting Route

0

The Internet Computer is drawing the curtains on one of its longest-running backend flows. Developers who rely on the ledger_notify method for minting cycles are now being asked to shift to a more efficient and robust alternative. The legacy path, once the standard for creating or topping up canisters using ICP tokens, is now marked for deprecation. This marks a technical turning point for many within the ecosystem, particularly those operating at the infrastructure level.

The existing process works in two steps: an initial ICP transfer to a subaccount tied to the Cycles Minting Canister (CMC), followed by a separate notify() call to the Ledger. The memo in the transaction determines whether it’s a request to create a canister or top up an existing one. Once this second step is triggered, the Ledger performs a chain of checks and forwards the transaction details to the CMC, which then mints cycles based on the ICP/XDR rate and completes the operation.

While this model has served the community well, it carries a few serious drawbacks. The coupling between the Ledger and downstream canister calls means the Ledger has to wait for responses from the CMC—or even other subnetworks. If those downstream systems hang or become unresponsive, the Ledger is left holding open contexts, which interferes with upgrades and puts the entire flow at risk. This issue isn’t theoretical; it has already created challenges for the ecosystem’s upgrade agility.

There’s also the matter of deduplication. To prevent replay attacks or accidental double usage, the Ledger only keeps track of past notify() calls for a 24-hour window. If a notification doesn’t come through successfully within that short timeframe, the underlying ICP tokens can’t be re-used or recovered unless a manual fix is applied. It’s not hard to imagine how this could affect developers working across time zones or with limited automation.

Even worse, if the response from a notify call is lost—for example, due to frontend failure or poor network conditions—there’s no recovery path. The Ledger doesn’t store the result, so users can’t later find out what happened to their canister creation or top-up. In short, the system has too many single points of failure and too little room for error.

That’s where the new cmc_notify flow comes in. The process kicks off in the same way—transferring ICP tokens to a designated CMC subaccount—but now skips the Ledger as an intermediary in the notification step. Instead, users directly interact with the CMC via one of two new endpoints: notify_create_canister() for new canisters, and notify_top_up_canister() for replenishing existing ones.

This change alters the architecture in meaningful ways. Since the Ledger is no longer involved in downstream calls, its role becomes cleaner, reducing upgrade risks and improving overall reliability. The CMC, now in charge of its own deduplication, retains records for up to a million transactions, well beyond the previous 24-hour limit. If something goes wrong—like a lost network response—the user can simply try again with the same transaction index and get the original result.

The shift also brings more power to developers. When creating a new canister, they can now specify the subnet type or even pick a specific subnet. They can also define initial settings right from the start. This gives teams much-needed control during deployment, especially in production environments.

Switching to the new system is simple, technically speaking. Developers currently using ledger_notify just need to swap out the notify call to point to one of the two new CMC methods. No other part of the ICP transfer process needs to change. That said, developers should act quickly—there’s a firm removal date on the horizon. The notify() endpoint is scheduled to be permanently disabled at the end of June. After that, any integration still relying on the legacy path will stop working.

To ease this transition, the ICP team is taking a proactive approach. The Ledger’s notify() endpoint now logs the principal of each caller. This way, the team can try to identify and reach out directly to anyone still relying on the old flow. It’s an unusual but necessary move, driven by the goal of reducing friction and avoiding disruption for developers who might otherwise miss the announcement.

Those still using ledger_notify are being asked to confirm their status and provide an estimated migration timeline. If there are any blockers—be it a lack of engineering time, dependency on third-party libraries, or just confusion about how to proceed—the ICP team is open to support. Several developers have already begun the migration, and early feedback indicates that the process is straightforward and the benefits are real.

While this might seem like a low-level infrastructure change, it has broader implications for the stability and scalability of the Internet Computer. By removing dependencies between key canisters and allowing more graceful handling of delays or outages, the network is making itself more robust. It’s the kind of detail that won’t make headlines outside developer circles, but within the community, it’s a major quality-of-life improvement.

As decentralised systems continue to mature, these types of incremental updates—replacing tightly-coupled flows with more modular ones—play a big part in long-term resilience. The success of the cmc_notify transition will likely shape how future improvements to ICP infrastructure are rolled out. For now, the focus is on getting everyone off the old flow before it vanishes for good.

Anyone using ledger_notify and yet to begin the migration is encouraged to flag it publicly or contact the team. With only a few weeks left before the legacy route is removed entirely, preparation is key. The change is already live and ready for use, and those who’ve switched say it’s a clear upgrade in terms of control and reliability.

Developers who take the time now to transition will likely face fewer headaches in the months to come. And for a platform as complex and interconnected as the Internet Computer, fewer headaches means more time building—and less time untangling backend flows that no longer fit the way the network runs.

Murph AI Plugs into ICP, Wants You to Earn While You Train

Murph AI is looking to shake up the AI space on Web3 by making machine learning infrastructure feel less like an exclusive club and more like a public park. The team behind the project is building what they describe as a modular, scalable AI network on the Internet Computer Protocol (ICP), with an emphasis on usability and incentives for contributors.

Instead of simply providing machine learning services or tools for developers, Murph AI is aiming to give users the chance to contribute to AI workflows and earn for doing so. At the heart of the offering is what they call an “AI-driven infrastructure” and a “using-AI-to-earn” model that fits neatly into ICP’s decentralised architecture.

The project’s aim is to demystify and decentralise how AI systems are trained and deployed. Murph AI says it is designing a network where compute tasks can be broken down into modules, distributed, and scaled up or down depending on what’s needed—whether it’s a casual user contributing GPU power or a developer looking to run a large model. The infrastructure relies on ICP’s ability to run backend logic on-chain without needing traditional cloud providers, which helps to keep things decentralised and, in theory, more open.

One of Murph AI’s main selling points is the way it connects contribution to reward. By allowing users to take part in the training and fine-tuning of AI models, the system is structured to compensate those users in return. The team hasn’t publicly laid out the full tokenomics yet, but the pitch focuses on creating a sustainable incentive loop, where people benefit directly from the growth and use of the AI they help power.

The “using-AI-to-earn” phrase is catchy, but what it describes is a process that blends the mechanical with the economic. Users might run prompts, feed in data, validate outputs, or provide computing power, and each action is logged, tracked and credited. Over time, this could make Murph AI more community-run than the usual AI infrastructure offerings, which tend to be closed-source, centrally hosted, and dominated by major cloud providers.

Integration with ICP is a key part of how this system is expected to work. Rather than offloading compute or data storage to separate systems, Murph AI keeps everything within the Internet Computer’s ecosystem. That’s not just about branding—it’s also about how decentralised computation is coordinated. The canisters (smart contracts) on ICP allow for persistent data and logic, meaning that training histories, user contributions, and inference records can all live on-chain. This ties into the transparency angle: everything is verifiable and auditable, including where compute happened and what it produced.

For those familiar with AI tooling but not ICP, the move might take some adjusting. ICP’s model is quite different from Ethereum or Solana in how it structures computation. Where others often push AI workloads off-chain to maintain speed, Murph AI is betting that ICP’s native compute abilities are strong enough to keep everything under one roof. This approach is more attractive to users who value openness and who prefer to avoid dependencies on platforms like AWS or Google Cloud.

The “modular” part of Murph AI’s message refers to how the system handles workflows. Instead of one giant model doing everything, the idea is to have smaller components that handle specific tasks. These can then be pieced together into pipelines, reused, or improved without needing to rebuild the whole system. This modularity could make the network more adaptable and let it evolve faster. It also lowers the entry barrier for developers or contributors who want to work on a single part of a larger system.

One of the other goals of Murph AI is to create a shared layer where AI applications can be deployed and monetised without relying on Web2 services. The team talks about enabling new kinds of Web3 apps that are more intelligent, from chatbots and recommendation engines to smart automation tools. But unlike traditional software where developers shoulder all the cost, here the resource load is distributed and contributors are rewarded, creating what they hope will be a self-sustaining cycle.

There are plenty of buzzwords floating around AI and crypto at the moment, but Murph AI is trying to build something grounded in working infrastructure and real user engagement. That means focusing less on hype and more on actually running models in a decentralised way, letting people earn for helping out, and making the tech easier to integrate into everyday apps.

Of course, it’s still early days. Like many projects that blend AI with blockchain, Murph AI has more ambition than track record at the moment. But the focus on rewarding contributors directly and using ICP’s native tools gives it a clearer framework than some AI-in-Web3 projects that have struggled to move beyond demo stage. By tying rewards to actual inputs and outputs, and keeping the whole thing on-chain, the project hopes to avoid the pitfalls that come with outsourcing core infrastructure.

The Telegram channel for Murph AI is now open, and users are being encouraged to join, connect their wallets, and follow updates. The official website also features an overview of the concept and its roadmap, though some details—especially around token distribution and governance—are still to come.

Murph AI is stepping into a fast-moving space, but its bet is on turning the infrastructure itself into the playground. It’s not just about flashy AI demos or trading on the buzz of machine learning—it’s about building a system where the community gets a stake in the underlying tools that power it all.

The idea that AI infrastructure can be decentralised, modular and rewarding isn’t new, but Murph AI’s take leans on ICP’s strengths and puts contributors front and centre. If it works as intended, it could help shift more AI development into open networks where people can participate, earn, and build smarter systems together—without needing to rent time from tech giants. Whether that’s enough to make it stand out long-term will depend on how smoothly the system runs and how attractive the rewards actually are.

For now, the message is clear: if you’ve got compute, data, or curiosity to spare, there’s a new machine learning project waiting for you on ICP—and it might even pay you for it.

Moon or Doom? Telegram Game Lets You Guess ICP’s Price and Win Points

0

ICP Racer has quietly rolled onto Telegram, inviting crypto users to test their price instincts in a game that’s part prediction, part fast-paced fun. Built on the Internet Computer Protocol (ICP), this new mini app brings price guessing into your chat window — and throws in a points system, a multiplier for streaks, and a referral boost to keep things moving.

At its core, ICP Racer asks a simple question: will the price of ICP go up or down? Players tap “Moon” for up or “Doom” for down, and if they’re right, they collect 10 points. Get on a roll — six wins in a row — and the game starts multiplying those points. The idea is clean, quick, and designed to fit into a Telegram interaction without extra logins, downloads, or distractions.

But it’s more than a single-token game. The team behind it is already expanding prediction options to include a broader range of tokens within the ICP ecosystem. That means more opportunities to guess, win, and build up a running score. The expansion helps keep the experience dynamic while showcasing the wider ICP landscape in the process.

There’s also a social hook. A referral system lets users earn bonus points by inviting others to join the game. It’s a setup that nudges organic growth while adding a layer of competition — since points aren’t just about guessing, they’re also about outreach. The more people a user brings in, the better their own scoreboard starts to look.

The process to start playing is deliberately frictionless. Users open the Telegram bot, connect their ICP wallet, and start guessing. Each prediction triggers a five-second price roll. After that, the result flashes on the screen and points are either awarded or the streak breaks. It’s simple enough for first-time players to grasp immediately, and snappy enough to keep them interested.

What helps this stand out is that it’s grounded in ICP’s infrastructure. That means the platform benefits from the protocol’s speed, security, and transparency. Each prediction is logged, and results can be independently verified. While guessing games aren’t new in crypto, many are centralised or opaque. With ICP Racer, there’s a sense of visibility in how everything operates.

This use of ICP for gameplay also adds another layer of real-world application to the protocol, beyond its more typical technical and infrastructure deployments. Here, the tech is being used to support an accessible, game-like experience that’s intended to boost engagement with the wider ICP community. The simplicity of “Moon or Doom” makes it instantly understandable — and that’s a feature, not a limitation.

The scoring system encourages users to return, especially with the multiplier mechanic kicking in after six correct calls in a row. Streak bonuses like this are a staple in casual gaming, but they’re relatively fresh in the crypto prediction space. Instead of introducing complicated odds or confusing tokenomics, ICP Racer sticks with clarity — make the right call, rack up the streak, win more.

Tasks and challenges also feed into the point economy. Players can earn even more by completing assigned tasks, giving them another path to climb the leaderboard. Combined with referrals, these tasks make the system feel open-ended, not just a one-shot game. It’s designed to grow with its users and keep offering something new over time.

At a time when many crypto apps lean toward either full-blown exchanges or abstract metaverse ventures, ICP Racer threads the needle between simplicity and strategy. It invites Telegram users to jump in with minimal setup and start engaging right away. That immediacy lowers the entry barrier, especially for people who are curious about crypto but not yet active traders.

This kind of low-stakes, prediction-based interaction also gives users a way to interact with markets without committing actual funds. Instead of risking tokens, players risk streaks. Instead of watching candles on a chart, they watch a five-second price roll. It makes market movement feel like a quick decision rather than a research project.

What makes it tick is the seamless blend of casual interaction and on-chain trust. The ICP integration ensures that predictions and results aren’t just guesses behind a curtain. There’s a framework ensuring fairness, and a mechanism backing each outcome. It’s not gambling in the traditional sense — it’s point-based engagement with verifiable outcomes.

ICP Racer isn’t shouting from the rooftops, but it is quietly creating a use case that fits naturally into the existing habits of Telegram users. That’s no small thing. Crypto communities already live on Telegram, share predictions, argue about prices, and circulate links. Bringing a game into that space — one that works directly inside the app — taps into existing behaviour instead of trying to create a new one.

As more users discover the mini app, there’s potential for it to evolve beyond predictions. The framework could support leaderboard competitions, seasonal events, or themed challenges around other tokens or events in the ICP world. For now, though, it’s keeping things simple — tap, guess, wait, and see.

That balance of approachability and on-chain infrastructure makes ICP Racer feel more focused than many casual crypto games. It doesn’t require users to learn a new metaverse map or adopt a new token. It just asks them to take a chance — and rewards them for getting it right.

At a time when many in the crypto space are looking for clearer applications and more interactive experiences, ICP Racer might not be flashy, but it’s refreshingly accessible. It strips things down to the basics: price, timing, and instinct. With Telegram integration, referral bonuses, and secure predictions underpinned by the Internet Computer, the project adds a lightweight but structured way to keep users engaged.

And with summer just around the corner, guessing whether ICP will Moon or Doom might become a new kind of habit for chat-happy crypto users. The points are digital, the risks are low, and the taps are easy. Whether you’re an accurate guesser or just in it for the fun, ICP Racer is giving users a reason to come back and check again. And again.

SEV Sneaks In: ICP’s Encryption Overhaul Takes Shape

0

Quietly but confidently, the Internet Computer Protocol (ICP) is getting a serious security boost. Behind the scenes, engineers are shaping up support for AMD’s Secure Encrypted Virtualisation (SEV), marking a bold move in how ICP handles replica upgrades and state encryption. The work is technical, time-consuming, and anything but cosmetic.

SEV is designed to protect the confidentiality of data in use, offering encryption at the virtual machine level. For ICP, adopting SEV means anchoring a node’s encryption keys to the very code it’s running. That makes keys tamper-resistant and unrecoverable outside their intended environment — a smart choice, but one that doesn’t come without some hefty structural consequences.

The team has confirmed that early tests of SEV support are scheduled to begin this summer. While that may sound straightforward, it’s hiding a complex web of engineering challenges. Rolling out SEV on ICP calls for a deep rethink of the upgrade and release machinery that sits under the hood of the protocol.

One of the biggest shifts comes in how encryption keys are generated and protected. Under this new approach, each replica’s keys are derived from a cryptographic hash of the running code. These keys don’t leave the virtual machine, and crucially, the hash changes with every release. That means an upgraded replica won’t be able to read the encrypted state left behind by an older one — unless the system finds a way to securely hand off those keys between versions.

To make that handoff safe and community-approved, the upgrade process itself is being overhauled. The new method will have to enable trusted key-sharing between consecutive replica versions, all while ensuring that no unauthorised or tampered replicas get access to the encrypted state. That’s a major shift from the current system and has knock-on effects across several moving parts.

The process of releasing new code will also need to adapt. Each release must now have its SEV hash calculated and stored, so the NNS Registry — the Internet Computer’s source of truth for node configuration — needs an upgrade of its own. It will have to record and distribute these hashes alongside the usual information, so that nodes know which versions are approved and which are off-limits.

Booting the operating system becomes more complicated, too. The bootloader — responsible for starting up the node and loading the correct software — must now verify that the OS matches the expected hash before anything else can run. That means redesigning the OS boot process from the ground up, with a strong focus on cryptographic assurance.

Beyond the internal mechanics, the way nodes interact is also changing. With SEV in place, nodes will be able to check if their peers are running SEV-enabled replicas — and if they’re not, the connection stops there. By default, data won’t be shared with nodes that don’t meet the new standard, cutting off any unapproved or potentially insecure participants from the network. That raises the bar for trust and integrity, but it also adds complexity to connection and verification protocols.

This isn’t a cosmetic polish or a mild performance tweak. The switch to SEV touches nearly every part of the infrastructure, and getting it right demands careful planning, testing, and coordination. Stability is a top priority, and no one wants to introduce security at the cost of uptime or user data. To that end, the team is taking a methodical approach, building out the necessary support structures and keeping a close eye on performance and safety throughout the rollout.

The upside is a system that can promise higher security guarantees, tighter control over upgrades, and an architecture that’s more resistant to tampering. It also sets the stage for a more fine-grained trust model across the network, where replicas can make smart decisions about who to trust — and who to avoid.

This move may not be loud or flashy, but it reflects a deliberate step towards strengthening the Internet Computer’s foundations. It’s a technical challenge with very real consequences, and one that will shape how secure, reliable, and flexible the network is in the years to come.

More updates are expected as testing begins in earnest. For now, SEV support remains a work-in-progress — a carefully engineered shift in how the Internet Computer handles itself behind the scenes.

Cryptography Kudos: DFINITY’s CTO Takes Top Honour

0

DFINITY’s Chief Technology Officer, Jan Camenisch, has received the 2025 PKC Test-of-Time Award, joining forces with fellow cryptographers Markulf Kohlweiss and Claudio Soriente. This prestigious recognition honours research in public key cryptography that has stood the test of time, influencing both theoretical foundations and practical applications.

The award, decided by a panel of experts, acknowledges the trio’s sustained contribution to advancing cryptographic methods, a vital component for secure digital communication and blockchain technologies. Their work has become fundamental within the field, helping shape how secure systems are designed and maintained today.

Jan Camenisch’s role at DFINITY highlights the company’s strong position in cryptographic innovation, underlining their commitment to pushing the boundaries of security in blockchain development. The team’s achievements reinforce their reputation as leaders in the evolving landscape of decentralised technology.

The recognition is a testament to the long-term impact that rigorous research can have, providing tools and frameworks that continue to underpin secure digital interactions worldwide. It shines a spotlight on the individuals whose work quietly but powerfully supports advancements in blockchain and cryptography.

This accolade adds to DFINITY’s growing list of successes, reinforcing the organisation’s influence across the technical and academic communities. Jan, Markulf, and Claudio’s pioneering efforts demonstrate how foundational research remains relevant as technology evolves, inspiring future innovation.

The award-winning research continues to inspire cryptographers and blockchain developers alike, serving as a benchmark for quality and durability in the field. It also reflects a broader commitment to secure digital ecosystems, an essential factor for the growth and trustworthiness of decentralised networks.

Celebrations within the blockchain space mark this achievement as a proud moment, recognising how cryptography forms the backbone of trust and security in digital finance and beyond. The ongoing work by DFINITY and its collaborators is expected to drive further breakthroughs, maintaining momentum in a rapidly changing sector.

By honouring Jan Camenisch and his colleagues, the PKC community not only acknowledges past excellence but also encourages ongoing dedication to research that shapes the future of secure communication. Their work underpins much of today’s cryptographic infrastructure, benefiting users and developers around the globe.

This recognition cements DFINITY’s position at the intersection of academic excellence and practical innovation, showing how sustained effort in cryptography fuels progress across blockchain technology. It provides a reminder of the critical role researchers play behind the scenes in securing digital systems and advancing the field.

With this award, DFINITY’s CTO joins a select group of cryptographers whose contributions have become essential references, inspiring confidence and innovation in the cryptographic community and beyond. It’s a milestone that celebrates enduring impact and opens the door to future achievements in secure digital technology.

KongSwapX Gets a Backend Boost and Picks Up Speed

KongSwapX has quietly rolled out a significant backend upgrade, and it’s already making waves across the decentralised exchange (DEX) scene. At the heart of this update is a sharper focus on ICRC-3 tokens, including newer arrivals like $TAGGR and assets tied to SNS projects. With these improvements, the platform aims to make trading faster, leaner, and more responsive—an upgrade that’s likely to matter as traffic and token variety continue to grow.

ICRC-3 tokens bring a layer of flexibility and granularity that earlier standards didn’t fully cover. KongSwapX’s backend revamp means these tokens can now be processed with a level of fluency that wasn’t previously possible. While the protocol specifics may read like alphabet soup to the average onlooker, to developers and active users, it signals a tightening of gears behind the scenes. The platform isn’t just making room for ICRC-3; it’s tuning itself to run better with them.

SNS-linked projects are another part of this refreshed toolkit. As community-owned projects under the Service Nervous System model pick up momentum, having DEX infrastructure that can keep up is essential. KongSwapX has essentially prepared the stage for tokens that are more dynamic, reflecting both governance shifts and evolving user patterns. Compatibility was always a check box, but now performance gets its spotlight.

The upgrade is powered by Internet Computer Protocol ($ICP), which serves as the digital engine under the bonnet. Unlike other ecosystems that rely on multiple chains, layers, or bridges, the ICP environment offers a more streamlined path for integrations. KongSwapX is betting on this advantage, pushing its performance while avoiding the sprawl that can come with traditional Web3 architecture.

Where earlier exchanges built around token swaps and liquidity pools had to compromise between convenience and decentralisation, this upgrade tries to cut that middle ground entirely. The aim is to handle more complex operations, such as advanced metadata or permission schemes tied to ICRC-3, without dragging down speed. Users aren’t expected to notice a dramatic change in interface just yet, but the background code has shifted enough to anticipate heavier demand and a broader set of use cases.

Performance testing was a large part of the work leading up to this rollout. The backend now processes data and transactions more efficiently, allowing it to deliver on-chain responses faster even under higher pressure. That’s particularly important for platforms like KongSwapX, which need to process requests in real-time across varied devices, wallets, and applications. With ICRC-3 token functionality more tightly integrated, the new codebase is less likely to choke on token structures that include extra fields, extended use permissions, or richer metadata.

The move is part of a wider trend in the DEX world, where simply supporting a token standard isn’t enough—supporting it with speed and stability is what draws developers and liquidity providers. With more SNS tokens and niche projects entering the market, the ability to handle those smart contracts without manual tweaks or clunky updates matters a lot more than it did a year ago. KongSwapX’s engineering team appears to have focused on getting ahead of that curve.

It’s not just about new tokens finding a place to trade. The update also lays the groundwork for smarter integration with dApps that rely on flexible token mechanics. If you’re building a tool that uses ICRC-3 assets for voting, staking, or tiered access, you need a DEX that won’t stumble over how those tokens are defined. KongSwapX is positioning itself as that reliable middle layer—a sort of interpreter between token logic and trader action.

The $TAGGR token stands out as an early test case. As one of the more prominent tokens using ICRC-3 features, it demands real-time recognition and fast pairing. Following the upgrade, it’s likely to see more volume moving across KongSwapX thanks to the smoother backend processes. That kind of throughput matters both for price discovery and for encouraging new communities to explore decentralised trading options.

One of the more technical yet vital improvements in this backend shift is how it handles data payloads. ICRC-3 tokens can carry extra information, including condition-based triggers or unique identity flags. Earlier versions of DEX engines weren’t always designed to interpret or respect these additional fields properly. Now, with the backend architecture refactored to match the latest token patterns, errors and mismatches should significantly drop.

There’s also the broader ICP community to consider. As the token economy within the Internet Computer grows, having stable, low-latency trading venues that fully understand the local standards becomes more critical. KongSwapX is not just upgrading for itself—it’s upgrading as part of a network that is leaning into decentralisation while trying to keep pace with user expectations. The DEX is essentially saying, “we’ve got the power and the tools, now bring your projects here.”

With the integration work done and stability benchmarks reportedly exceeded, the next challenge is adoption. Backend upgrades aren’t flashy, but they often unlock features that ripple outward into user experience. Expect the frontend team to gradually introduce visual cues, minor UI tweaks, and smarter analytics tools that draw from the new token data they’re now able to handle more fluently.

ICRC-3 support is a step towards broader interoperability too. Projects working across ICP and beyond can now design with more assurance that their tokens will function as expected once they hit KongSwapX. While bridges and cross-chain communication are still ongoing topics across Web3, having strong single-chain support is a necessary start. A robust internal standard often becomes the blueprint for outward-facing features.

There’s more coming, though, and this time it’s wrapped in chain abstraction. KongSwapX has announced that it will begin using $ICP’s chain fusion technology to explore what many see as the future of blockchain: seamless interaction between different chains without added complexity. The goal here is to move beyond compatibility and towards composability—where multi-chain DeFi doesn’t feel like a patchwork but behaves as if it’s all built under one protocol roof.

Chain abstraction changes how DEXs can work. It allows for DeFi platforms to interact with assets from multiple chains while hiding the complexity from users. Instead of switching networks, bridging assets, or using different wallets for different protocols, users would interact with KongSwapX in the same way they always have. Behind the curtain, though, the protocol would be working across several chains, thanks to ICP’s chain fusion.

For builders and power users, this means reduced friction and increased creative possibilities. Multi-chain DeFi won’t need to rely on external bridges or connectors that create bottlenecks. Frictionless, as KongSwapX describes it, is exactly what users expect from financial tools—especially when moving quickly is part of the strategy.

The ability to manage these complexities while offering a smooth front-end experience is part of what positions KongSwapX well for the next wave of DeFi innovation. It’s not trying to rebuild Web3 from scratch—it’s making the infrastructure more modular, more reliable, and more aligned with where the developer ecosystem is heading.

The industry has seen many DEXs scale quickly and crumble just as fast under architectural limits or neglected backend work. KongSwapX seems to be taking the slower, sturdier approach—build first, scale next. There are fewer fireworks in this method, but the platform’s goal of handling everything from simple swaps to advanced asset governance without delay is ambitious in a good way. The $ICP engine underneath is working harder, and that’s the sort of thing developers like to hear.

Future plans weren’t part of this announcement, but the engineering focus hints at more SDKs and toolkits coming soon. If KongSwapX opens up more of its integration points, we could see a broader set of dApps layering on top of its token routing and trade execution logic. That would place it closer to infrastructure status than just a market venue, particularly as ICRC-3 adoption increases.

As Web3 projects look for DEX partners that won’t buckle under new standards, KongSwapX’s timing feels right. A lot of attention is still on flashy NFTs and DePIN plays, but quietly, infrastructure is having its turn. With this backend upgrade live and chain abstraction in the works, KongSwapX is ready to carry a bit more of the load. Whether users notice a difference or just enjoy fewer hiccups is beside the point—what matters is that the pipes are cleaner, the logic sharper, and the road ahead far more connected.