Apple silicon chip technology will be integrated into the recently created decentralized physical infrastructure network (DePIN) io.net for its machine learning (ML) and artificial intelligence (AI) applications.In order to support ML and AI computation, Io.net has created a decentralized network based in Solana that sources graphics processing unit (GPU) computing capacity from geographically dispersed data centers, cryptocurrency miners, and decentralized storage providers.
The business, which had recently struck a cooperation with Render Network, announced the introduction of its beta platform during the Solana Breakpoint conference in Amsterdam in November 2023.With its most recent update, Io.net claims to be the first cloud service to offer machine learning applications using Apple silicon chip clustering.For ML and AI computing, engineers can cluster Apple chips from any location in the world.
As discussed in-depth by Cointelegraph earlier, io.net offers inexpensive GPU processing resources for use in AI and ML applications.The platform makes payments to GPU and central processing unit compute providers easier by utilizing Solana’s blockchain.
Tory Green, the chief operational officer of io.net, claims that Solana’s infrastructure is specifically designed to handle the volume of transactions and inferences that io.net will enable.In order to leverage the hardware, the infrastructure gathers GPU processing power in clusters through hundreds of inferences and related microtransactions.Users of io.net can now contribute processing power from a variety of Apple Silicon chips thanks to the update.The M1, M1 Max, M1 Pro, and M1 Ultra; M2, M2 Max, M2 Pro, and M2 Ultra; and M3, M3 Max, and M3 Pro models are all included in this broad range.
The 128 megabyte memory architecture of Apple’s M3 chips outperforms the capabilities of Nvidia’s flagship A100-80 GB graphics cards, according to Io.net.Additionally, according to Io.net, Apple’s M3 CPUs have an improved neural engine that powers them 60% quicker than its M1 series.
The chips’ unified memory architecture also makes them suitable for model inference, which involves putting real-time data through an AI model to solve problems or provide predictions.The incorporation of Apple chip compatibility, according to Io.net founder Ahmad Shadid, may enable hardware to keep up with the increasing demand for AI and ML processing resources.
“This is a massive step forward in democratizing access to powerful computing resources, and paves the way for millions of Apple users to earn rewards for contributing to the AI revolution.”
Millions of users of Apple products can now contribute more chips and processing power for AI and ML use cases thanks to the addition of hardware support.