io.net Partners With AI Startup WOMBO To Enhance Computing Power For Its Machine Learning Models

io.net Partners With AI Startup WOMBO To Enhance Computing Power For Its Machine Learning Models

AI avatar app WOMBO partners with a decentralized physical infrastructure network (DePIN) io.net to power its machine learning (ML) model using decentralized Apple silicon chips. This strategic partnership will see the WOMBO ecosystem, including the WOMBO app, Dream, and WOMBO Me, contribute to additional compute capacity for its products with io.net’s decentralized GPU compute network.

The partnership follows io.net’s move to become the first cloud service provider to support  Apple silicon chip clustering for machine learning (ML) applications. Announced in February, io.net’s move towards Apple silicon chips aimed at offering millions of Apple users and ML engineers cheaper and more accessible GPU compute options. 

As one of the pioneering partners, WOMBO, the generative AI startup, will harness the power of Apple silicon chip clusters to power their ML models.

“We are excited about partnering with io.net to help bring unused computing power and put it to use in groundbreaking AI applications – together, our teams have the potential to put a serious dent in the GPU supply shortage,” said WOMBO CEO Ben-Zion Benkhin.

io.net’s goal to minimize operational costs

Cloud computing ranks as one of the largest operational costs for AI or ML companies, such as WOMBO. The cloud computing space witnesses growing demand by the day while there is a limited supply of hardware. This has made the process of storing data expensive and less efficient, meaning higher costs and longer timeframes for companies. 

The partnership with io.net will solve these issues for WOMBO, one of the spokesperson stated. The DePIN platform offers solutions through the aggregation of decentralized and geographically distributed GPUs, allowing companies to deploy clusters on demand for a fraction of the cost. 

io.net has a collection of the best-performing GPUs in the market, boasting over 100,000 nodes across its network. The network enables machine learning engineers to deploy both Ray and Kubernetes clusters with thousands of GPUs rapidly and cost-effectively. With WOMBO boasting over 200+ million application downloads, the partnership with io.net will continue to “supercharge its growth” as the operational costs are minimized. 

WOMBO taps into Apple’s silicon chips

As alluded to, io.net introduced an innovative development this February that allowed hundreds of millions of Apple users globally to contribute unutilized Apple chip compute resources for AI/ML use cases. WOMBO is the latest platform to leverage these unique capabilities and will use the compute capabilities to power its machine learning models. 

This initiative aims to leverage the Neural Engine capabilities on Apple’s chips and the mega-clustering capabilities of io.net to tap into hundreds of millions of consumer devices for AI workloads.

On WOMBO’s partnership and the utilization of Apple silicon chips, Benkhin added:  

“With over 74 million people across more than 180 countries at its peak, WOMBO is an example of a consumer AI application at scale.” 

Since its launch in December, io.net has amassed over 520,000 GPUs and CPUs and an infrastructure value of over $2 billion. 

Disclaimer: This article is provided for informational purposes only. It is not offered or intended to be used as legal, tax, investment, financial, or other advice.

 

Leave a Reply