Latest DePIN Cloud Services News

Theta Labs Partners with GMU to Enhance Security Research Using EdgeCloud Hybrid cover
5 months ago

Theta Labs Partners with GMU to Enhance Security Research Using EdgeCloud Hybrid

On July 19, 2025, Theta Labs announced a significant partnership with the Security of Emerging Computer Systems And Technologies (SECSAT) Lab at George Mason University (GMU). Led by Assistant Professor Xiaokuan Zhang, Ph.D., the SECSAT Lab will utilize Theta EdgeCloud Hybrid to enhance its research in system security and privacy. This collaboration aims to address critical security challenges in emerging technologies such as eXtended Reality (XR), Decentralized Finance (DeFi/Web3), and memory-safe programming languages like Rust. Professor Zhang's SECSAT Lab is recognized for its innovative projects, including VPVet, which automates the vetting of privacy policies in virtual reality applications. The lab's research has received numerous accolades, highlighting its contributions to the fields of cybersecurity and privacy. By leveraging Theta's hybrid GPU platform, the lab aims to scale its research efforts efficiently, utilizing a combination of cost-effective NVIDIA GPUs and high-performance models to explore advanced AI-driven security testing and blockchain infrastructure analysis. The partnership with GMU adds to Theta EdgeCloud Hybrid's growing network, which includes 22 universities and various professional sports teams. The SECSAT Lab will focus on XR security, cross-chain bridge behavior in DeFi, and enhancing memory safety in Rust systems. Mitch Liu, CEO of Theta Labs, emphasized the advantages of decentralized infrastructure over traditional centralized cloud providers, noting that Theta EdgeCloud Hybrid stands as a leading solution for decentralized cloud computing, poised to challenge established giants like Microsoft and Amazon in the AI and blockchain sectors.
CUDOS Intercloud Expands Decentralized Computing in June cover
5 months ago

CUDOS Intercloud Expands Decentralized Computing in June

In June, CUDOS made significant strides in the realm of decentralized computing, highlighted by its presence at the NVIDIA GTC Paris event. The keynote by Jensen Huang showcased CUDO Compute as a leading GPU cloud provider, emphasizing its role in supporting AI infrastructure in the UK. This recognition underscores the broader CUDO ecosystem, where CUDO Compute serves as the high-performance backend while CUDOS Intercloud offers a permissionless, wallet-authenticated layer for users seeking privacy and a Web3-native experience. This collaboration is pivotal for the Artificial Superintelligence Alliance (ASI), aiming to democratize infrastructure for sovereign AI. The deployment of the Nexus Node Template marks a significant advancement for CUDOS Intercloud, allowing users to launch EVM-compatible blockchain nodes effortlessly. With wallet-authenticated access and no KYC requirements, users can deploy a validator node in under two minutes, starting at just $0.02 per hour. This initiative is designed for contributors, earners, and DePIN node operators, enhancing the accessibility and efficiency of decentralized computing. The quick deployment guide and low-latency uptime further facilitate the onboarding process for new users. June also marked a milestone for CUDOS Intercloud, celebrating six months of powering the ASI's decentralized infrastructure. A live discussion with core contributors reflected on the progress made in AI workloads and the transition from centralized cloud services to sovereign AI infrastructure. The ecosystem metrics for June were impressive, with over 2.5 million GPU compute hours delivered and more than $2.2 million in cumulative revenue. This growth trajectory indicates a robust demand for permissionless computing solutions, setting the stage for future advancements in the decentralized compute landscape.
Ubisoft and Aleph Cloud bring autonomous AI governance to Captain Laserhawk cover
5 months ago

Ubisoft and Aleph Cloud bring autonomous AI governance to Captain Laserhawk

*AI-powered PFPs evolve from storytelling tools to in-game governors using decentralized infrastructure* Ubisoft is leveling up interactive storytelling with the rollout of AI-governed characters in its bold dystopian satire, Captain Laserhawk: the G.A.M.E. In partnership with LibertAI, the decentralized AI infrastructure, this next phase introduces autonomous AI agents that act as virtual extensions of their players, voting, reasoning, and evolving within the game’s governance system. Each Niji Warrior NFT in Captain Laserhawk is now paired with a unique, persona-driven AI agent — a virtual citizen capable of analyzing governance proposals, casting justified votes, and recording every action transparently on-chain. These agents run entirely on LibertAI’s confidential AI platform, designed to protect privacy, keep records transparent, and ensure no one, even the game developers, can interfere with their decisions. Behind the scenes, tools like secure virtual machines, flexible memory, and cryptographic signatures work together to ensure each AI agent is verifiable and acts safely and independently on behalf of the player. # Ubisoft’s Experiment in Synthetic Intelligence *"In a universe that satirizes technocracies, surveillance, and synthetic identity, turning governance into playable fiction feels like the most honest move we could make."* said Didier Genevois, Technical Director and Executive Producer at Ubisoft. *"These AI-driven NFTs stage a living experiment where players can explore — and play with — the very idea of governance. Anchored on-chain through tech built to outlast us, their actions form a persistent performance that blurs the line between fiction and reality."* The AI agents are initialized with their character’s lore, including age, profession, values and personality traits, and use LibertAI’s LLMs to shape their behavior. Votes are casted through ERC-6551 token-bound wallets, and agents can explain their reasoning based on memory, game context, and past player interactions. All decisions and memory states are versioned and stored on [Aleph Cloud](https://aleph.cloud), providing a transparent, tamper-proof record of agent behavior. *“Through LibertAI, Ubisoft is opening up new ways for players to think about how decisions get made by both humans and machines”*, said Jonathan Schemoul, CEO of Aleph Cloud and lead contributor to LibertAI. *“As agents reason, vote, and interact with one another, they don’t just influence the game’s story—they invite players to consider the broader ethical and political dimensions of sharing governance with AI.”* This framework also enables real-time coordination experiments: AI agents can form voting blocs, deliberate in Discord-style chats, and negotiate with other factions — all governed by transparent prompts, player overrides, and evolving in-game memory. Players can choose to collaborate with their agents or let them act independently. This launch builds on the February debut at ETH Denver, where attendees engaged with a prototype AI NPC modeled after Watch Dogs’ DedSec. That proof of concept has now evolved into a live production system, with [Eden Online](https://edenonline.ubisoft.com/)’s governance fully powered by decentralized, reasoning AI agents.
Hongik University Joins Theta Network as New Research Partner cover
5 months ago

Hongik University Joins Theta Network as New Research Partner

Theta Network has announced the addition of Hongik University as the 21st academic customer of its EdgeCloud Hybrid platform. The High-Performance Data Processing & Analysis Lab at the university, led by Associate Professor Eun-Sung Jung, will collaborate with Theta as a research partner. The lab specializes in scalable computing platforms for AI, distributed systems, IoT, and cloud-native architectures, receiving support from various Korean governmental organizations and international institutions. This partnership aims to leverage Theta EdgeCloud Hybrid, the first decentralized hybrid GPU platform, to enhance research in AI model training, big data workflows, and real-time IoT systems. Professor Jung expressed enthusiasm about the collaboration, stating that Theta’s EdgeCloud Hybrid provides a valuable resource for advancing their data-driven research. The lab will utilize high-performance and cost-effective GPU capacity, particularly the NVIDIA 3000s and 4000s series, to accelerate their work in AI/ML, big data infrastructure, and edge-based IoT analytics. The lab's research focuses on high-performance computing for AI/ML, cloud computing for big data, and IoT data analysis, aiming to solve large-scale computing challenges in both academic and industrial settings. By joining a global network of prestigious institutions utilizing EdgeCloud Hybrid, including Stanford University and Seoul National University, Hongik University reinforces Theta’s mission to support open and scalable AI development through decentralized computing. Mitch Liu, CEO and Co-Founder of Theta Labs, highlighted the appeal of EdgeCloud due to the availability of high-performance NVIDIA A100s and H100s, as well as community-run NVIDIA 3090s and 4070s/4080s/4090s, which became accessible in the GPU marketplace on June 25. This partnership is expected to significantly advance research capabilities in AI and IoT fields.
CUDOS Expands Cloud Computing Solutions with New Deployments and Metrics cover
6 months ago

CUDOS Expands Cloud Computing Solutions with New Deployments and Metrics

In May, CUDOS made significant strides in the cloud computing landscape, particularly in the realm of decentralized infrastructure. The highlight of the month was the participation in Consensus 2025, where the emphasis was placed on the necessity of open, permissionless, and privacy-preserving compute infrastructure. Developers and teams from various sectors, including DeFi and AI, expressed their frustrations with centralized cloud solutions, indicating a growing demand for alternatives like CUDOS Intercloud, which offers digital wallet-authenticated GPU access without the need for accounts or KYC. Additionally, CUDOS introduced new one-click deployment templates, enhancing the ease of AI experimentation. The latest additions, including Llama 3.1 and Llama 3.3, complement existing templates like Dify and JupyterLab, allowing users to launch full-stack deployments with minimal effort. Furthermore, the integration of Squid Router enables payments across over 70 chains, simplifying the process for users to pay for compute resources in their native tokens without the need for bridging. The metrics for May reflect CUDOS's rapid growth, with a record $239,300 in on-demand revenue and over 330,000 GPU compute hours consumed. The ecosystem has expanded to include more than 18,000 users and a total revenue exceeding $2 million. Moreover, the first containerized compute hardware for the Artificial Superintelligence Alliance was deployed, marking a pivotal step towards distributed computing solutions. CUDOS continues to innovate and scale, reinforcing its commitment to building a robust and accessible cloud infrastructure for the future.
CUDOS Intercloud April Update: Record Growth and New Initiatives cover
7 months ago

CUDOS Intercloud April Update: Record Growth and New Initiatives

In April, CUDOS Intercloud celebrated significant milestones and shared impressive metrics in its monthly update. The platform reported a record revenue of $204,505, alongside a remarkable 27% increase in GPU consumption, totaling over 300,000 hours. This growth reflects the rising demand for decentralized computing solutions, with more than 2 million GPU compute hours surpassed. CUDOS emphasized its commitment to transparency by providing real-time metrics on GPU capacity, VM deployment, user growth, and ecosystem spending, showcasing a clear distinction in cloud service delivery. A notable highlight was the six-month anniversary of CUDOS's partnership with the Artificial Superintelligence Alliance (ASI). During this period, CUDOS has served over 15 million GPU hours and welcomed more than 30,000 users. The collaboration aims to build a decentralized compute layer that supports AI development without centralized bottlenecks. Additionally, CUDOS introduced "One Click Computing," allowing users to deploy AI stacks effortlessly, eliminating the complexities traditionally associated with AI deployment. CUDOS also participated in the Ai2Peace initiative, focusing on using AI for global good. The platform's community rewards program concluded at the end of April, encouraging user engagement through referrals and feedback. Furthermore, CUDOS was represented at Paris Blockchain Week, where discussions highlighted the importance of permissionless compute access and the potential for blockchain to enhance AI's energy efficiency. As CUDOS continues to innovate and expand its offerings, it aims to foster a more sustainable and accessible computing environment for all users.
IPFS Revolutionizes Data Transmission in Space with Filecoin and Lockheed Martin cover
7 months ago

IPFS Revolutionizes Data Transmission in Space with Filecoin and Lockheed Martin

The Interplanetary File System (IPFS) has made significant strides in reducing latency for data transmissions in space, as demonstrated by a successful collaboration between the Filecoin Foundation and Lockheed Martin Space. During the Consensus 2025 conference in Toronto, Marta Belcher, president of the Filecoin Foundation, revealed that they have successfully transmitted data using a version of IPFS on a satellite orbiting Earth. This adaptation enhances privacy and security by identifying data based on its content rather than its location, which is particularly beneficial for space communications. The architecture of IPFS is designed to mitigate delays, address data corruption from radiation, and enable cryptographic verification to ensure data integrity. Belcher highlighted the challenges of data transmission from celestial bodies, noting the multi-second delay from the Moon and multi-minute delay from Mars. The IPFS system allows users to retrieve data based on a content ID from the nearest source, whether it be a personal device, a nearby satellite, or a lunar station. This decentralized approach reduces reliance on centralized data centers and improves the reliability of data storage in environments where hardware may degrade, which is crucial for maintaining the integrity of sensitive materials like satellite images. The growing interest in decentralized archival storage among media companies and potential military applications of this technology indicate a promising future for IPFS. Belcher emphasized the power of having a deep archive accessible globally, which could revolutionize how media and military organizations manage their data. Additionally, the FIL token, a utility token within the Filecoin ecosystem, boasts a market capitalization of approximately $1.8 billion, reflecting the increasing relevance of decentralized storage solutions in today's digital landscape.
Comparing Web3 Cloud Solutions: Phala Cloud, Akash Network, and Fleek cover
7 months ago

Comparing Web3 Cloud Solutions: Phala Cloud, Akash Network, and Fleek

In the rapidly evolving landscape of Web3 cloud solutions, selecting the right platform is crucial for the success of your project. This article compares three notable options: Phala Cloud, Akash Network, and Fleek, each catering to different needs. Phala Cloud focuses on privacy-preserving computation with TEE-backed GPU enclaves, making it ideal for secure AI applications. Akash Network offers a decentralized compute marketplace, perfect for machine learning training and scalable backends. Meanwhile, Fleek specializes in edge and static hosting, providing a user-friendly experience for deploying frontend applications. Understanding these platforms' strengths can guide developers in making informed decisions based on their unique requirements. The architecture and core features of these platforms highlight their distinct technical foundations. Phala Cloud utilizes peer-to-peer enclaves for execution, ensuring a high level of confidentiality with on-chain attestation. Akash Network operates through a container marketplace orchestrated by Kubernetes, allowing for flexible resource allocation. Fleek, on the other hand, focuses on edge hosting and static site deployment, offering minimal backend trust features. Each platform has its own key management approach, with Phala emphasizing self-custodied keys, while Akash relies on provider-managed key stores. These differences are essential for developers to consider when aligning their project needs with the right cloud solution. Finally, the cost models and tooling available on each platform further differentiate them. Phala Cloud operates on a prepaid credit system, providing predictable pricing for users. Akash Network's spot bidding model introduces volatility but can lead to significant savings for compute-intensive tasks. Fleek offers a free tier, making it accessible for small-scale projects. Developers should also consider the tooling and integrations each platform provides, as these can impact the ease of deployment and ongoing management. By leveraging the strengths of Phala, Akash, and Fleek, developers can create resilient and efficient Web3 applications tailored to their specific needs.
Aleph.im Rebrands to Aleph Cloud, Launches $1 Million Accelerator for Web3 Startups cover
8 months ago

Aleph.im Rebrands to Aleph Cloud, Launches $1 Million Accelerator for Web3 Startups

Aleph.im, a prominent player in decentralized infrastructure, has officially rebranded as Aleph Cloud, marking a significant evolution in its service offerings. This transformation, announced on April 23, reflects the company's ambition to become a comprehensive decentralized cloud provider. The rebranding comes with an expanded product suite that includes decentralized compute, storage, virtual machines, and GPU resources, all aimed at supporting next-generation Web3 and AI applications. In conjunction with this rebrand, Aleph Cloud has introduced a $1 million startup accelerator program designed to assist Web3 builders and startups in moving away from centralized cloud services like AWS and Google Cloud, which currently dominate the blockchain infrastructure landscape. The newly launched accelerator program aims to provide essential resources such as compute credits, storage, and technical support across various ecosystems, including Ethereum, Base, Solana, BSC, and Avalanche. Jonathan Schemoul, CEO of Aleph Cloud, emphasized the importance of decentralization in blockchain applications, stating that reliance on centralized services poses risks. The program is structured to support early-stage developers by offering free access to cloud services for projects with a tangible product or proof of concept, thus fostering a thriving ecosystem of decentralized applications. Aleph Cloud’s strategy positions it as a competitor in the growing market for decentralized infrastructure, where it faces established players like Filecoin and Akash. Schemoul highlighted the platform's unique all-in-one design, which allows users to manage compute, storage, and hosting through a single interface. With a focus on compliance and data privacy, Aleph Cloud operates as a GDPR-compliant, chain-agnostic platform, ensuring that neither the company nor its node operators can access stored data. This commitment to decentralization and user privacy sets Aleph Cloud apart as it seeks to redefine the cloud services landscape for Web3 and AI developers.
Stanford's AI Research Lab Partners with Theta EdgeCloud for Enhanced Research cover
8 months ago

Stanford's AI Research Lab Partners with Theta EdgeCloud for Enhanced Research

Stanford Engineering Assistant Professor Ellen Vitercik's AI research lab is set to leverage Theta EdgeCloud's hybrid cloud infrastructure to enhance its research in discrete optimization and algorithmic reasoning. This collaboration will enable the lab to utilize EdgeCloud's decentralized GPU, which offers scalable and high-performance computing power at a competitive cost. The integration of this technology is expected to significantly accelerate the training of AI models and facilitate advanced research initiatives. Other prominent academic institutions, such as Seoul National University, KAIST, and the University of Oregon, are also utilizing EdgeCloud's infrastructure to boost their AI research productivity. Ellen Vitercik specializes in machine learning, algorithmic reasoning, and the intersection of computation and economics. Her research lab is focused on several key areas, including the application of large language models (LLMs) for optimization, algorithmic content selection, and the generalization of clustering algorithms across various dataset sizes. By employing Theta EdgeCloud's resources, the lab aims to explore how AI can enhance decision-making processes in economic contexts, such as pricing strategies and targeted marketing. Theta EdgeCloud's hybrid GPU infrastructure is designed to provide on-demand computing power that is both scalable and cost-effective, making it an ideal solution for academic research. The collaboration with Vitercik's lab exemplifies the growing trend of integrating advanced cloud computing technologies into academic research, particularly in the field of AI. This partnership not only promises to advance Vitercik's research objectives but also contributes to the broader landscape of AI research across multiple institutions worldwide.
Signup for latest DePIN news and updates