IoTeX Mainnet v1.14.0 Upgrade Boosts Throughput to 1000 TPS

Tuesday, April 23, 2024 6:02 PM
135

IoTeX, a leading modular Decentralized Physical Infrastructure Network (DePIN) blockchain platform, has upgraded its Mainnet to version 1.14.0, achieving a remarkable increase in throughput to 1000 transactions per second (TPS). This enhancement places IoTeX as the sixth fastest blockchain globally and the second fastest among EVM-compatible chains. The upgraded Mainnet v1.14.0 has raised the block gas limit to 50M, significantly enhancing processing capabilities while maintaining blockchain stability. With over 114 million transactions and 29.5 million blocks processed to date, IoTeX provides a robust and scalable environment for developers to build DePIN applications. The IoTeX L1 serves as a trust source for project-related token economies and identity layers, offering speed, low costs, and instant finality with transaction costs as low as a fraction of a penny. The upgrade also improved delegate management through delegate endorsements, activated with the approval and activation of the latest IoTeX Improvement Proposal (IIP), making delegate management more flexible and encouraging diverse projects to participate in the IoTeX network operation. CEO of IoTeX, Raullen Chai, stated, ‘This performance upgrade is a significant milestone for IoTeX, offering developers a powerful platform for building next-generation DePIN applications and solidifying IoTeX’s position as a leading blockchain for a scalable, secure future.’ IoTeX is a modular Web3 infrastructure platform that connects smart devices and real-world data to blockchains, empowering developers to integrate Web3 into everyday life through various innovations. With the launch of W3bstream, the world’s first off-chain compute framework for smart devices and real-world data, IoTeX has become a leading DePIN technology provider, enabling vertical innovations for machines, humans, businesses, and dApps to interact with trust and privacy.

Related News

Ubisoft and Aleph Cloud bring autonomous AI governance to Captain Laserhawk cover
6 hours ago
Ubisoft and Aleph Cloud bring autonomous AI governance to Captain Laserhawk
*AI-powered PFPs evolve from storytelling tools to in-game governors using decentralized infrastructure* Ubisoft is leveling up interactive storytelling with the rollout of AI-governed characters in its bold dystopian satire, Captain Laserhawk: the G.A.M.E. In partnership with LibertAI, the decentralized AI infrastructure, this next phase introduces autonomous AI agents that act as virtual extensions of their players, voting, reasoning, and evolving within the game’s governance system. Each Niji Warrior NFT in Captain Laserhawk is now paired with a unique, persona-driven AI agent — a virtual citizen capable of analyzing governance proposals, casting justified votes, and recording every action transparently on-chain. These agents run entirely on LibertAI’s confidential AI platform, designed to protect privacy, keep records transparent, and ensure no one, even the game developers, can interfere with their decisions. Behind the scenes, tools like secure virtual machines, flexible memory, and cryptographic signatures work together to ensure each AI agent is verifiable and acts safely and independently on behalf of the player. # Ubisoft’s Experiment in Synthetic Intelligence *"In a universe that satirizes technocracies, surveillance, and synthetic identity, turning governance into playable fiction feels like the most honest move we could make."* said Didier Genevois, Technical Director and Executive Producer at Ubisoft. *"These AI-driven NFTs stage a living experiment where players can explore — and play with — the very idea of governance. Anchored on-chain through tech built to outlast us, their actions form a persistent performance that blurs the line between fiction and reality."* The AI agents are initialized with their character’s lore, including age, profession, values and personality traits, and use LibertAI’s LLMs to shape their behavior. Votes are casted through ERC-6551 token-bound wallets, and agents can explain their reasoning based on memory, game context, and past player interactions. All decisions and memory states are versioned and stored on [Aleph Cloud](https://aleph.cloud), providing a transparent, tamper-proof record of agent behavior. *“Through LibertAI, Ubisoft is opening up new ways for players to think about how decisions get made by both humans and machines”*, said Jonathan Schemoul, CEO of Aleph Cloud and lead contributor to LibertAI. *“As agents reason, vote, and interact with one another, they don’t just influence the game’s story—they invite players to consider the broader ethical and political dimensions of sharing governance with AI.”* This framework also enables real-time coordination experiments: AI agents can form voting blocs, deliberate in Discord-style chats, and negotiate with other factions — all governed by transparent prompts, player overrides, and evolving in-game memory. Players can choose to collaborate with their agents or let them act independently. This launch builds on the February debut at ETH Denver, where attendees engaged with a prototype AI NPC modeled after Watch Dogs’ DedSec. That proof of concept has now evolved into a live production system, with [Eden Online](https://edenonline.ubisoft.com/)’s governance fully powered by decentralized, reasoning AI agents.
Hongik University Joins Theta Network as New Research Partner cover
4 days ago
Hongik University Joins Theta Network as New Research Partner
Theta Network has announced the addition of Hongik University as the 21st academic customer of its EdgeCloud Hybrid platform. The High-Performance Data Processing & Analysis Lab at the university, led by Associate Professor Eun-Sung Jung, will collaborate with Theta as a research partner. The lab specializes in scalable computing platforms for AI, distributed systems, IoT, and cloud-native architectures, receiving support from various Korean governmental organizations and international institutions. This partnership aims to leverage Theta EdgeCloud Hybrid, the first decentralized hybrid GPU platform, to enhance research in AI model training, big data workflows, and real-time IoT systems. Professor Jung expressed enthusiasm about the collaboration, stating that Theta’s EdgeCloud Hybrid provides a valuable resource for advancing their data-driven research. The lab will utilize high-performance and cost-effective GPU capacity, particularly the NVIDIA 3000s and 4000s series, to accelerate their work in AI/ML, big data infrastructure, and edge-based IoT analytics. The lab's research focuses on high-performance computing for AI/ML, cloud computing for big data, and IoT data analysis, aiming to solve large-scale computing challenges in both academic and industrial settings. By joining a global network of prestigious institutions utilizing EdgeCloud Hybrid, including Stanford University and Seoul National University, Hongik University reinforces Theta’s mission to support open and scalable AI development through decentralized computing. Mitch Liu, CEO and Co-Founder of Theta Labs, highlighted the appeal of EdgeCloud due to the availability of high-performance NVIDIA A100s and H100s, as well as community-run NVIDIA 3090s and 4070s/4080s/4090s, which became accessible in the GPU marketplace on June 25. This partnership is expected to significantly advance research capabilities in AI and IoT fields.
CUDOS Expands Cloud Computing Solutions with New Deployments and Metrics cover
12 days ago
CUDOS Expands Cloud Computing Solutions with New Deployments and Metrics
In May, CUDOS made significant strides in the cloud computing landscape, particularly in the realm of decentralized infrastructure. The highlight of the month was the participation in Consensus 2025, where the emphasis was placed on the necessity of open, permissionless, and privacy-preserving compute infrastructure. Developers and teams from various sectors, including DeFi and AI, expressed their frustrations with centralized cloud solutions, indicating a growing demand for alternatives like CUDOS Intercloud, which offers digital wallet-authenticated GPU access without the need for accounts or KYC. Additionally, CUDOS introduced new one-click deployment templates, enhancing the ease of AI experimentation. The latest additions, including Llama 3.1 and Llama 3.3, complement existing templates like Dify and JupyterLab, allowing users to launch full-stack deployments with minimal effort. Furthermore, the integration of Squid Router enables payments across over 70 chains, simplifying the process for users to pay for compute resources in their native tokens without the need for bridging. The metrics for May reflect CUDOS's rapid growth, with a record $239,300 in on-demand revenue and over 330,000 GPU compute hours consumed. The ecosystem has expanded to include more than 18,000 users and a total revenue exceeding $2 million. Moreover, the first containerized compute hardware for the Artificial Superintelligence Alliance was deployed, marking a pivotal step towards distributed computing solutions. CUDOS continues to innovate and scale, reinforcing its commitment to building a robust and accessible cloud infrastructure for the future.
CUDOS Intercloud April Update: Record Growth and New Initiatives cover
2 months ago
CUDOS Intercloud April Update: Record Growth and New Initiatives
In April, CUDOS Intercloud celebrated significant milestones and shared impressive metrics in its monthly update. The platform reported a record revenue of $204,505, alongside a remarkable 27% increase in GPU consumption, totaling over 300,000 hours. This growth reflects the rising demand for decentralized computing solutions, with more than 2 million GPU compute hours surpassed. CUDOS emphasized its commitment to transparency by providing real-time metrics on GPU capacity, VM deployment, user growth, and ecosystem spending, showcasing a clear distinction in cloud service delivery. A notable highlight was the six-month anniversary of CUDOS's partnership with the Artificial Superintelligence Alliance (ASI). During this period, CUDOS has served over 15 million GPU hours and welcomed more than 30,000 users. The collaboration aims to build a decentralized compute layer that supports AI development without centralized bottlenecks. Additionally, CUDOS introduced "One Click Computing," allowing users to deploy AI stacks effortlessly, eliminating the complexities traditionally associated with AI deployment. CUDOS also participated in the Ai2Peace initiative, focusing on using AI for global good. The platform's community rewards program concluded at the end of April, encouraging user engagement through referrals and feedback. Furthermore, CUDOS was represented at Paris Blockchain Week, where discussions highlighted the importance of permissionless compute access and the potential for blockchain to enhance AI's energy efficiency. As CUDOS continues to innovate and expand its offerings, it aims to foster a more sustainable and accessible computing environment for all users.
IPFS Revolutionizes Data Transmission in Space with Filecoin and Lockheed Martin cover
2 months ago
IPFS Revolutionizes Data Transmission in Space with Filecoin and Lockheed Martin
The Interplanetary File System (IPFS) has made significant strides in reducing latency for data transmissions in space, as demonstrated by a successful collaboration between the Filecoin Foundation and Lockheed Martin Space. During the Consensus 2025 conference in Toronto, Marta Belcher, president of the Filecoin Foundation, revealed that they have successfully transmitted data using a version of IPFS on a satellite orbiting Earth. This adaptation enhances privacy and security by identifying data based on its content rather than its location, which is particularly beneficial for space communications. The architecture of IPFS is designed to mitigate delays, address data corruption from radiation, and enable cryptographic verification to ensure data integrity. Belcher highlighted the challenges of data transmission from celestial bodies, noting the multi-second delay from the Moon and multi-minute delay from Mars. The IPFS system allows users to retrieve data based on a content ID from the nearest source, whether it be a personal device, a nearby satellite, or a lunar station. This decentralized approach reduces reliance on centralized data centers and improves the reliability of data storage in environments where hardware may degrade, which is crucial for maintaining the integrity of sensitive materials like satellite images. The growing interest in decentralized archival storage among media companies and potential military applications of this technology indicate a promising future for IPFS. Belcher emphasized the power of having a deep archive accessible globally, which could revolutionize how media and military organizations manage their data. Additionally, the FIL token, a utility token within the Filecoin ecosystem, boasts a market capitalization of approximately $1.8 billion, reflecting the increasing relevance of decentralized storage solutions in today's digital landscape.
Comparing Web3 Cloud Solutions: Phala Cloud, Akash Network, and Fleek cover
2 months ago
Comparing Web3 Cloud Solutions: Phala Cloud, Akash Network, and Fleek
In the rapidly evolving landscape of Web3 cloud solutions, selecting the right platform is crucial for the success of your project. This article compares three notable options: Phala Cloud, Akash Network, and Fleek, each catering to different needs. Phala Cloud focuses on privacy-preserving computation with TEE-backed GPU enclaves, making it ideal for secure AI applications. Akash Network offers a decentralized compute marketplace, perfect for machine learning training and scalable backends. Meanwhile, Fleek specializes in edge and static hosting, providing a user-friendly experience for deploying frontend applications. Understanding these platforms' strengths can guide developers in making informed decisions based on their unique requirements. The architecture and core features of these platforms highlight their distinct technical foundations. Phala Cloud utilizes peer-to-peer enclaves for execution, ensuring a high level of confidentiality with on-chain attestation. Akash Network operates through a container marketplace orchestrated by Kubernetes, allowing for flexible resource allocation. Fleek, on the other hand, focuses on edge hosting and static site deployment, offering minimal backend trust features. Each platform has its own key management approach, with Phala emphasizing self-custodied keys, while Akash relies on provider-managed key stores. These differences are essential for developers to consider when aligning their project needs with the right cloud solution. Finally, the cost models and tooling available on each platform further differentiate them. Phala Cloud operates on a prepaid credit system, providing predictable pricing for users. Akash Network's spot bidding model introduces volatility but can lead to significant savings for compute-intensive tasks. Fleek offers a free tier, making it accessible for small-scale projects. Developers should also consider the tooling and integrations each platform provides, as these can impact the ease of deployment and ongoing management. By leveraging the strengths of Phala, Akash, and Fleek, developers can create resilient and efficient Web3 applications tailored to their specific needs.
Signup for latest DePIN news and updates