Home Science & TechSecurity Centralized LLMs Will Crash By 2030

Centralized LLMs Will Crash By 2030

by ccadm


The business model behind the predominant AI Large Language Models (LLMs) is going to crash by 2030 or before. It will not be for dystopian reasons, intellectual property rights infringements, or lack of usage. It will be because of their huge success and the unsustainable, centralized, closed-source model that they are based on that is unable to deliver the fundamental structural needs for mass adoption. When the main LLMs crash, other decentralised open source and open-weight AI systems will take their place.

So far the debate around AI has been primarily centered around serious and also necessary human rights concerns, such as AI taking over jobs, data protection and privacy, and even ideological ones addressing the ethics of humanity’s over-reliance on this new technology. Yet, behind these flashy topics that tap into our fears around the future of the human species exists a more “boring” but pressing reality: AI LLM models are grappling with severe infrastructure bottlenecks, particularly in energy consumption, data storage, computing power, and cooling—all challenges that remain largely unresolved and threaten the scalability of the entire ecosystem. 

AI is estimated to reach about 1.2 billion users by 2031. Already, AI consumes 230% more energy than Bitcoin does. The impact of AI is going to be particularly profound in certain locations where data centers are built. In Ireland, for example, data centers comprise more than 20% of the country’s electricity usage. In storage it requires 200 zettabytes (equivalent to over 2 trillion 1TB hard drives) in order to be trained and allow users to keep their information. Among other constraints, cooling systems consume up to 40% of a data center’s total energy. From end to end, strain over infrastructure is the trademark of AI. Specifically, electricity demand from AI-optimized data centers is projected to more than quadruple by 2030.

We have already entered the next stage of the Internet as the “Internet of Value” and ownership of data and digital assets, powered by the emerging triumvirate of AI, the metaverse, and blockchain. These “value systems” are poised to unlock unique 21st-Century business models that transform humanity’s relationship to technology and wealth creation forever. 

Despite being in the midst of the Internet of Value’s emergence, we have been overlooking the need to also deploy alternative suitable options to address the gaping need for infrastructure efficiency. This is our only hope for preventing an otherwise inevitable LLM crash. If we don’t act now, the repercussions of the AI-infrastructure failure will be similar to the impact of the Dot com bubble burst of 2000. 

Any attempt to build the Internet of Value that insists on the centralized concentration of power and within similar business models holds an intrinsic contradiction. The pending question to answer is: who will end up paying the mammoth price of growth that these systems require?

A good example is the “Stargate Project,” which is looking for half trillion US dollars of investment to deploy 20 hyperscale data centres. Precisely, that decision will probably be enough to save one company in the short term, but in the long run few questions arise: Who will pay back the funds that are being invested? Is that company going to become a public service? How much will the consumer have to pay to monetise trillions of dollars? And this is just a starting point since current investment won’t be enough, while another couple trillion will have to come in to enable LLM AI to thrive. So, how can LLMs avoid an otherwise inevitable crash by 2030? The answer is “sharing the profit.” That simple approach represents a true change of paradigm of the current monopolistic business model. And it’s possible.

As Deepseek already demonstrated, breakthroughs often happen when human ingenuity is challenged. Within the Web3 arsenal, Decentralized Physical Infrastructure Network (DePIN) projects have already shown mature solutions for energy consumption, storage and computing power. Projects like Filecoin (decentralized storage), Render Network (decentralized GPU compute), and DIMO (decentralized energy and mobility data infrastructure) offer real-world alternatives to centralized, resource-intensive AI infrastructure.

Those AI organizations that will win are those same ones building already more resilient approaches and investing in decentralized business models and solutions to support their AI proposals. Technology is developing fast and individual devices like PCs, DePIN-dedicated GPUs, and mobile phones will soon all be harnessed to enhanced data processing power, storage, and AI model training.

The message is written on the wall: Train people to run the nodes and support the new decentralized internet. This way we shall avoid the crash, while democratizing the internet with new revenue streams for the benefit of all. 



Source link

Related Articles