>

Congress Should Start Planning for a Potential AI Crash Now, New Report Says

Trillions of dollars of capital investment in AI infrastructure are propping up the U.S. economy, yet revenues from AI are not increasing at a rate commensurate to this investment. This overreliance on AI investment, coupled with the opaque and complex financial engineering involved, means that the cost of a market correction could look like 2008 — an economy-wide crash with unpredictable and systemic consequences. If such a crash comes to pass, Congress would likely consider legislation to reform the market. Instead of scrambling to identify the right policies during an economic crash, Congress should start thinking seriously about and planning for a potential AI crash now, according to a new report from Asad Ramzanali, VPA Director of AI and Technology Policy.

Ramzanali offers a set of proposals for post-AI crash reforms. These include:

  • First, Congress should curtail the financial engineering—circular equity investments, opaque debt, and distortive government subsidies—that may be the proximate cause of the crash, and the government should prosecute any related frauds and illegal activities. All of the major chipmakers, hyperscalers, and AI model makers are entangled in a web of vendors owning equity in customers, in a novel form of circular equity that means that a business problem for one firm could cascade through the industry. Congress can stop this form of financing directly. This is on top of the widespread use of private credit-backed special purpose vehicles (SPVs), asset-backed securities, highly-leveraged “neocloud” companies, and state-level tax loopholes that obscure how much debt is involved and how much the public is invested in AI infrastructure. Congress should prohibit large debt financings that don’t disclose true sources of capital, end undisclosed debt-shifting, and require disclosures of all data center deals. Congress should also end a race-to-the-bottom among state and local governments for increasing tax breaks for data center construction. Finally, almost all previous bubbles and crises — famously, except for the 2008 crisis — involved fraud that led to significant federal criminal prosecutions and prison time for those who defraud the public.
  • Congress should turn data centers that become stranded assets into a public cloud and sustain AI research and development (R&D) for public purposes. In a financial crash, SPVs and neoclouds could go into various stages of financial insolvency. Congress can authorize agencies to purchase these assets to build a “public cloud” that provides a public option for computing infrastructure. Separately, much of R&D happening into uses of AI that have a public purpose — like those involving drug discovery, disaster planning, and energy sustainability — are funded by companies that might cut research budgets during a crisis. Congress should sustain that research funding.
  • Congress should protect workers by expanding unemployment insurance, creating a digital Works Progress Administration (WPA), and limiting worker surveillance. Significant rates of job loss could be the cause and effect of an AI crash, and companies are already engaging in “AI-washing” where they cut headcount in anticipation of AI reducing the need for labor (or blaming AI for firings they were going to do anyway). Like in prior crises, Congress should expand unemployment insurance and relax work requirements on other social safety net programs. Another bold solution to mass unemployment would be a mass employment program, such as a digital instantiation of the WPA created following the Great Depression. Finally, during a crash, worker surveillance (enabled by AI) is likely to increase as a means of increasing productivity. Congress should limit those practices that make workers worse off.
  • Congress should reform AI markets by establishing a Glass-Steagall for AI, utility-style regulations for digital utilities, a new regulatory agency, and a ban on surveillance-based business models. Following the Great Depression, Congress passed Glass-Steagall, which was repealed in the late 1990s, setting the stage for the 2008 financial crisis. Today, fervent demand for AI software is driving investment in AI infrastructure (chips, data centers), but that investment is distorted by the fact that the same few companies often own or invest in both sides of the coin, driving overinvestment in infrastructure. Congress should structurally separate algorithms (AI models and software) from data centers and hardware. Congress should also institute utility-style regulations on modern digital utilities required for AI innovation (e.g., foundation models, cloud computing, and chip design). These market rules would require a new digital regulator to administer regulations and requirements. Finally, a financial crash would incent a further search for extractive business models. As the dot-com bubble hastened surveillance advertising, an AI crash may hasten surveillance pricing and wages (and further entrench surveillance ads). Congress should directly ban those practices as part of a larger privacy regime.

Some argue that today’s AI investment is not at all a bubble, but there are enough warning signs for policymakers to be concerned. Instead of waiting for this much anticipated crisis, Congress should take the opportunity to plan for it now.

Read the full paper, or learn more about the proposal on VPA’s Substack.

Recent News