- with Inhouse Counsel
- with readers working within the Technology and Pharmaceuticals & BioTech industries
Recent media reports and commentators have raised concerns that current economic conditions may reflect excessive valuations of investments in AI technologies. Examples include:
- Opinion | Warning: Our Stock Market Is Looking Like a Bubble(NYTimes, 15 October 2025)
- Why experts are concerned there is a stock market AI bubble (The Hill, 14 October 2025)
- IMF and Bank of England join growing chorus warning of an AI bubble (CNBC, 9 October 2025)
- This Is How the AI Bubble Bursts (Yale Insights, 8 October 2025)
- Gartner Says Agentic AI Supply Exceeds Demand, Market Correction Looms (Gartner, 7 October 2025)
According to Gartner:
"The impending agentic AI market correction is distinct from speculative bubbles fueled by systemic financial engineering, fraud or policy. At this point, the underlying product, agentic AI, is sound, and the current market correction, where markets rationalize and consolidate, is a regular part of the product life cycle.
"However, a 'speculative bubble' could still form if investment becomes detached from agentic AI's intrinsic potential to deliver tangible and commensurate economic value." [cleaned up]
If the market for AI-related products and services proves to be overvalued or unsustainable, a correction or collapse—commonly referred to as an "AI bubble"—could have a significant impact on public companies, depending on the level of their investments in or dependence on AI.
Some of the risks could include:
Market Valuation Risk
Stock prices and investor interest may be artificially elevated due to perceived alignment with AI trends. A reversal in sentiment could lead to a sharp decline in market valuations, with investors questioning projections of future earnings or rates of return. In particular, AI-related stocks are reported to account for 75% of recent market gains, creating significant concentration risk.
Capital Allocation Risk
Some companies have made and are making substantial investments in AI infrastructure (especially data centers), personnel, and partnerships. For example, AI capex is reported to have topped consumer spending as the main driver of economic growth in 2025 so far, accounting for 1.1% of GDP growth, with data centers representing almost 2% of GDP this year. If anticipated returns from these investments do not materialize, those companies may incur significant losses or impairments – requiring write-downs or restructurings.
Technology Disruption
Public company risk factors often address the threats posed by rapid technological change, frequent new product and service introductions and evolving industry standards. AI-related companies are likely no different, as two Yale professors recently noted:
"For AI, further innovation in semiconductor chip design or major advances in quantum computing, as hundreds of billions of dollars in data center infrastructure are being deployed, would immediately leave much of that investment useless in the medium to long term. That is not to say that the spare "compute" will not be needed in the future, but just like the fiber-optic cable infrastructure, it could be years before those data center investments start generating a return for their backers." [cleaned up]
Operational Disruption
A downturn in the AI sector could lead to reduced demand for AI-related service or product offerings, delays in product development, or loss of strategic partners. For example, in response to a recent report suggesting that AI models may not be as sophisticated as assumed, one commentator explained:
"AI researchers have long worried that the impressive benchmarking results [of AI models] may be due to data contamination, where the AI training data contains the answers to the problems used in benchmarking. It's like giving a student the answers to a test before they take the exam. That would lead to exaggerations in the models' abilities to learn and generalize."
Further, trade disputes with China and Russia relating to access to rare earth minerals or critical supplies of noble gases could affect the supply of advanced semiconductor chips needed for AI supercomputers and data centers.
Regulatory and Legal Risk
Increased scrutiny of AI technologies by regulators may result in new compliance obligations, litigation exposure, or reputational harm, particularly if public sentiment turns against AI adoption. As the Yale professors wrote:
"It is not difficult to imagine a major, publicly available AI model going rogue and inflicting significant damage to financial markets or national security systems. Such an action would require a national moratorium on comparable AI models until the damage is contained and the risk mitigated."
Additionally, public companies may face litigation risk as plaintiff attorneys or regulators scrutinize disclosures for overly promotional language or allegedly misleading projections. They may also face pressure from activists to change management or strategies.
Further, a number of states, as well as other governmental authorities, are considering or adopting AI regulations relating to safety, deepfakes, transparency, intellectual property as well as sector-specific rules. Although a structured regulatory framework may benefit AI development in some respects, new rules might also have a dampening effect on growth and innovation.
Talent and Resource Constraints
A contraction in the AI market may lead to displacement of talent, as engineers and scientists pivot to other sectors or leave the field altogether; reduced access to compute access or open source tools; and increased competition for limited resources.
Contagion Risk
Even if a company's core business is not directly dependent on AI, broader market volatility stemming from an AI bubble burst could negatively impact investor confidence, access to capital, and overall economic conditions. The risk may be enhanced, as the small number of major AI companies represent a substantial percentage of the total S&P 500 valuation.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.