Power, buildings, and silicon: the real AI alpha
In the mid-1990s, the world was captivated by the information revolution. While the public focused on the first web browsers, institutional capital made a different move, quietly backing the mobile operators and utility firms laying fiber-optic cables and raising 5G towers. Like in the old analogy, they were investing in pickaxes and shovels, not in the gold mines.
In 2026, the AI gold rush is peaking. And once again, investors are chasing “wrappers” — superficial software layers built on top of other people’s models. But smarter ones look deeper, moving away from the apps and toward the physical bottlenecks of the digital age: energy and semiconductors.
The Wrapper Trap
Most AI projects today are flashy packaging around a Large Language Model (LLM). These companies are often seat-based models with functionality that can be easily replicated or bypassed by the model providers themselves.
As a strategic investment, these wrappers are precarious. They lack a moat. How this or that model talks, doesn’t matter. The real value of AI is in the raw compute and power required to let it think. Besides, as moving from simple “chat” AI to “reasoning” AI increases energy costs sometimes by 4,300%, efficient architecture becomes a requirement for profit. In this light, software wrappers are economically fragile — without owning the underlying hardware architecture or energy source, the margins of wrapper companies will be consumed by the soaring cost of the compute.
If you are looking for the “NVIDIA of tomorrow,” you will find it in the companies trying to crack NVIDIA’s monopoly by building more efficient silicon, and in the infrastructure firms laying the physical rails for the new era of digitization.
The $1 Trillion in Silicon
By 2027, the global semiconductor industry is expected to reach a $1 trillion valuation.
This growth is top-heavy, driven almost entirely by the data center requirements for logic and memory chips.
The market has entered a phase of extreme specialization. While Nvidia maintains a 90% share of the GPU market used for model training, the industry is bifurcating as it moves toward the inference phase where AI is actually used.
Custom-designed chips (ASICs) now represent the high-efficiency alternative to general-purpose GPUs. A single Nvidia GPU currently costs nearly five times as much as an average custom AI chip. Hyperscalers like Google and Meta are using specialized ASICs to reduce their execution costs. Broadcom has emerged as the leader here, controlling roughly 70% of the AI-ASIC market.
AI performance is limited by data transfer speeds. This has transformed the memory market from a cyclical commodity into a bottleneck. High-bandwidth memory (HBM) is expected to grow into a $68 billion market by the end of 2026. This is driving record earnings for manufacturers like Micron Technology and SK Hynix, as they redirect production away from consumer electronics and toward data the centers.
Data Centers as Strategic Assets
In the cloud era, data centers were generic real estate. Now they are the factories of the digital economy.
Building the shell is not enough. The winners integrate compute, cooling, and connectivity at a hyperscale. Amazon (AWS) and Microsoft (Azure) are currently engaging in an unprecedented infrastructure arms race, with Google doubling AI-related capital expenditure in 2026 (from $85 billion in 2025). Altogether, Big Tech companies plan to spend over $600 billion on AI infrastructure this year.
Perhaps most telling is the pivot of Oracle. By focusing on high-performance clusters Oracle has transformed from a legacy software firm into a critical AI infrastructure provider. This transition highlights a new market rule: in a world of infinite data, the physical building that can keep a GPU cluster cool and powered is the ultimate moat.
Energy Clusters
The primary limit on AI growth comes down to simple electricity. The world is running out of the cheap, reliable power required to run massive data centers. Meanwhile, according to MIT Tech Review, between 2024 and 2028, the share of US electricity going to data centers may triple, from its current 4.4% to 12%.
We are seeing a migration of private capital into energy clusters — geographies where companies can generate low-cost power and build data centers directly at the source. This has triggered a nuclear Renaissance in private equity. Consider the recent $1.6 billion deal by Microsoft and Constellation Energy to restart the Three Mile Island reactor.
The winners of this cycle will be the firms that own the server farms, the energy providers that fuel them. Without enough fuel, AI remains a laboratory experiment.
Private Markets of “Shovels”
The capital requirements for this hardware shift are astronomical. To build the necessary pipes, initiatives like Stargate Project are planning expenditures of $500 billion — a sum larger than the inflation-adjusted cost of the Apollo space program. The infrastructure companies building these energy clusters and next-generation chips are rarely found on public exchanges in their high-growth phase. By the time they IPO, the infrastructure alpha has already been harvested.
Public data cannot help you here. Infrastructure of major AI providers remains hidden behind proprietary corporate layers, and the only way to identify the winners is through private network signals that reveal where the physical capital is actually being deployed.
This is where the private secondary market becomes essential. As institutional interest shifts from Web3 protocols to AI infrastructure, the secondary market acts as the only venue to acquire stakes in the private firms controlling these mission-critical workflows.
The Takeaway
If history is a guide, the software will eventually become a commodity. The price of intelligence will drop to near-zero as models become interchangeable. But the price of a kilowatt-hour and a high-end chip will not.
The alpha of the next decade belongs to those who invest in the pipes and the power. We are moving from a software-first world back to a hardware-first world.
Oleg Ivanov, COO & Co-Founder, SecondLane