The real estate to energy asset play for data center development
Why your next land deal is a grid interconnection


The demand for data centers continues to exceed even lofty expectations, driven by the exponential growth of AI and cloud computing. For real estate investors and developers looking at this booming sector, the opportunity is immense — but so is the learning curve.
Such a significant expansion comes at a cost: The U.S. electric utility system is facing record congestion due to the massive scale of new power demand, and power is growing harder and costlier to secure.
This has rendered the traditional approach of simply searching for suitable land entirely obsolete. As PVcase VP of Client & Strategic Initiatives, Megan Young, explained at the recent Blueprint real estate conference, data center siting is far more akin to developing a major energy project.
Navigating the path to power and interconnection have become the true value — and the determining risk — for data center projects.

Speaker panel at the Blueprint real estate conference. From left to right: Amy Polvado (Faciimax); Christopher Abramo (Mod42 LLC); Adam Kramer (Panattoni), Megan Young (PVcase)
From utility customer to grid partner
Traditional CRE vs. gigawatt-scale load
In traditional CRE development, the local utility is generally responsible for serving any new load (say, a new housing development or commercial complex). This includes covering necessary network upgrades to keep power moving smoothly through the system.
Utilities and grid operators plan for these system upgrades (building new high-voltage transmission lines, installing higher-rated equipment, building new substations, etc.) through standard transmission planning processes. Once the necessary improvements are approved, utilities recoup the costs across the entire customer base via electricity rates.
For a gigawatt-scale data center, the dynamic is entirely different. Drawing the equivalent power of a large city like Denver, connecting such a massive load would instantly destabilize the grid, creating cascading infrastructure failures across the entire system.
The 'you break it, you buy it' reality
Consequently, the grid infrastructure upgrades necessary to ensure stability and reliability in the face of these new loads are significantly larger and more expensive. This has raised concerns among consumers, who believe that utilities may be driving up electricity prices.
In response, grid operators are rapidly evolving their large load interconnection policies, and many regions are now shifting towards a 'you break it, you buy it' mindset. In other words, the developer may be responsible for all necessary grid network upgrades across the entire system that their project triggers.
In October 2025, the DOE and the Secretary of Energy mandated FERC via ANOPR to reevaluate its guidance on large load interconnections and instructed the regulator to specifically consider making large load developers “responsible for 100% of the network upgrades that they are assigned through the interconnection studies.”
The consequences of traditional due diligence in this context can be dramatic:
We have seen initial upgrade estimates balloon from $20 million to over $200 million once the developer receives a full interconnection study from the utility.
Project lead times for power can stretch from the familiar 3-5 years to a decade or more.
Thus, finding the land and hoping to obtain the power is not enough. Site selection and screening must begin with the grid.
The three pillars of modern site selection
Securing interconnection is the biggest bottleneck. However, to future-proof their investments, data center developers and investors must evaluate three foundational elements — power, connectivity, and land — in an integrated manner.
As all the easy sites are gone, it is crucial to evaluate each component individually to assess tradeoffs and identify the best sites.
1. Power first: beyond simple utility connection
For investors intrigued by the space, the takeaway is stark: power isn't just a utility connection; it is a foundational risk to the entire investment. You cannot select a piece of land and then "hope" you can get the power. Site selection must begin with a thorough examination of the grid itself, utilizing high-quality interconnection studies to assess load capacity (power) and potential upgrade costs before considering land.

Navigating this is incredibly complex. It requires modeling raw utility ‘base case’ data, navigating technical methodologies for different grid operators, and synthesizing a dozen plus other datasets into complex grid model simulations. The output from these studies is not inherently tied to the real-world point of interconnection; therefore, you must manually geolocate and map the study results to the real-world substations and lines they are associated with.
Real estate investors looking to enter the data center industry should familiarize themselves with the interconnection process and seek out platforms that integrate this power-first approach with fiber and land analysis. Relying on siloed data or low-fidelity capacity estimates is a fast track to stranded capital in this new energy-first landscape.
Beyond initial interconnection, the operational viability of a project hinges on ongoing electricity pricing. Wholesale electricity prices (Locational Marginal Prices, or LMPs) can vary up to four times within a single region. Since electricity is the highest ongoing cost for a data center, a poor pricing decision can cost your project hundreds of millions over its lifespan.
2. Connectivity: the data highway
A data center’s purpose is to process and move data. This means scrutinizing proximity to high-capacity fiber optic cable routes and ensuring redundancy with multiple providers. Extending fiber to a remote site is possible, but the cost and timeline can impact project economics.

You need a holistic, data-driven tool to screen connectivity while evaluating tradeoffs around land suitability and power.
3. Land: the new developability
The criteria for a viable campus are now hyper-specific. Modern data centers require massive, developable acreage (some sites now exceed 1,000 acres).

But developers must also focus on two often overlooked components compared with traditional CRE:
Natural hazards: Analyzing flood, fire, and earthquake risks to ensure the asset’s long-term reliability and resiliency.
Zoning & permitting: Navigating local requirements driven by the unique size, noise, and massive power draw, which are far more complex than typical commercial developments.
The path forward
Developers traditionally analyze these three factors in silos, often using one-off consulting studies that are slow, expensive, and offer no guarantee to look at fiber, power markets, or land. This fragmented approach is the single largest bottleneck in the industry, wasting time and stranding capital on non-viable sites.
To succeed in this new asset class, real estate developers must adopt a repeatable, data-driven approach to strategically identify suitable sites from the outset. The most critical step new investors can take is adopting a platform that integrates a power-first approach with scalable power markets, fiber, and land analysis.
You need to simulate the utility's work — modeling load capacity and upgrade costs — and evaluate trade-offs between these other criteria before investing in expensive due diligence and, ultimately, site control.
The most profitable developers today aren't those who move the quickest; they're the ones who invest in a high-quality, integrated, scalable approach to screen out the bad sites fast.


Try PVcase Prospect now
Factor in grid offtake capacity, network upgrade costs, historical electricity prices, fiber route access, pipelines, zoning, environmental constraints, and more – all in one powerful platform!
