An IT technician stands in a data center and looks at a laptop
  • AI's physical supply chain, including chips and data centers, is where the early revenue is.
  • Data centers, crucial for AI, face high demand, with prices doubling and low vacancy rates.
  • Databank CEO says customers are pre-leasing forthcoming buildings two years in advance.

Artificial intelligence may seem intangible, but it has an immense physical supply chain behind it, and at least for right now, that's where the money is.

The chips, designed by Nvidia, AMD, and others and largely manufactured by Taiwan Semiconductor Manufacturing Company, have myriad parts and suppliers. The chips need to be organized and connected by another slew of components. All of that tech then needs to go into a fit-for-purpose building. There's an AI-driven boom taking place at every level of the chain, including the big concrete box it all goes into.

Though data centers often resemble a simplistic box, they're far from simple. Cooling and power management for these structures, measured in megawatts rather than square footage, is key and a whole other category of technological innovation.

Though Amazon, Google, and Microsoft hold much of the cloud capacity, new entrants have been a regular occurrence since the rise of AI cloud computing.

DataBank is one of a handful of players positioned to see trends across Big Tech since the 19-year-old firm leases out space at more than 65 data centers in 27 markets. The company has new competition as Blackstone and industrial real estate giant Prologis have followed the smell of revenue into the space.

But DataBank CEO Raul Martynek has been through boom cycles before.

In the early aughts, during the dot-com boom Martynek ran a firm at the forefront of dedicated internet service. Today, he's yet again building the rails on which new technology runs — an eyewitness to the biggest players jostling for cloud supremacy with years of contracts into the future.

Martynek spoke to BI about AI computing power demand, hardware issues, and the yet-unanswered question of return on AI investment.

This Q&A has been edited for clarity and length.

BI: How have you experienced the demand for AI data centers over the last 18 months?

We're a data center developer, so we build the physical buildings that all this equipment goes into. The sector has been around for about 25 years, since the aftermath of the dot-com bubble, and it has been a steady grower.

In the last 10 years, the most important influence of that was public cloud growth. They didn't exist in 2010 and now represent hundreds of billions of revenue. Hyperscalers build a lot of their own data centers, but they also outsource a lot of that effort to companies like DataBank.

ChatGPT happened in November 2022, and then 2023 showed up and actually, the first quarter and a half was relatively slow. We were wondering what was going on. Then, in the middle of 2023, there was a tsunami of demand.

The hyperscalers and other important technology companies started accelerating their AI initiatives. Every time Nvidia sells a GPU it has to go into a data center, so there's a direct relationship between the number of units that they ship and the need for data center capacity.

Really, it's been about 12 months where we've seen sustained demand outpacing supply. A year ago data center inventory was tight — it's not like people over-build in this space because it's expensive — but there was capacity. Over the last 12 months, pricing has doubled and supply has gone down to where vacancy rates are low single-digits.

A man in a blue and pink striped shirt smiles at the camera
DataBank CEO Raul Martynek

BI: Data centers are planned years in advance, which means you're constantly thinking about the future. Roughly what year is it inside your head most of the time?

I think it's a split-brain thing. I'm thinking about this year because we have existing inventory and we're focused on hitting our budget for this year. But at the same time, we have to think 24 to 36 months ahead.

As an example, we're developing over 800 megawatts of data center capacity. It will come online in 2025, 2026, 2027, and part of 2028, so we kind of know what we're going to be doing for the next three and a half years.

It all starts with the land.

We acquired the land 12 months or 18 months ago. We have spent a year designing, permitting, zoning, and negotiating the power agreement with the power company. To give you an example, one of our new developments, it's 95 acres in Atlanta. It's a 120-megawatt data center campus — we're just starting to push dirt on that, and by next year, the first building will start coming out of the ground, and then by the second quarter of 2026, we'll turn up customers.

In our business, we call that a pre-lease, and pre-leasing has been pretty common over the last couple of years before ChatGPT. The earliest someone would pre-lease was 12 months. Because of the demand that AI has caused and the scarcity of data center capacity, that window will increase to 24 months.

A rendering of a simple rectangle building, a data center,with grey walls, big windows on one side and a bright orange patio.
Databank is building its fourth datacenter in the Atlanta area, responding to AI demand.
a crane sits in an empty spot amid a simple grey concrete building, a data center, under construction.
Databank's next Atlanta data center will have use 120 megawatts of power and occupy 95 acres of land.

BI: News recently broke that some Nvidia chip models may be delayed or canceled. How much do fluctuations in the AI hardware market matter to your planning and how disruptive is this to your customers?

It doesn't matter. We're obviously acutely aware of it because, with these developments, we can put our foot on the gas in terms of how quickly we bring capacity online and we can also kind of elongate it a little bit. So we want to moderate that in terms of what we're hearing from customers.

I think customers might be in some ways annoyed, but in other ways relieved, because there were a lot of complaints that Nvidia is trying to bring out a new architecture every year. It doesn't allow the people buying these chips to sweat them and get their full useful life.

But when we build, let's say a 40-megawatt data center, it has 10 four-megawatt data halls. There are 10 little data centers within the building itself. If we were going to build a data center, without a pre-lease, we're only going to outfit the first four megawatts of it. So you stagger your capital.

BI: Do you think about ROI when it comes to AI when you're making data center deals?

That's a fantastic topic, and especially for me, since I have been in the sector since the mid-90s, I saw the dot-com crash. I think everyone realized that the internet is real, it was just that the business models at that time didn't create earnings, and that's why things went off a cliff. You could argue that with GenAI that same dynamic could occur.

We're definitely tracking that in terms of, whether we believe our customers are getting ROI on their investments.

It's still too early, but, this technology is really revolutionary. It's probably one of the most transformative technologies since the internet, frankly, or maybe mobile phones. We're going to look back at it in seven years and say, 'Oh my gosh, that was so primitive, right?'

Some companies are able to demonstrate ROI, but certainly, it's not uniform right now. But that's to be expected from the introduction of such a transformative technology.

We're looking at the customer's credit quality, for sure. Just like any landlord, we want to make sure that our tenants are going to be viable for the term of that lease, right? The good news is this technology is so sophisticated and so expensive that you need to be relatively financially stable and have the technical ability to use it.

We're seeing more larger enterprises — hyperscalers and large technology companies adopt these GPU deployments and I think we'll start to see a lot more of that next year.

If there's a company that just got started four months ago, has gotten an allocation of a couple of 1000 GPUs, and wants to take five megawatts of data center space — we're not going to do that. We'll let someone else support them. We're staying close to very creditworthy tenants, or if they are earlier-stage companies, they're well-funded.

Got a tip or an insight to share? Contact Senior Reporter Emma Cosgrove at ecosgrove@businessinsider.com or use the secure messaging app Signal: 443-333-9088

Read the original article on Business Insider