Data Management

Data centers aren’t crunching their own sustainability numbers

A recent survey indicates that many data centers are failing to track key environmental data.
article cover

Francis Scialabba

· 5 min read

In an era where all the warning signs are written on the wall—from heat waves taking out data centers in the UK and wildfires threatening them in California, to operations sucking up huge amounts of water in drought-affected areas—many data centers aren’t preparing for a future where legislative responses to climate change affect their bottom line.

According to the Uptime Institute’s 2022 Global Data Center Survey, most operators aren’t tracking key sustainability metrics. While the majority track power use (85%) and 73% track energy efficiency metrics (such as power usage effectiveness, or PUE), just 35% are tracking server utilization. Only 28% track e-waste or their equipment life cycles, 39% track water use, and 37% track IT/data center carbon emissions.

Tracking carbon contributions is far from simple and requires not only tracking carbon generated on-site or via electricity consumption, but the carbon cost of facilities and equipment as well as the downstream use of products by consumers.

Meanwhile, other metrics like power use directly affect companies’ bottom lines, Andy Lawrence, Uptime Institute Intelligence’s executive director of research, told IT Brew. The lack of attention in other areas is setting them up for a rude awakening when legislation requiring environmental reporting hits, he added, although 63% of survey respondents expect this to happen in the next five years.

Still catching up. Lawrence explained that when PUE first became a widely used metric in the late 2000s, the results were “an embarrassment” to many companies and triggered an effort to catch up. He added that many of the easiest ways to increase PUE have already been exhausted throughout the industry and further improvements will be difficult to achieve for existing data centers.

Alternative tech like direct liquid cooling offers that possibility, but adoption rates remain quite low, Lawrence said.

“The ecosystem is still young,” Lawrence told IT Brew. “So, a lot of data center operators feel uncomfortable about committing to a technology that’s really only supplied by one or two small companies. And there’s a fear of introducing complexity into the data center that wasn’t there with some of the simpler air-cooled systems.”

Tate Cantrell, chief technology officer at Verne Global, told IT Brew that Iceland’s nearly 100% renewable power and relatively cool climate were key factors in the company’s decision to build a campus near Keflavík in 2012, when running applications at such remote distances wasn’t conventional.

“Our prospect was its much lower cost,” Cantrell said. “Even back in 2012, it was a fifth of the cost for the power in Iceland versus London or Frankfurt. And now [with] the energy crisis, one order of magnitude difference is where you’re at. So, an investment in energy security really is an investment for the future.”

Top insights for IT pros

From cybersecurity and big data to cloud computing, IT Brew covers the latest trends shaping business tech in our 4x weekly newsletter, virtual events with industry experts, and digital guides.

“And we were able to do some really clever things with regard to how we cooled our data centers, which has allowed us to scale tremendously and support what has become a lot of really high performance compute applications,” Cantrell added.

A changing (regulatory) climate. According to Lawrence, while the data business has made significant progress on sustainability and resiliency, operators “really don’t fully understand” they are in for what he predicts will be a tsunami of new regulatory requirements around climate change. Many executives have made commitments—like net zero by 2050—without thinking through how to achieve those promised targets, he said, leaving them unprepared for a rapid change in the regulatory climate.

“There’s going to be pressure, legislative pressure, to do carbon reporting, to track emissions accurately, to track water, to adopt net-zero targets, to adopt science-based targets, to prove to their customers that they are doing that, to source renewable energy, to track the amount of carbon density in [those] power sources,” Lawrence said. “So, it’s going to be a very, very difficult thing for people to start doing at scale when they haven’t done it before.”

Additional blind spots include risk assessments that are calculated too infrequently to keep up with the increasing pace of extreme weather events, as well as IT equipment that is run at low utilization and inefficiently, Lawrence told IT Brew. While cloud computing tends to be very efficient, he added the carbon footprint of running workloads there is not yet fully understood.

Cantrell said optimization and location flexibility will be critical, especially as advanced applications like machine learning models continue to require ever-growing amounts of data and become more computationally expensive.

“That kind of infrastructure, it’s important to the future of humanity,” Cantrell said. “But it’s a risk to the future of humanity, if we’re not thoughtful about how we deploy those.”—TM

Do you work in IT or have information about your IT department you want to share? Email [email protected] or DM @thetomzone on Twitter. Want to go encrypted? Ask Tom for his Signal.

Top insights for IT pros

From cybersecurity and big data to cloud computing, IT Brew covers the latest trends shaping business tech in our 4x weekly newsletter, virtual events with industry experts, and digital guides.