Why organizations are choosing to repatriate data from data centers
How hard could it really be?
• 3 min read
Even as the biggest names in tech build data centers as fast as possible, some IT professionals are looking to repatriate their data from these massive facilities. What’s driving this trend?
Repatriating data, or the act of bringing data back on-premises, is becoming popular among IT professionals, according to Cloudian’s 2026 research report. Out of a pool of 212 senior IT decision makers, 75% told Cloudian that they had moved workloads from the cloud (and accompanying data center) to on-prem in the past 24 months.
Michael Gale, chief marketing officer at AI and data company EDB, told IT Brew that professionals may repatriate data out of a desire to control it as much as possible, especially in the context of training AI and deploying agents.
“If you want to use AI and data, you’ve got to be secure and compliant, they’ve got to be next to each other,” Gale said. “If we end up having maybe up to 300 million agents working in US enterprises, those agents need to be secure and compliant, and the only thing that feeds them is data…There’s a huge demand to repatriate data back into a closeness to the actual business control center. We were comfortable putting it all in the cloud because it all looked the same, but data doesn’t all look the same.”
Keep an AI on that. Andy Stone, CTO for the Americas at Pure Storage, told IT Brew that part of the desire to repatriate data stems from security concerns,
While enterprises may employ a big-box LLM for their AI needs, other companies may want a model that can answer company-specific questions. Organizations don’t want to share proprietary information with a non-specialized model.
Top insights for IT pros
From cybersecurity and big data to cloud computing, IT Brew covers the latest trends shaping business tech in our 4x weekly newsletter, virtual events with industry experts, and digital guides.
By subscribing, you accept our Terms & Privacy Policy.
“Where you’re integrating very confidential information, you want those guardrails in place, you want that containerization in your environment where you can specify the exact parameters that you want on those data sets, to ensure that only certain data is allowed to follow in and out and certain results are given,” Stone said.
Hey, can you give me that back? But pulling data out of a data center isn’t easy—Stone said it requires a lot of architecting and planning, including managing the applications consuming and producing data.
Some companies that host other organizations’ data, like Microsoft (with Azure), have an egress charge for data.
“They’re saying, as long as your data lives here, we’re cool; you want to take your data out, we’re going to charge you on the back end,” Stone said. “In your data center, you don’t have that, you’re not going to pay an egress charge. It’s a benefit you derive, but the move itself takes time, a lot of planning and effort, and it’s certainly not easy in most cases.”
And why not? If an enterprise doesn’t want to pay the costs of managing and storing their own data locally, then repatriation might not be for them. Stone also recommended that smaller companies might not want to host their own server for a website.
“It really depends on your business and the business model,” Stone said. “Now, where you have high security needs, having it on-prem gives you that better control, but you also need to take into account things like agility.”
About the author
Caroline Nihill
Caroline Nihill is a reporter for IT Brew who primarily covers cybersecurity and the way that IT teams operate within market trends and challenges.
Top insights for IT pros
From cybersecurity and big data to cloud computing, IT Brew covers the latest trends shaping business tech in our 4x weekly newsletter, virtual events with industry experts, and digital guides.
By subscribing, you accept our Terms & Privacy Policy.