Skip to main content
IT Strategy

Who owns AI governance (and is it IT)?

Industry pros recommend who should be accountable for the responsible deployment of AI.

4 min read

If a company’s chatbot starts hallucinating, deleting databases, or revealing passwords, who on staff is going to own up to these AI aberrations?

During an IT Brew live event in August titled “Ready or Not: Governing AI in a Fragmented Regulatory Landscape,” an IT Brew attendee asked:

Who should ultimately own AI compliance? Is it IT, legal, or another business unit?

Many orgs are likely asking a version of this question as they seek clarity on AI policy and who should lead that effort:

  • While CIOs and CTOs have often been the leads of IT initiatives, some companies want a chief AI officer for today’s tech deployments. A May 2025 report from AWS, which polled 3,739 senior IT decision-makers in nine countries, found that 60% of organizations (both large and small) have appointed a dedicated AI executive.
  • A Gallup poll of US employees, conducted in Q2 2025, revealed only 22% of surveyed enterprise AI users said their company has communicated a clear plan or strategy for implementing the technology.

While Guru Sethupathy, founder and CEO of AI-governance platform FairNow, told attendees during IT Brew’s August 28 presentation that AI governance is a “shared responsibility.” “Ultimately, you still want someone who’s a decision-maker, because you don’t want to get lost in a kind of paralysis by analysis,” he added.

Sethupathy and other industry pros who spoke with IT Brew agreed on the shared responsibility of AI governance, but had slightly different views on who should be involved and who gets to make the big decisions.

The decider. AI governance refers to policies, process, regulations, and ethical guidelines considered as the technology is designed and deployed. The discipline will likely examine an AI tool’s impact on factors such as data privacy, transparency, and fairness.

Sethupathy sees “key stakeholders” like business, IT, compliance, and legal teams involved in the process of responsible AI deployment, but he stressed the importance of having that “ultimate decision-maker” chosen by the company.

Top insights for IT pros

From cybersecurity and big data to cloud computing, IT Brew covers the latest trends shaping business tech in our 4x weekly newsletter, virtual events with industry experts, and digital guides.

“The question then becomes who’s ultimately responsible for calculating the trade-off between value and risk,” he said during the event. “Each company does that in a different way.”

Nathan Olson, senior manager of digital solutions at advisory, tax, and assurance firm Baker Tilly, agrees that AI governance should be a “shared responsibility” and a “collaborative model.” In the midsize orgs that he counsels, Olson sees:

  • IT pros examining model behavior, data pipelines, and infrastructure
  • Legal teams overseeing alignment with regulations
  • Business units defining specific use cases and ethical boundaries

But governance “ownership,” he said, falls to the group, which he recommends should meet regularly to approve or review existing use cases for efficacy, risk, and compliance.

“It’s important to have a clearly defined and maybe an AI governance charter of what responsibilities fall to what teams specifically,” he said.

Rohan Sen, a principal in PwC’s cyber data and tech risk practice focusing on data risk and responsible AI, sees the importance of cross-disciplinary collaboration with business leaders in IT, legal, cybersecurity, and privacy—those who own a particular risk domain in a company.

IT has an especially important AI governance role to play, surfacing various technology choices (like one model provider over another) and communicating their impact, he said.

Ultimately, Sen recommends having a single authoritative role—maybe a chief compliance officer, chief risk officer, or chief privacy officer, he said—to “break ties” and make the ultimate decisions once risk players in a committee provide their input.

“Otherwise, what you have is a whole bunch of different viewpoints, and nobody to be a referee or an emcee to say, ‘Okay, having heard all of these different viewpoints and considerations, what should we actually do?’” Sen concluded.

Top insights for IT pros

From cybersecurity and big data to cloud computing, IT Brew covers the latest trends shaping business tech in our 4x weekly newsletter, virtual events with industry experts, and digital guides.