Skip to main content
Software

When everybody’s using ChatGPT, how do you keep your data safe?

IT pros try to answer the question without saying, “You can’t.”

ChatGPT

SOPA Images/Getty Images

3 min read

Any professor who holds office hours knows: College kids have lots of questions.

Patty Patria, chief information officer at Babson College, knows, too, that students may now save their quick queries for a chatbot. Currently the team has visibility, she said, into the many AI tools being used on the school’s network.

Babson IT employs Microsoft Defender for Cloud Apps to view AI-related activity, like suspicious prompts and data leaks. Other options for companies that want to monitor LLM usage include CASB products, which apply access controls and detect data-compromising SaaS usage.

“This is becoming a much more challenging thing,” Patria told us, “to get down to the granularity of data: What data is going into ChatGPT versus DeepSeek versus Copilot?”

Patria shared ways to ensure data security when everybody’s hooked on ChatGPT.

Stat! US students and workers have increasingly turned to ChatGPT since its introduction in November 2022. The use cases, like the query data powering them, are all over the place.

  • A Pew Research Center study, released in January 2025, found that the proportion of teens using the tech for their school work rose to 26%—an increase from 13% in 2023.
  • Over 40% of office employees deploy GenAI tools like ChatGPT at work in 2025—and nearly one-third keep it a secret, a recent report from Ivanti concluded.
  • More users are turning to chatbots for self-improvement. The number one use for LLMs, according to recent research from Harvard Business Review, was “therapy/companionship.”

Close it up. Patria’s concern is that any data sent to the open, public version of ChatGPT can be used to train the models. Babson made the choice to ban DeepSeek for staff members (not faculty and students), Patria told us, and faculty and staff handling institutional data must use a closed model like the Copilot tool.

Top insights for IT pros

From cybersecurity and big data to cloud computing, IT Brew covers the latest trends shaping business tech in our 4x weekly newsletter, virtual events with industry experts, and digital guides.

“The closed versions don’t leverage your prompt in training their models. You’re ensured that only you have access to your data and nobody else has access to your data, which, from a security and privacy perspective, is really the most important thing,” Patria said.

According to May 2023 Bloomberg reporting, Samsung Electronics banned employee use of GenAI tools like ChatGPT following the discovery of sensitive code being uploaded to the platform. (Some Wall Street banks, Bloomberg wrote later, also cracked down on chatbot usage.)

That’s acceptable. Many frameworks assist organizations through all the AI instances, including the NIST AI Risk Management Framework, Mitre’s ATLAS knowledge base, OWASP’s Top 10 list of GenAI risks, and the ISO 42001 standard for AI management.

Scott Laliberte, managing director and CISO advisory at Protiviti, helps clients map their policies to those kinds of standards. During the process, he often finds that organizations have data-specific acceptable-usage terms in place to protect intellectual property and prevent compromise.

“But a lot of times, it’s reminding the users that acceptable use also applies to these new tools that they’re out there engaging with,” he said.

Patria has had to instruct about the AI tools out there.

“We train our faculty, staff, and students to tell them if you’re using any confidential, proprietary data, you either need to use what we’re providing to you, or if it’s a student or faculty and they want to use their own proprietary data, use something that you’re paying for that’s closed versus a free tool,” Patria said.

Top insights for IT pros

From cybersecurity and big data to cloud computing, IT Brew covers the latest trends shaping business tech in our 4x weekly newsletter, virtual events with industry experts, and digital guides.