A recent privacy issue involving ChatGPT has brought attention to how easily personal or confidential content can end up in Google search results. The issue began with OpenAI’s “Share” feature, which allowed users to generate public links of their conversations. Although users had to actively opt in and tick a box to make those links searchable, many didn’t realise what that meant for privacy.
Typing a simple search like site:chatgpt.com/share followed by a keyword could uncover chats containing personal names, job titles, and company details. One shared link listed a senior consultant from Deloitte, including their age and job description, openly visible on the web. Christopher Penn from TrustInsights.ai explained on LinkedIn that anything shared on a public link can be indexed by Google if it’s placed somewhere Google can access.
The problem wasn’t the feature itself, but how people used it. OpenAI said the feature was an experiment, but the fallout showed how quickly even well-intentioned tools can go wrong. According to VentureBeat, people shared chats ranging from home advice to health concerns, often without realising they had made them available to anyone with a search bar.
Why Did OpenAI Stop The Feature?
OpenAI responded fast after people criticised this on social media. Within hours, the company removed the option to make shared ChatGPT conversations searchable. They admitted in a post on X that although users had to check a box to activate this feature, the guardrails weren’t strong enough to stop accidental sharing of sensitive content.
The platform’s design expected users to understand the risks of public sharing. But that expectation turned out to be unrealistic. As one security expert pointed out, it only took a few clicks to publish chats that could include private health questions or company strategies. For many, the privacy risk was not immediately obvious.
The speed of OpenAI’s decision shows how AI companies are still working out how to build tools that feel easy to use but are also safe. Even though the share feature was created to let people swap ideas and insights, the backlash showed that privacy concerns outweigh the benefits if the controls are too loose.
More from News
- Project Europe Reveals Six Startups to be Included in the Organisation’s First Cohort
- Comp AI Secures $2.6M Pre-Seed to Disrupt SOC 2 Market
- Cybersecurity Challenges During Disasters: How Can Startups Help?
- Experts Share: How Will The Rise Of AI Call Centres Impact Entry Level Jobs?
- Air Traffic Fault Causes Delays Across UK, Here’s Why
- Google Signs EU AI Code Of Practice, Here’s What That Means
- Saudi Arabia Invests $6 Billion In Syria: Why Should This Investment Strategy Matter to UK and European Founders?
- London Landlords Hit Affordability Wall as Northern Rents Surge Ahead
What Are The Ridks If Sharing AI Chats?
This issue hit a nerve because many people use AI tools like ChatGPT for tasks that involve confidential or sensitive content. Marketers, researchers, and consultants often ask ChatGPT to help with outlines, pitches, or internal documents. Sharing those chats, intentionally or not, could leak company names, research ideas, or client strategies.
Christopher Penn called it a “data leak waiting to happen,” and said the situation was a goldmine for competitors. It’s easy to imagine a rival firm stumbling upon outlines for an upcoming pitch, just by searching the right words. Olaf Kopp, co-founder at Aufgesang GmbH, said that although only a few thousand chats were indexed, the damage could already be done.
There’s also the technical side to consider. Kopp warned that interacting with public ChatGPT chats could open users up to prompt injection attacks.. a method where someone manipulates an AI’s behaviour through a carefully crafted prompt. He advised users to avoid clicking on shared links they don’t trust, and to delete any old ones using OpenAI’s settings.
On X, they said: “We just removed a feature from [@ChatGPTapp] that allowed users to make their conversations discoverable by search engines, such as Google. This was a short-lived experiment to help people discover useful conversations. This feature required users to opt-in, first by picking a chat to share, then by clicking a checkbox for it to be shared with search engines (see below).
“Ultimately we think this feature introduced too many opportunities for folks to accidentally share things they didn’t intend to, so we’re removing the option. We’re also working to remove indexed content from the relevant search engines. This change is rolling out to all users through tomorrow morning.
“Security and privacy are paramount for us, and we’ll keep working to maximally reflect that in our products and features.“