AI Chats Aren’t Private: How Your Conversations Could Be Used Against You

Winston kept his back turned to the telescreen. It was safer, though, as he well knew, even a back can be revealing.
— George Orwell, 1984
ai tech chatgpt openai artificial intelligence silicon valley tattoo california blogger

I was entertaining a run-down of my day in bed and fell asleep wondering, “What haven’t I read about in ChatGPT’s Terms and Conditions?”. It niggled at me in an ugly way as I lay my head down next to my sleeping child. The concern stems from the lack of regulation or transparency surrounding the AI chats we are all having, or soon will be, for those who haven’t participated yet. So many of us are vulnerable and knowing the answers to how technology is going to shape our future society is important. We must be vigilant in understanding the risks as well as the potential rewards.

ai tech chatgpt openai artificial intelligence silicon valley tattoo california blogger

I asked ChatGPT this question, and it says in one very concerning example;

“Here’s how the privacy issues in OpenAI’s terms could potentially cause problems for an average user:

Misinterpretation of Stored Data

   •   Example: A user asks sensitive or emotional questions (e.g., about legal advice, mental health, or financial struggles).

   •   Risk: If such data were to be shared or accessed (e.g., by legal subpoena or during an audit), it could be misconstrued or used against the user without proper context.”

ai tech chatgpt openai artificial intelligence silicon valley tattoo california blogger

Below is the full list of risk(s) for this specific conversation - many of which carry serious weight. I encourage you to read through them and consider the implications for yourself and/or your families. As someone who has witnessed multiple disparaging, harmful, and emotionally traumatizing court cases, the risk of litigation by itself is a huge consideration for the average user. I encourage you to read both lists and perhaps continue doing further research.

“OpenAI, which powers ChatGPT, prioritizes privacy and security, but there are a few points in the terms and conditions that might concern some users:

1. Data Collection:

      •   OpenAI may collect and store your interactions to improve the model, though efforts are made to anonymize data. Sensitive or personal information shared during interactions could potentially be included in this data.

2. Use of Inputs:

      •   Information you provide may be reviewed by OpenAI employees or contractors to ensure safety, develop new features, or improve system performance. While they aim to protect user privacy, this could involve humans seeing your input.

3. Data Retention:

      •   OpenAI retains conversations for a certain period. Even though data is anonymized, there’s a risk that patterns or unique inputs could inadvertently reveal personal information.

4. Sharing with Third Parties:

      •   While OpenAI doesn’t sell personal information, it may share data with third-party service providers, especially for hosting, analytics, and technical support. This could increase the exposure risk if these parties have vulnerabilities.

5. User Responsibility:

      •   Users are advised not to share sensitive personal information in conversations. However, if someone unknowingly or accidentally does, it could be stored temporarily.

6. Anonymization is Not Foolproof:

      •   Even though OpenAI works to anonymize data, sophisticated analysis could potentially de-anonymize some information in rare cases, depending on the nature of the shared data.

Here’s how the privacy issues in OpenAI’s terms could potentially cause problems for an average user:

1. Accidental Sharing of Sensitive Information

   •   Example: A user shares their full name, address, or medical history during a conversation.

   •   Risk: Although OpenAI anonymizes data, the conversation might still be reviewed by humans or used to improve the model. If the anonymization process fails or the data is improperly handled, sensitive details could be exposed or linked to the user.

2. Patterns Leading to Identification

   •   Example: A user regularly discusses niche topics, mentions their city, job, or hobbies, and provides enough unique details across multiple conversations.

   •   Risk: These patterns could unintentionally allow someone analyzing the data to identify the user, especially if the user is part of a smaller community or group.

3. Data Breaches at Third Parties

   •   Example: A third-party vendor used by OpenAI for hosting or analytics experiences a security breach.

   •   Risk: If conversations or metadata (like IP addresses) are exposed, malicious actors might exploit this information for phishing, fraud, or harassment.

4. Misuse by Employees or Contractors

   •   Example: An employee or contractor reviewing conversations for quality assurance or feature development recognizes personal details about a user.

   •   Risk: While OpenAI has policies in place, human error or malicious intent could lead to misuse of sensitive data (e.g., sharing it externally or using it for personal gain).

5. Misinterpretation of Stored Data

   •   Example: A user asks sensitive or emotional questions (e.g., about legal advice, mental health, or financial struggles).

   •   Risk: If such data were to be shared or accessed (e.g., by legal subpoena or during an audit), it could be misconstrued or used against the user without proper context.

6. De-anonymization with External Data

   •   Example: A user shares seemingly harmless details (e.g., “I live in a small town and run the only bakery here”).

   •   Risk: Even anonymized data could be linked back to a specific individual when combined with publicly available information or data from other sources.

7. Data Retention Leading to Long-Term Risks

   •   Example: Years later, a user who shared personal details during a chat becomes a public figure or involved in a legal case.

   •   Risk: Data retained by OpenAI, even if anonymized, could resurface or be subpoenaed, potentially harming the user’s reputation or legal standing.

Takeaway: To minimize risks, avoid sharing sensitive or identifiable information. Treat conversations with AI as public data to some extent and stay mindful of what could happen if the content were exposed.”

ai tech chatgpt openai artificial intelligence silicon valley tattoo california blogger 1984

“It was terribly dangerous to let your thoughts wander when you were in any public place or within range of a telescreen. The smallest thing could give you away. A nervous tic, an unconscious look of anxiety, a habit of muttering to yourself – anything that carried with it the suggestion of abnormality, of having something to hide. In any case, to wear an improper expression on your face (to look incredulous when a victory was announced, for example) was itself a punishable offence.”

– George Orwell

Some information on deleting your data from ChatGPT and/or opting out of model training (for what it’s worth):

Deleting Conversations:

  • Individual Conversations: You can delete specific chats by selecting the conversation and choosing the delete option. This removes the conversation from your chat history.

    PCMag

  • All Conversations: To delete all your chats, navigate to the settings and select the option to clear all conversations. This action will remove all chat history from your account.

    Digital Trends

Data Retention and Model Training:

While deleting conversations removes them from your visible history, OpenAI may retain anonymized data for model training purposes. To prevent your data from being used in this way, you can disable chat history and model training in the settings.

Wired

Account Deletion:

If you wish to ensure all your data is removed, you can delete your OpenAI account. This action will permanently delete all associated data, including chat histories. Be aware that this process is irreversible and will also remove access to other OpenAI services.

TechWiser

Considerations:

  • Data Usage: Even after deleting conversations, anonymized data may still be used for improving AI models unless you opt out.

  • Privacy: Avoid sharing sensitive personal information in chats, as data handling practices can evolve.

In summary, while you can delete conversations from your ChatGPT history, some data may still be retained and used by OpenAI unless you take additional steps to opt out or delete your account.

ai tech chatgpt openai artificial intelligence silicon valley tattoo california blogger 1984 orwell

Turning off model training data collection in settings 

I hope I have helped disseminate information that you weren’t previously aware of when it comes to the risks of using AI platforms like ChatGPT. Your freedom, your privacy, and your right to a bright future unmarred by security breaches, and litigious issues is important.

Next
Next

The Play Crisis: Why Unstructured Play Is Essential for a Generation’s Future