Blog

The 6 things you should (absolutely) not tell ChatGPT

ChatGPT has become a reflex for millions of internet users. Asking a question, requesting a summary, generating a text: in a few seconds the tool provides a structured and often very convincing response. But behind this apparent efficiency lies a more complex reality. What you give it is not always without consequences. Identity, health, banking information, professional documents… As the Wall Street Journal points out, some data that seems harmless should never pass through an artificial intelligence interface, however powerful it may be.

Key takeaways:

  • Never share usernames, passwords, or bank details in ChatGPT.
  • Medical results and health data must remain confidential.
  • Precise personal information can be enough to identify you.
  • Internal company documents are not protected in this interface.
  • ChatGPT is not suitable for personal or psychological confessions.

1. Never share your usernames or passwords

This is one of the most dangerous reflexes: entering a password, a login code or an API key as if addressing technical support. ChatGPT is not a secure service and must never receive this kind of information.

The data you share can be used to train the model, or read by human teams as part of audits. Even with privacy options enabled, no processing is guaranteed to be completely confidential.

2. Never provide your banking information

Whether it’s a card number, a bank account details (IBAN) or a security code, this information has no business being part of a conversation with an artificial intelligence. The model does not encrypt exchanges end-to-end and does not comply with current standards for financial data security.

Even if the request seems innocuous — “can you proofread this email to my bank?” — it can expose sensitive information. It is recommended to treat this interface as a public space.

3. Do not enter your medical test results

AI can be tempting for a quick opinion on a blood test or to understand complex medical terms. But under no circumstances does it replace a healthcare professional. It does not know your medical file or personal context, and it cannot guarantee the accuracy of the information provided.

Beyond the ethical aspect, sharing health data is particularly sensitive in view of legislation. These informations fall under the category of sensitive data under the GDPR, and their transmission to a general-purpose AI poses a real compliance problem.

4. Do not share precise personal information

Name, first name, address, phone number, a relative’s name: so many details that, when combined, allow the identification of an individual. Even if the tool does not retain memory between sessions, what you type can be used to train future versions of the model.

Caution is necessary, especially in the context of frequent or prolonged conversations. Every detail may seem harmless in isolation but becomes meaningful in a broader context. The basic rule, therefore, is to not share any information such as social security numbers, identity card numbers, or passport details.

5. Avoid submitting your company's internal data

Many users integrate ChatGPT into their work environment: drafting notes, summarizing meetings, preparing presentations… But by sharing internal documents they sometimes expose confidential information without realizing it.

Business objectives, contracts, human resources, strategic roadmaps… These items should never be shared via a public interface. No confidentiality agreement binds your company to the AI provider.

6. Do not share intimate thoughts or sensitive personal situations

Some users spontaneously share very personal matters: emotional disorders, painful family episodes, psychological suffering. But the tool was not designed to handle this type of content. It has no clinical competence, and the responses provided may be inappropriate or even dangerous.

Even if the conversation seems reassuring, it is based solely on statistical predictions. When needed, a human interlocutor remains the only appropriate response.

A useful tool, but take precautions

ChatGPT has become a daily assistant for many users. But this ease of use can also hide very real risks. The tool is neither confidential, nor medical, nor professional in the legal sense.

Before sending sensitive information, ask yourself a simple question: would you accept that this data be read by a third person? If the answer is no, refrain. Artificial intelligence, despite its apparent neutrality, guarantees neither confidentiality nor forgetfulness.

The article "The 6 things you should (absolutely) not tell ChatGPT" was published on the site Abondance.