PC.net
HomeHome : Monthly Tips : Sep 2024
ShareShare

Don't Share PII with LLMs

September 2024 — Tip of the Month

ChatGPT and other generative AI platforms are incredibly helpful, but remember to consider your privacy when using them. These services are powered by algorithms that may retain and reuse the information you provide. Therefore, be cautious about sharing personal information with any AI-based chatbot or platform.

Sharing personal information with AI services poses privacy and security risks. Once data is shared, it could be stored or used in ways that you don't expect. In the wrong hands, someone can use your personal information for malicious purposes such as identity theft or fraud. Moreover, sharing personal details with AI services can lead to unintentional data exposure. Even if the service is trustworthy, there's always the risk of a data breach that could compromise the information you've shared. I've lost track of how many "data security incident" notices I've received in the past several years. 🙄

▶ The data you submit to an AI engine is often stored and used for training and improving the service. Even if the conversation seems private, your information could be used to enhance the AI's capabilities in the future. This includes details like your name, address, phone number, etc. It's also wise not to share confidential business information like undisclosed financial data or trade secrets with generative AI services. Before uploading source code, make sure the service provides clear safeguards that keep your data private and secure.

In summary, don't share PII with LLMs. That is, don't share personally identifiable information with large language models — the data sets used by ChatGPT, Google Gemini, and others. The input you provide may be stored and used without your knowledge.

- Per Christensson

space