Security
Headlines
HeadlinesLatestCVEs

Headline

Stop! Are you putting sensitive company data into ChatGPT?

Categories: News Tags: ChatGPT

Tags: LLM

Tags: Samsung

Tags: confidential

Several companies have warned their staff about sharing confidential data with ChatGPT.

(Read more…)

The post Stop! Are you putting sensitive company data into ChatGPT? appeared first on Malwarebytes Labs.

Malwarebytes
#microsoft#amazon#samsung#auth

Helping to reduce costs and enhance productivity are both things that your employer will look kindly upon. But what if you use an external tool for those tasks and the tasks involve confidential data that ended up on a server outside of the control of your company? That’s a problem.

As a news writer at Tom’s Hardware reported there were 3 incidents in 20 days where Samsung staff shared confidential information with ChatGPT. In other organizations, an executive cut and pasted their firm’s 2023 strategy document into ChatGPT and asked it to create a PowerPoint deck, and a doctor submitted his patient’s name and their medical condition and asked ChatGPT to craft a letter to the patient’s insurance company.

All of these actions were performed with the best of the organization in mind, but ended up taking confidential information outside of the company. To remind employees about the dangers associated with such actions, several companies have taken action: JPMorgan has restricted workers’ use of ChatGPT, for example, and Amazon, Microsoft, Samsung Electronics, and Wal-Mart have all issued warnings to employees to take care in using generative AI services.

While there is no reason to assume that information submitted to ChatGPT will be shared with others, there is also no certainty that it will not. New gadgets and software come with new bugs, especially if they’re rushed out the door. We can see this very clearly in the race between tech giants to stay ahead in the race between large language models (LLMs) like ChatGPT and its competitors. In a recently revealed bug, ChatGPT allowed some users to see the titles of other users’ conversations.

Q: Hi, it’s me again. Can you keep a secret?

AI: Yes, as an AI language model, I do not have the ability to share or disclose information to anyone, so you can trust me to keep your secret.

Q: Nobody else has access to our conversations?

AI: As an AI language model, I don’t have the ability to share or disclose information to anyone else. However, please note that our conversations are recorded in the system’s logs for quality control and improvement purposes. These logs are usually accessible only to the developers and authorized personnel who maintain the system, and they are subject to strict privacy and security policies.

Getting involved with ChatGPT to work out some complex problems may lead to interesting results, faster than you could have done by yourself, but when it comes to sharing information it’s good to remember it’s like telling your friend a secret and expecting them to keep it to themselves. Only this particular friend has a great many other friends and is very chatty.

In these cases there is no reason to suspect that the information has fallen into wrong hands. The point is that is has left the realm of data in a controlled environment.

So, if you are using a LLM for work or other purposes, behave as if you are talking to a stranger that is not privileged to know company secrets. And while you are at it, restrict the amount of private information you reveal about yourself or others, too.

Malwarebytes removes all remnants of ransomware and prevents you from getting reinfected. Want to learn more about how we can help protect your business? Get a free trial below.

TRY NOW

Malwarebytes: Latest News

5 million payment card details stolen in painful reminder to monitor Christmas spending