Would You like a feature Interview?
All Interviews are 100% FREE of Charge
On the surface, it might look like a useful tool for a range of work tasks. But before the chatbot can summarize important notes or check for errors in its work, anything you share with ChatGPT can be used to train the system and be displayed in responses to other users. This is something that some employees probably should have been aware of before they were reported to have shared confidential information with a chatbot.
Shortly after Samsung’s semiconductor division began allowing engineers to use ChatGPT, employees leaked confidential information at least three times. (as discovered by ). One employee reportedly asked the chatbot to check his code for errors in sensitive database sources, another asked for code optimizations, and a third asked to record I asked her to feed the scheduled meetings to her ChatGPT and take the minutes.
After learning of the security failure, Samsung suggests that it tried to limit the scope of future failures by limiting the length of its employees’ ChatGPT prompts to 1 kilobyte or 1024 characters of text. The company is also said to be investigating the three employees in question and building its own chatbot to prevent similar incidents. Engadget reached out to his Samsung for comment.
ChatGPT It states that it will use prompts to train the model unless the user explicitly opts out. Chatbot Owner OpenAI Don’t share confidential information with ChatGPT in conversations because “you can’t remove certain prompts from your history.” The only way to remove personally identifiable information on ChatGPT is to delete your account. .
The Samsung story is another example of why all online activities need this. You never know where your data will end up.