Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

admin
Pinned May 4, 2023

<> Embed

@  Email

Report

Uploaded by user
Three Samsung employees reportedly leaked sensitive data to ChatGPT
<> Embed @  Email Report

Three Samsung employees reportedly leaked sensitive data to ChatGPT

Three Samsung employees reportedly leaked sensitive data to ChatGPT

 
Kris Holt
Kris Holt

On the surface, ChatGPT might seem like a tool that can come in useful for an array of work tasks. But before you ask the chatbot to summarize important memos or check your work for errors, it’s worth remembering that anything you share with ChatGPT could be used to train the system and perhaps even pop up in its responses to other users. That’s something several Samsung employees probably should have been aware of before they reportedly shared confidential information with the chatbot.

Soon after Samsung’s semiconductor division started allowing engineers to use ChatGPT, workers leaked secret info to it on at least three occasions, according to The Economist Korea (as spotted by Mashable). One employee reportedly asked the chatbot to check sensitive database source code for errors, another solicited code optimization and a third fed a recorded meeting into ChatGPT and asked it to generate minutes.

Reports suggest that, after learning about the security slip-ups, Samsung attempted to limit the extent of future faux pas by restricting the length of employees’ ChatGPT prompts to a kilobyte, or 1024 characters of text. The company is also said to be investigating the three employees in question and building its own chatbot to prevent similar mishaps. Engadget has contacted Samsung for comment.

ChatGPT’s data policy states that, unless users explicitly opt out, it uses their prompts to train its models. The chatbot’s owner OpenAI urges users not to share secret information with ChatGPT in conversations as it’s “not able to delete specific prompts from your history.” The only way to get rid of personally identifying information on ChatGPT is to delete your account — a process that can take up to four weeks.

The Samsung saga is another example of why it’s worth exercising caution when using chatbots, as you perhaps should with all your online activity. You never truly know where your data will end up.

Three Samsung employees reportedly leaked sensitive data to ChatGPT | DeviceDaily.com

Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics

(15)