The question is no longer "What can ChatGPT do?" It's "What should I share with it?"
Internet users are generally aware of the risks of possible data breaches, and the ways our personal information is used online. But ChatGPT's seductive capabilities seem to have created a blind spot around hazards we normally take precautions to avoid. OpenAI only recently announced a new privacy featurewhich lets ChatGPT users disable chat history, preventing conversations from being used to improve and refine the model.
SEE ALSO: ChatGPT rolls out important privacy options"It's a step in the right direction," said Nader Henein, a privacy research VP at Gartner who has two decades of experience in corporate cybersecurity and data protection. "But the fundamental issue with privacy and AI is that you can't do much in terms of retroactive governance after the model is built."
Henein says to think about ChatGPT as an affable stranger sitting behind you on the bus recording you with a camera phone. "They have a very kind voice, they seem like nice people. Would you then go and have the same conversation with that? Because that's what it is." He continued, "it's well-intentioned, but if it hurts you — it's like a sociopath, they won't think about it twice."
Even OpenAI's CEO Sam Altman has acknowledged the risks of relying on ChatGPT. "It's a mistake to be relying on it for anything important right now. We have lots of work to do on robustness and truthfulness," he tweeted in December 2022.
This Tweet is currently unavailable. It might be loading or has been removed.
Essentially, treat ChatGPT prompts as you would anything else you publish online. "The best assumption is that anyone in the world can read anything you put on the internet — emails, social media, blogs, LLMs — do not ever post anything you do not want someone else to read," said Gary Smith, Fletcher Jones Professor of Economics at Pomona College and author of Distrust: Big Data, Data-Torturing, and the Assault on Science. ChatGPT can be used as an alternative to Google Search or Wikipedia, as long as it's fact-checked, he said. But it shouldn't be relied on for much else.
The bottom line is that there are still risks, made even more precarious because of ChatGPT's allure. Whether you're using ChatGPT in your personal life or to boost work productivity, consider this your friendly reminder to think twice about what you share with ChatGPT.
First, let's look at what OpenAI tells users about how they use their data. Not everyone's privacy priorities are the same, but it's important to know the fine print for the next time you open up ChatGPT.
First and foremost, there's the possibility of someone outside of OpenAI hacking in and stealing your data. There's always an inherent risk of data exposure from bugs and hackers while using a third party service, and ChatGPT is no exception. In March 2023, a ChatGPT bug was discovered to have exposed titles, the first message of new conversations, and payment information from ChatGPT Plus users.
"All this information you're pushing into it is highly problematic, because there's a good chance it might be susceptible to machine learning attacks. That's number one," said Henein. "Number two, it's probably sitting in clear text somewhere in the log. Whether or not somebody is going to look at it, I don't know, neither do you. That's the problem."
While it's unlikely, certain OpenAI employees have access to user content. On the ChatGPT FAQs page, OpenAI says user content is stored on its systems and other "trusted service providers' systems in the US." So while OpenAI removes identifiable personal information, before its "de-identified," it exists in raw form on its servers. Some authorized OpenAI personnel have access to user content for four explicit reasons: one of them being to "fine tune" their models, unless users opt out.
SEE ALSO: Beware of shady knockoff ChatGPT appsWe'll get to opting out later, but unless you do that, your conversations are used to train ChatGPT. According to its data usage policy, which is scattered across several different articles on its site, OpenAI says, "we may use the data you provide us to improve our models." On another page, OpenAI says it may "aggregate or de-identify Personal Information and use the aggregated information to analyze the effectiveness of our Services." This means, theoretically the public can become aware of something like a business secret via whatever the model "learns."
Previously, users were only able to opt out of sharing their data with the model through a Google Form linked in the FAQs page. Now, OpenAI has introduced a more explicit way of disabling data sharing in the form of a toggle setting within your ChatGPT account. But even with this new "incognito mode," conversations are stored on OpenAI's server for 30 days. However, the company has relatively little to say on how they keep your data secure.
OpenAI says it does not share user data to third parties for marketing or advertising purposes, so that's one less thing you have to worry about. But it does share user data with vendors and service providers for maintenance and operation of the site.
ChatGPT and generative AI tools have been touted as the ultimate productivity hack. ChatGPT can draft articles, emails, social media posts, and summaries of long chunks of text. "There isn't an example that you can possibly think of that hasn't been done," said Henein.
But when Samsung employees used ChatGPT to check their code, they inadvertently revealed trade secrets. The electronics company has since banned the use of ChatGPT and threatened employees with disciplinary action if they fail to adhere to the new restrictions. Financial institutions like JPMorgan, Bank of America, and Citigroup have also banned or restricted the use of ChatGPT due to strict financial regulations about third-party messaging. Apple has also banned employees from using the chatbot.
The temptation to cut mundane work down into seconds seems to overshadow the fact that users are essentially publishing this information online. "You're thinking of it in the same way that you think of a calculator, you're thinking of it like Excel," he said. "You're not thinking that this information is going into the cloud and that it's going to be there in perpetuity either in a log somewhere, or in the model itself."
So if you want to use ChatGPT at work to break down concepts you don't understand, write copy, or analyze publicly available data, and there's no rule against it, cautiously proceed. But be very careful before you, for example, ask it to evaluate the code on the top secret missile guidance system you're working on, or have it write a summary of your boss' meeting with a corporate spy embedded at a competing company. That could cost you your job, or worse.
A survey conducted by healthtech company Tebra revealed that one in four Americans is more likely to talk to an AI chatbot than to attend therapy. Instances have already popped up of people using ChatGPT as a form of therapy, or seeking help for substance abuse. These examples were shared as exciting use cases for how ChatGPT can be a helpful, non-judgmental, and anonymous conversation partner. But your deepest, darkest admissions are stored somewhere in a server.
This Tweet is currently unavailable. It might be loading or has been removed.
People tend to think their ChatGPT sessions are like a "walled garden" said Henein. "At the end, when I log out, everything inside of that [session] flushes down the toilet, and that's the end of the conversation. But that's not the case."
If you're a Person On The Internet, your personal data is already all over the place. But not the ChatGPT conversational medium where you might feel compelled to divulge intimate and personal thoughts. "LLMs are an illusion—a powerful illusion, but still an illusion reminiscent of the Eliza computer program that Joseph Weizenbaum created in the 1960s," said Smith.
Smith is referring to the "Eliza effect," or the human tendency to anthropomorphize things that are inanimate. "Even though users knew they were interacting with a computer program, many were convinced that the program had human-like intelligence and emotions and were happy to share their deepest feelings and most closely held secrets."
So given how OpenAI stores your conversations, try not to give yourself over to the illusion that it's a mental health wizard, and blurt out your innermost thoughts, unless you're prepared to broadcast your innermost thoughts to the world.
There's a way to go incognito when using ChatGPT. That means your conversations are still stored for 30 days, but they won't be used to train the model. By navigating to your account name, you can open up settings, then click on "Data Controls." From here you can toggle off "Chat History & Training." You can also clear past conversations by clicking on "General" and then "Clear all chats."
Copyright © 2023 Powered by
What not to share with ChatGPT if you use it for work-鼓盆之戚网
sitemap
文章
5
浏览
5
获赞
227
Comedian gives her family brilliant informational pamphlets before going on a date
Anticipating her family's inevitable questions, Mary Beth Barone prepared an informational pamphletMiley Cyrus and her tongue to save the world by opting out of red carpets
Miley Cyrus will be showing up at all parties in the USA and elsewhere like a normal, non-famous humElon Musk looking to create an AI chatbot to compete with OpenAI's ChatGPT
Just months after acquiring Twitter, Tesla and SpaceX CEO Elon Musk is now distracted by the newestThis Taco Belle gown is exactly what you need to feel like a princess
Only true royals know the opulence of drive-thrus, Doritos tacos and the legendary "fourth meals," sEU is investigating Apple Pay and App Store for breaking competition rules
The European Commission has launched two formal investigations into Apple's business practices overWikileaks retracts Twitter poll speculating about Clinton's health
Wikileaks, the organization that publishes classified and leaked materials and prides itself on tranMeta allows Trump back on Facebook and Instagram
Get ready to see Trump posts shared on your Facebook feed again.Meta announcedon Wednesday that it wGood news, stoners: Study links daily marijuana use to lower BMI
People who smoke marijuana daily may be slimmer than those who don't use the drug, a new study suggeSamsung's Galaxy Z Flip 5G is basically confirmed in leaked video
It's been the week of leaks for Samsung — there was the Note 20 Ultra, the Galaxy Tab S7+, theApple devices running old software will lose access to some Apple services
Older Apple devices including the iPhone, Mac, Apple Watch, and Apple TV might soon lose access to sHow are we regulating ChatGPT and other AI tools?
ChatGPT is only two months old, but we've spent the time since it debuted debating how powerful it rSimone Biles opens up about her ADHD on Twitter
Simone Biles continues to be an inspirational badass.On Tuesday, the gold-winning Olympic gymnast goLinkedIn says its extra intense clipboard snooping in iOS is a bug
LinkedIn's iOS app has taken the ongoing issue of snooping at users' clipboards to whole, new level.21 blunt and bizarre one
LONDON -- There are some books out there that are just unanimously enjoyed by everyone, right? CertaChatGPT is raising concerns in the literary world
Every month, Clarkesworld— a monthly science fiction and fantasy magazine — receives abo