A team of researchers was able to prompt ChatGPT to reveal private information including email addresses, phone numbers, snippets from research papers, news articles, Wikipedia pages, and more. The researchers, from Google DeepMind, the University of Washington, Cornell, Carnegie Mellon University, the University of California Berkeley, and ETH Zurich, published their findings in a 404 Media and urged AI companies to perform internal and external testing before releasing large language models. They noted that the attack they used to access the data was “wild” and should have been discovered earlier. Chatbots like ChatGPT and prompt-based image generators like DALL-E are powered by large language models trained on data often scraped from the public internet without consent. The researchers discovered that using simple prompts, they were able to make ChatGPT reveal poetry, Bitcoin addresses, fax numbers, names, birthdays, social media handles, explicit content from dating websites, snippets from copyrighted research papers, and verbatim text from news websites. OpenAI patched the vulnerability on August 30, but Engadget was able to replicate some of the paper’s findings in their own tests. OpenAI did not respond to Engadget’s request for comment.
Related Posts
Amazon’s Exclusive $10 Monthly Grocery Subscription for Prime Members
- admin
- December 8, 2023
- 0
Amazon has recently introduced a new subscription plan for its grocery service, which aims to attract more customers. Prime members can now pay $10 monthly […]
Get up to $200 off the Pixel 8 and Pixel 8 Pro in Google’s Black Friday Sale
- admin
- November 16, 2023
- 0
Google’s Pixel 8 and Pixel 8 Pro are considered the best Android phones for most people, and right now both are on sale for the […]
UK Supreme Court Decision: Only Natural Persons Can Be Patent Inventors, AI Excluded
- admin
- December 20, 2023
- 0
AI might not impact jobs, however, obtaining patents for inventions created by AI poses a challenge. Dr. Stephen Thaler, the inventor of AI “creativity machine” […]
