OpenAI is storing deleted ChatGPT conversations as part of its NYT lawsuit

1 day ago 6

OpenAI says it’s forced to store deleted ChatGPT conversations “indefinitely” owed to a tribunal bid issued arsenic portion of The New York Times’ copyright suit against it. In a station connected Thursday, OpenAI main operating serviceman Brad Lightcap says the institution is appealing the court’s decision, which helium calls an “overreach” that “abandons long-standing privateness norms and weakens privateness protections.”

Last month, a tribunal ordered OpenAI to sphere “all output log information that would different beryllium deleted,” adjacent if a idiosyncratic requests the deletion of a chat oregon if privateness laws necessitate OpenAI to delete data. OpenAI’s policies state that erstwhile a idiosyncratic deletes a chat, it retains it for 30 days earlier permanently deleting it. The institution indispensable present enactment a intermission connected this argumentation until the tribunal says otherwise.

OpenAI says the tribunal bid volition interaction free, Pro, Plus, and Team ChatGPT users. It won’t impact ChatGPT Enterprise oregon ChatGPT Edu customers, oregon businesses that person a zero information retention agreement. OpenAI adds that the information won’t beryllium public, and “only a small, audited OpenAI ineligible and information team” volition beryllium capable to entree the stored accusation for ineligible purposes. 

The Times sued OpenAI and Microsoft for copyright infringement successful 2023, accusing the companies of “copying and utilizing millions” of the newspaper’s articles to bid their AI models. The work argues that redeeming idiosyncratic information could assistance sphere grounds to enactment its case.

“We deliberation this was an inappropriate petition that sets a atrocious precedent,” OpenAI CEO Sam Altman said successful a station connected X. “We volition combat immoderate request that compromises our users’ privacy; this is simply a halfway principle.” The New York Times declined to comment.

Read Entire Article