Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>NYTimes has produced credible evidence that OpenAI is simply stealing and republishing their content

They shouldnt have any rights to data after its released.

>That's a question they fundamentally cannot answer without these chat logs.

They are causing more damage than anything chatGPT could have caused to NYT. Privacy needs to be held higher than corporate privilege.

>Think about it this way. Let's say this were a book store selling illegal copies of books.

Think of it this way, no book should be illegal.

>They can't know how often that's happened and OpenAI has an obvious incentive to simply say "Oh that never happened".

NYT glazers do more to uphold OpenAI as a privacy respecting platform than OpenAI has ever done.

>If this never happens then the amount will be low.

Should be zero, plus compensation to the affected OpenAI users from NYT.

>The user has no right to privacy.

And this needs to be remedied immediately.

>The same as how any internet service can be (and have been) compelled to produce private messages.

And this needs to be remedied immediately.



I get that you're mad, and rightly should be for an invasion of your privacy, but the NYT would be foolish to use any of your data for anything other than this lawsuit, and to not delete it afterwards, as per their request.

They can't use this data against any individual, even if they explicitly asked, "How do I hack the NYT?"

The only potential issue is them finding something juicy in someone's chat, that they could publish as a story; and then claiming they found out about this juicy story through other means, (such as a confidential informant), but that's not likely an issue for the average punter to be concerned about.


>The only potential issue is them finding something juicy in someone's chat, that they could publish as a story; and then claiming they found out about this juicy story through other means, (such as a confidential informant)

Which is concerning since this is a news organization that's getting the data.

Let's say they do find some juicy detail and use it, then what? Nothing. It's not like you can ever fix a privacy violation. Nobody involved would get a serious punishment, like prison time, either.


>Let's say they do find some juicy detail and use it, then what? Nothing. It's not like you can ever fix a privacy violation. Nobody involved would get a serious punishment, like prison time, either.

There are no privacy violations. OpenAI already told the court they anonymized it. What they say in court and what they say in the blog is different and so many people here are (unfortunately) falling for it!


There's no such thing. Anonymized data can still be used to identify someone as we've seen on numerous occasions.


Read the ToS next time




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: