Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

And even worse:

   Conversations that have been reviewed or annotated by human reviewers (and related data like your language, device type, location info, or feedback) are not deleted when you delete your Gemini Apps activity because they are kept separately and are not connected to your Google Account. Instead, they are retained for up to three years.
Emphasis on "retained for up to three years" even if you delete it!!


Well they can't delete a user's Gemini conversations because they don't know which user a particular conversation comes from.

This seems better, not worse, than keeping the user-conversation mapping so that the user may delete their conversations.


How does it compare to OpenAI and anthropic’s user data retention policy?


If i'm not wrong, Chatgpt states clearly that they don't use user data anymore by default.

Also, maybe some services are doing "machine learning" training with user data, but it is the first time I see recent LLM service saying that you can feed your data to human reviewers at their will.


They seem to use it as long as the chat history is enabled, similar to Gemini. https://help.openai.com/en/articles/7792795-how-do-i-turn-of...


I believe this is out of date. There’s a very explicit opt in/out slider for permitting training on conversations that doesn’t seem to affect conversation history retention.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: