
If you’re a user of Anthropic‘s popular AI assistant, Claude, there’s a new update you need to be aware of. The company has made some big changes to Claude AI’s consumer privacy policy. Now, you have to decide whether your chats will be used to train future AI models.
Previously, Anthropic had a policy of not using consumer chats for training purposes. But that’s all changing. The firm now wants to use your conversations and coding sessions to improve its AI. If you agree, your data will be retained for up to five years. For current users, the default setting is already switched “on” in a new pop-up window, so you’ll have to actively switch it off if you don’t want to participate.
You have a deadline to opt out of Anthropic’s new privacy policy for Claude AI
The deadline to make your choice is September 28. If you choose not to accept the new policies, your chats will continue to be deleted after 30 days.
So, why is this happening? Anthropic’s official reason is that your data will “help us improve model safety” and lead to a more accurate and better-performing Claude for everyone. By using real-world data, the company can make the AI better at everything from reasoning and analysis to writing code.
While that all sounds great, the full truth is likely a little simpler: in the race to build the best AI, every major company needs a vast amount of high-quality data to stay competitive. Accessing millions of real-life conversations from its own users is a goldmine for Anthropic. The company might need such a move to continue competing against rivals like OpenAI and Google.
This move mirrors a broader industry trend. OpenAI, which similarly partitions its enterprise users from its data policies, has faced legal battles over its data retention practices.
Many could miss this setting
The most concerning part of this change is how easy it is for users to miss it. New users will be prompted with the choice during sign-up. Meanwhile, existing users will see a pop-up with a large “Accept” button and a much smaller toggle switch for data sharing that is automatically enabled. This design has raised concerns that users might quickly click through without realizing they are agreeing to a major change in their data’s usage.
The bottom line is simple: if you use Claude, it’s a good idea to check your privacy settings before the September 28 deadline. With this simple move, you will make sure your data is being handled exactly the way you want it.
The post Your Claude Chats Are Now Training AI, Unless You Take Action appeared first on Android Headlines.