Claude will start using your chats to train itself beginning September 28, 2025, according to a major policy update from Anthropic. The AI company announced that conversations and coding sessions from its consumer-facing products will be used to improve Claude’s reasoning, safety, and coding abilities. This change, however, comes with an opt-out option for users who do not want their chats stored for training.
Why Claude Will Start Using Your Chats to Train Itself
Anthropic, the creator of Claude AI, aims to make its models smarter, safer, and more useful. By analyzing real-world user interactions, Claude can learn patterns, detect harmful content, and improve natural language understanding.
This approach is similar to what other AI companies like OpenAI and Google have already adopted, where user conversations help refine AI performance.
How It Works
- Starting September 28, new users will see a data usage toggle during sign-up.
- Existing users will get a pop-up prompt with a pre-checked option allowing data use.
- Unless you actively opt out, Claude will start using your chats to train itself.
- Data retention will increase from 30 days to 5 years for opted-in users.
Who Is Affected?
This policy affects users of:
- Claude Free, Pro, and Max
- Claude Code (the AI coding assistant)
It does not apply to:
- Claude for Work
- Claude for Education
- Claude for Government
- API access through Amazon Bedrock or Google Cloud Vertex AI
Privacy and Security Measures
Anthropic has stressed that:
- No data will be sold to third parties.
- Automated tools will filter sensitive content.
- Users can adjust privacy settings anytime, but data already used for training cannot be removed.
Despite these safeguards, some privacy experts warn that even anonymized data could still be re-identified in rare cases.
Why It Matters for Users
- Improved Claude performance → Better reasoning, coding, and safety.
- User choice → You can opt out if you value privacy.
- Industry trend → This aligns Anthropic with AI leaders like OpenAI and Google.
- Data responsibility → Users must take action if they don’t want their chats used.
How to Opt Out
- Open Claude app or website settings.
- Look for Data Sharing / Privacy Settings.
- Toggle off the option: “Allow my chats to be used for AI improvement.”
- Confirm your choice.
Conclusion
Claude will start using your chats to train itself from September 28, 2025. While this update promises smarter and safer AI, it also raises important privacy concerns. Users who value confidentiality must take proactive steps to opt out before the deadline.