Microsoft 365 Copilot now sends data to Anthropic by default. Here's how to think about it.
From 7 January 2026, Anthropic's Claude is enabled by default across Microsoft 365 Copilot. What that means for your data, your DPA register, and your governance posture.
If you use Microsoft 365 Copilot, your tenant gains a new data subprocessor on 7 January 2026: Anthropic. Microsoft has enabled Claude models by default across many Copilot experiences under the existing Product Terms and Data Protection Addendum. The official Microsoft Learn page is here: Connect Microsoft 365 Copilot to AI subprocessors.
The opt-out is the bit that matters most. If you do nothing, the integration is on. Your prompts, and the Microsoft 365 data those prompts pull in to answer them, can now be sent to Anthropic in the course of fulfilling Copilot requests.
For most businesses this is a governance event. Worth at least one structured conversation between your IT, your data-protection lead, and whoever signs off on cross-border data transfers.
Why this matters
Anthropic is an external data provider, distinct from Microsoft. When Claude handles a Copilot prompt, the data needed to answer it leaves Microsoft’s processing boundary and enters Anthropic’s. That changes the trust model in a few ways:
- You now have a second data processor in scope. Any record you maintain of who processes your data needs to include Anthropic.
- It can be a cross-border transfer. Anthropic’s processing locations and Microsoft’s aren’t the same. If you’re subject to data-sovereignty rules or to the Australian Privacy Principles around overseas disclosure, treat this transfer like the cross-border event it is, not like an in-tenant Microsoft feature.
- Your existing DLP and Purview rules may not have been written with this subprocessor in mind. The classification, redaction and policy work that kept regulated content out of Microsoft’s external models needs to be reviewed against Anthropic’s processing too.
None of this is to say Anthropic is the wrong choice as a model provider. Claude is a strong model. The point is that “default on” doesn’t mean the legal and governance work is already done.
The trade-off Microsoft is offering
Disabling Anthropic isn’t free. Several Copilot features lean on Claude specifically:
- The Researcher agent.
- Custom Copilot Studio agents.
- The Office “Frontier” experiences.
Turn Anthropic off and you get reduced functionality from the Copilot licences you’ve already paid for. That’s the real trade on the table: bleeding-edge capability now, or governance-controlled capability later.
For most regulated businesses (legal, finance, health, anyone holding personal information at scale), the right play is to disable by default, run the legal and data-governance review, then re-enable selectively where the business case supports the additional processor.
For unregulated businesses with low-sensitivity data, the right play might be to leave it on and use the features. Make sure that’s an actual decision and not a default.
What we’re advising clients
A short checklist for the next two weeks:
- Inventory your Copilot tenant. Who has Copilot licences, what they use them for, and what data those workflows touch.
- Toggle Anthropic off in the Microsoft Admin Center while you do the rest of this list. The default is reversible.
- Review your DPA register. Add Anthropic. If your DPA register doesn’t exist yet, this is the moment to start one.
- Run a Purview and DLP review against the workflows that would route through Copilot. Make sure your sensitivity labels actually catch the content you’d want held back from any external model.
- Make a written decision about which user groups can use Anthropic-backed Copilot features and which can’t. Document the reasoning and the data classes you’re protecting.
- Re-enable selectively only where the workflow genuinely benefits and the data class is appropriate.
If you’re a managed compliance client, your account lead will bring this up at the next review anyway. If you’d like a hand thinking it through before then, get in touch.