Introduction
In today’s AI driven world, tools like Microsoft Copilot are transforming how employees access and use information. But with this power comes a new challenge: ensuring that sensitive data isn’t accidentally exposed or processed by AI. Not every document should be summarized, analyzed, or surfaced by an intelligent assistant.
Think of payroll files, HR records, or confidential contracts these must remain off‑limits in many organizations. Each company has different requirements and levels of sensitivity, so the approach may vary. In this blog, we’ll show one practical method using Microsoft Purview to label and restrict sensitive content from Copilot.
By combining sensitivity labels with Data Loss Prevention (DLP) policies, organizations can mark certain content as Copilot‑restricted and prevent Copilot from processing it.
Why Is This Important?
Sensitive information lives everywhere in SharePoint sites, OneDrive folders and Teams chats. Without guardrails, AI could surface content that was never meant to be shared. A labeling and DLP strategy ensures:
- Controlled AI access: Copilot respects sensitivity labels and won’t process restricted files.
- Regulatory compliance: Helps meet GDPR, HIPAA, and other data protection requirements.
- Reduced risk of leaks: Prevents accidental exposure of confidential data through AI prompts.
- User awareness: Policy tips educate employees in real time when they try to use restricted content.
Scenario
Copilot is a powerful tool for summarizing HR policies and streamlining documentation. But it should not be used to process or access individual salary spreadsheets or personal records stored in Microsoft 365. These files contain sensitive employee data, and protecting confidentiality is important.
To prevent this, the company implements a policy that:
- Detects when content is labeled “Highly Confidential – No Copilot Processing.”
- Blocks Copilot from retrieving or summarizing that content.
- Notifies the user that the file is protected.
- Alerts administrators when someone attempts to use restricted data with Copilot.
Objectives
In this walkthrough, we will:
- Create a sensitivity label for AI‑restricted content.
- Apply the label automatically or manually to sensitive files.
- Create a DLP policy that blocks Copilot from processing labeled content.
- Test the setup with Copilot agent.
Step 1: Create a Sensitivity Label
This label acts as a digital shield, telling Copilot and other AI tools to stay away.
- This label acts as a digital shield, telling Copilot to stay away.
- Go to the Microsoft Purview Compliance Portal.
- Navigate to Information Protection > Labels.
- Click + Create a label.
- Name it: Highly Confidential – No Copilot Processing.
- Add a description: “Content with this label cannot be processed by Copilot.”
- Define the scope to files and other data assets.
- Configure content marking (e.g., header: “Copilot Processing Prohibited”) for visibility.
- Publish the label to relevant users or groups.
Step 2: Publish the label
You can apply the label either automatically or manually. In this scenario, we’ll use manual labeling.
- In the Purview Compliance Portal, go to:
Information Protection > Policies > Label publishing policies > Publish label. - Select the sensitivity label: Highly Confidential – No Copilot Processing.
- Publish it to the required users and groups.
- Give your policy a clear name, e.g., No Copilot Processing Policy.
- Train users to apply the label manually using the Sensitivity button in Word, Excel, or PowerPoint.
Step 3: Create a DLP Policy
This policy blocks Copilot from accessing labeled content.
- In Purview, go to Data Loss Prevention > Policies.
- Click + Create policy.
- Choose Custom policy.
- Apply it to Microsoft 365 Copilot.
- Create a rule:
- Condition: Content contains > Sensitivity Labels > Highly Confidential – No Copilot Processing.
- Action: Restrict Copilot from processing the content.
- Enable alerts to notify administrators of violations.
Step 4: Test and Monitor
Make sure your setup works as expected.
- Create a Word file with salary content.
- Apply the label manually (if auto‑labeling is not enabled).
- Ask Copilot to summarize or retrieve it.
See It in Action: Sensitivity Label Restricting Copilot
Expected result:
Copilot blocks access and displays a policy tip.
Conclusion
By combining sensitivity labels with DLP policies, you can ensure that your most sensitive files remain off‑limits to Copilot in this case, Microsoft Copilot. This approach not only protects confidential data but also builds trust in how AI is used across your organization.
Discover more from Blogs | Saied Taki
Subscribe to get the latest posts sent to your email.

