Copilot reviews users' private emails, bypassing protection—Microsoft has acknowledged this as a mistake.

Copilot reviews users' private emails, bypassing protection—Microsoft has acknowledged this as a mistake.

13 hardware

Microsoft discovered a bug in Copilot that was summarizing confidential emails

According to the company’s internal notice, a bug in Microsoft 365 Copilot began at the end of January causing the AI assistant to automatically summarize users’ email content. This bypassed data loss prevention (DLP) policies that organizations use to protect sensitive information.

What exactly went wrong
* Bug number CW1226326 was logged on January 21 and affects the chat function in Copilot’s work tab.
* As a result, the algorithm incorrectly read emails located in the “Sent” and “Drafts” folders, including messages marked as confidential (which are specifically restricted from automated tools).
* Microsoft stated: *“Emails marked ‘confidential’ are incorrectly processed by Microsoft 365 Copilot Chat.”* It also emphasizes that the AI summarizes emails even when a confidentiality label or DLP policy is present.

How Microsoft is responding
* Later, the company confirmed that the cause was a coding error and began working on a patch in early February.
* As of mid‑week, Microsoft reported that it continues to monitor the rollout of the fix and is contacting affected users to assess update quality.

What remains unknown
Microsoft has not yet specified when the bug will be fully resolved or disclosed how many users or organizations were impacted. The report notes that as the investigation proceeds, the scope of the issue may change.

Comments (0)

Share your thoughts — please be polite and stay on topic.

No comments yet. Leave a comment — share your opinion!

To leave a comment, please log in.

Log in to comment