Microsoft says Copilot was summarizing confidential emails without permission
A bug in Microsoft 365 and Copilot has been causing the AI assistant to summarize emails that were explicitly labeled as confidential, according to a report from Bleeping Computer. The Copilot security bug reportedly bypassed organizations' data loss prevention (DLP) policies, which are used to protect sensitive information.
The bug specifically affected Copilot Chat. According to Microsoft's documentation, it caused emails with a confidential label to be "incorrectly processed by Microsoft 365 Copilot chat.”
For context...