Microsoft has downplayed the issue in official communications, stating that the summaries of the confidential emails were not exposed to anyone that did not already have access to the messages in ...
A Microsoft 365 Copilot bug allowed the AI assistant to read confidential emails despite Data Loss Prevention policies designed to protect sensitive information.
Microsoft 365 Copilot bug let AI summarize confidential emails despite Data Loss Prevention rules. What it means for your data and safety.
The company says it has addressed the issue and it "did not provide anyone access to information they weren't already ...
Microsoft said the bug meant that its Copilot AI chatbot was reading and summarizing paying customers' confidential emails, bypassing data protection policies.
Microsoft expands DLP controls to prevent Copilot from processing confidential Office files across local devices, SharePoint, and OneDrive.
Microsoft's (MSFT) AI chatbot service Copilot has become key to its AI strategy, but the effort to build it up as a ChatGPT alternative has been going tough, the Wall Street Journal reported.
Windows 11 was born in enshittification, but the rise of Copilot and Microsoft's AI ambitions has only made matters worse.
If Copilot fails to process the file, the document may not load at all. In some cases, Copilot reportedly references ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results