Microsoft Blames Bug For Copilot Exposing Confidential Emails In Summaries
According to Microsoft, "A code issue is allowing items in the sent items and draft folders to be picked up by Copilot (Chat) even though confidential labels are set in place, and this functionality is unintended." It's a bad look for Microsoft, especially for those who are staunchly against AI-integration at the OS level or are aware of Copilot exploits used to steal data.

Even for fans of Copilot, it's disappointing to see this supposedly cutting-edge technology fail to adhere to basic security policies. Fortunately, elements of the story suggest that only a limited number of users of Microsoft 365 are affected, and the functionality has only been present since September of last year, minimizing the potential real-world impact of the problem. The bug itself was only reported on January 21st as CW1226324, and if Microsoft's fix which began rolling out in early February definitely resolves the issue, this issue was only present for a few weeks.
But per the BleepingComputer coverage, there is no finalized timeline for full remediation or disclosure of how many users or organizations were impacted. Because of that, there is still a chance that the issue is ongoing. As Copilot's integration with Windows and other Microsoft products becomes more deeply-rooted, the importance of Microsoft ironing out these bugs and security problems is critical. Significant investments have been made in Copilot and other AI services, but if users turn on the technology, widespread rejection of it could see Copilot go the way of Cortana.