A newly identified vulnerability in Microsoft 365 Copilot, tracked as CVE‑2025‑32711 with a CVSS score of 9.3, poses a serious risk to organizational data security. This flaw referred to as “EchoLeak” enables attackers to gain access to private content such as emails, documents, and chat histories without any user action. Because the exploit requires no clicks or input from the user, it is especially dangerous and difficult to detect. The discovery raises renewed concerns about the security implications of embedding AI technologies into core business tools.
How the Attack Works
- No clicks required: Malicious actors can exploit AI-driven workflows in Microsoft 365 Copilot to exfiltrate emails, documents, and internal communications silently.
- No malware needed: Unlike traditional phishing or ransomware, this attack leverages AI API manipulation to extract data without deploying malicious code.
- Bypasses MFA & Email Filters: Because the attack originates from legitimate AI queries, it evades multi-factor authentication (MFA) and email security gateways.
Affected Systems
- Microsoft 365 Copilot Enterprise users
- SharePoint, Outlook, and Teams-integrated workflows
- Businesses relying on AI-assisted document processing
Recommendations:
- Disable Copilot’s external data access until Microsoft releases a patch.
- Audit AI permissions to restrict Copilot’s access to sensitive repositories.
- Educate employees on AI-based social engineering tactics.
Tighten email andattachment security .Enforce stronger filtering for external emails especiallythose containing attachments or links.