Microsoft AI Researchers Unintentionally Expose 38TB of Sensitive Data, Including Passwords and Secret Keys
In 2023, the issue of cybersecurity is a major worry for individuals and businesses alike. Surprisingly, even large technology companies are not immune to the intrusion of hackers and cybercriminals. Recently, a significant amount of data, approximately 38TB, was unintentionally leaked online. Despite the implementation of various data protection measures by companies to prevent external threats, this particular data breach reportedly involved Microsoft employees. Let’s delve into the details of what occurred.
Big data leak
In a blog post, Microsoft announced that researchers at cloud security company Wiz discovered that researchers from Microsoft’s AI division accidentally leaked 38 TB of data while contributing to a GitHub repository where open-source AI models were developed. Microsoft has emphasized that customer data or other internal services are not compromised, and no customer intervention is required. However, nearly 000 internal Microsoft Teams messages, secret keys, passwords for Microsoft services and other data were involved in the big data leak.
How did the leak happen?
According to Wiz’s Coordinated Vulnerability Disclosure (CVD) report, the data breach involved a Microsoft employee who accidentally shared the blob store URL while contributing to the public GitHub repository for developing open-source AI models. This URL contained a Microsoft Azure feature called Shared Access Signature (SAS) for an internal storage account. “Like other secrets, SAS credentials should be properly created and managed,” Microsoft said.
While SAS links usually include access to only a certain number of files, this link was configured to give access to the entire account. It also granted “full control” permissions, allowing a user to modify the entire account’s content, rather than allowing read-only access. Access to internal storage was accidentally included in the blob URL, it contained backups of the workstation profiles of two former Microsoft employees, including their passwords, and thousands of Teams messages with colleagues.
The research team accessed this account using a SAS ID, and this massive security issue was then reported to the Microsoft Security Response Center (MSRC). After that, all external access to the storage account was revoked.
Microsoft said: “Further investigation was then conducted to understand the potential impact to our customers and/or business continuity. Our investigation concluded that this exposure did not pose a risk to customers.”