Microsoft AI researchers expose 38TB of sensitive data via SAS tokens2023-09-18 18:34 by Daniela
Microsoft artificial intelligence (AI) researchers accidentally exposed 38 terabytes of sensitive data, including private keys and passwords, according to Wiz.
The error occurred when a Microsoft employee published open-source training data to a company GitHub repository providing Open-source code and AI models for image recognition from Microsoft's AI research division. Users were instructed to download the data from a misconfigured link that instead allowed access to 38TB of internal data, including 30,000 internal Microsoft Teams messages from 359 Microsoft employees, passwords to Microsoft services, and secret keys.
Microsoft Azure Storage allows customers to share their data with others using (Shared Access Signature) SAS tokens. However, it doesn't provide any way to manage the SAS tokens easily and they are difficult to track. Microsoft's researcher accidentally published the SAS token for their Azure Storage account which led to this huge leak. It is highly recommended to avoid using SAS token for external sharing as they can easily go unnoticed and expose sensitive data.
Wiz said it shared its findings with Microsoft on June 22, and Microsoft revoked the SAS token two days later on June 24. Microsoft said it completed its investigation on potential organizational impact on August 16.
Read more -here-