MicrosoftのAI研究部門が2020年7月にオープンソースのAI学習モデルをGitHubのリポジトリに公開した際に38TBにおよぶ機密データを漏えいしていたことを、クラウドセキュリティ企業のWizが公表しました。機密データにはパスワードや秘密鍵、3万件を超えるMicrosoft ...
Microsoftの研究者が人工知能(AI)技術のオープンソースの学習モデルを開発中に、オブジェクトストレージサービス「Azure Blob Storage」の同社アカウントへのURLを誤ってソースコード共有サービス「GitHub」に公開していた。公開情報にはMicrosoft内部の ...
Microsoft indicated on Monday that it had revoked an overly permissioned Shared Access Signature (SAS) token, said to have exposed "38TB" of internal Microsoft data. The action comes after ...
Cloud security company Wiz has announced that 38TB of confidential data was leaked when Microsoft's AI research department published an open source AI learning model to a GitHub repository in July ...
Microsoft’s AI research team accidentally exposed 38 terabytes of private data through a Shared Access Signature (SAS) link it published on a GitHub repository, according to a report by Wiz research ...
The Microsoft AI research division accidentally leaked dozens of terabytes of sensitive data starting in July 2020 while contributing open-source AI learning models to a public GitHub repository.
White Hat Hackers Discover Microsoft Leak of 38TB of Internal Data Via Azure Storage Your email has been sent The Microsoft leak, which stemmed from AI researchers sharing open-source training data on ...
An overly permissive file-sharing link allowed public access to a massive 38TB storage bucket containing private Microsoft data, leaving a variety of development secrets — including passwords, Teams ...
一部の結果でアクセス不可の可能性があるため、非表示になっています。
アクセス不可の結果を表示する