The headline is misleading. It says that Microsoft will provide the key if asked, but the linked statement to Forbes says Microsoft will provide the key if it receives a valid legal order.
These have different meanings. Microsoft is legally entitled to refuse a request from law enforcement, and subject to criminal penalties if it refuses a valid legal order.
It does illustrate a significant vulnerability in that Microsoft has access to user keys by default. The public cannot be sure that Microsoft employees or criminals are unable to access those keys.
Nah, you’re just not reading carefully. You must parse everything about this stuff carefully as the words are always crafted. It’s usually more productive to read with a goal to understand what isn’t said as opposed to what is said.
They said “legal order”, which includes a variety of things ranging from administrative subpoenas to judicial warrants. Generally they say warrant if that was used.
A “request” is “Hi Microsoft man, would you please bypass your process and give me customer data?” That doesn’t happen unless it’s for performative purposes. (Like when the FBI was crying about the San Bernardino shooter’s iPhone) Casual asks are problematic for police because it’s difficult to use that information in court.
What exactly was requested sounds fishy as the article states that Microsoft only gets 20 a year, and is responsive to 9 or fewer requests. Apple seems to get more and typically is more responsive. (https://www.apple.com/legal/transparency/us.html)
The other weird thing is that the Microsoft spokesman named in the Forbes article is an external crisis communications consultant. Why an use external guy firewalled from the business for what is a normal business process?
> Microsoft is legally entitled to refuse a request from law enforcement, and subject to criminal penalties if it refuses a valid legal order.
This is a problem, because Microsoft operates in a lot of jurisdictions, but one of them always wants to be the exception and claims that it has jurisdiction over all the others. Not that I personally am of the opinion, that it is wise for the other jurisdiction to trust Microsoft, but if MS wants to secure operating in the other jurisdiction it needs to separate itself from that outsider.
Note that they say "legal order" not, specifically, "warrant". Now remember that government agencies have internal memos instructing them that no warrants are needed for them to do things like the 4th amendment, stop citizens, detain citizens, "arrest" citizens, etc.
Exactly. The discussion should center on the fact that Microsoft's shift was a contingency, not a technical necessity. It cannot have escaped them that their design choices create a legal point of entry for data requests that they are then obligated to fulfill, which would not have been the case with proper end-to-end encryption; in that case they would have told authorities that they simply cannot fulfill these requests.
Crucially, the headline says Microsoft will provide the key if asked by the FBI, which implies a state entity with legal power that extends beyond a typical person's assumptions of "rule of law" and "due process," let alone ethics.
Yes, "asked" versus "ordered" is meaningfully misleading, especially in this context.
There is reasonable suspicion, some might argue evidence, that Microsoft voluntarily cooperated with U.S. Intelligence Community without being compelled by a court order, the most famous instances being leaked in the Snowden disclosures.
To be fair to Microsoft, here's their updated statement (emphasis mine):
"Microsoft confirmed to Forbes that it does provide BitLocker recovery keys if it receives a valid legal order. “While key recovery offers convenience, it also carries a risk of unwanted access, so Microsoft believes customers are in the best position to decide... how to manage their keys,” said Microsoft spokesperson Charles Chamberlayne."
You’ve overly simplified the degree to which a company must accept a court order without pushback.
First they are capable of fulfilling the request in the first place which means their approach or encryption is inherently flawed. Second companies can very much push back on such requests with many examples of such working, but they need to make the attempt.
I don't think it's reasonable to expect businesses to spend money fighting court orders for customer data, especially if the orders are more or less reasonable.
They do seem to be reasonable in the case that brought about this reporting, with substantial evidence that the suspects committed fraud and that evidence is on the devices in question.
Never means the specifics are irrelevant, you’re making the sad argument on the worst possible case and the best one.
So why should customers entrust their data to the company? It’s a transactional relationship and the less you do the less reason someone has to pay you.
Further, our legal system is adversarial it assumes someone is going to defend you. Without that there’s effectively zero protection for individuals.
People shouldn't entrust highly sensitive data to third parties who aren't highly motivated to protect it. That means different things in different situations, but if you're likely to be investigated by the FBI, don't give Microsoft the encryption keys to your laptop.
As many, many people have pointed out -- many people don't know that their drives are encrypted or know that these protections exist. You're also assuming that the FBI doesn't investigate just random people. "I'm not doing anything bad, why should I worry?"
You're making a lot of assumptions about how people use their computers, their understanding of their own devices, and the banality of building argumentation around what someone should have done or should not have done in the face of how reality works.
I am not assuming the FBI doesn't investigate random people. I am, however assuming that the FBI does not randomly seize computers and obtain court orders demanding encryption keys for them from Microsoft. Unless Microsoft is lying, that happens about 20 times a year.
One of the privacy protections is simply that it's a lot of work to go through that process. The FBI wouldn't have the resources to do it to everyone it's merely curious about even if it had the authority, which it doesn't because warrants require probable cause.
I believe that it's generally acceptable that when law enforcement has probable cause for a search warrant, third parties grant them what access they reasonably can. I also believe people who actually want to protect their privacy and security should learn fundamentals like whoever has the key can unlock it and if nobody has the key, it's gone forever. If I was building a consumer product, I'd have to care quite a bit about the fact that many people won't do that, but I'm not so I don't.
Heh, I subpoena'd Microsoft once in part of some FOIA litigation I did against the White House OMB back in 2017. They, in no unclear terms, denied it. We were seeking documentation.
I realize it's not a court order, but just want to add to the stack that there are examples of them being requested to provide something within the public's interest in a legal context (a FOIA lawsuit) where their counsel pushed back by saying no.
I would guess that the FBI never asks Microsoft for encryption keys without a valid legal order because it knows Microsoft will demand one, and because the FBI rarely has possession of suspect devices without a warrant to search for them and obtain their contents.
It could be a bigger obstacle for other agencies. CBP can hold a device carried by someone crossing the border without judicial oversight. ICE is in the midst of a hiring surge and from what I've read lately, has an abbreviated screening and training process likely not matching the rigor of the FBI. Local law enforcement agencies vary greatly.
It’s immensely misleading. At least with a valid legal order we are still living by rule of law. With the recent actions I can’t say ICE is acting by rule of law.
Broader context isWindows defaults to making their access to your data legally accessible. Their entire windows platform and one drive defaults to this insecurity
Inlight of fascism coming to Democratic cities and anyone documenting it being a registered domestic terrorist...well thats pretty f'n insecure by default.
That's a distinction without a difference. Microsoft should structure Windows such that they're unable to comply with such an order, however legal. There are practical cryptographic ways to do it: Microsoft just doesn't want to. Shame on them.
> The headline is misleading. It says that Microsoft will provide the key if asked, but the linked statement to Forbes says Microsoft will provide the key if it receives a valid legal order.
This is an odd thing to split hairs over IMO. Warrants or subpoenas or just asking nicely, whatever bar you want to set, is a secondary concern. The main issue is they can and will hand the keys to LEO’s at all.
If you don’t like the behavior of a company voluntarily doing something, your problem is with that company. If you don’t like a company complying with the law, your problem is with the law. It is unreasonable to expect anyone or any company to break the law or violate a court order to protect you.
If you don’t trust the institutions issuing those court orders, that is an entirely reasonable stance but it should be addressed at its root cause using our democratic process, however rapidly eroding that process may seem to be.
The fourth amendment protects against warrantless search and seizure, it is not carte blanche to fill up your hard drive with child porn and expect Microsoft to fall on their swords to protect you.
> The fourth amendment protects against warrantless search and seizure, it is not carte blanche to fill up your hard drive with child porn and expect Microsoft to fall on their swords to protect you.
I was understanding and felt your points had validity until you threw out this gross, emotionally manipulative, horrible misrepresentation of my stance.
I appreciate the sentiment and do think most people should know not to trust Microsoft by this point, but I do think we have to be a little careful not to steer too hard into caveat emptor and forget who the perpetrators are in the first place.
I hate MS as much as anyone else, but I don't have a problem with them doing this. Legally they have to comply if they have evidence in a legal action. Maybe they are at fault for not solely relying on the TPM, or not giving users informed consent about using the cloud, but I cannot fault them for not going to battle for civil liberties when they can't even implement notepad without screwing it up.
The latter is not news, it's the way it has been for quite some time, not just for IT providers, but for businesses in general.
If you are running any kind of service, you should learn how warrants work in the country you are hosting in, come the time, if your service grows, eventually you will have to comply with an order.
If you want anything else you will have to design your system such that you can't even see the data, ala Telegram. And even then, you will get into pretty murky waters.
Microsoft is legally entitled to refuse absent a warrant, but generally all it takes is a phone call from the FBI to get big tech to cough up any authenticating info they actually have.
Beyond the crypto architecture debate, I don't really understand how could anyone imagine a world where MS could just refuse such a request. How exactly would we draft laws to this effect, "the authorities can subpoena for any piece of evidence, except when complying to such a request might break the contractual obligations of a third party towards the suspect"?
Do we really, really, fully understand the implications of allowing for private contracts that can trump criminal law?
They could just ask before uploading your encryption key to the cloud.
Instead they force people to use a Microsoft Account to set up their windows and store the key without explicit consent
That's a crypto architecture design choice, MS opted for the user-friendly key escrow option instead of the more secure strong local key - that requires a competent user setting a strong password and saving recovery codes, understanding the disastrous implication of a key loss etc.
Given the abilities of the median MS client, the better choice is not obvious at all, while "protecting from a nation-state adversary" was definitely not one of the goals.
While you're right, they also went out of their way to prevent competent users from using local accounts and/or not upload their BitLocker keys.
I could understand if the default is an online account + automatic key upload, but only if you add an opt-out option to it. It might not even be visible by default, like, idk, hide it somewhere so that you can be sure that the median MS user won't see it and won't think about it. But just fully refusing to allow your users to decide against uploading the encryption key to your servers is evil, straight up.
I really doubt those motives are "evil." They're in the business of selling and supporting an OS. Most people couldn't safeguard a 10-byte password on their own, they're not going to have a solution for saving their encryption key that keeps it safer than it'd be with Microsoft, and that goes for both criminals (or people otherwise facing law enforcement scrutiny) and normal grandmas who just want to not have all their pictures and recipes lost.
Before recently, normal people who get arrested and have their computer seized were 100% guaranteed that the cops could read their hard drive and society didn't fall apart. Today, the chances the cops can figure out how to read a given hard drive is probably a bit less. If someone needs better security against the actual government (and I'm hoping that person is a super cool brave journalist and not a terrorist), they should be handling their own encryption at the application layer and keeping their keys safe on their own, and probably using Linux.
The OOBE (out of box experience) uploads the key by default (it tells you it’s doing it, but it’s a bit challenging to figure out how to avoid it) but any other setup method specifically asks where to back up your key, and you can choose not to. The way to avoid enrollment is to enable Bitlocker later than OOBE.
I really think that enabling BitLocker with an escrowed key during OOBE is the right choice, the protection to risk balance for a “normal” user is good. Power users who are worried about government compulsion can still set up their system to be more hardened.
The last time I've installed windows, bitlocker was enabled automatically and the key was uploaded without my consent.
Yes, you can opt out of it while manually activating bitlocker, but I find it infuriating that there's no such choice at the system installation process. It's stupid that after system installation a user supposed to renecrypt their system drive if they don't want this.
If they honestly informed customers about the tradeoff between security and convenience they'd certainly have far fewer customers. Instead they lead people to believe that they can get that convenience for free.
> tradeoff between security and convenience they'd certainly have far fewer customers
What? Most people, thinking through the tradeoff, would 100% not choose to be in charge of safeguarding their own key, because they're more worried about losing everything on their PC, than they are about going to jail. Because most people aren't planning on doing crime. Yes, I know people can be wrongly accused and stuff, but overall most people aren't thinking of that as their main worry.
If you tell people, "I'll take care of safeguarding your key for you," it sounds like you're just doing them a favor.
It would be more honest to say, "I can hold on to a copy of your key and automatically unlock your data when we think you need it opened," but that would make it too obvious that they might do so without your permission.
It makes sense if you consider the possibility of a secret deal between the government and a giant corporation. The deal is that people's data is never secure.
The alternative is just not having FDE on by default, it really isn't "require utterly clueless non-technical users to go through complicated opt-in procedure for backups to avoid losing all their data when they forget their password".
And AFAICT, they do ask, even if the flow is clearly designed to get the user to back up their keys online.
No, encryption keys should never be uploaded to someone else's computer unencrypted. The OOBE should give users a choice between no FDE or FDE with a warning that they should not forget their password or FDE and Microsoft has their key and will be able to recover their disk and would be compelled to share the key with law enforcement. By giving the user the three options with consequences you empower the user to address their threat model how they see fit. There is no good default choice here. The trade offs are too varied.
Always on FDE with online backups is a perfectly reasonable default. The OOBE does offer the users the choice to not back up their key online, even if it's displayed less prominently.
>By giving the user the three options with consequences you empower the user to address their threat model how they see fit.
Making it too easy for uneducated users to make poor choices is terrible software design.
> The alternative is just not having FDE on by default
yes, it would be. So, the current way, 99% of people are benefitting from knowing their data is secure when very common thefts occur, and 1% of people have the same outcome as if their disk was unencrypted: When they're arrested and their computers seized, the cops have their crime secrets. What's wrong?
Disagree. If the path is shrouded behind key presses and commands which are unpublished by MS (and in some instances routes that have been closed), it may as well be.
Im going to shoot you unless you say the magic word - and technically Im not even forcing you into it, you could have said the magic word and got out of it!! Whats the magic word? not telling!
Anyway Microsoft and any software developer can be compelled to practically do anything, you don't want to be blocked in some jurisdictions (even less the US) and the managers do not want to go to jail to protect a terrorist, especially if nobody is going to know that they helped.
Some even go that far that they push an update that exfiltrates data from a device (and some even do on their own initiative).
And even if you are not legally compelled. Money or influence can go a long way. For example, the fact that HTTPS communications were decipherable by the NSA for almost 20 years, or, whoops, no contract with DoD ("not safe enough"...)
Once the data is in the hands of the intelligence services, from a procedure perspective they can choose what to do next (e.g. to officialize this data collection through physical collection of the device, or do nothing and try to find a more juicy target).
It's not in the interest of anyone to prevent such collection agreement with governments. It's just Prism v2.
So seems normal that Microsoft gives the keys, the same that Cloudflare may give information about you and the others. They don't want to have their lives ruined for you.
Microsoft killed local accounts in Windows 11 and made this the default path by users: Your private encryption keys are sent to Microsoft in a way that requires no other keys. This is a failure and doesn't happen on systems like LUKS. I understand Microsoft wants to be able to look nice and unlock disks when people forget their passwords, but doing so allows anyone to exploit this. Windows systems and data are more vulnerable because of this tradeoff they made.
> How exactly would we draft laws to this effect, "the authorities can subpoena for any piece of evidence, except when complying to such a request might break the contractual obligations of a third party towards the suspect"?
Perhaps in this case they should be required to get a warrant rather than a subpoena?
Encrypt the BL key with the user's password? I mean there are a lot of technical solutions besides "we're gonna keep the BL keys in the clear and readily available for anyone".
For something as widely adopted as Windows, the only sensible alternative is to not encrypt the disk by default.
The default behavior will never ever be to "encrypt the disk by a key and encrypt the key with the user's password." It just doesn't work in real life. You'll have thousands of users who lost access to their disks every week.
It works for macOS. Filevault key is encrypted by user password. User login screen is shown early in boot process, so that Filevault is able to decrypt data and continue boot process. It sure works fine for a about a decade. No TPM nonsense required. Imo, the TPM based key only makes sense for unattended systems such as servers.
While this is true, why even bother turning on encryption and making it harder on disk data recovery services in that case?
Inform, and Empower with real choices. Make it easy for end users to select an alternate key backup method. Some potential alternatives: Allow their bank to offer such a service. Allow friends and family to self host such a service. Etc.
This is a bit tricky as it couples the user's password with the disk encryption key. If a user changes the password they would then need to change the encryption key, or remember the previous (possibly compromised) password. A better option is to force the user to record a complex hash, but that's never going to be user friendly when it comes to the average computer user.
Basically, we need better education about the issue, but as this is the case with almost every contentious issue in the world right now, I can't imagine this particular issue will bubble to the top of the awareness heap.
The system handles these changes for the user automatically. The disk key is encrypted by user password, when user changes the password, the system completes disk key rollover automatically. Which means it will decrypt key with old password and then encrypt key with new password.
I thought this was what happened. Clearly not :( That’s the idea with services like 1Password (which I suppose is ultimately doing the same thing) - you need both the key held on the device and the password.
I suppose this all falls apart when the PC unlock password is your MS account password, the MS account can reset the local password. In Mac OS / Linux, you reset the login password, you loose the keychain.
At this point, end-to-end encryption is a solved problems when password managers exist. Not doing it means either Microsoft doesn't care enough, or is actually interested on keeping it this way
I wouldn't call the problem "solved" just because of password managers.
Password managers shift the paradigm and the risk factors. In terms of MFA, a password in your manager is now "something you have" rather than "something you know". The only password I know nowadays is my sign-in password that unlocks the password manager's vault. So the passwords to my bank, my health care, my video games are no longer "in my fingers" or in my head anymore, they're unknown to me!
So vault management becomes the issue rather than password management. If passwords are now "something you have" then it becomes possible to lose them. For example, if my home burns down and I show up in a public library with nothing but the clothes on my back, how do I sign into my online accounts? If the passwords were in my fingers, I could do this. But if they require my smartphone to be operational and charged and having network access, and also require passwords I don't know anymore, I'm really screwed at that library. It'd be nearly impossible for me to sign back in.
So in the days of MFA and password managers, now we need to manage the vaults, whether they're in the cloud or in local storage, and we also need to print out recovery codes on paper and store them securely somewhere physical that we can access them after a catastrophe. This is an increase in complexity.
So I contend that password managers, and their cousins the nearly-ubiquitous passkeys, are the main driving factor in people's forgetting their passwords and forgetting how to sign-in now, without relying on an app to do it for them. And that is a decrease in opsec for consumers.
Sure that's valid, they do need to conply with legal orders. But they don't need to store bitlocker keys in the first place, they only need to turn over data they actually have.
I don't think that many people here are naive enough to believe that any business would fight the government for the sake of its customers. I think most of us are simply appalled by this blatantly malicious behavior. I'm not buying all these "but what if the user is an illiterate, senile 90-year-old with ADHD, huh?" attempts to rationalize it away. it's the equivalent of the guy who installed your door keeping a copy of your keys by unspoken default - "what if your toddler locks himself out, huh?"
I know the police can just break down my door, but that doesn't mean I should be ok with some random asshole having my keys.
This is being reported on because it seems newsworthy and a departure from the norm.
Apple also categorically says they refuse such requests.
It's a private device. With private data. Device and data owned by the owner.
Using sleight of hand and words to coax a password into a shared cloud and beyond just seems to indicate the cloud is someone else's computer, and you are putting the keys to your world and your data insecurely in someone else's computer.
Should windows users assume their computer is now a hostile and hacked device, or one that can be easily hacked and backdoored without their knowledge to their data?
The Bernardino incident is a very different issue where Apple refused to use its own private key to sign a tool that would have unlocked any iPhone. There is absolutely no comparison between Apple's and MS conduct here because the architectures of the respective systems are so different (but of course, that's a choice each company made).
Should Apple find itself with a comparable decryption key in its possession, it would have little options but to comply and hand it over.
> Apple refused to use its own private key to sign a tool that would have unlocked any iPhone.
This is a misrepresentation of what actually happened: the FBI even argued that they would accept a tool locked to the specific device in question so as to alleviate this concern.
This is still forced labor/creative work/engineering work/speech and not okay, but it was not a "master key."
Firstly, Apple does not refuse such requests. In fact, it was very widely publicized in the past couple of weeks that Apple has removed Advanced Data Protection for users in the UK. So while US users still enjoy Advanced Data Protection from Apple, UK users do not.
It is entirely possible that Apple's Advanced Data Protection feature is removed legally by the US as well, if the regime decides they want to target it. I suspect there are either two reasons why they do not: Either the US has an additional agreement with Apple behind the scenes somewhere, OR the US regime has not yet felt that this was an important enough thing to go after.
There is precedent in the removal, Apple has shown they'll do the removal if asked/forced. What makes you think they wouldn't do the same thing in the US if Trump threatened to ban iPhone shipments from China until Apple complied?
The options for people to manage this stuff themselves are extremely painful for the average user for many reasons laid out in this thread. But the same goes for things like PGP keys. Managing PGP keys, uploading to key servers, using specialized mail clients, plugging in and unplugging the physical key, managing key rotation, key escrow, and key revocation. And understanding the deep logic behind it actually requires a person with technical expertise in this particular solution to guide people. It's far beyond what the average end user is ever going to do.
That was before Tim Cook presented Donald Trump with a gold and glass plaque along with a Mac Pro.
We live in far different times these days. I have no doubt in my mind that Apple is complying 100% with every LE request coming their way (not only because of the above gesture, but because it's actually the law)
> don't really understand how could anyone imagine a world where MS could just refuse such a request
By simply not having the ability to do so.
Of course Microsoft should comply with the law, expecting anything else is ridiculous. But they themselves made sure that they had the ability to produce the requested information.
Right, Microsoft have the ability to recover the key, because average people lose their encryption keys and will blame Microsoft if they can't unlock their computer and gain access to their files. BitLocker protects you from someone stealing your computer to gain access to your files, that's it. It's no good in a corporate setting or if you're worried about governments spying on you.
I'm honestly not entirely convinced that disk encryption be enabled by default. How much of a problem was stolen personal laptops really? Corporate machine, sure, but leave the master key with the IT department.
> Do we really, really, fully understand the implication of allowing private contracts that trump criminal law?
...it's not that at all. We don't want private contracts to enshrine the same imbalances of power; we want those imbalances rendered irrelevant.
We hope against hope that people who have strength, money, reputation, legal teams, etc., will be as steadfast in asserting basic rights as people who have none of those things.
We don't regard the FBI as a legitimate institution of the rule of law, but a criminal enterprise and decades-long experiment in concentration of power. The constitution does not suppose an FBI, but it does suppose that 'no warrant shall issue but upon probable cause... particularly describing the place to be searched, and the persons or things to be seized' (emphasis mine). Obviously a search of the complete digital footprint and history of a person is not 'particular' in any plain meaning of that word.
...and we just don't regard the state as having an important function in the internet age. So all of its whining and tantrums and pepper spray and prison cells are just childish clinging to a power structure that is no longer desirable.
I think legally the issue was adjudicated by analogy to a closed safe: while the exact contents of the safe is unknown beforehand, it is reasonable it will contain evidence, documents, money, weapons etc. that are relevant, so if a warrant can be issued in that case compelling a locksmith to open it, then by analogy it can be issued against an encrypted device.
Without doubt, this analogy surely breaks down as society changes to become more digital - what about a Google Glass type of device that records my entire life, or the glasses of all people detected around me? what about the device where I uploaded my conscience, can law enforcement simply probe around my mind and find direct evidence of my guilt? Any written constitution is just a snapshot of a social contract at a particular historical time and technological development point, so it cannot serve as the ultimate source of truth regarding individual rights - the contract is renegotiated constantly through political means.
My question was more general: how could we draft that new social contract to the current age, how could we maintain the balance where the encrypted device of a suspected child predator and murderer is left encrypted, despite the fact that some 3rd party has the key, because we agreed that is the correct way to balance freedoms and law enforcement? It just doesn't sound stable in a democracy, where the rules of that social contract can change, it would contradict the moral intuitions of the vast majority.
> so if a warrant can be issued in that case compelling a locksmith to open it, then by analogy it can be issued against an encrypted device.
But it isn't a warrant, it's a subpoena. Also, the locksmith isn't the one compelled to open it; if the government wants someone to do that they have to pay them.
> Any written constitution is just a snapshot of a social contract at a particular historical time and technological development point, so it cannot serve as the ultimate source of truth regarding individual rights - the contract is renegotiated constantly through political means.
The Fourth Amendment was enacted in 1791. A process to change it exists, implying that the people could change it if they wanted to, but sometimes they get it pretty right to begin with. And then who are these asshats craving access to everyone's "papers and effects" without a warrant?
Actual freedom starts with freedom of thought which requires spaces that you can truly believe are safe. The push for the surveillance world is rapidly eroding the places someone can not only be safe to think but feel safe to think in. The 'feel safe' is deeply important here. The arguments of 'if you have nothing to hide' do not make anyone feel safe, they do the opposite and they chill free thought.
The second, very clear, argument is that the state can't be trusted in the long run. Period. Maybe you love your elected officials today but tomorrow they could be actively out to harm you. Every tool we allow the state to use needs to be viewed with this level of extreme skepticism and even very clear benefits need to be debated vigorously.
Encryption, and technologies like it, may allow hiding criminal activity but they also provide people a sense of security to think freely and stave off political power grabs. We recognize the fundamental right to free speech and give great latitude to it even when it is harmful and hateful, we need to recognize the fundamental right to free thought and recognize that encryption and similar tools are critical to it.
Exactly! I agree about feeling free to think is important. I am a legal immigrant here on the green card, and I was randomly looking at my iCloud photos, and there were two of them where I was wearing a 2024 elections t-shirt of the losing side. The t-shirt was given to me as a gag gift, and I just had taken a picture of it to show it to the sender for giggles.
Now looking at this old image. I had second thoughts. What if on the border crossing some officer sees a t-shirt and doesn't agree with it? Maybe I should delete the image. And it's not the first time I want to go post something online, but I've stopped myself. What if it comes back and bites me? Even though it might be an innocuous tweet, nothing egregious, but I just don't want to engage. And this is how freedom goes. This feels as bad as it was growing up in the Soviet Union.
I’m not trying to defend Microsoft, but I think people are being a bit dramatic. It's a fairly reasonable default setting for average users who simply want their data protected from theft. On the other hand, users should be able to opt out from the outset, and above all, without having to fiddle with the manage-bde CLI or group policy settings.
With Intel Panther Lake (I'm not sure about AMD), Bitlocker will be entirely hardware-accelerated using dedicated SoC engines – which is a huge improvement and addresses many commonly known Full Disk Encryption vulnerabilities. However, in my opinion some changes still need to be made, particularly for machines without hardware acceleration support:
- Let users opt out of storing recovery keys online during setup.
- Let users choose between TPM or password based FDE during setup and let them switch between those options without forcing them to deal with group policies and the CLI.
- Change the KDF to a memory-hard KDF - this is important for both password and PIN protected FDE. It's 2026 - we shouldn't be spamming SHA256 anymore.
- Remove the 20 char limit from PIN protectors and make them alphanumerical by default. Windows 11 requires TPM 2.0 anyway so there's no point in enforcing a 20 char limit.
- Enable TPM parameter encryption for the same reasons outlined above.
> If you don’t think Intel put back doors into that then I fear for the future.
If that’s what you’re worried about, you shouldn’t be using computers at all. I can pretty much guarantee that Linux will adopt SoC based hardware acceleration because the benefits – both in performance and security – outweigh the theoretical risks.
When someone is arrested, the police can get a subpoena to enter your house, right?
There they can collect evidence regarding the case.
Digital protections should exist, but should they exist beyond what is available in the physical world? If so, why?
I think the wording of this is far too lenient and I understand the controversy of "if asked" vs "valid legal order", neither of which strictly say "subpoena", and of course, the controversy of how laws are interpreted/ignored in one country in particularly (yes, I'm looking at you USA).
Should there be a middle ground? Or should we always consider anything that is digital off-limits?
Crazier question: what’s wrong with a well-intentioned surveillance state? Preventing crime is a noble goal, and sometimes I just don’t think some vague notion of privacy is more important than that.
I sometimes feel that the tech community would find the above opinion far more outlandish than the general population would.
tl;dw: A well-intentioned surveillance state may, in fact, love the beings they are surveilling. They may fall in love so deeply, that they want to become like us. I know it's a revolutionary concept.
I don't understand this, it's actually baffling. Why was the question being asked to begin with let along a whole post being made about this? If they have a legal request from a law enforcement agency of any country they operate in, they either comply or see executives in prison.
Is how bitlocker works not well known perhaps? I don't think it's a secret. The whole schtick is that you get to manage windows computers in a corporate fleet remotely, that includes being able to lock-out or unlock volumes. The only other way to do that would be for the person using the device to store the keys somewhere locally, but the whole point is you don't trust the people using the computers, they're employees. If they get fired, or if they lose the laptop, them being the only people who can unlock the bitlocker volume is a very bad situation. Even that aside, the logistics of people switching laptops, help desk getting a laptop and needing to access the volume and similar scenarios have to be addressed. Nothing about this and how bitlocker works is new.
Even in the safer political climates of pre-2025, you're still looking at prosecution if you resist a lawful order. You can fight gag-orders, or the legality of a request, but without a court order to countermand the feds request, you have to comply.
Microsoft would do the same in China, Europe, middle east,etc.. the FBI isn't special.
Sure, I don't disagree but that isn't what this discussion is about. It's about a lawful publicized request. For microsoft, they don't need any leverages, they can just use a FISA order, they can force you to keep it a secret. Their leverage is federal prison.
If you are not typing in a passphrase or plugging in a device containing a key to unlock your disk then the secret exists somewhere else. Chances are that secret is available to others. The root issue here is that the user is not being made clearly aware of where the secret is stored and what third party(s) have access to it or reasonably might be able to get access to it.
These sorts of things should be very unsurprising to the people who depend on them...
Due to Third Party Doctrine, Microsoft doesn't even NEED a "legal order." It's merely a courtesy which they could change at any time.
Based on the sheer number of third parties we're required to use for our day to day lives, that is ridiculous and Third Party Doctrine should be eliminated.
Sure. You voluntarily use windows. You could use something else or nothing so you chose to use it. You are not compelled to use it by law. You are just strongly compelled by a small carrot and a large stick. The same applies to a smart phone BTW.
The default setting is a good mix of protecting people from the trouble they’re far more likely to run into (someone steals their laptop) while still allowing them back in if they forget their password. The previous default setting was no encryption at all which is worse in every case.
> Every bad day for microsoft is yet another glorious day for linux.
Nah. If that were the case, Linux would dominate personal computer statistics. The reality is that most mainstream users just don't care. But, of course, that won't stop us.
I would also argue that _what_ personal computing means to most people has also evolved, even with younger generations. My gen Z nephew the other day was faberglasted when he learned I use my Documents, Videos, Desktop folders, ect. He literally asked "What is the Documents folder even for?". To most people, stuff is just magically somewhere (the cloud) and when they get a new machine tbey just expect it all to be there and work. I feel like these cryptography and legality discussions here on HackerNews always miss the mark because we overestimate hiw much most people care. Speaking of younger generations, I also get the feeling that there isn't such a thing as "digital sovereignty" or "ownership", at least not by the same definitions we gen x and older millennials internalize those definitions.
Across the generations, there are always a few groups to where cryptographic ownership really matter, such as journalists, protesters, and so on. Here on HN I feel like we tend to over-geeneralize these use cases to everybody, and then we are surprised when most people don't actually care.
As my family's tech support department, i switched them over to linux long ago. For the last decade, my elderly parents used linux laptops and much prefered the stability.
As stated you can generate backup keys, but you can also associate more than one hardware token to your account. Which is what I do. I keep a separate yibikey in a lockbox off site as a break glass option.
They have a recovery sheet you can print. If you lose your key, you can use the recovery information on that piece of paper to regain access. You put the recovery information in a safe place.
That is also exactly why people like myself are so against passkeys, there are no offline recovery.
End-to-end usually means only the data's owner (aka the customer) holds the keys needed. The term most used across password managers and similar tools is "zero knowledge encryption", where only you know the password to a vault, needed to decrypt it.
There's a "data encryption key", encrypted with a hash derived of your username+master password, and that data encryption key is used locally to decrypt the items of your vault. Even if everything is stored remotely, unless the provider got your raw master password (usually, a hash of that is used as the "password" for authentication), your information is totally safe.
A whole other topic is communications, but we're talking decryption keys here
If tech companies implemented real, e2e encryption for all user data, there would be a huge outcry, as the most notable effect would be lots of people losing access to their data irrevocably.
I'm all for criticizing tech companies but it's pointless to demand the impossible.
Just say "we are storing your keys on our servers so you won't lose them" and follow that with either "do you trust us" or even "we will share this key with law enforcement if compelled". Would be fine. Let people make these decisions.
Besides, bit ocker keys are really quite hard to lose.
is it just me or would "Microsoft refuses to comply with a legal search warrant" be an actual, surprising news story? like of course MSFT is going to hand over to authorities whatever they ask for if there's a warrant, imagine if they didn't (hint: not good for business. their customers are governments and large institutions, a reputation for "going rogue" would damage their brand quite a bit)
For a long time, if you used full disk encryption, the encryption key never left your machine. If you forgot your password, the data was gone - tough luck, should have made a backup. That's still how it works on Linux.
Pretty surprising they'd back up the disk encryption secrets to the cloud at all, IMHO, let alone that they'd back it up in plaintext.
That's why full disk encryption was always a no-go for approximately all computer users, and recommending it to someone not highly versed in technology was borderline malicious.
"Tough luck, should have made a backup" is higher responsibility than securing anything in meatspace, including your passport or government ID. In the real world, there is always a recovery path. Security aficionados pushing non-recoverable traps on people are plain disconnected from reality.
Microsoft has the right approach here with Bitlocker defaults. It's not merely about UX - it's about not setting up traps and footguns that could easily cause harm to people.
Google Authenticator used to be disconnected from reality like this. Users were asking how to copy the codes to another phone, and they said "you can't, WAI, should add the other phone as a second auth method on every site." Like how people say you shouldn't copy SSH privkeys. I figured out an undocumented way to do it on iPhone by taking an encrypted iTunes backup though.
Eventually they yielded on this, but their later updates had other usability traps. Because Google Auth was the household name for TOTP apps, this maybe ruined TOTP's reputation early-on.
> should add the other phone as a second auth method on every site.
That's the problem right there. Migrating my phone recently (without having broken/bricked the previous one, which is somehow even worse wrt. transferring 2FA these days than getting new phone after old one breaks!), I discovered that most sites I used did not allow more than one authenticator app. If I try to add new phone as second-factor auth method, the website deletes the entry for the old phone.
I had hoped the average person would have a baseline understanding of how computers work by now. Baseline includes things like the difference between a web browser and a search engine, "the cloud" is someone else's computer, and encrypted means gone if you lose the password/key.
I am sad that this now appears unlikely. I suspect it may even be lower for people in their 20s today than a decade ago.
> Baseline includes things like the difference between a web browser and a search engine, "the cloud" is someone else's computer, and encrypted means gone if you lose the password/key.
One of these things is not like the other...
That's why I'm stressing the comparison to e.g. government documents: nothing in meatspace requires regular people to show anywhere near as much conscientiousness as handling encryption keys.
Or: many people probably know, in the abstract, that "encrypted means gone if you lose the key", much like many people know slipping up while working on a HV line will kill you. Doesn't mean we should require everyone to play with them.
> Security aficionados pushing non-recoverable traps on people are plain disconnected from reality.
To be fair, if you inadvertently get locked out of your Google account "tough luck, should have used a different provider" and Gmail is a household name so ...
Less snarky, I think that there's absolutely nothing wrong with key escrow (either as a recovery avenue or otherwise) so long as it's opt in and the tradeoffs are made abundantly clear up front. Unfortunately that doesn't seem to be the route MS went.
Google will lock you out of an account even if you remember your password. This happened to me, when Google decided to use the recovery email address for 2FA, locking me out of my primary account. And the exact same change was made to my recovery account, at the same time. As for the recovery email of my recovery emails address, it was with a company that hadn't existed for over a decade, and no longer existed.
As long as the automated flow works everything is great. But if the music stops can you get in touch with a human to fix it? That applies not just to auth but pretty much all of their stuff. Plenty of horror stories have made it to the HN front page over the years.
I've had to get in touch with a human before for account recovery, it worked. Horror stories, idk. I hear horror stories about every single business I interact with, but then don't experience it myself.
Well, for a consumer notebook or mobile device, the threat model typically envisions a thief grabbing it from a coffeehouse or hotel room. So your key needs to be safeguarded from the opportunist who possesses your hardware illegally.
Linux can be fairly well-secured against state-level threat actors, but honestly, if your adversary is your own nation-state, then no amount of security is going to protect you!
For Microsoft and the other consumer-OS vendors, it is typically a bad user-experience for any user, particularly a paying subscriber, to lose access to their account and their cloud apps. There are many ways to try and cajole the naïve user into storing their recovery key somewhere safe, but the best way is to just do it for them.
A recovery key stored in the user's own cloud account is going to be secure from the typical threats that consumers will face. I, for one, am thankful that there is peace of mind both from the on-device encryption, as well as the straightforward disaster recovery methods.
The problem is mass-surveillance and dragnets. Obviously if the state wants to go after you no laws will protect you. As we've seen they can even illegally collect evidence and then do a parallel construction to "launder" the evidence.
But One-drive is essentially a mass-surveillance tool. It's a way to load the contents of every single person's computer into Palentir or similar tools and, say, for instance, "give me a list of everyone who harbors anti-ICE sentiments."
By the way my windows computer nags me incessantly about "setting up backups" with no obvious way to turn off the nags, only a "remind me later" button. I assume at some point the option to not have backups will go away.
I agree that "cloud storage" paradigms are a sea change from the status quo of the old days. My father has a file cabinet at home and keys on his keychain, wherein he stores all his important paperwork. There is no way anyone's getting in there except by entering his home and physically intruding on those drawers. Dad would at least notice the search and seizure, right?
What is just as crazy as cloud storage, is how you "go paperless" with all your service providers. Such as health care, utility bills, banks, etc. They don't print a paper statement and send it to your snail mail box anymore. They produce a PDF and store it in their cloud storage and then you need to go get it when you want/need it.
The typical consumer may never go get their paperwork from the provider's cloud. It is as if they said "Hey this document's in our warehouse! You need to drive across town, prove your identity, and look at it while you're here! ...You may not be permitted to take it with you, either!"
So I've been rather diligent and proactive about going to get my "paperless documents" from the various providers, and storing them in my own cloud storage, because, well, at least it's somewhere I can access it. I care a lot more about paying my medical bills, and accounting for my annual taxes, than someone noticing that I harbor anti-jew sentiment. I mean, I think they already figured that part out.
> But One-drive is essentially a mass-surveillance tool.
There are plenty of people that post clear positions on multiple social networks. I personally doubt that One-drive files will provide much more information for most of the people compared to what's already out there (including mobile phone location, credit card transactions, streaming services logs, etc.).
What I think the danger is for individual abuse. Someone "in power" wants one guy to have issues, they could check his One-drive for something.
Best is to make people aware of how it works and let them figure it out. There are so many options (local only, encrypted cloud storage, etc.) I doubt there is an ideal solution for everything.
Full-disk encryption is the opposite of pointless, my dude! The notebook-thief cannot access my data! That is the entire point!
No, I cannot recover the data from an HDD or SSD that I don't possess. But neither can the thief. The thief cannot access the keys in my cloud. Isn't that the point?
If a thief steals a notebook that isn't encrypted at all, then they can go into the storage, even forensically, and extract all my data! Nobody needs a "key" or credentials to do that! That was the status quo for decades in personal computing--and even enterprise computing. I've had "friends" give me "decommissioned" computers that still had data on their HDD from some corporation. And it would've been readable if I had tried.
The thief may have stolen a valuable piece of kit, but now all she has is hardware. Not my data. Not to mention, if your key was in a cloud backup, isn't most of your important data in the cloud, as well? Hopefully the only thing you lost with your device are the OS system files, and your documents are safely synced??
That's a reductionist view. Apple, at least, based a big portion of their image on privacy and encryption. If a company does that and is then proven otherwise, it does a tremendous damage to the brand and stock value and is something shareholders would absolutely sue the board and CEO for. Things like these happened many times in the past.
Nobody today cares about their encryption, their main sales pich now is convenience and luxury. They still need to comply with law which they do. In US or China. Nothing reductionist about stating a fact.
And I agree with that, too. This whole discussion made me realize they pivoted their PR. They probably had to because everyone wants the AI and there's no AI with privacy, at least not with the current processing power of portable devices.
A Proton model makes this very simple: full cooperation and handover and virtually nothing to be extracted from the data. Size is somewhat of a metadata, ip connection points and maybe date of first use and when data changes occurred...
I'm all for law enforcement, but that job has to be old-school Proof of Work bound and not using blanket data collection and automated speeding ticket mailer.
But I guess it's not done more because the free data can't be analyzed and sold.
It's already established that your disk encryption keys are in the Microsoft cloud whether you want them there or not. It's just a small step from there to your local government having the key too. Some governments claim to respect the privacy of their citizens, but there are always exceptions. Most governments likely have direct access to the keys, and don't even need to make the request.
No surprises here. There are people out there warning this would happen soon or later, and urging people to stop using Microsoft products, but of course, nobody cared about it as usual.
The origin of this is a Forbes article[0] where the quote is: "Microsoft confirmed to Forbes that it does provide BitLocker recovery keys if it receives a valid legal order."
Not surprising. The whole Win11 feels like a spy-tool for the government. Just that "recall" anti-feature nobody needs - except for those who want to sniff and spy after people.
The headline is slightly misleading. Microsoft can only provide the key if you are using a Microsoft Account which automatically escrows the BitLocker recovery key to OneDrive.
If you use a Local Account (which requires bypassing the OOBE internet check during setup) or explicitly disable key backup, the key never leaves the TPM. The issue isn't the encryption algorithm its the convenience selection.
If you have advanced data protection enabled, Apple claims:
“No one else can access your end-to-end encrypted data — not even Apple — and this data remains secure even in the case of a data breach in the cloud.”
Don't know if the problem is on my end but your link goes to a 20 page document. If this is not a mistake you should quote the actual section and text you are referrimg to.
> For users that have enabled Advanced Data Protection, iCloud stores content for email, contacts, and calendars that the customer has elected to maintain in the account while the customer’s account remains active. This data may be provided, as it exists in the customer’s account, in response to a search warrant issued upon a showing of probable cause, or customer consent.
> Apple does not receive or retain encryption keys for customer’s end-to-end encrypted data. Advanced Data Protection uses end-to-end encryption, and Apple cannot decrypt certain iCloud content, including Photos, iCloud Drive, Backup, Notes, and Safari Bookmarks
>>Do you think Tim Cook gave that gold bar to Trump for nothing?
Not in US - THANKS for this hint: I googled it! Wow!!! The both do bribery (offering&accepting) in front of the recording camera in a government building!!
Yes, I know this sounds conspiratorial, but I think the whole Liquid Ass thing was a rush to put some other software in Apple products to appease the Trump admin.
For example, it is new in Tahoe that they store your filevault encryption key in your icloud keychain without telling you.
But iCloud Keychain is end-to-end encrypted using device-specific keys, so Apple cannot read items in your iCloud Keychain (modulo adding their own key as a device key, rolling out a backdoor, etc. but that applies to all proprietary software).
My conspiration theory about Liquid Ass is their hardware for past 5 years was so good that they needed to make people finally upgrade it. My Air M1 16GB worked absolutely fine until it slowed down immensely on macOS 26.
Last time I onboarded a Mac (a few months ago), it would very explicitly ask if you want to enable support for remote FileVault unlocking.
That said, they could also roll out a small patch to a specific device to extract the keys. When you really want to be safe (and since you can be a called a 'left extremist' for moving your car out of the way, that now includes a lot of people), probably use Linux with LUKS.
Sure, but every company doesn't make it as difficult as possible to set up a new encrypted computer without uploading a copy of your your encryption key to their servers.
Except you’re not coerced (near enough forced?) to use an account password managed by MS on Apple. Until MS themselves publish, for home users, how to set up without an MS account, I’m considering it forced.
iCloud login is still optional on macOS. Can't download stuff from the App Store and I think some continuity things require iCloud, but otherwise pretty solid.
This issue aside, if anyone has the keys what value are they in the end? Has Microsoft ever refused to unlock someone's pc stating that they could not technically do that? Isn't storing keys like this akin to storing passwords in clear text?
it is perhaps mildly surprising that they have access to user encryption keys, but anyone surprised, over 20 years post-Patriot Act, that an American corporation is willing to cooperate with American federal law enforcement has maybe not been paying attention.
The major OS vendors (apple, google, ms) are complicit in data turnover and have been for over ten years now. It has been reported multiple times so I'm struggling to see the angle being projected here. This feels like click harvesting got the HN "Microsoft bad" crowd.
The segment of the population that is the target of political vindictiveness from the FBI seems to have changed somewhat with this administration so it makes sense to remind people of the vulnerabilities from time to time.
This was a decade ago, before the big tech went to brown nose Trump on live TV. We live in different reality nowadays. Apple doesn't even market their encryption and safety anymore, like they did on massive billboards all over the world.
Sure, but these are all mere statements. You don't know if they fully back that until there's a public standoff with law enforcement/administration and there weren't any in recent years. Yet at the same time it's hard to believe there were no attempts from that government to decrypt some devices they needed. So the fact we hear nothing about it is also an information to me. Sure, this is all speculation, but all things considered...
Besides, they fully comply with Chinese requirements, so...
iCloud Keychain is end-to-end encrypted, even without the Advanced Data Protection setting. https://support.apple.com/en-us/102651 Not something they can turn over to the feds.
And if you don't want iCloud Keychain, you are still given the choice to encrypt and print the backup key.
They fully comply with Chinese requirements if you subscribe to iCloud in China, and they do this quite transparently. They do not, notably, say they don't share anything with China and then go ahead and do it anyway.
Unless Apple is straight up lying about their technology and encryption methods used to secure iCloud and their hardware, the issue of a public standoff is moot, because Apple couldn't help them if they wanted to. And while perhaps it's possible that Apple would lie to consumers to please US law enforcement, it's a bit of a stretch to say that because there haven't been any high-profile cases where law enforcement tries to force Apple to give up what they don't have, that this must be evidence that they're in cahoots.
Apple has since confirmed in a statement provided to Ars that the US federal government “prohibited” the company “from sharing any information,” but now that Wyden has outed the feds, Apple has updated its transparency reporting and will “detail these kinds of requests” in a separate section on push notifications in its next report.
Who knows what else they're hiding, if we only found out about this scheme in 2023.
Technically it is possible to configure butlocker using passphrase instead of a TPM. It is not easy though. It is configured via GPO. However it is not a local account password. It is a separate passphrase which you need to provide early in boot process, similar to LUKS on linux systems. It works on windows computers without TPM, i’m not sure is it supported on systems that actually have TPM available.
I do find it quite interesting how people support this idea (because they got a warrant), but are vehemently against the idea of backdooring encryption.
Everybody should have access to your hard drive, not just the FBI, so please do not encrypt your hard-drive.
If you encrypt your drive and upload the key to Microsoft, you are engaging in anti-competitive behavior since you give them access to your data, but not also to the local thief.
Just don't encrypt your drive if you cant be bothered to secure your key. Encryption-neutrality.
Why Microsoft stores the encryption keys of the users in their servers? Key recovery is convenient, but in my opinion it should exist the "opt out" option, without MS being involved in the key storage in their datacenters.
Title should read "Microsoft confirms it will give the FBI your Windows PC data encryption key if court-ordered to do so".
Just because the article is click bait doesn't mean the HN entry needs to be, too.
Sure, the fact that MS has your keys at all is no less problematic for it, but the article clearly explains that MS will do this if legally ordered to do so. Not "when the FBI asks for it".
Which is how things work: when the courts order you to do something, you either do that thing, or you are yourself violating the law.
The problem is not that they will give the key (government can force them - this is expected), but that they even have the key in the first place.. I bet this is done without proper consent, or with choice like "yes" vs "maybe later"..
Apple will do this too. Your laptop encryption key is stored in your keychain (without telliing you!). All is needed is a warrant for your iCloud account and they also have access to your laptop.
It's most software. Cryptography is user-unfriendly. The mechanisms used to make it user friendly sacrifice security.
There's a saying that goes "not your keys not your crypto" but this really extends to everything. If you don't control the keys something else does behind the scenes. A six digit PIN you use to unlock your phone or messaging app doesn't have enough entropy to be secure, even to derive a key-encryption-key.
If you pass a KDF with a hardness of ~5 seconds a four digit PIN to derive a key, then you can brute force the whole 10,000 possible PINs in ~13 hours. After ~6.5 hours you would have a 50% chance of guessing correctly. Six digit PIN would take significantly longer, but most software uses a hardness nowhere near 5 seconds.
> A six digit PIN you use to unlock your phone or messaging app doesn't have enough entropy to be secure
The PIN is not usually used for cryptography, it's used to authorize the TEE (secure enclave) to do it for you. It's usually difficult or impractical to get the keys from the TEE.
Take it a step further, even - "End-to-End-Encryption" is complete security theater if the user doesn't control either end.
We joke and say that maybe Microsoft could engineer a safer architecture, but they can also ship an OTA update changing the code ad-hoc. If the FBI demands cooperation from Microsoft, can they really afford to say "no" to the feds? The architecture was busted from the ground-up for the sort of cryptographic expectations most people have.
You can (and should) watch all of https://www.youtube.com/watch?v=BLGFriOKz6U&t=1993s for the details about how iCloud is protected by HSMs and rate limits to understand why you’re wrong, but especially the time-linked section… instead of spreading FUD about something you know nothing about.
Which is really galling when you consider how many Windows 11 users have inadvertently been locked out of their own bought-and-paid-for computers thanks to BitLocker.
These have different meanings. Microsoft is legally entitled to refuse a request from law enforcement, and subject to criminal penalties if it refuses a valid legal order.
It does illustrate a significant vulnerability in that Microsoft has access to user keys by default. The public cannot be sure that Microsoft employees or criminals are unable to access those keys.
They said “legal order”, which includes a variety of things ranging from administrative subpoenas to judicial warrants. Generally they say warrant if that was used.
A “request” is “Hi Microsoft man, would you please bypass your process and give me customer data?” That doesn’t happen unless it’s for performative purposes. (Like when the FBI was crying about the San Bernardino shooter’s iPhone) Casual asks are problematic for police because it’s difficult to use that information in court.
What exactly was requested sounds fishy as the article states that Microsoft only gets 20 a year, and is responsive to 9 or fewer requests. Apple seems to get more and typically is more responsive. (https://www.apple.com/legal/transparency/us.html)
The other weird thing is that the Microsoft spokesman named in the Forbes article is an external crisis communications consultant. Why an use external guy firewalled from the business for what is a normal business process?
This is a problem, because Microsoft operates in a lot of jurisdictions, but one of them always wants to be the exception and claims that it has jurisdiction over all the others. Not that I personally am of the opinion, that it is wise for the other jurisdiction to trust Microsoft, but if MS wants to secure operating in the other jurisdiction it needs to separate itself from that outsider.
I think you need to rethink your position.
> Microsoft confirmed to Forbes that it does provide BitLocker recovery keys if it receives a valid legal order.
I suspect the FBI part was added editorially since this specific legal order came from the FBI.
There is reasonable suspicion, some might argue evidence, that Microsoft voluntarily cooperated with U.S. Intelligence Community without being compelled by a court order, the most famous instances being leaked in the Snowden disclosures.
To be fair to Microsoft, here's their updated statement (emphasis mine):
"Microsoft confirmed to Forbes that it does provide BitLocker recovery keys if it receives a valid legal order. “While key recovery offers convenience, it also carries a risk of unwanted access, so Microsoft believes customers are in the best position to decide... how to manage their keys,” said Microsoft spokesperson Charles Chamberlayne."
First they are capable of fulfilling the request in the first place which means their approach or encryption is inherently flawed. Second companies can very much push back on such requests with many examples of such working, but they need to make the attempt.
They do seem to be reasonable in the case that brought about this reporting, with substantial evidence that the suspects committed fraud and that evidence is on the devices in question.
So why should customers entrust their data to the company? It’s a transactional relationship and the less you do the less reason someone has to pay you.
Further, our legal system is adversarial it assumes someone is going to defend you. Without that there’s effectively zero protection for individuals.
You're making a lot of assumptions about how people use their computers, their understanding of their own devices, and the banality of building argumentation around what someone should have done or should not have done in the face of how reality works.
One of the privacy protections is simply that it's a lot of work to go through that process. The FBI wouldn't have the resources to do it to everyone it's merely curious about even if it had the authority, which it doesn't because warrants require probable cause.
I believe that it's generally acceptable that when law enforcement has probable cause for a search warrant, third parties grant them what access they reasonably can. I also believe people who actually want to protect their privacy and security should learn fundamentals like whoever has the key can unlock it and if nobody has the key, it's gone forever. If I was building a consumer product, I'd have to care quite a bit about the fact that many people won't do that, but I'm not so I don't.
I realize it's not a court order, but just want to add to the stack that there are examples of them being requested to provide something within the public's interest in a legal context (a FOIA lawsuit) where their counsel pushed back by saying no.
It could be a bigger obstacle for other agencies. CBP can hold a device carried by someone crossing the border without judicial oversight. ICE is in the midst of a hiring surge and from what I've read lately, has an abbreviated screening and training process likely not matching the rigor of the FBI. Local law enforcement agencies vary greatly.
I keep seeing mentions in the news of FBI agents resigning suddenly.
Having said that I won’t go back to Windows.
Inlight of fascism coming to Democratic cities and anyone documenting it being a registered domestic terrorist...well thats pretty f'n insecure by default.
This is an odd thing to split hairs over IMO. Warrants or subpoenas or just asking nicely, whatever bar you want to set, is a secondary concern. The main issue is they can and will hand the keys to LEO’s at all.
If you don’t trust the institutions issuing those court orders, that is an entirely reasonable stance but it should be addressed at its root cause using our democratic process, however rapidly eroding that process may seem to be.
The fourth amendment protects against warrantless search and seizure, it is not carte blanche to fill up your hard drive with child porn and expect Microsoft to fall on their swords to protect you.
I was understanding and felt your points had validity until you threw out this gross, emotionally manipulative, horrible misrepresentation of my stance.
If you are running any kind of service, you should learn how warrants work in the country you are hosting in, come the time, if your service grows, eventually you will have to comply with an order.
If you want anything else you will have to design your system such that you can't even see the data, ala Telegram. And even then, you will get into pretty murky waters.
Do we really, really, fully understand the implications of allowing for private contracts that can trump criminal law?
Given the abilities of the median MS client, the better choice is not obvious at all, while "protecting from a nation-state adversary" was definitely not one of the goals.
I could understand if the default is an online account + automatic key upload, but only if you add an opt-out option to it. It might not even be visible by default, like, idk, hide it somewhere so that you can be sure that the median MS user won't see it and won't think about it. But just fully refusing to allow your users to decide against uploading the encryption key to your servers is evil, straight up.
Before recently, normal people who get arrested and have their computer seized were 100% guaranteed that the cops could read their hard drive and society didn't fall apart. Today, the chances the cops can figure out how to read a given hard drive is probably a bit less. If someone needs better security against the actual government (and I'm hoping that person is a super cool brave journalist and not a terrorist), they should be handling their own encryption at the application layer and keeping their keys safe on their own, and probably using Linux.
I really think that enabling BitLocker with an escrowed key during OOBE is the right choice, the protection to risk balance for a “normal” user is good. Power users who are worried about government compulsion can still set up their system to be more hardened.
Yes, you can opt out of it while manually activating bitlocker, but I find it infuriating that there's no such choice at the system installation process. It's stupid that after system installation a user supposed to renecrypt their system drive if they don't want this.
If they honestly informed customers about the tradeoff between security and convenience they'd certainly have far fewer customers. Instead they lead people to believe that they can get that convenience for free.
The obvious better choice is transparancy.
What? Most people, thinking through the tradeoff, would 100% not choose to be in charge of safeguarding their own key, because they're more worried about losing everything on their PC, than they are about going to jail. Because most people aren't planning on doing crime. Yes, I know people can be wrongly accused and stuff, but overall most people aren't thinking of that as their main worry.
If you tell people, "I'll take care of safeguarding your key for you," it sounds like you're just doing them a favor.
It would be more honest to say, "I can hold on to a copy of your key and automatically unlock your data when we think you need it opened," but that would make it too obvious that they might do so without your permission.
The MSFT marketing folks seem to have opted for the less transparent one, just in case.
Protecting from specifically the nation state that hosts and regulates Microsoft and its biggest clients, probably not.
This story is just yet another confirmation of what used to be the "the americans have bugged most computers in the world" conspiracy theory.
I hope Microsoft wakes up to the changes in the way America is being viewed these days, because they stand to lose a lot of business if they don't.
It's a nightmare actually.
And AFAICT, they do ask, even if the flow is clearly designed to get the user to back up their keys online.
Of course this feature comes at the cost of no longer being able to have low level control over your device, but this isn't a binary choice.
Yes, phones just try to back up all of your data online.
>By giving the user the three options with consequences you empower the user to address their threat model how they see fit.
Making it too easy for uneducated users to make poor choices is terrible software design.
yes, it would be. So, the current way, 99% of people are benefitting from knowing their data is secure when very common thefts occur, and 1% of people have the same outcome as if their disk was unencrypted: When they're arrested and their computers seized, the cops have their crime secrets. What's wrong?
That defies the definition of "forced". Forced means no option. You can disagree all you want -- but at a technical level, you're incorrect.
Some even go that far that they push an update that exfiltrates data from a device (and some even do on their own initiative).
And even if you are not legally compelled. Money or influence can go a long way. For example, the fact that HTTPS communications were decipherable by the NSA for almost 20 years, or, whoops, no contract with DoD ("not safe enough"...)
Once the data is in the hands of the intelligence services, from a procedure perspective they can choose what to do next (e.g. to officialize this data collection through physical collection of the device, or do nothing and try to find a more juicy target).
It's not in the interest of anyone to prevent such collection agreement with governments. It's just Prism v2.
So seems normal that Microsoft gives the keys, the same that Cloudflare may give information about you and the others. They don't want to have their lives ruined for you.
Perhaps in this case they should be required to get a warrant rather than a subpoena?
The default behavior will never ever be to "encrypt the disk by a key and encrypt the key with the user's password." It just doesn't work in real life. You'll have thousands of users who lost access to their disks every week.
Inform, and Empower with real choices. Make it easy for end users to select an alternate key backup method. Some potential alternatives: Allow their bank to offer such a service. Allow friends and family to self host such a service. Etc.
Basically, we need better education about the issue, but as this is the case with almost every contentious issue in the world right now, I can't imagine this particular issue will bubble to the top of the awareness heap.
I suppose this all falls apart when the PC unlock password is your MS account password, the MS account can reset the local password. In Mac OS / Linux, you reset the login password, you loose the keychain.
Password managers shift the paradigm and the risk factors. In terms of MFA, a password in your manager is now "something you have" rather than "something you know". The only password I know nowadays is my sign-in password that unlocks the password manager's vault. So the passwords to my bank, my health care, my video games are no longer "in my fingers" or in my head anymore, they're unknown to me!
So vault management becomes the issue rather than password management. If passwords are now "something you have" then it becomes possible to lose them. For example, if my home burns down and I show up in a public library with nothing but the clothes on my back, how do I sign into my online accounts? If the passwords were in my fingers, I could do this. But if they require my smartphone to be operational and charged and having network access, and also require passwords I don't know anymore, I'm really screwed at that library. It'd be nearly impossible for me to sign back in.
So in the days of MFA and password managers, now we need to manage the vaults, whether they're in the cloud or in local storage, and we also need to print out recovery codes on paper and store them securely somewhere physical that we can access them after a catastrophe. This is an increase in complexity.
So I contend that password managers, and their cousins the nearly-ubiquitous passkeys, are the main driving factor in people's forgetting their passwords and forgetting how to sign-in now, without relying on an app to do it for them. And that is a decrease in opsec for consumers.
(Separately, if you can get access to a computer I'm sure you can get access to a phone charger.)
I know the police can just break down my door, but that doesn't mean I should be ok with some random asshole having my keys.
This is being reported on because it seems newsworthy and a departure from the norm.
Apple also categorically says they refuse such requests.
It's a private device. With private data. Device and data owned by the owner.
Using sleight of hand and words to coax a password into a shared cloud and beyond just seems to indicate the cloud is someone else's computer, and you are putting the keys to your world and your data insecurely in someone else's computer.
Should windows users assume their computer is now a hostile and hacked device, or one that can be easily hacked and backdoored without their knowledge to their data?
Should Apple find itself with a comparable decryption key in its possession, it would have little options but to comply and hand it over.
This is a misrepresentation of what actually happened: the FBI even argued that they would accept a tool locked to the specific device in question so as to alleviate this concern.
This is still forced labor/creative work/engineering work/speech and not okay, but it was not a "master key."
It is entirely possible that Apple's Advanced Data Protection feature is removed legally by the US as well, if the regime decides they want to target it. I suspect there are either two reasons why they do not: Either the US has an additional agreement with Apple behind the scenes somewhere, OR the US regime has not yet felt that this was an important enough thing to go after.
There is precedent in the removal, Apple has shown they'll do the removal if asked/forced. What makes you think they wouldn't do the same thing in the US if Trump threatened to ban iPhone shipments from China until Apple complied?
The options for people to manage this stuff themselves are extremely painful for the average user for many reasons laid out in this thread. But the same goes for things like PGP keys. Managing PGP keys, uploading to key servers, using specialized mail clients, plugging in and unplugging the physical key, managing key rotation, key escrow, and key revocation. And understanding the deep logic behind it actually requires a person with technical expertise in this particular solution to guide people. It's far beyond what the average end user is ever going to do.
We live in far different times these days. I have no doubt in my mind that Apple is complying 100% with every LE request coming their way (not only because of the above gesture, but because it's actually the law)
American presidents are not dictators. The system has checks and balances and the courts decide. It doesn’t matter who the president is.
By simply not having the ability to do so.
Of course Microsoft should comply with the law, expecting anything else is ridiculous. But they themselves made sure that they had the ability to produce the requested information.
I'm honestly not entirely convinced that disk encryption be enabled by default. How much of a problem was stolen personal laptops really? Corporate machine, sure, but leave the master key with the IT department.
...it's not that at all. We don't want private contracts to enshrine the same imbalances of power; we want those imbalances rendered irrelevant.
We hope against hope that people who have strength, money, reputation, legal teams, etc., will be as steadfast in asserting basic rights as people who have none of those things.
We don't regard the FBI as a legitimate institution of the rule of law, but a criminal enterprise and decades-long experiment in concentration of power. The constitution does not suppose an FBI, but it does suppose that 'no warrant shall issue but upon probable cause... particularly describing the place to be searched, and the persons or things to be seized' (emphasis mine). Obviously a search of the complete digital footprint and history of a person is not 'particular' in any plain meaning of that word.
...and we just don't regard the state as having an important function in the internet age. So all of its whining and tantrums and pepper spray and prison cells are just childish clinging to a power structure that is no longer desirable.
Without doubt, this analogy surely breaks down as society changes to become more digital - what about a Google Glass type of device that records my entire life, or the glasses of all people detected around me? what about the device where I uploaded my conscience, can law enforcement simply probe around my mind and find direct evidence of my guilt? Any written constitution is just a snapshot of a social contract at a particular historical time and technological development point, so it cannot serve as the ultimate source of truth regarding individual rights - the contract is renegotiated constantly through political means.
My question was more general: how could we draft that new social contract to the current age, how could we maintain the balance where the encrypted device of a suspected child predator and murderer is left encrypted, despite the fact that some 3rd party has the key, because we agreed that is the correct way to balance freedoms and law enforcement? It just doesn't sound stable in a democracy, where the rules of that social contract can change, it would contradict the moral intuitions of the vast majority.
But it isn't a warrant, it's a subpoena. Also, the locksmith isn't the one compelled to open it; if the government wants someone to do that they have to pay them.
> Any written constitution is just a snapshot of a social contract at a particular historical time and technological development point, so it cannot serve as the ultimate source of truth regarding individual rights - the contract is renegotiated constantly through political means.
The Fourth Amendment was enacted in 1791. A process to change it exists, implying that the people could change it if they wanted to, but sometimes they get it pretty right to begin with. And then who are these asshats craving access to everyone's "papers and effects" without a warrant?
The second, very clear, argument is that the state can't be trusted in the long run. Period. Maybe you love your elected officials today but tomorrow they could be actively out to harm you. Every tool we allow the state to use needs to be viewed with this level of extreme skepticism and even very clear benefits need to be debated vigorously.
Encryption, and technologies like it, may allow hiding criminal activity but they also provide people a sense of security to think freely and stave off political power grabs. We recognize the fundamental right to free speech and give great latitude to it even when it is harmful and hateful, we need to recognize the fundamental right to free thought and recognize that encryption and similar tools are critical to it.
With Intel Panther Lake (I'm not sure about AMD), Bitlocker will be entirely hardware-accelerated using dedicated SoC engines – which is a huge improvement and addresses many commonly known Full Disk Encryption vulnerabilities. However, in my opinion some changes still need to be made, particularly for machines without hardware acceleration support:
- Let users opt out of storing recovery keys online during setup.
- Let users choose between TPM or password based FDE during setup and let them switch between those options without forcing them to deal with group policies and the CLI.
- Change the KDF to a memory-hard KDF - this is important for both password and PIN protected FDE. It's 2026 - we shouldn't be spamming SHA256 anymore.
- Remove the 20 char limit from PIN protectors and make them alphanumerical by default. Windows 11 requires TPM 2.0 anyway so there's no point in enforcing a 20 char limit.
- Enable TPM parameter encryption for the same reasons outlined above.
If that’s what you’re worried about, you shouldn’t be using computers at all. I can pretty much guarantee that Linux will adopt SoC based hardware acceleration because the benefits – both in performance and security – outweigh the theoretical risks.
Brian Cantrill is trying to end this nonsense but we shall see if they end up being the lone voice or not.
And if it's not there, a patch is pretty easy to write.
It's not like there's no source code ;)
When someone is arrested, the police can get a subpoena to enter your house, right?
There they can collect evidence regarding the case.
Digital protections should exist, but should they exist beyond what is available in the physical world? If so, why?
I think the wording of this is far too lenient and I understand the controversy of "if asked" vs "valid legal order", neither of which strictly say "subpoena", and of course, the controversy of how laws are interpreted/ignored in one country in particularly (yes, I'm looking at you USA).
Should there be a middle ground? Or should we always consider anything that is digital off-limits?
That's a warrant. A subpoena is an order to appear in court.
Crazier question: what’s wrong with a well-intentioned surveillance state? Preventing crime is a noble goal, and sometimes I just don’t think some vague notion of privacy is more important than that.
I sometimes feel that the tech community would find the above opinion far more outlandish than the general population would.
https://en.wikipedia.org/wiki/Wings_of_Desire
tl;dw: A well-intentioned surveillance state may, in fact, love the beings they are surveilling. They may fall in love so deeply, that they want to become like us. I know it's a revolutionary concept.
Is how bitlocker works not well known perhaps? I don't think it's a secret. The whole schtick is that you get to manage windows computers in a corporate fleet remotely, that includes being able to lock-out or unlock volumes. The only other way to do that would be for the person using the device to store the keys somewhere locally, but the whole point is you don't trust the people using the computers, they're employees. If they get fired, or if they lose the laptop, them being the only people who can unlock the bitlocker volume is a very bad situation. Even that aside, the logistics of people switching laptops, help desk getting a laptop and needing to access the volume and similar scenarios have to be addressed. Nothing about this and how bitlocker works is new.
Even in the safer political climates of pre-2025, you're still looking at prosecution if you resist a lawful order. You can fight gag-orders, or the legality of a request, but without a court order to countermand the feds request, you have to comply.
Microsoft would do the same in China, Europe, middle east,etc.. the FBI isn't special.
One would presume US agencies has leverage to access global data.
These sorts of things should be very unsurprising to the people who depend on them...
Based on the sheer number of third parties we're required to use for our day to day lives, that is ridiculous and Third Party Doctrine should be eliminated.
Ref: https://en.wikipedia.org/wiki/Third-party_doctrine
Is it the case with BitLocker? The voluntary part.
Article and facts are “…if served with a valid legal order compelling it”
∴ Headline is clickbait.
I’d much rather they require a warrant than just give it to any enforcement agency that sends them an email asking. The former is what I expect.
The default setting is a good mix of protecting people from the trouble they’re far more likely to run into (someone steals their laptop) while still allowing them back in if they forget their password. The previous default setting was no encryption at all which is worse in every case.
There were questions about their motivation at the time. There still are questions.
https://ubuntu.com/download/desktop
https://archlinux.org/
https://www.kali.org/get-kali/#kali-platforms
https://fedoraproject.org/
Every bad day for microsoft is yet another glorious day for linux.
Nah. If that were the case, Linux would dominate personal computer statistics. The reality is that most mainstream users just don't care. But, of course, that won't stop us.
Across the generations, there are always a few groups to where cryptographic ownership really matter, such as journalists, protesters, and so on. Here on HN I feel like we tend to over-geeneralize these use cases to everybody, and then we are surprised when most people don't actually care.
http://www.slackware.com/
http://slackware.osuosl.org/slackware64-current/ChangeLog.tx...
Given that the us government is happy to execute us citizens and invade other countries that basically means everyone.
That is also exactly why people like myself are so against passkeys, there are no offline recovery.
Who holds/controls the keys on both ends?
There's a "data encryption key", encrypted with a hash derived of your username+master password, and that data encryption key is used locally to decrypt the items of your vault. Even if everything is stored remotely, unless the provider got your raw master password (usually, a hash of that is used as the "password" for authentication), your information is totally safe.
A whole other topic is communications, but we're talking decryption keys here
I'm all for criticizing tech companies but it's pointless to demand the impossible.
Besides, bit ocker keys are really quite hard to lose.
Pretty surprising they'd back up the disk encryption secrets to the cloud at all, IMHO, let alone that they'd back it up in plaintext.
"Tough luck, should have made a backup" is higher responsibility than securing anything in meatspace, including your passport or government ID. In the real world, there is always a recovery path. Security aficionados pushing non-recoverable traps on people are plain disconnected from reality.
Microsoft has the right approach here with Bitlocker defaults. It's not merely about UX - it's about not setting up traps and footguns that could easily cause harm to people.
Eventually they yielded on this, but their later updates had other usability traps. Because Google Auth was the household name for TOTP apps, this maybe ruined TOTP's reputation early-on.
That's the problem right there. Migrating my phone recently (without having broken/bricked the previous one, which is somehow even worse wrt. transferring 2FA these days than getting new phone after old one breaks!), I discovered that most sites I used did not allow more than one authenticator app. If I try to add new phone as second-factor auth method, the website deletes the entry for the old phone.
I am sad that this now appears unlikely. I suspect it may even be lower for people in their 20s today than a decade ago.
One of these things is not like the other...
That's why I'm stressing the comparison to e.g. government documents: nothing in meatspace requires regular people to show anywhere near as much conscientiousness as handling encryption keys.
Or: many people probably know, in the abstract, that "encrypted means gone if you lose the key", much like many people know slipping up while working on a HV line will kill you. Doesn't mean we should require everyone to play with them.
To be fair, if you inadvertently get locked out of your Google account "tough luck, should have used a different provider" and Gmail is a household name so ...
Less snarky, I think that there's absolutely nothing wrong with key escrow (either as a recovery avenue or otherwise) so long as it's opt in and the tradeoffs are made abundantly clear up front. Unfortunately that doesn't seem to be the route MS went.
Apple manages a recovery path for users without storing the key in plain text. Must have something to do with those "security aficionados."
Linux can be fairly well-secured against state-level threat actors, but honestly, if your adversary is your own nation-state, then no amount of security is going to protect you!
For Microsoft and the other consumer-OS vendors, it is typically a bad user-experience for any user, particularly a paying subscriber, to lose access to their account and their cloud apps. There are many ways to try and cajole the naïve user into storing their recovery key somewhere safe, but the best way is to just do it for them.
A recovery key stored in the user's own cloud account is going to be secure from the typical threats that consumers will face. I, for one, am thankful that there is peace of mind both from the on-device encryption, as well as the straightforward disaster recovery methods.
But One-drive is essentially a mass-surveillance tool. It's a way to load the contents of every single person's computer into Palentir or similar tools and, say, for instance, "give me a list of everyone who harbors anti-ICE sentiments."
By the way my windows computer nags me incessantly about "setting up backups" with no obvious way to turn off the nags, only a "remind me later" button. I assume at some point the option to not have backups will go away.
What is just as crazy as cloud storage, is how you "go paperless" with all your service providers. Such as health care, utility bills, banks, etc. They don't print a paper statement and send it to your snail mail box anymore. They produce a PDF and store it in their cloud storage and then you need to go get it when you want/need it.
The typical consumer may never go get their paperwork from the provider's cloud. It is as if they said "Hey this document's in our warehouse! You need to drive across town, prove your identity, and look at it while you're here! ...You may not be permitted to take it with you, either!"
So I've been rather diligent and proactive about going to get my "paperless documents" from the various providers, and storing them in my own cloud storage, because, well, at least it's somewhere I can access it. I care a lot more about paying my medical bills, and accounting for my annual taxes, than someone noticing that I harbor anti-jew sentiment. I mean, I think they already figured that part out.
There are plenty of people that post clear positions on multiple social networks. I personally doubt that One-drive files will provide much more information for most of the people compared to what's already out there (including mobile phone location, credit card transactions, streaming services logs, etc.).
What I think the danger is for individual abuse. Someone "in power" wants one guy to have issues, they could check his One-drive for something.
Best is to make people aware of how it works and let them figure it out. There are so many options (local only, encrypted cloud storage, etc.) I doubt there is an ideal solution for everything.
...in which case having a cloud backup of the full disk encryption key is pointless, because you don't have access to the disk any more.
Full-disk encryption is the opposite of pointless, my dude! The notebook-thief cannot access my data! That is the entire point!
No, I cannot recover the data from an HDD or SSD that I don't possess. But neither can the thief. The thief cannot access the keys in my cloud. Isn't that the point?
If a thief steals a notebook that isn't encrypted at all, then they can go into the storage, even forensically, and extract all my data! Nobody needs a "key" or credentials to do that! That was the status quo for decades in personal computing--and even enterprise computing. I've had "friends" give me "decommissioned" computers that still had data on their HDD from some corporation. And it would've been readable if I had tried.
The thief may have stolen a valuable piece of kit, but now all she has is hardware. Not my data. Not to mention, if your key was in a cloud backup, isn't most of your important data in the cloud, as well? Hopefully the only thing you lost with your device are the OS system files, and your documents are safely synced??
This isn't that simple.
But I guess it's not done more because the free data can't be analyzed and sold.
This is blurring of fact drives click bait.
The origin of this is a Forbes article[0] where the quote is: "Microsoft confirmed to Forbes that it does provide BitLocker recovery keys if it receives a valid legal order."
[0] https://www.forbes.com/sites/thomasbrewster/2026/01/22/micro...
If you use a Local Account (which requires bypassing the OOBE internet check during setup) or explicitly disable key backup, the key never leaves the TPM. The issue isn't the encryption algorithm its the convenience selection.
https://support.apple.com/en-us/102651
The following information may be available from iCloud if a user has enabled Advanced Data Protection for iCloud:
https://www.apple.com/legal/privacy/law-enforcement-guidelin...
Do you think Tim Cook gave that gold bar to Trump for nothing?
Don't know if the problem is on my end but your link goes to a 20 page document. If this is not a mistake you should quote the actual section and text you are referrimg to.
> Apple does not receive or retain encryption keys for customer’s end-to-end encrypted data. Advanced Data Protection uses end-to-end encryption, and Apple cannot decrypt certain iCloud content, including Photos, iCloud Drive, Backup, Notes, and Safari Bookmarks
Not in US - THANKS for this hint: I googled it! Wow!!! The both do bribery (offering&accepting) in front of the recording camera in a government building!!
Relly "impressive" :-X
For example, it is new in Tahoe that they store your filevault encryption key in your icloud keychain without telling you.
https://sixcolors.com/post/2025/09/filevault-on-macos-tahoe-...
iCloud is much more secure than most people realize because most people don’t take the 30 minutes to learn how it is architected.
You can (and should) watch https://www.youtube.com/watch?v=BLGFriOKz6U&t=1993s for all the details about how iCloud is protected, but especially the time-linked section. :)
That said, they could also roll out a small patch to a specific device to extract the keys. When you really want to be safe (and since you can be a called a 'left extremist' for moving your car out of the way, that now includes a lot of people), probably use Linux with LUKS.
Apple provides an optional encryption level (ADP) where they don't have a copy of your encryption key.
When Apple doesn't have the encryption key, they can't decrypt your data, so they can't provide a copy of the decrypted data in response to a warrant.
They explain the trade off during device setup: If Apple doesn't have a copy of the key, they can't help you if you should lose your copy of the key.
That's a Microsoft thing.
People also forget how they kind of always played ball in similar governments.
Lockdown mode: https://support.apple.com/en-us/105120
Advanced Data Protection for iCloud: https://support.apple.com/en-us/108756
Besides, they fully comply with Chinese requirements, so...
PS. Others report Filevault keys are also being backed to iCloud since September and they didn't tell anyone: https://sixcolors.com/post/2025/09/filevault-on-macos-tahoe-...
And if you don't want iCloud Keychain, you are still given the choice to encrypt and print the backup key.
Unless Apple is straight up lying about their technology and encryption methods used to secure iCloud and their hardware, the issue of a public standoff is moot, because Apple couldn't help them if they wanted to. And while perhaps it's possible that Apple would lie to consumers to please US law enforcement, it's a bit of a stretch to say that because there haven't been any high-profile cases where law enforcement tries to force Apple to give up what they don't have, that this must be evidence that they're in cahoots.
Which, to be clear, is perfectly possible. Apple has denied the existence of a deliberately backdoored system at least once before: https://arstechnica.com/tech-policy/2023/12/apple-admits-to-...
Who knows what else they're hiding, if we only found out about this scheme in 2023.How is this any different?
If you encrypt your drive and upload the key to Microsoft, you are engaging in anti-competitive behavior since you give them access to your data, but not also to the local thief.
Just don't encrypt your drive if you cant be bothered to secure your key. Encryption-neutrality.
Just because the article is click bait doesn't mean the HN entry needs to be, too.
Sure, the fact that MS has your keys at all is no less problematic for it, but the article clearly explains that MS will do this if legally ordered to do so. Not "when the FBI asks for it".
Which is how things work: when the courts order you to do something, you either do that thing, or you are yourself violating the law.
Still crap but the headline is intentionally inaccurate for clickbaiting
sixcolors.com/post/2025/09/filevault-on-macos-tahoe-no-longer-uses-icloud-to-store-its-recovery-key/
Probably not if one is not using Apple cloud on their laptops.
> stored in your keychain (without telliing you!)
How to verify that? Any commands/tools/guides?
There's a saying that goes "not your keys not your crypto" but this really extends to everything. If you don't control the keys something else does behind the scenes. A six digit PIN you use to unlock your phone or messaging app doesn't have enough entropy to be secure, even to derive a key-encryption-key.
If you pass a KDF with a hardness of ~5 seconds a four digit PIN to derive a key, then you can brute force the whole 10,000 possible PINs in ~13 hours. After ~6.5 hours you would have a 50% chance of guessing correctly. Six digit PIN would take significantly longer, but most software uses a hardness nowhere near 5 seconds.
The PIN is not usually used for cryptography, it's used to authorize the TEE (secure enclave) to do it for you. It's usually difficult or impractical to get the keys from the TEE.
We joke and say that maybe Microsoft could engineer a safer architecture, but they can also ship an OTA update changing the code ad-hoc. If the FBI demands cooperation from Microsoft, can they really afford to say "no" to the feds? The architecture was busted from the ground-up for the sort of cryptographic expectations most people have.
You can (and should) watch all of https://www.youtube.com/watch?v=BLGFriOKz6U&t=1993s for the details about how iCloud is protected by HSMs and rate limits to understand why you’re wrong, but especially the time-linked section… instead of spreading FUD about something you know nothing about.
Where's the source code? Who audits this system?
> Microsoft confirms it will give the FBI your Windows PC data encryption key if asked
> Microsoft says it will hand those over to the FBI if requested via legal order
Microsoft complying with legal orders is not news. But why hire actual journalists when you can just lie in your headlines and still get clicks?
Edit: Nevermind.