I have always thought biometrics on phones is just another way so-called "tech" companies perform data collection ultimately to be used for commercial purposes or any purposes deemed appropriate by the companies or their business partners
The companies are secretive so who knows what they are up to that we dont know about. What we do know is that these companies do not tell the whole truth when explaining their publicly visible conduct, including their data collection practices
For example, a so-called "tech" company might claim they need a user's phone number for "security" purposes while the data actually serves other purposes for the company that the user might find objectionable if they knew about them (This actually happened)
The mobile phone has become a computer that the user cannot truly control. Companies can remotely install and run code on these computers at any time for any reason.^1 If the user stores data on the phone, the company tries to get the user to sync it to the company's computers
If there are promises, e.g., about "privacy", made by the companies, then these promises are unlikely to be enforceable. It's rather difficult if not impossible to verify such promises are kept, or to discover they have been breached. Unfortunately, when the promises are broken then there is no adequate remedy. It's too late
1. This unfettered access can be blocked but there's been a culture that has emerged around actively doing the opposite. That the so-called "tech" companies are the primary beneficiaries is surely a fortuitous coincidence
Anything else is insecure in principle, and getting less and less secure in practice, as acquisition, collation, sharing, and leveraging unpermissioned information use becomes cheaper, easier, and more profitable by the day.
Cryptography provides a long menu of ways entities can exchange information and interact, without sharing information that is not functionally relevant.
Making those capabilities the basis for digital inter-entity trade is the only way we will get real privacy and avoid the massive predatory surveillance-manipulation-for-hire economy from continuing to metastasize. With AI driving the value and opportunity of its leverage against us ever upward.
Strict laws might have been a practical solution a couple decades ago when information based services began hyperscaling the surveillance-manipulation economy. It wouldn't be a bad thing now. But those laws seem unlikely, so the technical solution is the only path forward.
I don't think people really absorb how much of the value of the economy is parasitically skimmed off by the 2-sided centralized S-M business model. From consumers and ad buyers/producers. The colossal revenues of Google and Facebook to start. And how effectively that is incentivizing and funding continued growth in addictive, manipulative and dominant (through pervasiveness) "personalized" content, that will make things much worse.
GrapheneOS has a nice feature where you can use both the fingerprint and a short passcode to avoid having to type out your longer/more valuable password all the time. Seems like a good solution to the problem.
Also, iirc iphones have this feature where if you appear to be under duress, it will refuse to unlock and disable face id. Is this true?
> Also, iirc iphones have this feature where if you appear to be under duress, it will refuse to unlock and disable face id. Is this true?
heh it would suck to be beaten with a wrench to unlock your phone and, finally, to make it stop you relent but then the phone is like "nope, sorry. if you're gonna be dumb you gotta be tough".
If you’re worried about wrench attacks then you’re already in a situation where encryption won’t help you. They may beat you anyway if they don’t find what they’re looking for on the phone, or they may just kill you for being a nuisance to power.
Graphene also has a kind of workaround to add fingerprint duress:
>GrapheneOS improves the security of the fingerprint unlock feature by only permitting 5 total attempts rather than implementing a 30 second delay between every 5 failed attempts with a total of 20 attempts. This doesn't just reduce the number of potential attempts but also makes it easy to disable fingerprint unlock by intentionally failing to unlock 5 times with a different finger.
The first phone I used with Graphene was a Pixel 4XL. It didn't come with a fingerprint sensor. If I remember correctly, the longest lockout period was still really short, like 5 mins or something. It was rather annoying to constantly have to put in your unlock code when you wanted to use or check something on the phone.
Loved Graphene, and the Pixel worked flawlessly, but man, that unlock thing drove me nuts more than a few times.
Though with all the devices GrapheneOS supports, there are only two fingers you can plausibly use with the device: the thumb, usually on your dominant hand. It is quite awkward to be using anything else.
Also, iirc iphones have this feature where if you appear to be under duress, it will refuse to unlock and disable face id. Is this true?
Sort of: if you hold the buttons on both sides of the phone for about three seconds, it will bring up the Power Off/SOS screen. You do not need to interact with that screen, just display it. Easy-peasy, you can do it with the phone in your pocket. Once that screen is displayed, it requires a passcode to unlock the phone. The courts have determined that the passcode is protected by the 5th Amendment, but biometrics are not.
It would be useful imho if an option was available for the phone to automatically enter this mode if separated for more than X seconds from a paired watch or airtag, or with sufficient vibration/acceleration (throw or stomp it). Similar adversarial defense as the phone rebooting after three days [1]. Perhaps part of Advanced Data Protection.
Not legal advice. Having a trusted contact remotely wipe the device is also a potential option with appropriate iCloud creds and a message passed [2], assuming the device is not powered down or kept in a physical location blocking internet/cellular channels.
Given that my Apple Watch throws alerts when I leave a device behind (“mikestew’s iPhone was left behind at $PLACE”), it would be just one more step to flip that “no biometrics” bit. I’m assuming that those APIs are not available to 3rd party devs, so I can’t write my own.
I don't think any rational discussion about privacy can be had without first describing exactly what your definition of "privacy" is in this specific context, AND you must define a threat model. Otherwise we can't know if the vendor is even relevant to what they care about.
Privacy from what? From a determined government and court system? Nothing is going to keep you private from that. From your peers and family? Apple and Google keep you private in that regard. As for the world of privacy in between those extremes: it depends.
> From a determined government and court system? Nothing is going to keep you private from that
While there's always https://xkcd.com/538/ there are not currently quantum computers that can factor 4k RSA keys, so the court can order whatever it wants, unless they have a way past that (which may involve variations of xkcd 538), they ain't getting shit out of a properly configured digital safe. (construction of said safe is left as an exercise to the reader.)
Most of us (reporters included) aren't protecting anything with their life, not just because of a survival instinct, but because what we're protecting isn't actually worth that much.
For the relative handful who are custodians of that sort of data, history suggests a smaller minority than they'd like to admit have a readily achievable breaking point. The true believers who are left then are a minority that's hardly impossible to track and subvert through attacks that don't involve decryption on a device.
The point of that XKCD wasn't to be THE SINGULAR EXAMPLE, it's sort of a Zen Koan for people who only think in terms of technical risks and solutions.
It's not quite settled whether the FBI is able to demand you to decrypt data for now. If this becomes widespread enough, they might try to get SCOTUS to decide this, which may or may not end privacy once and for all.
A better strategy would be to configure multiple profiles and when they ask you to unlock your phone you use the pin that unlocks the boring one.
We just need a UX which makes it impossible to know how many profiles a phone has configured. Not some kind of sneaky hidden mode that you can be labeled a terrorist for having enabled, just that's how it works--you have to know a profile exists in order to log into it.
Of course it's not going to stand up to forensic scrutiny, but that's not what the feature is about anyhow.
For an organization, a better strategy is to never store anything of value on the phone, and have a remote server in a safe place. The phone acts as a thin client to access server. The key in turn is easy to hide in a plausibly-deniable way or simply memorized. The server can also revoke the key, rendering it useless even if it is revealed at a later date.
This is famously used by Uber to protect their systems from the French police, for instance.
without exception, bio metrics should be in-addition-to a password, never the only method. just because it's constantly sold as a convenience alternative, doesn't make it right.
Anyone in journalism should know not to be using biometrics. I use it, but know how to quickly disable it. If using fingerprint, you can always offer up the wrong digit, a few fails should make it fallback to pin.
So all an adversary/the police need to do is watch you unlock your phone once to know which finger to use? Trivial considering how often we unlock our phones and how many cameras exist.
Something that could come in handy: You can put iPhones into passcode mode by holding down a volume button + the lock button (the poweroff/emergency mode sequence), and then cancelling.
My understanding is that this and similar techniques don't get you back into the before first unlock (BFU) state. To do that as far as I know you have to shut down the device. Otherwise--even if locked--your phone will be in the after first unlock (AFU) state. I believe that in the AFU state considerably more of the system is decrypted and accessible than in the much more limited BFU state.
Maybe someone with more knowledge can chime in here.
This is true but there's automatic restart which will automatically restart the phone to get it back into BFU state:
> Automatic Restart is a security mechanism in iOS 18.1 iPadOS 18.1 and or later that leverages the Secure Enclave to monitor device unlock events. If a device remains locked for a prolonged period, it automatically restarts, transitioning from an After First Unlock state to a Before First Unlock state. During the restart, the device purges sensitive security keys and transient data from memory.
Probably a much better idea to just go ahead and hit shutdown if you're on that screen anyway, since many phones are more susceptible to gear like Greykey or Cellebrite if they have ever been unlocked since the last power-on.
I wish phones supported continuous re-authentication. Like an in-screen fingerprint reader that authenticates every single touch (even better if you could also use it to assign different actions to different fingers), or to have FaceID immediately lock the phone if someone other than the owner is using it.
The constitution has been interpreted to allow the police to force your finger onto an inkpad for fingerprints. That decision was extended to allow the police to force your finger onto a biometric reader.
The 5th Amendment has been (so far) interpreted to only limit things that require conscious thought, such as remembering a password and speaking it or typing it.
I don't know about that exactly, but my understanding was that this is similar in justification to compelling a person to be fingerprinted or give a DNA sample. To me there does seem to be a fairly major difference between forcing someone to disclose information held in their mind and forcing them to provide a biometric. The former seems equivalent to compelling testimony against oneself. I have a hard time seeing the latter as compelling testimony against oneself, especially if giving fingerprints or DNA isn't.
Part of it is that compelling information can be problematic, in that other circumstances can happen where the information may not easily be obtainable.
Extreme example, imagine a stroke or head injury causing memory loss.
OTOH DNA/Face/Fingerprints, usually can't be 'forgotten'.
It shouldn't be different. But law enforcement wants access and everyone who could reign them in seems to also want them to have access. Honestly it's surprising at this point they haven't argued that people can be compelled to give up their password using whatever means necessary.
Could you get charged with destroying evidence if you provided the duress password wiping the device when asked for a password ?
You technically followed orders and didn't even touch the device.
Yes, that would be "spoliation of evidence" and probably "obstruction of justice". Also, I believe duress passwords are only a "thing" on GrapheneOS, not iOS or stock Android.
Don't use biometrics a pin has been shown to have more 5th amendment protections.
Have your phone automatically reboot at a regular time every day. When your phone reboots a lot of the exploits that can get into your phone are locked out because they rely on reading the active memory.
Unless one has been ordered to preserve evidence already for a pending court case... proving that someone knew said information was valuable as evidence, and willfully destroyed it knowing so, might be extremely difficult.
There are very specific rules for proving destruction of evidence. For a criminal case the burden proof in the US at least is "beyond a reasonable doubt", so someone would likely have to prove that you knowingly destroyed valuable evidence before you'd get in big trouble. And if you haven't already been served with something saying you need to preserve evidence, they might not have any claim to information they had no idea existed beforehand, especially if you don't talk.
Believe this is bad legal advice. They would only need to prove you destroyed information with intent to impede an investigation/case. They would not need to prove something convicting or weighing was destroyed.
What you seem to be referring to would be obstruction, whereas the entire parent thread was specifically discussing destruction of evidence. Fair to point out that there are other offenses that could be charged, but misleading to imply it’s the same thing.
> They would only need to prove you destroyed information with intent to impede an investigation/case
Which requires them to prove they know that device likely contains relevant information. Just being party to a court case doesn't mean you're forbidden from deleting anything ever again... like I said there are very specific rules for evidence, and one cannot begin to claim something relevant is destroyed if you can't even show that you had any idea what might have been destroyed in the first place.
It mostly hinges on your intent, i.e. what they can argue is your understanding of the information you destroyed, not theirs. It unfortunately can be far-reaching, including into the past.
You're right that in normal circumstances you can routinely delete records for data hygiene, to save money, as part of a phone repair, and so on, unless you've been court ordered otherwise.
Samsung phones have the Secure Folder where you can use a different, more secure password. If you have the data encrypted it is very secure as the actual encryption key is stored in a secure element.
I've been genuinely depressed about how fast the country is descending into strong man rule while half the country cheers it on. Which I think is their point, they want their political opponents to suffer at all costs.
> The warrant included a few stipulations limiting law enforcement personnel. Investigators were not authorized to ask Natanson details about what kind of biometric authentication she may have used on her devices.
The warrant said they couldn't demand she do those things, not that they couldn't ask.
Functionally there's very little distinction - a question asked by a law enforcement officer during a search and seizure will inevitably be understood as a demand, no matter how it's worded. (And doubly so when it's in the context of e.g. choosing which of the person's fingers to grab and press to their phone.) I'm surprised that the warrant even acknowledged the possibility of a "voluntary" disclosure.
The companies are secretive so who knows what they are up to that we dont know about. What we do know is that these companies do not tell the whole truth when explaining their publicly visible conduct, including their data collection practices
For example, a so-called "tech" company might claim they need a user's phone number for "security" purposes while the data actually serves other purposes for the company that the user might find objectionable if they knew about them (This actually happened)
The mobile phone has become a computer that the user cannot truly control. Companies can remotely install and run code on these computers at any time for any reason.^1 If the user stores data on the phone, the company tries to get the user to sync it to the company's computers
If there are promises, e.g., about "privacy", made by the companies, then these promises are unlikely to be enforceable. It's rather difficult if not impossible to verify such promises are kept, or to discover they have been breached. Unfortunately, when the promises are broken then there is no adequate remedy. It's too late
1. This unfettered access can be blocked but there's been a culture that has emerged around actively doing the opposite. That the so-called "tech" companies are the primary beneficiaries is surely a fortuitous coincidence
Anything else is insecure in principle, and getting less and less secure in practice, as acquisition, collation, sharing, and leveraging unpermissioned information use becomes cheaper, easier, and more profitable by the day.
Cryptography provides a long menu of ways entities can exchange information and interact, without sharing information that is not functionally relevant.
Making those capabilities the basis for digital inter-entity trade is the only way we will get real privacy and avoid the massive predatory surveillance-manipulation-for-hire economy from continuing to metastasize. With AI driving the value and opportunity of its leverage against us ever upward.
Strict laws might have been a practical solution a couple decades ago when information based services began hyperscaling the surveillance-manipulation economy. It wouldn't be a bad thing now. But those laws seem unlikely, so the technical solution is the only path forward.
I don't think people really absorb how much of the value of the economy is parasitically skimmed off by the 2-sided centralized S-M business model. From consumers and ad buyers/producers. The colossal revenues of Google and Facebook to start. And how effectively that is incentivizing and funding continued growth in addictive, manipulative and dominant (through pervasiveness) "personalized" content, that will make things much worse.
Also, iirc iphones have this feature where if you appear to be under duress, it will refuse to unlock and disable face id. Is this true?
heh it would suck to be beaten with a wrench to unlock your phone and, finally, to make it stop you relent but then the phone is like "nope, sorry. if you're gonna be dumb you gotta be tough".
>GrapheneOS improves the security of the fingerprint unlock feature by only permitting 5 total attempts rather than implementing a 30 second delay between every 5 failed attempts with a total of 20 attempts. This doesn't just reduce the number of potential attempts but also makes it easy to disable fingerprint unlock by intentionally failing to unlock 5 times with a different finger.
Loved Graphene, and the Pixel worked flawlessly, but man, that unlock thing drove me nuts more than a few times.
Though with all the devices GrapheneOS supports, there are only two fingers you can plausibly use with the device: the thumb, usually on your dominant hand. It is quite awkward to be using anything else.
All this biometric talk in the world and it’s rarely made convenient for the user like this.
It was likely almost as fast as a physical keyboard smartphone for instant entry into an app.
Cut to my phone failing to recognize the fingerprint whenever it feels like or maybe because the humidity is 0.5% from the ideal value
sigh
Sort of: if you hold the buttons on both sides of the phone for about three seconds, it will bring up the Power Off/SOS screen. You do not need to interact with that screen, just display it. Easy-peasy, you can do it with the phone in your pocket. Once that screen is displayed, it requires a passcode to unlock the phone. The courts have determined that the passcode is protected by the 5th Amendment, but biometrics are not.
https://arstechnica.com/tech-policy/2023/12/suspects-can-ref...
Not legal advice. Having a trusted contact remotely wipe the device is also a potential option with appropriate iCloud creds and a message passed [2], assuming the device is not powered down or kept in a physical location blocking internet/cellular channels.
[1] New Apple security feature reboots iPhones after 3 days, researchers confirm - https://news.ycombinator.com/item?id=42143265 - November 2024 (215 comments)
[2] Erase a device in Find Devices on iCloud.com - https://support.apple.com/guide/icloud/erase-a-device-mmfc0e...
While there's always https://xkcd.com/538/ there are not currently quantum computers that can factor 4k RSA keys, so the court can order whatever it wants, unless they have a way past that (which may involve variations of xkcd 538), they ain't getting shit out of a properly configured digital safe. (construction of said safe is left as an exercise to the reader.)
For the relative handful who are custodians of that sort of data, history suggests a smaller minority than they'd like to admit have a readily achievable breaking point. The true believers who are left then are a minority that's hardly impossible to track and subvert through attacks that don't involve decryption on a device.
The point of that XKCD wasn't to be THE SINGULAR EXAMPLE, it's sort of a Zen Koan for people who only think in terms of technical risks and solutions.
The duress password feature is also useful. Entering it will completely wipe the phone and reset it to factory.
We just need a UX which makes it impossible to know how many profiles a phone has configured. Not some kind of sneaky hidden mode that you can be labeled a terrorist for having enabled, just that's how it works--you have to know a profile exists in order to log into it.
Of course it's not going to stand up to forensic scrutiny, but that's not what the feature is about anyhow.
This is famously used by Uber to protect their systems from the French police, for instance.
https://en.wikipedia.org/wiki/Uber_Files#Kill_switch
Maybe someone with more knowledge can chime in here.
> Automatic Restart is a security mechanism in iOS 18.1 iPadOS 18.1 and or later that leverages the Secure Enclave to monitor device unlock events. If a device remains locked for a prolonged period, it automatically restarts, transitioning from an After First Unlock state to a Before First Unlock state. During the restart, the device purges sensitive security keys and transient data from memory.
https://help.apple.com/pdf/security/en_US/apple-platform-sec...
> [...] inactivity reboot triggers exactly after 3 days (72 hours). [...]
https://naehrdine.blogspot.com/2024/11/reverse-engineering-i...
GrapheneOS also has this (https://grapheneos.org/features#auto-reboot) with a default of 18 hours.
Maybe one could try to force restart (https://support.apple.com/en-gb/guide/iphone/iph8903c3ee6/io...) to quickly get to BFU. But I could imagine that it'd be hard to remember and then execute the right steps in a stressful situation.
if i dont click those 5 presses fast enough it instead opens apple cash or whatever it’s called
i’m assuming that in a stressful situation it’d be much more consistent to hold down power and volume rather than clicking quickly
The 5th Amendment has been (so far) interpreted to only limit things that require conscious thought, such as remembering a password and speaking it or typing it.
Extreme example, imagine a stroke or head injury causing memory loss.
OTOH DNA/Face/Fingerprints, usually can't be 'forgotten'.
And unlike a witness, you can legally lie and mislead officers.
A solution that can seem like plausible deniability could be interesting.
My impression is deliberately doing this would be illegal. It would have to be convincingly deniable somehow.
Is there a way to do that?
You'd also have to rely on this unnamed other to force that particular finger, rather than the others...
E.x. if one had a "dead man's switch" phone that required a passkey every x minutes, and each time you did so it set the next threshold...
Which requires them to prove they know that device likely contains relevant information. Just being party to a court case doesn't mean you're forbidden from deleting anything ever again... like I said there are very specific rules for evidence, and one cannot begin to claim something relevant is destroyed if you can't even show that you had any idea what might have been destroyed in the first place.
You're right that in normal circumstances you can routinely delete records for data hygiene, to save money, as part of a phone repair, and so on, unless you've been court ordered otherwise.
> The warrant included a few stipulations limiting law enforcement personnel. Investigators were not authorized to ask Natanson details about what kind of biometric authentication she may have used on her devices.
The warrant said they couldn't demand she do those things, not that they couldn't ask.
Makes me question the rest of the reporting.
Why do you think it's appropriate to talk to people like this?