I get my gas and electricity from Scottish Power. Recently a rival company, Ovo Energy made a clerical error and sent me a bill, leading to a dispute. The front line of defence against this kind of dispute is that the bills give the serial numbers of the meters. The bill from Scottish Power gives the same meter serial numbers that are embossed on the front of my meters, and is therefore valid. The bill from Ovo Energy gives different serial numbers and is therefore in error.
Picture though the internal processes in Ovo Energy. A second clerk is tasked with attending to the problem. He has a choice. He can change the address to agree with the meter serial numbers, correcting the error. Or he can change the meter serial numbers to those for my address, compounding the error.
Since the meter serial numbers are confidential, to me and Scottish Power, Ovo Energy does not have the second option; they do not know the serial numbers (which are long, like a credit card number, not just 1,2,3,...). Thus the clerical error gets corrected, or just left, but not compounded.
My guess is that confidential information, (such as meter serial numbers, credit card numbers, and account numbers), are the front like of defence against both clerical error and fraud based on impersonation. It is a rather weak defence, but it is light weight, and seems to how much of billing and billing disputes work.
We all have lots to hide: the confidential information that the system needs us to keep confidential to stop clerical errors from compounding.
telco guy comes in at point x in past, takes a pic of your meters while you don't attend. privacy fucked. but obscuring stuff like that behind temper proof (mwemphasis on proof) the glitter?
This is a valid story and I’m sorry to hear that you went through this. However, it’s a strawman for the current argument from the blog post, which is that living life in the open and acting normal is setting things up for failure, and I don’t believe that it is.
Having nothing to hide is fine. Nothing to hide and doing nothing wrong is least likely to cause trouble.
The blog post’s argument that someone would be more likely to get watched if they start hiding after not hiding is not valid. ALL encrypted and unencrypted communication is a valid target for analysis, but ANY encrypted traffic is obviously more of a concern, just like one person walking into a store brandishing a gun is as alarming as 5 brandishing guns, and it doesn’t matter whether they used to not carry guns into the store.
To anyone who says "I have nothing to hide" I respond with "Unfortunately, you are not the one who gets to decide whether what you have is worth hiding."
(I think I first might have come across this beautifully succinct and unfortunately very true counter in a Reddit AMA with Edward Snowden way back when, but I might be misremembering.)
I was thinking a little about it lately. Not the saying itself, but the positioning to the general public. The annoying reality is that for most of the things that I consider important enough to voice discontent over ( and maybe even suppress need for convenience for ) are not always easy to 'present'. Note that it is not even always easy here either, but we do, by design, give one another a charitable read.
Hell, look at me, I care and I accepted some of it as price to pay for house peace.
I feel that I have nothing to hide, but I do my darnedest to ensure that it costs a maximal amount of time and effort to find that out.
If a random stranger (law enforcement or otherwise) wants to know shit about me, then I'm immediately creeped out and the last thing I want to do is make (online) stalking me as difficult as possible.
As it should be for everyone.
Edited to add:
One thing I can tell you from experience: law enforcement only look for things that will confirm their suspicions. They do not look for counter evidence, no matter how obvious it is or how easy it is to find - even within government records to which they would already have access.
As such, beware what trail you leave, if it suits the right (wrong) agenda, it will be used to point in the worst possible direction.
"I have nothing to hide" only makes sense if privacy and disclosure are treated as a binary. In reality, both exist on a spectrum: privacy is controlled disclosure, shaped by what is shared, with whom, at what level of detail, and under what power asymmetry.
Large surveillance systems inevitably build baselines. They don't just detect crimes; they detect patterns and anomalies relative to whatever becomes "normal".
The problem with "nothing to hide" is that it defaults to maximal disclosure. Data is persistent, aggregatable, and reinterpretable as norms and regimes change. The data doesn't.
This isn't purely individual. Your disclosures can expose others through contact graphs and inference, regardless of intent. And it doesn't matter whether the collector is the state or a company; aggregation and reuse work the same way.
> And then comes the part they can't (or won't) fathom. The context shifts. The political winds change. The Overton window slams shut on a belief they once held. A book they read is declared subversive. A group they donated to is re-classified as extremist. A joke they told is now evidence of a thoughtcrime.
There are at least some people who would respond by (still) saying "I have nothing to hide." They are proud of their moral choices and confident in their convictions. Arrest them if you dare.
I wonder if the author still has contempt for them?
I think the author is trying to say in today's world we face a sort of moral/societal vacuum of privacy. The more we try to remain private, whether by being an open book or by some type of digital way, it's basically futile or will eventually be broken.
My spin, as a recovering perfectionist, is when you've done everything you can to be "innocent" and the political or whatever wind changes, the pit of despair is a real and devastating thing. When this happens, sometimes the decisions that are made are desperate.
In many moral frameworks, inconsistency isn't the only wrong someone can commit. The argument constructed in this article is essentially utilitarian, making the claim that the mechanisms of surveillance and privacy make this behavior harmful to others, regardless of their intentions or internal sense of morality. In fact, the author doesn't mention hating these people at all, although I suppose that's not a completely unreasonable thing to infer. From the perspective of this argument, this only lacks the harm the "deviancy signal" would itself do to the individual, though in the oppressive regime proposed they would perhaps take greater risk by openly deviating
> In fact, the author doesn't mention hating these people at all
The article opens with:
>> There's a special kind of contempt I reserve for the person who says, "I have nothing to hide."
Which isn't literally saying "I hate them" but I'm not sure how else to interpret "a special kind of contempt." Regardless, I've edited my original post.
It is very interesting, in our polarized times, what people read into a statement, and if they interpret it charitably or in the worst possible way. Like you, I find contempt and hate very different.
The author clarifies a couple sentences later that the contempt they feel is "the cold, hard anger you hold for a collaborator" - "collaborator" apparently meaning something like the very bad WWII kind of collaborator, rather than the benign artistic co-author kind. So, despite the implicit acknowledgement that there are multiple types of contempt, this particular contempt does sound fairly close to hatred.
He's running for governor of California. He's apparently having trouble getting 6,000 signatures or $5000 to get on the ballot, so he's probably not a serious candidate.
> He's running for governor of California. He's apparently having trouble getting 6,000 signatures or $5000 to get on the ballot, so he's probably not a serious candidate.
The popular, well funded politicians haven't exactly served their constituents well in the privacy domain...
ok, as a privacy enthusiast, some people just dont get it, and they never will. what you need to do is not discuss anything you dont want broadcast with them... ever. im even thinking of using edge or chrome so i look like a normie in a sea of normies. i mean i really dont have anything to hide but i dont want anyone to know that.
Doesn't mean you or anyone else has a right to know what it is and it doesn't mean I'm doing anything wrong either. We should all admit that we have lots we want to hide from other people and that's just fine and normal.
Speak softly and carry a big stick. Im a fairly private person. Id also end my life in defense of freedom and autonomy... I read posts like this and I cant imagine what an extremist like this is trying
to protect? To continue living a life in shadow? Well, when the time comes that his giant "machine" the governments all poorly maintain and utilize finally awakens and results in concentration camps... you better forget all you knew about dumb technology, hope you have a big stick.
To think, "no presence" = no problems. If I were a dumb machine, I just might decide to pick up all citizens with birth certs that are also internet ghosts. What was the point then?
There's a special kind of contempt I reserve for the person who says, "I have nothing to hide." It's not the gentle pity you'd have for the naive. It's the cold, hard anger you hold for a collaborator. Because these people aren't just surrendering their own liberty. They're instead actively forging the chains for the rest of us. They are a threat, and I think it's time they were told so.
Their argument is a "pathology of the present tense," a failure of imagination so profound it borders on a moral crime. What they fail to understand is that by living as an open book, they are creating the most dangerous weapon imaginable: a baseline of "normalcy." They are steadily creating a data profile for the State's machine, teaching its algorithms what a "good, transparent citizen" looks like. Every unencrypted text, every thoughtless search, every location-tagged post is another brick in the wall of their own cage.
And then comes the part they can't (or won't) fathom. The context shifts. The political winds change. The Overton window slams shut on a belief they once held. A book they read is declared subversive. A group they donated to is re-classified as extremist. A joke they told is now evidence of a thoughtcrime. Suddenly, for the first time, they have something to hide.
So they reach for the tools of privacy. They download the encrypted messenger. They fire up the VPN. They start to cover their tracks.
And in that single act, they trigger the Deviancy Signal.
Their first attempt at privacy, set against their own self-created history of total transparency, is a screaming alarm to the grown surveillance machine. It's the poker player with a perfect tell, or the nocturnal animal suddenly walking in daylight. Their very attempt to become private is the most public and suspicious act they could possibly commit. They have not built an effective shield, as they have painted a target on their own back. By the time they need privacy, their own history has made seeking it an admission of guilt.
But the damage doesn't end with your own self-incrimination. It radiates outward, undoing the careful work of everyone around you. Think of your friend who has practiced perfect operational security, who has spent years building a private life to ensure they have no baseline for the state to analyze. They are a ghost in the machine. Then they talk to you. Your unshielded phone becomes the listening device they never consented to. You take their disciplined effort to stay invisible and you shout it into a government microphone, tying their identity to yours in a permanent, searchable log. You don't just contrast with their diligence; you actively dismantle it.
On a societal scale, this inaction becomes a collective betrayal. The power of the Deviancy Signal is directly proportional to the number of people who live transparently. Every person who refuses to practice privacy adds another gallon of clean, clear water to the state's pool, making any ripple of dissent ... any deviation ... starkly visible. This is not a passive choice. By refusing to help create a chaotic, noisy baseline of universal privacy, you are actively making the system more effective. You are failing to do your part to make the baseline all deviant, and in doing so, you make us all more vulnerable.
There is only one way to disarm this weapon: we must destroy its premise. We must obliterate the baseline. The task is not merely to hide, but to make privacy the default, to make encryption a reflex, to make anonymity a universal right. We must create so much noise that a signal is impossible to find. Our collective goal must be to make a "normal" profile so rare that the watchers have nothing to compare us to. We must all become deviations.
Hannah Arendt in her most famous television interview after the Eichman process said what shocked her in 1933 when the Nazis came to power wasn't that the Nazis came to power, it was how many of the people she preceived as friends and allies, especially in the intellectual space would convince themselves with fantastical theories about why Hitler being in power might be a good thing actually.
The truth is that many people are cowards and even more people are just small-minded. The problem of the "I have nothing to hide"-excuse is that it shows thst this person has no concept about how their small personal ideas would affect the world once you roll them out in scale. Not only that but it shows that they haven't understood how power is organized in their democratic societies.
Let's say we would have nothing to hide and we give up our rights to the people we as the voters are meant to hold accountable. Even if it is a benevolent government filled with honest actors, large scale surveillance has the problem that there will be false positives. Surveil 360 million people each day and with an totally utopic accuracy of 99% and you still get 3.6 million false positives a day. In reality however these processes are much less accurate and the people who use those powers much less benevolent and honest, e.g. it is not uncommon that police look up their exes, spouses, people they personally hate etc.
The biggest problem however is that we have a division of power for a reason in democracies, and the people who vote giving up power in order to give more to a government is bad actually. Democracies need to be designed in such way they can survive one or two governments that try to abolish that system. Giving them a "spy on everybody" -power coupled with let's say secret courts is a good way to risk that form of government for generations who then have to fight bloody civil wars and revolutions to get the power back.
You’re correct on all things in principle, but hiding one’s subversive thoughts (or what may be catalogued by any given regime as subversive) only plays into the regime’s hands, because it atomizes any chance of real resistance. It is much more valuable to bring the fight out in the open, yes, while still playing by the regime’s rules in a way, but out in the open, because that still keeps open the possibility of a community of resistance. Being able to hide stuff only generates conspirators, but keeping (even if heavily camouflaged) resistance out in the open has bigger chances of eventually toppling any given regime, because people power consists in numbers, and you can’t have numbers if the default setting is “hide behind a VPN”.
What really matters is judiciary due process and the legitimacy of a government.
Companies are the ones gathering data, it's not the government doing it.
Before the internet, governments already had data on their citizens.
The internet makes it more difficult for the government to catch criminals and fraudsters.
If you live in Russia or China or under Trump's administration, there are good reasons to hide.
If you live in a country where freedoms and due process are respected, there is no point in hiding, UNLESS you can really argue that due process and freedoms are eroding, but that's a different debate.
> If you live in a country where freedoms and due process are respected, there is no point in hiding, UNLESS you can really argue that due process and freedoms are eroding, but that's a different debate.
This assumes usage of collected data stays the same forever. But regime changes do happen, and once the data has been allowed to be collected, you have no power. I think Trumpland was once considered a state where freedoms and due process were once respected.
For example data of your period, if you're female...
Considering the reckless lawlessness of the current regime of "the shining beacon of democracy", I wonder if they could retroactively convict "murders of unborn babies" and find them by trawling to online health data and looking back at gaps of female periods.
It was once considered that, but it was never actually that. Ever since it was founded it was locking up or killing the people with different skin colours, over and over and over and over and over!
Once your data is out there it's too late. If the hypothetical country you live in - where freedom & due process is respected - suddenly has an authoritarian change of direction, you're done.
It's my understanding that, organically or under external influences, many democratic countries in EU are at the emerging risk of going full fascists. I see that in France, Le Pen & friends don't hide the fact that they'd make a new constitution.
Richest guy in the world has vowed to use his propaganda power to make this happen for the sake of cancelling the EU, fun times
governments have always collected "data*", and use it for power/controll, any company doing so is actualy hoping that the government like's the stink of there particular shit, but now as forever, false positives/negatives will undermine and destroy a career.
The true "nothing to hide" good honest decent folks, will almost certainly leave a convoluted trail of there meandering through life that can and will lead anywhere, but as the author points out, that other relentlessly "perfect" demographic is likely to be dangerous competition, rather than a general danger, and so will be assesed as to potential threat/uses, almost like a job interview...potentialy usefull idiots
I get my gas and electricity from Scottish Power. Recently a rival company, Ovo Energy made a clerical error and sent me a bill, leading to a dispute. The front line of defence against this kind of dispute is that the bills give the serial numbers of the meters. The bill from Scottish Power gives the same meter serial numbers that are embossed on the front of my meters, and is therefore valid. The bill from Ovo Energy gives different serial numbers and is therefore in error.
Picture though the internal processes in Ovo Energy. A second clerk is tasked with attending to the problem. He has a choice. He can change the address to agree with the meter serial numbers, correcting the error. Or he can change the meter serial numbers to those for my address, compounding the error.
Since the meter serial numbers are confidential, to me and Scottish Power, Ovo Energy does not have the second option; they do not know the serial numbers (which are long, like a credit card number, not just 1,2,3,...). Thus the clerical error gets corrected, or just left, but not compounded.
My guess is that confidential information, (such as meter serial numbers, credit card numbers, and account numbers), are the front like of defence against both clerical error and fraud based on impersonation. It is a rather weak defence, but it is light weight, and seems to how much of billing and billing disputes work.
We all have lots to hide: the confidential information that the system needs us to keep confidential to stop clerical errors from compounding.
Having nothing to hide is fine. Nothing to hide and doing nothing wrong is least likely to cause trouble.
The blog post’s argument that someone would be more likely to get watched if they start hiding after not hiding is not valid. ALL encrypted and unencrypted communication is a valid target for analysis, but ANY encrypted traffic is obviously more of a concern, just like one person walking into a store brandishing a gun is as alarming as 5 brandishing guns, and it doesn’t matter whether they used to not carry guns into the store.
(I think I first might have come across this beautifully succinct and unfortunately very true counter in a Reddit AMA with Edward Snowden way back when, but I might be misremembering.)
Hell, look at me, I care and I accepted some of it as price to pay for house peace.
We are stopping corruption here, so only corrupt people could oppose such decision and they should be immediately investigated.
Already there friend.
I feel that I have nothing to hide, but I do my darnedest to ensure that it costs a maximal amount of time and effort to find that out.
If a random stranger (law enforcement or otherwise) wants to know shit about me, then I'm immediately creeped out and the last thing I want to do is make (online) stalking me as difficult as possible.
As it should be for everyone.
Edited to add: One thing I can tell you from experience: law enforcement only look for things that will confirm their suspicions. They do not look for counter evidence, no matter how obvious it is or how easy it is to find - even within government records to which they would already have access.
As such, beware what trail you leave, if it suits the right (wrong) agenda, it will be used to point in the worst possible direction.
Large surveillance systems inevitably build baselines. They don't just detect crimes; they detect patterns and anomalies relative to whatever becomes "normal".
The problem with "nothing to hide" is that it defaults to maximal disclosure. Data is persistent, aggregatable, and reinterpretable as norms and regimes change. The data doesn't.
This isn't purely individual. Your disclosures can expose others through contact graphs and inference, regardless of intent. And it doesn't matter whether the collector is the state or a company; aggregation and reuse work the same way.
I'm well aware of the possible and even unavoidable consequences of the current trajectory.
But this is a conscious decision to try to shape the norm so that the current dystopian zillionaire future would not happen fully.
My reasoning is most likely the humanely typical post-hoc rationalization and strategic reasoning, but I try to think good old MLK quote fits it.
"In the end, we will remember not the words of our enemies, but the silence of our friends"
There are at least some people who would respond by (still) saying "I have nothing to hide." They are proud of their moral choices and confident in their convictions. Arrest them if you dare.
I wonder if the author still has contempt for them?
My spin, as a recovering perfectionist, is when you've done everything you can to be "innocent" and the political or whatever wind changes, the pit of despair is a real and devastating thing. When this happens, sometimes the decisions that are made are desperate.
The article opens with:
>> There's a special kind of contempt I reserve for the person who says, "I have nothing to hide."
Which isn't literally saying "I hate them" but I'm not sure how else to interpret "a special kind of contempt." Regardless, I've edited my original post.
The popular, well funded politicians haven't exactly served their constituents well in the privacy domain...
Doesn't mean you or anyone else has a right to know what it is and it doesn't mean I'm doing anything wrong either. We should all admit that we have lots we want to hide from other people and that's just fine and normal.
To think, "no presence" = no problems. If I were a dumb machine, I just might decide to pick up all citizens with birth certs that are also internet ghosts. What was the point then?
Their argument is a "pathology of the present tense," a failure of imagination so profound it borders on a moral crime. What they fail to understand is that by living as an open book, they are creating the most dangerous weapon imaginable: a baseline of "normalcy." They are steadily creating a data profile for the State's machine, teaching its algorithms what a "good, transparent citizen" looks like. Every unencrypted text, every thoughtless search, every location-tagged post is another brick in the wall of their own cage.
And then comes the part they can't (or won't) fathom. The context shifts. The political winds change. The Overton window slams shut on a belief they once held. A book they read is declared subversive. A group they donated to is re-classified as extremist. A joke they told is now evidence of a thoughtcrime. Suddenly, for the first time, they have something to hide.
So they reach for the tools of privacy. They download the encrypted messenger. They fire up the VPN. They start to cover their tracks.
And in that single act, they trigger the Deviancy Signal.
Their first attempt at privacy, set against their own self-created history of total transparency, is a screaming alarm to the grown surveillance machine. It's the poker player with a perfect tell, or the nocturnal animal suddenly walking in daylight. Their very attempt to become private is the most public and suspicious act they could possibly commit. They have not built an effective shield, as they have painted a target on their own back. By the time they need privacy, their own history has made seeking it an admission of guilt.
But the damage doesn't end with your own self-incrimination. It radiates outward, undoing the careful work of everyone around you. Think of your friend who has practiced perfect operational security, who has spent years building a private life to ensure they have no baseline for the state to analyze. They are a ghost in the machine. Then they talk to you. Your unshielded phone becomes the listening device they never consented to. You take their disciplined effort to stay invisible and you shout it into a government microphone, tying their identity to yours in a permanent, searchable log. You don't just contrast with their diligence; you actively dismantle it.
On a societal scale, this inaction becomes a collective betrayal. The power of the Deviancy Signal is directly proportional to the number of people who live transparently. Every person who refuses to practice privacy adds another gallon of clean, clear water to the state's pool, making any ripple of dissent ... any deviation ... starkly visible. This is not a passive choice. By refusing to help create a chaotic, noisy baseline of universal privacy, you are actively making the system more effective. You are failing to do your part to make the baseline all deviant, and in doing so, you make us all more vulnerable.
There is only one way to disarm this weapon: we must destroy its premise. We must obliterate the baseline. The task is not merely to hide, but to make privacy the default, to make encryption a reflex, to make anonymity a universal right. We must create so much noise that a signal is impossible to find. Our collective goal must be to make a "normal" profile so rare that the watchers have nothing to compare us to. We must all become deviations.
The truth is that many people are cowards and even more people are just small-minded. The problem of the "I have nothing to hide"-excuse is that it shows thst this person has no concept about how their small personal ideas would affect the world once you roll them out in scale. Not only that but it shows that they haven't understood how power is organized in their democratic societies.
Let's say we would have nothing to hide and we give up our rights to the people we as the voters are meant to hold accountable. Even if it is a benevolent government filled with honest actors, large scale surveillance has the problem that there will be false positives. Surveil 360 million people each day and with an totally utopic accuracy of 99% and you still get 3.6 million false positives a day. In reality however these processes are much less accurate and the people who use those powers much less benevolent and honest, e.g. it is not uncommon that police look up their exes, spouses, people they personally hate etc.
The biggest problem however is that we have a division of power for a reason in democracies, and the people who vote giving up power in order to give more to a government is bad actually. Democracies need to be designed in such way they can survive one or two governments that try to abolish that system. Giving them a "spy on everybody" -power coupled with let's say secret courts is a good way to risk that form of government for generations who then have to fight bloody civil wars and revolutions to get the power back.
What really matters is judiciary due process and the legitimacy of a government.
Companies are the ones gathering data, it's not the government doing it.
Before the internet, governments already had data on their citizens.
The internet makes it more difficult for the government to catch criminals and fraudsters.
If you live in Russia or China or under Trump's administration, there are good reasons to hide.
If you live in a country where freedoms and due process are respected, there is no point in hiding, UNLESS you can really argue that due process and freedoms are eroding, but that's a different debate.
This assumes usage of collected data stays the same forever. But regime changes do happen, and once the data has been allowed to be collected, you have no power. I think Trumpland was once considered a state where freedoms and due process were once respected.
Considering the reckless lawlessness of the current regime of "the shining beacon of democracy", I wonder if they could retroactively convict "murders of unborn babies" and find them by trawling to online health data and looking back at gaps of female periods.
It's my understanding that, organically or under external influences, many democratic countries in EU are at the emerging risk of going full fascists. I see that in France, Le Pen & friends don't hide the fact that they'd make a new constitution.
Richest guy in the world has vowed to use his propaganda power to make this happen for the sake of cancelling the EU, fun times