Privacy is necessary for an open society in the electronic age. Privacy is not secrecy. A private matter is something one doesn’t want the whole world to know, but a secret matter is something one doesn’t want anybody to know. Privacy is the power to selectively reveal oneself to the world.
— Eric Hughes, “A Cypherpunk’s Manifesto,” 1993
Vladimir Putin’s not-so-secret police wiretapped a strategy meeting between Russian LGBT activists and Western NGOs in St. Petersburg last month — then played the tapes on TV, as proof of a conspiracy. That’s no surprise. What’s surprising is that the Western NGOs didn’t expect it. “Soviet-like surveillance” (to quote the indignant condemnations) is nothing new in Russia. The Soviet security establishment didn’t ever curl up and die. The only innovation is that recently, instead of using the recordings for blackmail or prosecution, the regime hands them over to pet media for a public smear campaign. But everyone knows that tactic already; during the 2011 anti-Putin protests, “grainy videos and audio recordings” were “leaked to Kremlin-friendly tabloids by security and law-enforcement agencies,” in “a concerted Kremlin effort to discredit and divide their opponents.” The organizers really should have seen this coming.
The truth is, those of us who work on sexual rights internationally don’t always take our own issues seriously. We assume evil politicians don’t truly fear us — that they’re merely manipulative or opportunistic, using homophobia, whorephobia, or misogyny as trumped-up distractions from “real” concerns. We don’t grasp our own power, or get that governments may see these issues as the real ones: that states could spend massive resources on repressing sexual dissidence with the same anxious fervor they devote to crushing separatism or stifling political dissent. Persuaded of our unimportance, we deprecate the actual dangers. But if that ever was justified, it isn’t today. The Obama administration’s broad and occasionally unhelpful ardor in playing tribune for LGBT groups worldwide, for instance, feeds fears that these minuscule movements are actually agents of alien geopolitics, hives of foreign subversion. And the US government’s own success in violating anybody’s and everybody’s privacy only encourages imitation, and revenge.
Everyone should worry about privacy. And you especially need to worry if either your work or your life contradicts society or law. You may run an NGO, or you may be an individual activist in a small town. You may be a queer checking Grindr in a country where gay sex is illegal; you may be a sex worker using Gmail to hook up with clients. You need to think about how you can protect your communications from prying ears and eyes — whether parents, roommates, or police.
Technologies are available. Yet most people don’t use them. There are three broad reasons for reluctance:
a) They’re slow. Secure browsers like Tor are a little lumbering; encrypting e-mails is a hassle. All I can say is, it’s less of a hassle than getting your group closed down, or winding up in jail.
b) Come on, why would they come after me? See above. They may already be after you. But even if the cops haven’t noticed you yet, there are plenty of accidental ways to attract attention. Suppose, earnest HIV activist, that your laptop’s stolen — and when the police recover it, they discover that illegal porn video you downloaded. Suppose, mild-mannered sex worker, that one of the clients you’ve been e-mailing works for Human Rights Watch — and is constantly watched and spied on in your country. There’s no lack of ways you can fall afoul of surveillance.
c) Transparency is a virtue. On principle, a lot of human rights activists don’t try to hide from state surveillance, because, they say, they have nothing to hide. This is noble, but not workable. You may not have secrets, but people who trust you do. Members of your organization, people who come to you for help, may expect confidentiality — and may feel betrayed if you don’t safeguard what they share. The landlord who rents to you, the guy who sleeps with you, the cleaning lady who scrubs the kitchen, could all get swept up in any scandal — smeared, shamed, or hauled into court. You have a responsibility to protect those around you and those who depend on you.
What follows are some steps to protect your electronic privacy, arranged roughly from the simplest to the most complex. I don’t claim to be an expert — the resources are gleaned from my own reading and use. If you have suggestions, or if you see something that won’t work, tell me in the comments or through email. Privacy is like safer sex. There’s no absolute safety, only relative protection. Everybody has to gauge their own levels of acceptable risk. Keeping abreast of changing technologies for both surveillance and safeguarding is vital. The best way to protect your information is to be informed.
Things you can do:
1. Clear your browser’s history. Browsers store copies of the web pages you visit in an area called the cache. Moreover, many pages automatically deposit a little turd of information called a cookie on your computer, which lets them recognize you when you return. Both these allow anybody with access to your computer to reconstruct what you’ve been viewing. I know dozens of people whose families or bosses have uncovered their sexual orientation simply by checking the browser history.
If you use a computer that anybody else might share, whether at home, at work, or in an internet cafe, you should clear the browser history regularly, preferably after each use. It’s not perfect — ultra-skilled geeks could still figure out what you’re been doing — but it frustrates most intruders. Good guides to how to do this, for the most common browsers, can be found here, and here, and here.
2. Realize that Facebook is not your friend. Facebook causes too many headaches to count. But this one is really serious.
Go to the search bar and just type in: “Gays in [your country]” — you know, as if you were looking for a group, or a page describing the local scene. What you’ll get will be quite different:
There’s a parable here about identity construction in the digital age. Facebook automatically takes the button that asks you what gender you’re interested in — one that a lot of people click in fun, or assume to refer to friendship rather than sex — and translates it into being “gay” or not. More ominously, though: The results you’ll get won’t be limited to friends, or friends of friends. You’ll get a list of every man who’s “interested in men” in [your country] and who didn’t bother to make that particular part of their profile private. It’s convenient if you’re gay, and looking for an alternative to Grindr. It’s also convenient if you’re a policeman, and homosexual sex is illegal in [your country], and you’re looking for a way to track down or entrap the guilty and throw them in jail.
This is all the upshot of Facebook’s new “Graph Search,” a terrifying new feature that puts security on a bonfire and lights a match. It allows you to mine the deep structure of the site — to pluck information out of profiles that, as profiles, are invisible to you. It’s a “semantic search” (unlike old-style Google); it doesn’t just take the words you enter literally, it tries to infer what you mean — hence the leap between “interested in men” and “gay.” It’s nasty and clever and it doesn’t give a fuck about your safety.
It’s called “Graph Search” because semantic search builds “a graph of information for the user that pulls insights from different formats to create an over-arching viewpoint related to the original query” … blah, blah. More simply: Facebook employs the little bits of data — “likes” and “interested ins” — from all those profiles to map out commonalities between its customers. But this isn’t really done “for the user,” though it’s sold to you as a way to share lovingly with your loved ones and learn lovely things about everyone. It’s done for Facebook and its advertiser-clients, to divvy up users by their desires and assemble a picture of diversified markets open for advertising and exploitation.
There’s a whole Tumblr blog highlighting the information, from eccentric to creepy, that Graph Search can turn up. You can look for “Employers of people who like racism”; you can root out “Mothers of Catholics from Italy who like Durex condoms.” But folks whose private lives put them in danger won’t laugh. “Graph Search” makes state repression easy. Human rights advocates ought to give Facebook hell. The search unearths, for instance, 258,285 results for “Men who are interested in men in Iran.” Somehow this has failed to elicit any objections from the usual obsessives over the Islamic Republic (they’re all on Facebook right now, busy searching for “Men in London who like men and like to read press releases”). But if an enterprising religious policeman in Tehran figures out how Graph Search can further the torture business, Facebook will have blood on its hands.
What can you do? The only way to remove yourself from Graph Search is to make sure that each item of information on your profile is marked “private.” To repeat: the universal privacy setting that could sequester your whole profile is gone now. You’ve got to do this step by step:
a) Go to each item in the “About” section of your profile, and if there’s anything you don’t want strangers to see, either delete or change it, or make sure the privacy setting is limited to “Friends.”
b) Check on every photo you’re tagged in. If you didn’t post the picture, its visibility depends solely on the privacy settings of the person it belongs to. If you don’t want it seen or searched, ever, you’ll have to remove the tag.
c) You can review all the comments you’ve made on Facebook by going to your Activity Log — sort it by Comments (look on the left side). If you’ve commented on somebody else’s photos or timelines, you can’t change the privacy settings — but if you don’t want the comment seen, you can delete it.
d) You can still change the privacy settings globally for all the old posts on your timeline. Click the gear icon at the upper right of your screen; select Privacy Settings. Under “Who can see my stuff?” you’ll find the option to “Limit the audience for posts you’ve shared with friends of friends or Public.” That’ll let you make them private at one fell swoop. Another option there allows you to review all your past posts if you want to decide on them one-by-one.
There’s a good overview of these methods here.
3. Use Tor. Tor is a downloadable bundle of software that includes its own browser. When you use the browser to access the Internet, the information you receive or send bounces through a global network of thousands of relays — thousands of other computers — and is encrypted over and over. All the encryption makes it very hard to intercept the data in transit; the rerouting makes it almost impossible to find its origin. All this means that unfriendly eyes can’t detect your location, or trace your posts or visits or messages back to you.
The chart shows how. Ordinarily, if Alice up there sends someone an email or accesses a web page, those on the other end can find out the Internet address she’s using. However, if she uses Tor, the recipient (“Bob” down below, or any watchers on Bob’s end) can only see the address of that last relay, or proxy, in the extended network: not Alice’s own.
Tor (the name stands for The Onion Router, representing the layers of protection that an intruder would have to peel away) was developed by the US military, and the State Department still funds its nonprofit promoters as a way of supporting what America otherwise opposes, Internet freedom. But it’s so independent and impenetrable that (according to national security documents Edward Snowden leaked) even the US government is intimidated; they call it “the king of high-secure,” anonymous Internet access. It’s open-source, meaning a team of elves is always at work to fix any vulnerabilities. Like most open-source projects, it has a cooperative and collective spirit. In fact, you can volunteer your own computer to serve as one of the relay points — though I don’t recommend this, because if the system ever is cracked, you could conceivably be held liable for anything illegal other users might send through your terminal.
There are four main limitations:
a) Tor is not fast. All those relays slow things down. Moreover, Tor blocks plugins like Flash, Quicktime, and RealPlayer, because they can bug up the browser and reveal your real address. You need a special fix to get it to play YouTube videos.
b) Obviously, it won’t conceal your identity if you log into e-mail or any other service. It’ll just hide what Internet address you’re writing from.
d) If your government knows where you are to begin with, it could still find ways to get at your computer and any information you’re sending from it. Similarly, Tor can’t protect what’s on the computer or server at the other end, the one you’re communicating with. Only the transmissions in between are encrypted and secure. Look at that chart again: Tor doesn’t encrypt the last stage of traffic, between the “exit node” (the last relay point) and the target server. If you want to be more secure, you need to use so-called “end to end” encryption such as PGP (below), which encodes your messages from the point you create them until the intended receiver reads them.
Nonetheless, Tor remains a crucial tool if you want to browse the Internet anonymously. Download it free here.
4. Encrypt your hard drive. You should protect yourself on your own end by keeping all or part of your computer encrypted. Anybody unauthorized who tries to open it — a hacker, a policeman, a thief — won’t be able to read the information you store in encrypted files. The data can only be made readable with a “key” — that is, by entering a code that activates decryption. So the main thing is; never give away (or forget) your key.
No encryption system is perfect. Governments — particularly the resourced and intrusive ones, like the US, China, or Israel — are always looking for ways around the codes. The US National Security Agency spent billions on what it called “an aggressive, multipronged effort to break widely used Internet encryption technologies.” This included $250 million a year bribing corporations — sorry; I mean “actively engag[ing] the U.S. and foreign IT industries to covertly influence and/or overtly leverage their commercial products’ designs” to make them “exploitable.” Paying them, that is, to put holes in the stuff they sell. A quarter of a billion buys a lot of cooperation. Microsoft, for one, now has a policy of providing “intelligence agencies with information about bugs in its popular software before it publicly releases a fix.”
The lesson: Don’t waste money buying “proprietary,” corporate encryption systems. You have no way of knowing whether they’ve obligingly built a back door into their ramparts for US spies to pry. (And you don’t know whether the US has shared those Trojan portals with your government, if it’s an American ally. Or, if it’s not, perhaps your local spies have managed to copy US anti-cryptography shortcuts: Americans seem better at stealing others’ secrets than concealing their own.) Paradoxically, open-source software is safer precisely because its code is out there on the net for anyone to see. If a government tried to insert malware or sneak in a weakness, somebody probably would notice. And it “is in a constant state of development by experts all over the world” — meaning that a lot of beautiful minds are fixing and fine-tuning it all the time.
Here is a helpful list of five trusted file encryption tools. Many experts recommend TrueCrypt, which works with Windows, Mac, and Linux, and is free. (Reportedly, Edward Snowden used it to smuggle information on his hard drive.) It can encrypt files, folders, or whole drives. It can hide encrypted volumes for additional security. It does “real-time encryption,” meaning it decrypts and encrypts material as you work. This simplifies things for you; true, it can slow your computer’s speed somewhat, but not much — “the performance penalty is quite acceptable,” one independent review found. You can download TrueCrypt here.
IMPORTANT NOTE (November 2016): TrueCrypt is no longer considered secure. Many recommend Veracrypt, which works quite similarly to TrueCrypt, as the best alternative. You can read about other alternative encryption tools here and here.
Meanwhile, you should also use an encryption program on your phone. There are programs to encrypt not only saved content on your phone, but chat, texts, and voice calls, so that outsiders can’t intercept them. You can get these encryption programs for free:
- If you have an IPhone, download Signal here.
- If you have an Android phone, download Boxcryptor here. It’s also available for Windows computers here.
- Or, if you have an Android phone, you can download TextSecure to protect your texting, and RedPhone to protect your voice calls.
5. Encrypt your emails. Email encryption is like riding a bicycle. It’s difficult to explain it to those who haven’t tried it, without making the doer sound either superhumanly agile or insane. (“Mounted upon the high saddle, commence revolving your legs in circular and rhythmic motion, an agitation that simultaneously ensures the balance of the inch-wide wheels and propels the mechanism forward ….”) Describing it is way harder than doing it. Bear with me, and try not to be too terrified, while I try.
First, background and basics. The standard form of email encryption is named “Pretty Good Privacy,” or PGP. Phil Zimmermann invented it in the 1990s. Cryptography, he wrote, is “about the power relationship between a government and its people. It is about the right to privacy, freedom of speech, freedom of political association, freedom of the press, freedom from unreasonable search and seizure, freedom to be left alone.” The anti-war and anti-nuke movements were his particular passion, and he intended the tools for them. “PGP” has since been trademarked by a company selling a proprietary variant, but there’s a range of free, open-source versions; one, called GnuPG or GPG, is available here, and others are at the International PGP home page.
E-mail encryption relies on a sender and receiver sharing tools that let them both encrypt messages and decode them.
These tools are called “keys.” When you install the program, you’ll be asked to set up two keys — strings of characters that perform certain tasks. You will have a public key, and a secret key. Anybody can use the former, but the latter will carry a password so that only you can activate it. You must share the public key with your interlocutors — anyone who wants to send you an encrypted message needs to have your public key first, because that’s what will encrypt it for them. And you’ll need that person’s public key to write her in return. People who have PGP on their computers can communicate easily as long as they have each other’s public keys.
So let’s say Faisal wants to send you a note. Faisal will use your public key, which you’ve given him, to encrypt the message in a code that’s readable to you alone. Though your “public key” performed the coding, the message is far from public: that key is cyber-twinned with your secret key, so that only your secret key can decode what it says. You’ll reply using Faisal’s public key, in a message he can only decode with his secret key. You can also apply your secret key to “sign” that message digitally, so Faisal will know it’s authentically from you; it’s like a seal on an old-fashioned letter, showing that nothing’s been tampered with in transit.
Several things make all this extra cumbersome.
a) You can only communicate with people who have both the software and your public key. So you’re obviously not going to encrypt all your e-mail communications — just the sensitive ones with folks who share your line of work. Some commercial “key authorities” compile online directories of users’ public keys, like phone books. Rather than relying on those, though, you’ll probably form circles of colleagues and co-conspirators who share each other’s public keys — “web of trust” is one term for this, a phrase that manages to combine Zen touchy-feeliness with faint paranoia.
b) You can only use PGP on the computers where you have it installed. If you get an encrypted message on your phone, you won’t be able to read it till you’re sitting at the computer that has your secret key. If you’re travelling and left your laptop behind, you’re screwed.
Email encryption is complicated, though once you and your correspondents get used to it, things will seem more natural and routine. Its advantage is that it safeguards information through the whole process of transmission — end to end, unlike the partial protection Tor offers. You can find more detailed descriptions of how to use it here and here.
6. Go off the record. IMPORTANT NOTE: NOVEMBER 2016: Since this was first published several other new encrypted-chat programs have appeared. I like Cryptocat: it’s easy to use, and lets you have encrypted chats with other Crypotcat users that are immediately deleted and hence leave no digital record anywhere. All you have to do is ask the person you want to chat with to download Cryptocat too. Still, the explanations below are still valid.
Millions of people worldwide used to entrust Skype with their long-distance intimacies and secrets. We now know, though, that the corporation has routinely handed over recorded conversations to the US and Chinese governments.
Off the Record (OTR) is a safer alternative. It’s a system, somewhat similar to PGP, for encrypting instant messaging over most of the major chat networks. Yet it’s much less cumbersome than PGP, and lets you communicate quickly in real time. Do not confuse OTR with the “off the record” feature in Google’s own instant messaging service, which is only as secure as Google itself — that is, not very; US state security, after all, has figured out how to trawl data from the giant corporation’s communications links. OTR encryption is really off the record, and offers you important protections.
To use OTR, you’ll need to download and install an instant-messaging client: either Pidgin or Adium. Pidgin is a free program that lets you chat with friends over the Google, MSN, Yahoo!, Jabber, and AIM networks. Adium is very similar, but specifically made for Mac. Adium has OTR built in. For Pidgin, you just have to add a special OTR encryption plugin.
From there on, it’s quite simple. All that’s required is that the person you want to chat with also have Pidgin or Adium, with OTR activated. OTR does two things for you: It encrypts the conversation, and it also lets you verify your messaging partner’s identity. (This verification formerly required exchanging a “fingerprint,” a trimmed-down version of PGP’s public keys, but recent versions of OTR simply let you use a previously-agreed-on secret word.) OTR encrypts your messages almost automatically: the two sets of software swap the necessary codes and mumbo-jumbo pretty much without either of you humans noticing.
OTR has one additional advantage that PGP e-mail doesn’t. For each chat session, the software creates a unique encryption key, then “forgets” it once the chat is over. This means that if your OTR account is compromised — if, for instance, somebody steals your computer with your chat program on it — nobody can recover and decrypt any past conversation. Effectively, those fleeting words are gone forever. This is called “forward secrecy,” and it bestows the peace of mind that forgetfulness fosters. (In PGP, by contrast, someone who obtains your private key could decode every single encrypted e-mail you’ve saved.)
We must defend our own privacy if we expect to have any. We must come together and create systems which allow anonymous transactions to take place. People have been defending their own privacy for centuries with whispers, darkness, envelopes, closed doors, secret handshakes, and couriers. The technologies of the past did not allow for strong privacy, but electronic technologies do.
— Eric Hughes, “A Cypherpunk’s Manifesto,” 1993
In the early 1990s, I taught for two years in Romania. The apartment I lived in had been the American lecturer’s residence since the mid-1960s; microphones riddled it, so many that at night I thought I could hear the wiretaps faintly clicking like sickly crickets, and I got an electric shock when I touched one particularly wired stretch of wall. The last Fulbright professor who’d served before the Revolution told me how he and his wife decided, in the cold November of 1989, to host a Thanksgiving dinner for their Romanian colleagues. It took them days to find a starved excuse for a turkey; then they faced the dilemma of making stuffing, when no vegetables graced the market at all. They’d spent a day in the kitchen debating the difficulty, till someone knocked at the door. A little man hunched outside, bundled against the wind. Springing into speech, he hinted that some colleagues — well, cousins, who intimately attended to matters about the flat, had phoned him regarding a problem here that, perhaps for a fee, needed fixing. He gestured vaguely at a tall-antennaed car parked (as it was always parked) down the road. “I understand,” he said, “that you are discussing how to stuff a bird. I can help. I am a licensed taxidermist …”
It was funny, and not funny. When I lived there the city still roiled with ethnic hate and nationalist hysteria. As a gay man and a human rights activist, who visited prisons on most off days, I was an object of exceptional interest. The secret police called in a friend of mine, and interrogated him about every syllable of our conversation the night before in my living room. They warned him I would recruit him into “a spy ring of Hungarians, Jews, and homosexuals undermining the Romanian nation.” I went to the United States for a couple of months that summer. Showering in the cramped bathroom in my father’s house, I started talking idly to myself, then stopped in terror: Had I repeated a secret? What if they were listening? The surge of relief when I realized there were no ears around was as if a dam burst behind my tensed muscles. I realized the constant and intolerable pressure I’d lived under for a year, always watched, always overheard.
The same year I settled in Romania, 1992, a few radical computer geeks in San Francisco started a mailing list that eventually grew into the Cypherpunk movement. Loathing of state surveillance drew them together, and a belief that technology could forge tools to resist. Their ideology was a remarkable faith that code should be public and knowledge shared so that people could stay private and intimacy stay intact:
Cypherpunks write code. We know that someone has to write software to defend privacy, and since we can’t get privacy unless we all do, we’re going to write it. We publish our code so that our fellow Cypherpunks may practice and play with it. Our code is free for all to use, worldwide. … We know that software can’t be destroyed and that a widely dispersed system can’t be shut down.
Cypherpunks deplore regulations on cryptography, for encryption is fundamentally a private act. The act of encryption, in fact, removes information from the public realm. Even laws against cryptography reach only so far as a nation’s border and the arm of its violence. Cryptography will ineluctably spread over the whole globe, and with it the anonymous transactions systems that it makes possible.
There’s a lot of our world in that manifesto.
Electronic technologies “allow for strong privacy.” But they also destroy it, at least when states and corporations wield them. I used to feel innocently sure in the US that the listening ears weren’t there; I wouldn’t feel it now. That watchfulness, inculcated in the bone, is the condition we inhabit; that no-man’s land is where we live.
The struggle between computer and computer, to see and not to be seen, is the new arms race and Cold War. Unless you want to drop out, turn Unabomber and settle in a cabin with the wires all cut, paranoiacally interrogating and torturing your carrier pigeons, you have to take a side. Choosing the technologies of privacy is about as close as you can come to choosing freedom. Yet it means living walled in by technology’s protections. The tension won’t go away.