Do You Need OPSEC if You Have Nothing to Hide?

[dropcap]W[/dropcap]e talk a lot about OPSEC and PERSEC, as well as how you should be communicating with and protecting your group—or yourself.

One of the biggest objections we hear about OPSEC or see posted by people on social media is that OPSEC is unnecessary because “we have nothing to hide.” This article will answer that, and is the first in a series where we’ll explore those objections in detail.

OPSEC and Chess

The Cryptosphere has a fantastic explanation of why you do have something to hide. All of you. And you very well SHOULD. To paraphrase for the folks who don’t spend their days dealing with game theory:

Imagine you’re playing chess. You see the whole board, you see all the pieces, and every possible move and rule is available to you.  People involved with game theory call that “perfect information,” or “the same information to determine all of the possible games (all combinations of legal moves) as would be available at the end of the game.” When you’re playing chess, all possible moves are right there. The other player isn’t hiding the board, they’re not hiding their pieces, they’re not suddenly changing the rules (hence the phrase “above board.”).  Chess is chess. Now, this would be a situation of “perfect information” except for one problem.

Most humans don’t possess the cognitive processing paths allowing them to treat chess as a game of perfect information. We’re simply not primed or trained to see all those possible moves from all sides.

Why do you think it was such a big deal when Garry Kasparov beat Big Blue the supercomputer at chess in 1996? Because computers have perfect information. Your brain usually doesn’t have the capacity to gain it, even if it’s available to you.

Tic-tac-toe is another game he mentions, to break it down further. If you put a 9-square TTT board, you could use a decision tree to plan out every possible move by both players throughout the game. You could literally have a blueprint for how to win because in any given board configuration you would know all possible moves by your opponent. That is called having perfect information.

How does this apply to you? Now imagine playing chess when your opponent decides midway through the game that the rules changed. He hid his pieces, and then suddenly has extra. Then you realize you don’t even know how many pieces he’s playing with. He’s hiding half the board, and changes which half he’s hiding at any given time. How well could you play?

Guess what? That’s exactly the kind of chess game you’re playing right now, whether you like it or not.

The Game is Stacked Against You

Before saying “I have nothing to hide” I’d have to say that I possessed perfect information in the context of making that decision. That’s perfect information not only about every past move leading up to this decision but every future move after it. It assumes that all “pieces” are above the board and that I know all the rules to this game. And that’s demonstrably incorrect.

Let’s take the assets and programs of the National Security Agency as some of our game pieces. For them to be above the board we’d need the government to be both honest and accountable about them. Instead, NSA Director Keith Alexander has repeatedly lied to the public about every aspect possible. So has Director of National Intelligence James Clapper. They’ve lied to us as individual players and Congress as what we might call a Superplayer; about buildings, assets, programs, collected materials. Everything we’d need to get a good idea, no less a complete idea, about the pieces on the playing board.

Now, that’s just the pieces. Let’s look at the board you’re playing on.

In order to play chess you’ve got to abide by certain rules, but there’s a trade-off: the rules are all made plain beforehand. You’re not going to get midway through the game and then be challenged about the legality of your opening move, either due to a rule that was hidden from you or due to a new interpretation of an old rule. But in the game model we’re dealing with here, government in general and intelligence agencies in particular have established exactly this possibility. As one example: the very court opinions and administration interpretations of the Patriot Act allowing the government to order telecommunications companies to collect and provide massive amounts of data on US citizens are secret.

…once you seemingly violate a rule that you’re not aware of, or once the administration alters its interpretation of the rule to make you a violator, they can now go back through every communication within their grasp and piece it together in any way they desire in order to make you appear guilty as sin. [emphasis added]

Without you knowing, at any step of the process.

What’s It All Mean?

This all adds up to a very simple bottom line. By saying “I have nothing to hide,” you are making very dangerous and false assumptions.

Both players in the chess game (you and the government) are in agreement as to the rules of the game, and those rules won’t change. We have seen plenty to know that we are all most definitely NOT in agreement about the rules, and those rules change at the opponent’s whim–or even after the fact.

Both players know how many pieces are on the table. We also know this is false; your opponent has pieces you aren’t aware of, many of which are deployed against you and others like you. They swap those pieces out at will, upgrade them when possible, and even stack their pieces in ways that violate whatever rules it previously agreed to.

Both players are playing openly. Obviously this is false as well. While you’re playing “openly” and claiming you’re pure as the driven snow, your opponent is playing the game at a whole other level—a level you don’t see. When it moves pieces, you don’t know. When it changes the rules, you don’t know. When it decides to add a host more pieces or even have one of its pieces pretend to be one of yours, you don’t know.

Your moves only affect your own game. If you truly believe this, then you are the worst kind of security risk: the person who thinks he can act how he wants and it doesn’t affect anyone else. Your moves affect every game being played around you; your opponent is able to play many, many games simultaneously, and has no problem taking strategy or information from your game and using it to beat another player. Your arrogance, lack of understanding, and refusal to comprehend the “game” can and will get someone else killed or arrested.

“I have nothing to hide” means you’re playing an asymmetric information game like other players would want you to: poorly. Out of some mythical principle you’ve chosen to tie both hands behind your back in order to play a game that the intelligence agencies won’t even tell you the rules to. This is a game you will lose every time. Because not only do other players have more information than you, they also have just about all the power in the situation. And remember what I said above: strategy in asymmetric games is dictated by power imbalance between the players. Relinquishing both your power and your information is not a strategy, it’s a suicide.

The thing about suicide is, it affects everyone around them. It’s not a solo activity. So next time you shrug your shoulders on Facebook and turn your nose up at protecting your own information and that of your group, remember this: You don’t have perfect information and this game isn’t being played fairly. If you want to play that way anyhow, then others will pay the price for your actions.

6 Things You Should Never Do With a Burner Phone

I get a lot of questions about burner phones. What kind to buy, how to buy, where to buy. The problem is, people go buy them and then use them improperly—completely defeating the purpose.

There is most definitely a right and a wrong way to use a burner phone. We’ll talk about 6 things that you should never, ever, under any circumstances do with your burner. In fact, if you have one and you have EVER done any of these things, you can assume that anything you talked about or did while it was in your possession is already known by your adversary.

1. Buy your burner phone anywhere you normally are.

This one doesn’t necessarily deal with usage, but it’s necessary to mention. If your idea of tradecraft is going to the Wal-Mart 5 miles from your house instead of the Target that’s 2 miles from your house, then please slap yourself for me. Don’t buy it near your work, your home, don’t buy it at the gas station you normally go to, the quickie mart where you get your smokes at 10pm, or anywhere else you ever go to. In fact, it’s also a good idea to not go in your own car. Don’t do anything you normally do, don’t stop anywhere you normally stop, and whatever you do, don’t take your regular phone with you. Have a cover story just in case. Always have a cover.

2. Put all your contacts in your burner.

It might seem like common sense, but you’d be surprised at how many people go out of their way to purchase one “correctly,” and then immediately put their new phone side by side with their old one so they can put all their contacts in it. Or even worse, they simply log into their cloud account and download their contacts backup. I should not have to explain how beyond moronic this is. Burner phones are not for chatting people up. They’re for coordination, passing short bursts of time-sensitive information, etc. In other words, you use them if you have to, and only to speak to another burner phone.

3. Install all your regular apps.

Pay very close attention to these words from Grugq:

Just 4 apps are enough to reidentify users 95% of the time. A complete list of installed apps is unique for 99%.

Your burner phone is not your personal phone. Say that out loud to yourself until you understand it. Your burner has one purpose, and one purpose only. Don’t install Wickr on it and sign in with your regular username. Don’t install Candy Crush on it because that’s how you kill time with your regular phone. Don’t install that one app you can’t do without. Your burner is not your personal phone.

Read the rest at Patrick Henry Society. When you’re done, take a look at the Groundrod Primer class coming up. You need it.

LastPass Unsafe: Easy Attack Gives Access to Anyone

Do you use LastPass? Might want to rethink that.

Unless, of course, you don’t mind someone getting all of your passwords.

Judging by how people are with their digital security, what are the odds that your LastPass master password is the same password as a whole bunch of your other stuff…?

 

Signal vs. Wickr: How Secure is Your Secure Messaging App?

Bottom line: Facebook doesn’t cut it; in fact, if you’re still using Facebook to coordinate, recruit, and communicate about your activities (stop doing roll calls!), then you’re a liability to your contacts–there’s no two ways about it. You need secure messaging. No excuses.

Some of you have a secure messaging app you use—but is it secure? The Electronic Frontier Foundation released a Secure Messaging Scorecard that will tell you, and we’ll flesh those ratings out with information from other experts. Let’s see how two of the more prominent apps stack up.

Secure Messaging Criteria

EFF uses a list of criteria to grade each application on a simple yes/no basis; it uses the simple formula these are the features it should have. Does it, or not? Some of these criteria include whether your password or identifying details are stored on their servers, or whether the provider themselves can access your messages. While even a full green light doesn’t mean the app is completely government-proof, it gives you a good idea as to whether you’ll at least make them work for it, and whether the company is on the right track in terms of their goals and capability.

On FacebookWickr

Perhaps one of the most popular apps used by those in the movement, Wickr claims that their level of security is better than any other app on the market. It’s free to boot, which makes it highly attractive to many. It has a mostly green light from EFF, but the problem is that Wickr is missing two critical components:

  • Its code is not open for independent review and audit.
  • The security design is not properly documented; i.e., public.

One of the most important parts of the security process is ensuring that each app’s code is available for other coders and security researchers to audit. It’s a self-imposed accountability system that allows the community to ensure quality and that apps do what they say they are supposed to do. In addition, developers typically release a white paper or other technical document to explain in detail how their encryption process works–again, for accountability and transparency. If the system’s encryption process is solid, it doesn’t matter if every single line of code is publicly available. Audits like these have caught both backdoors and coding errors—resulting in a better product. When you’re talking about life and death communications, you need to have the most secure app available. Audits help achieve that through public disclosure of both the encryption and the code itself. The keys are what stay private.

Wickr, however, has not released its code (refusing to even consider it), and that’s caused an interesting debate in the security community. Security researcher Brian Krebs puts Wickr in a group of apps “that use encryption the government says it can’t crack” but others aren’t so sure. This video explains some of the reasons why you should perhaps think twice before trusting your secure information to Wickr. The video was made in 2014; it would be a good idea to check some of the documents he’s talking about to see if any of these issues have changed. (I can tell you from experience that his first issue—them storing your password on their server after claiming they do not—is not rectified as of yet. Also, check out his other videos, especially the one regarding your contacts).

Several other security researchers have also voiced concerns regarding Wickr’s lack of open source accountability.

 

“We have a kind of a maxim in our field, in cryptography, which is that the systems should be open,” says Matthew Green, a cryptography researcher and professor at Johns Hopkins University Information Security Institute. […] For Green, that means “if you don’t know how a system works, you kind of have to assume that it’s untrustworthy.” He adds that this is not about being an open source activist. But Wickr, he says, doesn’t even have white papers on its website explaining how the system works…”From my perspective I don’t think the company should be telling us, ‘Trust us, it’s safe,’ ‘Trust us, it’s encrypted,’ or ‘Trust us, it’s audited,'” says Nadim Kobeissi, a cryptographer and founder of encrypted browser-based chat service Cryptocat. “We should be able to verify ourselves.”

Others believe that Wickr’s refusal to make their code open to independent audit is just fine. Dan Kaminsky, a security guru, has said he personally audited Wickr’s code and it’s secure. However, Matthew Green sums it up thusly:

Should I use this to fight my oppressive regime? Yes, as long your fight consists of sending naughty self-portraits to your comrades-at-arms. Otherwise, probably not.

It’s each individual choice whether to use Wickr, and Kaminsky’s admonition that “nothing is 100% secure” is a fair one. I use Wickr myself, but not exclusively, and not for anything critical.

Signal

Another increasingly popular app is Signal (formerly RedPhone and TextSecure). Offering both texting and secure calling, the EFF gives Signal a green light across the board. It has all of the encryption features of Wickr, and also has open source code and documented encryption processes. Matthew Green says that it “does not retain a cache of secrets from connection to connection.” The Intercept also endorses Signal, with the caveat that any app you install is only as secure as the device you install it on. Other endorsers include Bruce Schneier, Edward Snowden, and Laura Poitras (for whatever that may be worth to you personally).

Like Wickr, Signal also has a desktop version. And, since it’s tied to the device, it doesn’t save your password on a server like Wickr does. From Signal’s website:

The Axolotl ratchet in Signal is the most advanced cryptographic ratchet available. Axolotl ensures that new AES keys are used for every single message, and it provides Signal with both forward secrecy and future secrecy properties. The Signal protocol also features enhanced deniability properties that improve on those provided by OTR, except unlike OTR all of these features work well in an asynchronous mobile environment.

For those who would like to audit Signal’s code themselves, you can find that here.

Conclusion

What you choose to use and trust is a personal decision. Nothing is completely secure all of the time; anything critical should be kept to face to face meetings. In addition, all standard OPSEC rules should apply. (For a real world case of security fails and how that ended, read this story.) For those who claim that “we aren’t doing anything illegal,” keep in mind that we have reached a point where that determination is made on a case by case basis these days, and the odds are not in your favor. I also daresay that there are quite a few people recently put in jail who, if they’re smart, are rethinking a lot of their OPSEC and security strategies. Besides, as world renowned information security researcher The Grugq points out that “OPSEC is prophylactic, you might not need it now, but when you do, you can’t activate it retroactively.”

I’ll do a future article on other apps such as Silent Circle, Telegram, Zello, and more. In the meantime, sit down and decide what your critical information is. Do some basic threat analysis. Next, do some research on the above programs and decide what you can afford to compromise in terms of security. For many users of secure chat, it’s a life or death decision. Keep that in mind.

Above all, take the time to research and learn. You don’t have to be a computer wizard, but you do need to learn the basics of encryption and how to protect  yourself. There’s an excellent beginner primer here (add this blog to your daily reads). For those who prefer a classroom setting, we have the Groundrod Primer class coming up in a few weeks. We highly recommend you check out both.

Whatever you do, for the love of Pete, stop using Facebook as a coordination, networking and recruiting tool.

SHTF Intelligence Course

Sam Culper of Forward Observer Magazine is teaching a SHTF Intelligence course in Spokane, WA in mid-March. If you haven’t taken this class yet, you need to–and if you’re one of the folks who have been asking us for an intelligence class in the Spokane area, we’ll simply point you in Sam’s direction for this one. He’s one of the best out there for this particular topic; he literally wrote the book on it. Here’s a taste of what you’ll be learning:

– threat identification
– threat analysis
– understanding the threat environment and you
– understanding the community security mission
– community security strategies
– the Intelligence Cycle
– how to gather intelligence information (specific for your locale)
– how to analyze incoming intelligence information
– how to set up a community intelligence section
– the fundamental tasks and responsibilities of the intelligence section
– Intelligence Preparation of the Battlefield & Community
– Area Assessments

There’s a lot more information on the FOMag website. If you want to know what’s really going on around you, if you truly want to understand the threats we face, and if you want to learn how to effectively deal with community intelligence then you need this course. Don’t wait until SHTF to care about this stuff—you need to understand SHTF Intelligence NOW.

As an added bonus, you can help out TOWR’s mission as well by attending! Sam has agreed to donate to TOWR for any students who we send to his course. So, go learn some critical skills AND let him know we sent you, so you can help us bring you more classes as well, such as the Groundrod Primer class in just a few weeks!