Bottom line: Facebook doesn’t cut it; in fact, if you’re still using Facebook to coordinate, recruit, and communicate about your activities (stop doing roll calls!), then you’re a liability to your contacts–there’s no two ways about it. You need secure messaging. No excuses.
Some of you have a secure messaging app you use—but is it secure? The Electronic Frontier Foundation released a Secure Messaging Scorecard that will tell you, and we’ll flesh those ratings out with information from other experts. Let’s see how two of the more prominent apps stack up.
Secure Messaging Criteria
EFF uses a list of criteria to grade each application on a simple yes/no basis; it uses the simple formula these are the features it should have. Does it, or not? Some of these criteria include whether your password or identifying details are stored on their servers, or whether the provider themselves can access your messages. While even a full green light doesn’t mean the app is completely government-proof, it gives you a good idea as to whether you’ll at least make them work for it, and whether the company is on the right track in terms of their goals and capability.
Perhaps one of the most popular apps used by those in the movement, Wickr claims that their level of security is better than any other app on the market. It’s free to boot, which makes it highly attractive to many. It has a mostly green light from EFF, but the problem is that Wickr is missing two critical components:
- Its code is not open for independent review and audit.
- The security design is not properly documented; i.e., public.
One of the most important parts of the security process is ensuring that each app’s code is available for other coders and security researchers to audit. It’s a self-imposed accountability system that allows the community to ensure quality and that apps do what they say they are supposed to do. In addition, developers typically release a white paper or other technical document to explain in detail how their encryption process works–again, for accountability and transparency. If the system’s encryption process is solid, it doesn’t matter if every single line of code is publicly available. Audits like these have caught both backdoors and coding errors—resulting in a better product. When you’re talking about life and death communications, you need to have the most secure app available. Audits help achieve that through public disclosure of both the encryption and the code itself. The keys are what stay private.
Wickr, however, has not released its code (refusing to even consider it), and that’s caused an interesting debate in the security community. Security researcher Brian Krebs puts Wickr in a group of apps “that use encryption the government says it can’t crack” but others aren’t so sure. This video explains some of the reasons why you should perhaps think twice before trusting your secure information to Wickr. The video was made in 2014; it would be a good idea to check some of the documents he’s talking about to see if any of these issues have changed. (I can tell you from experience that his first issue—them storing your password on their server after claiming they do not—is not rectified as of yet. Also, check out his other videos, especially the one regarding your contacts).
Several other security researchers have also voiced concerns regarding Wickr’s lack of open source accountability.
“We have a kind of a maxim in our field, in cryptography, which is that the systems should be open,” says Matthew Green, a cryptography researcher and professor at Johns Hopkins University Information Security Institute. […] For Green, that means “if you don’t know how a system works, you kind of have to assume that it’s untrustworthy.” He adds that this is not about being an open source activist. But Wickr, he says, doesn’t even have white papers on its website explaining how the system works…”From my perspective I don’t think the company should be telling us, ‘Trust us, it’s safe,’ ‘Trust us, it’s encrypted,’ or ‘Trust us, it’s audited,'” says Nadim Kobeissi, a cryptographer and founder of encrypted browser-based chat service Cryptocat. “We should be able to verify ourselves.”
Others believe that Wickr’s refusal to make their code open to independent audit is just fine. Dan Kaminsky, a security guru, has said he personally audited Wickr’s code and it’s secure. However, Matthew Green sums it up thusly:
Should I use this to fight my oppressive regime? Yes, as long your fight consists of sending naughty self-portraits to your comrades-at-arms. Otherwise, probably not.
It’s each individual choice whether to use Wickr, and Kaminsky’s admonition that “nothing is 100% secure” is a fair one. I use Wickr myself, but not exclusively, and not for anything critical.
Another increasingly popular app is Signal (formerly RedPhone and TextSecure). Offering both texting and secure calling, the EFF gives Signal a green light across the board. It has all of the encryption features of Wickr, and also has open source code and documented encryption processes. Matthew Green says that it “does not retain a cache of secrets from connection to connection.” The Intercept also endorses Signal, with the caveat that any app you install is only as secure as the device you install it on. Other endorsers include Bruce Schneier, Edward Snowden, and Laura Poitras (for whatever that may be worth to you personally).
Like Wickr, Signal also has a desktop version. And, since it’s tied to the device, it doesn’t save your password on a server like Wickr does. From Signal’s website:
The Axolotl ratchet in Signal is the most advanced cryptographic ratchet available. Axolotl ensures that new AES keys are used for every single message, and it provides Signal with both forward secrecy and future secrecy properties. The Signal protocol also features enhanced deniability properties that improve on those provided by OTR, except unlike OTR all of these features work well in an asynchronous mobile environment.
For those who would like to audit Signal’s code themselves, you can find that here.
What you choose to use and trust is a personal decision. Nothing is completely secure all of the time; anything critical should be kept to face to face meetings. In addition, all standard OPSEC rules should apply. (For a real world case of security fails and how that ended, read this story.) For those who claim that “we aren’t doing anything illegal,” keep in mind that we have reached a point where that determination is made on a case by case basis these days, and the odds are not in your favor. I also daresay that there are quite a few people recently put in jail who, if they’re smart, are rethinking a lot of their OPSEC and security strategies. Besides, as world renowned information security researcher The Grugq points out that “OPSEC is prophylactic, you might not need it now, but when you do, you can’t activate it retroactively.”
I’ll do a future article on other apps such as Silent Circle, Telegram, Zello, and more. In the meantime, sit down and decide what your critical information is. Do some basic threat analysis. Next, do some research on the above programs and decide what you can afford to compromise in terms of security. For many users of secure chat, it’s a life or death decision. Keep that in mind.
Above all, take the time to research and learn. You don’t have to be a computer wizard, but you do need to learn the basics of encryption and how to protect yourself. There’s an excellent beginner primer here (add this blog to your daily reads). For those who prefer a classroom setting, we have the Groundrod Primer class coming up in a few weeks. We highly recommend you check out both.
Whatever you do, for the love of Pete, stop using Facebook as a coordination, networking and recruiting tool.