[liberationtech] My CPJ blog: Lessons from the Cryptocat debate
Griffin Boyce
griffinboyce at gmail.com
Tue Sep 11 14:16:16 PDT 2012
Hey all,
Personal feelings aside, there seem to be some problems with the article.
For one, much of the criticism focused around using a web browser alone
(without a plugin or standalone application). Now that it's served entirely
via plugin, those criticisms are no longer on the table.
That some people criticize cryptocat for using SSL seems very
disingenuous, as we rely on SSL to help secure everything from bank
accounts to email. On top of that, Cryptocat now uses OTR, which is the
gold standard in real-time communication encryption. It's only slight
hyperbole to say that everyone uses OTR-encrypted chat, though usually
coupled with Pidgin or Adium. The much-lauded TextSecure also used OTR for
its encryption process. Chat is not dramatically more secure when it's
done from a stand-alone application.
If it seems like I'm nit-picking, it's because there are a lot of nits to
pick. And it's not just with this piece or with pieces about cryptocat.
This is a consistent problem with tech journalism as a whole, where apps
are created, audited, patched, audited again, and reworked, making
fact-checking more difficult. If criticisms ABCD are present in version
1.0, and only criticism D applies in version 1.5, then talking about ABCD
as if they were still active concerns is not only incorrect, but is a very
sensitive topic for developers.
And frankly, the use of past-tense to describe Cryptocat seems a bit...
harsh (even if that wasn't your intention). It's still a solid app, still
under development, and still used by a lot of very passionate users.
Best,
Griffin Boyce
--
"I believe that usability is a security concern; systems that do
not pay close attention to the human interaction factors involved
risk failing to provide security by failing to attract users."
~Len Sassaman
PGP Key etc: https://www.noisebridge.net/wiki/User:Fontaine
On Tue, Sep 11, 2012 at 1:07 PM, <frank at journalistsecurity.net> wrote:
> Hi everybody,
>
> Below is my CPJ blog on the Cryptocat debate. It makes some of the same
> points that I already made here a few weeks ago. And please know that my
> intent is to help work toward a solution in terms of bridging invention and
> usability. I know there are different views, and I have already heard some.
> Please feel free to respond. (If you wish you may wish to copy me at
> frank at journalistsecurity.net to avoid me missing your note among others.)
>
> Thank you! Best, Frank
>
>
> http://www.cpj.org/security/2012/09/in-cryptocat-lessons-for-technologists-and-journal.php
>
>
> *In Cryptocat, lessons for technologists and journalists*
> By Frank Smyth/Senior Adviser for Journalist Security<http://www.cpj.org/blog/author/frank-smyth>
> *Alhamdulillah! *Finally, a technologist designed a security tool that
> everyone could use. A Lebanese-born, Montreal-based computer scientist,
> college student, and activist named Nadim Kobeissi had developed a
> cryptography tool, Cryptocat <https://crypto.cat/>, for the Internet that
> seemed as easy to use as Facebook Chat but was presumably far more secure.
> Encrypted communications are hardly a new idea. Technologists wary of
> government surveillance have been designing free encryption software since the
> early 1990s <http://www.pgpi.org/doc/overview/>. Of course, no tool is
> completely safe, and much depends on the capabilities of the eavesdropper.
> But for decades digital safety tools have been so hard to use that few
> human rights defenders and even fewer journalists (my best guess is one in
> a 100) employ them.
> Activist technologists often complain that journalists and human rights
> defenders are either too lazy or foolish to not consistently use digital
> safety tools when they are operating in hostile environments. Journalists
> and many human rights activists, for their part, complain that digital
> safety tools are too difficult or time-consuming to operate, and, even if
> one tried to learn them, they often don't work as expected.
> Cryptocat promised<http://www.wired.com/threatlevel/2012/07/crypto-cat-encryption-for-all/all>to finally bridge these two distinct cultures. Kobeissi was
> profiled<http://www.nytimes.com/2012/04/18/nyregion/nadim-kobeissi-creator-of-a-secure-chat-program-has-freedom-in-mind.html>in
> *The New York Times*; *Forbes*<http://www.forbes.com/sites/jonmatonis/2012/07/19/5-essential-privacy-tools-for-the-next-crypto-war/>and especially
> *Wired*<http://www.wired.com/threatlevel/2012/07/crypto-cat-encryption-for-all/all>each praised the tool. But Cryptocat's sheen faded fast. Within three
> months of winning a prize associated with *The Wall Street Journal*<http://datatransparency.wsj.com/>,
> Cryptocat ended up like a cat caught in storm--wet, dirty, and a little
> worse for wear. Analyst Christopher Soghoian--who wrote a *Times* op-ed last
> fall<http://www.nytimes.com/2011/10/27/opinion/without-computer-security-sources-secrets-arent-safe-with-journalists.html>saying that journalists must learn digital safety skills to protect
> sources--blogged that Cryptocat had far too many structural flaws<http://paranoia.dubfire.net/2012/07/tech-journalists-stop-hyping-unproven.html?utm_source=Contextly&utm_medium=RelatedLinks&utm_campaign=AroundWeb>for safe use in a repressive environment.
> An expert writing in *Wired* agreed. Responding to another *Wired* piece
> just weeks before, Patrick Ball said the prior author's admiration of
> Cryptocat was "inaccurate, misleading andpotentially dangerous<http://www.wired.com/threatlevel/2012/08/wired_opinion_patrick_ball/2/>."
> Ball is one of the Silicon Valley-based nonprofit Benetech<http://www.benetech.org/>developers of
> Martus <http://www.benetech.org/human_rights/martus.shtml>, an encrypted
> database used by groups to secure information like witness testimony of
> human rights abuses.
> But unlike Martus, which uses its own software, Cryptocat is a "host-based
> security" application that relies on servers to log in to its software. And
> this kind of application makes Cryptocat potentially vulnerable<http://www.wired.com/threatlevel/2012/08/wired_opinion_patrick_ball/all/>to manipulation through theft of login information--as everyone, including
> Kobeissi, now seems to agree.
> So we are back to where we started, to a degree. Other, older digital
> safety tools are "a little harder to use, but their security is real," Ball
> added in *Wired*. Yet, in the real world, fromMexico<http://www.cpj.org/blog/2011/09/mexican-murder-may-mark-grim-watershed-for-social.php>to
> Ethiopia<http://www.cpj.org/2012/07/ethiopia-sentences-eskinder-six-others-on-terror-c.php>,
> from Syria<http://www.cpj.org/security/2012/05/dont-get-your-sources-in-syria-killed.php>to
> Bahrain<http://www.cpj.org/2012/09/bahrain-should-scrap-life-sentence-of-blogger-alsi.php>,
> how many human rights activists, journalists, and others actually use them?
> "The tools are just too hard to learn. They take too long to learn. And no
> one's going to learn them," a journalist for a major U.S. news organization
> recently told me.
> Who will help bridge the gap? Information-freedom technologists clearly
> don't build free, open-source tools to get rich. They're motivated by the
> recognition one gets from building an exciting, important new tool. (Kind
> of like journalists breaking a story.) Training people in the use of
> security tools or making those tools easier to use doesn't bring the same
> sort of credit.
> Or financial support. Donors--in good part, U.S. government agencies<http://www.fas.org/sgp/crs/row/R41120.pdf>--tend
> to back the development of new tools rather than ongoing usability training
> and development. But in doing so, technologists and donors are avoiding a
> crucial question: Why aren't more people using security tools? These
> days--20 years into what we now know as the Internet--usability testing is
> key to every successful commercial online venture. Yet it is rarely
> practiced in the Internet freedom community.
> That may be changing. The anti-censorship circumvention tool Tor has grown
> progressively easier to use, and donors and technologists are now working
> to make it easier and faster still. Other tools, like Pretty Good Privacy<http://www.pgpi.org/>or its slightly improved German
> alternative <http://www.gnupg.org/>, still seem needlessly difficult to
> operate. Partly because the emphasis is on open technology built by
> volunteers, users are rarely if ever redirected how to get back on track if
> they make a mistake or reach a dead end. This would be nearly inconceivable
> today with any commercial application designed to help users purchase a
> service or product.
> Which brings us back to Cryptocat, the ever-so-easy tool that was not as
> secure as it was once thought to be. For a time, the online debate among
> technologists degenerated into thekind of vitriol<http://www.wired.com/threatlevel/2012/08/security-researchers/all/>one might expect to hear among, say, U.S. presidential campaigns. But
> wounds have since healed and some critics are now working with Kobeissi to
> help clean up and secure Cryptocat.
> Life and death, prison and torture remain real outcomes<http://www.cpj.org/reports/2011/12/journalist-imprisonments-jump-worldwide-and-iran-i.php>for many users, and, as Ball noted in
> *Wired*, there are no security shortcuts in hostile environments. But if
> tools remain too difficult for people to use in real-life circumstances in
> which they are under duress, then that is a security problem in itself.
> The lesson of Cryptocat is that more learning and collaboration are
> needed. Donors, journalists, and technologists can work together more
> closely to bridge the gap between invention and use.
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.stanford.edu/pipermail/liberationtech/attachments/20120911/8fac71b4/attachment.html>
More information about the liberationtech
mailing list