[liberationtech] Medill online Digital Safety Guide
frank at journalistsecurity.net
frank at journalistsecurity.net
Sat Jun 1 11:37:22 PDT 2013
Rich,
I appreciate you taking the time to lay out your recommendations. If I
understood you correctly, you are suggesting that journalists should use
only open-source operating systems and other carefully selected
open-source software, and that the operating systems and software they
use should also be partly, if not largely customized and also modified
as required on an ongoing basis to suit their specific operational
needs.
That certainly provides much food for thought, and it raises more
questions for me than it answers, especially in the long-run, which is
always good. I have also heard of at least one project that may be
moving in that direction.
But I also wholeheartedly agree with Ella that users, not least of all
journalists, need to be met where they are, and, to quote Ella, that is
ok. No journalists anywhere that I know --from activists operating in
Internet Repressive environments, to U.S. national security
correspondents-- would even consider the kind of
personal-custom-tech-heavy approach suggested above. Instead, imperfect
solutions are better than none, although at the very least journalists
and their sources should know the risks of storing and communicating any
information.
One thing that journalists are learning is to limit what they store or
communicate digitally, and to use technology in a more limited way, too.
Like perhaps finding a safe (or safer) way to ping or signal a source to
not necessarily communicate information but facilitate switching to
another (maybe safer) form of communication or meeting face-to-face, all
depending, of course, upon the suspected threat model in play.
Ella, I know you have spent a great deal of time looking at operational
security issues for journalists and other high-risk users from the
perspective of the security community.
This kind of dialogue involving people coming from different
perspectives is invaluable.
Best, Frank
> -------- Original Message --------
> Subject: Re: [liberationtech] Medill online Digital Safety Guide
> From: Eleanor Saitta <ella at dymaxion.org>
> Date: Sat, June 01, 2013 10:42 am
> To: liberationtech <liberationtech at mailman.stanford.edu>
> Cc: Rich Kulawiec <rsk at gsp.org>
>
>
> -----BEGIN PGP SIGNED MESSAGE-----
> Hash: SHA256
>
> I'm going to step into this thread just once (and try to stick to
> that); apologies for top-posting this.
>
> I come from the security community. I understand very well many of
> the arguments you're making and even agree at a technical level with
> most of them. However, let's talk about human outcomes.
>
> Human outcomes are the only thing that matter even a little bit in
> security -- not "could you be owned", but "were you actually owned,
> and did that mean you didn't get out in time/accomplish your
> objective". Nothing else matters.
>
> You cannot force people to adopt "proper" security procedures. You
> cannot scare people into adopting them -- you can't scare people into
> doing anything. It's useful for us to understand where we'd like
> people to end up, but we have to start with where people are and deal
> with the fact that they have limited time, limited capability to
> adapt, and limited resources.
>
> This means that many people will be owned, and many people will get
> hurt. Our goal, our only real goal, in doing security work in this
> context, can be to *statistically* reduce the number of people who get
> hurt and *statistically* increase the number of people who achieve
> their objectives.
>
> Yes, I don't like the current set of trends we're seeing in computer
> architectures, and I've got my own projects that are fighting them,
> but we also have to work within those trends, because computing is a
> social practice and if you aren't were the users are, you're crying
> alone in a corner.
>
> Asking people whose job is to be a professional journalist to go use a
> text-based mail client means you lose, because they're not going to.
> You *might* get them to give up the power and convenience of gmail to
> switch to a thick-client, but more than that? Forget it. You might
> get someone who's a writer to switch to a linux box, but asking a
> professional photographer to ditch Lightroom and switch their whole
> workflow around? Forget it. You might even get your journalists to
> adopt a nice hardened workflow for document intake that keeps them
> safe from almost all of the malware that people try to pass them, but
> you have to understand that they will jump around the edge of that
> system sometimes when things are blowing up, and *THIS IS FINE*.
> Their job isn't to be secure, it's to get the story out. Sometimes
> this means they go to jail. It's ok. They signed up for it. It
> sucks, and we want to help them reduce the chances, we want to make
> sure they understand the risks of their actions, but in the end, it's
> their call.
>
> The point of humanitarian technology is to empower people, and that
> means empowering people to take risks and make decisions that we
> consider completely insane. It's ok.
>
> Yes, in some situations, all we can do is tell people "either you do
> all of this of you're going to be owned". That's fine. Our job for
> those situations is to see how much we can lower that bar, based on
> where the users are, what they need, and what we can do. Lowering the
> bar for other hackers doesn't help normal users, although in some
> cases it may be a prerequisite.
>
> I could go on for a while longer -- one of the things we need to think
> more about is how we can go from defense in depth to a more
> epidemiological approach to security, and how we can use the most
> sophisticated pattern-and-behavior matching processor we've ever
> invented and that we've been ignoring for years (the user), but I'll
> stop here.
>
> The bottom line: meet the user where they are, think about outcomes,
> not ideals, and understand that you're going to fail, most of the
> time, and that that's maybe ok.
>
> E.
>
> On 2013.06.01 07.40, Rich Kulawiec wrote:
> > On Wed, May 29, 2013 at 03:21:45PM -0700,
> > frank at journalistsecurity.net wrote:
> >> I appreciate your feedback and your bluntness, Rich.
> >>
> >> But you are providing far more guidance about what to avoid than
> >> what to use. If journalists and other users should avoid all
> >> commercial based operating systems including Macs, or any system
> >> requiring anti-virus software, then what operating system should
> >> they use? Linux maybe? Or something else?
> >>
> >> Similarly, if they shouldn't use GUI-based email clients, what
> >> email should they use?
> >
> > See below, I'll try to address these questions.
> >
> > That's actually not my blunt voice. That's my exasperated voice,
> > because I've grown exceedingly weary of listening to people
> > explain how to secure a closed-source OS/application environment.
> >
> > Can't. Be. Done.
> >
> > The evidence supporting that statement is already piled so high
> > that one could spend a lifetime examining it and not finish. And
> > more arrives all day, every day. Yet there are STILL people trying
> > to claim that yes, you can secure your Windows desktop if only you
> > use anti-spyware anti-virus anti-malware anti-anti-anti-whatever.
> > If only you spend enough money. If only you use
> > IDS/IPS/firewalls/yadda yadda yadda.
> >
> > No, you can't.
> >
> > Not for any reasonable value of "secure". (Yes, yes, I'm well
> > aware that nothing is absolutely secure, I'm using the term in
> > sense of "adequate to stop attacks it might plausibly face".)
> > People do all those things and spend ferocious amounts of money on
> > them and yet they are STILL routinely 0wned. It never seems to
> > occur to them to step back and consider that they're doing
> > something fundamentally wrong; it always seems to occur to them
> > that throwing more money at the problem will fix it. It won't.
> >
> > And for your use case, where we can presume that the users'
> > systems will come under scrutiny from governments, criminal gangs,
> > and other unsavory people with substantial resources, including
> > possibly the implied or overt threat of physical force, it's absurd
> > to even consider that approach. You need to discard it entirely
> > and try something that has an appreciable chance of working -- NOT
> > something that's guaranteed, as I don't think that exists, but
> > something that at least gets you into the game and gives you a
> > fighting chance.
> >
> > Is Linux the best choice? Maybe. Maybe FreeBSD is. Maybe
> > something else. We could argue (and we *have* argued, for many
> > years) about their relative merits and drawbacks. I'll propose
> > that for this purpose it may well be entirely reasonable to create
> > a custom Linux or BSD distribution that has only the essentials
> > required for reporters/editors in the field to do their jobs.
> > Perhaps it should be based on something that already exists, e.g.,
> > Tails. But what all of those alternatives have in common is that
> > they raise your probability of success from zero to something
> > non-zero.
> >
> > That's not enough, of course. Reasonably secure software
> > environments only stay that way if they're used appropriately:
> > procedures are as important as code. So if someone equipped with
> > one of those is so insanely stupid as to log onto Facebook [1] or
> > some other scam site, then they're probably neatly undercut
> > themselves. Using these tools properly takes discipline, restraint
> > and thoughtfulness.
> >
> > For example: users need to actually look at URLs before they
> > follow them and they need to know enough to realize that
> > google-com.com is probably not Google and that CIT1BANK.COM is
> > probably not CitiBank. [2]
> >
> > Yes, that's tedious. Sorry. But it's not as tedious as being
> > locked in a cell for six years while the State Department tries to
> > negotiate your release.
> >
> > Anyway, my point is that a judicious combination of careful
> > procedures and minimal software applications on a robust operating
> > system will yield something that has a much higher level of
> > operational security than anything you can build around a
> > closed-source base. Or to put it another way, all the
> > discipline/restraint/thoughtfulness in the world will not help you
> > if you insist on using Adobe Acrobat, thus making yourself a member
> > of the Ginormous Acrobat Security Hole of the Month Victim's Club.
> > [3] Or if you choose to use Outlook instead of mutt (see
> > http://mutt.org), which is a pretty robust full-featured email
> > client that trained users can use far more efficiently than many
> > others.
> >
> > So a constructive approach to this might be ("might be"):
> >
> > 1. Write a functional specification. What computing tasks do
> > reporters/editors/etc. in the field have to do?
> >
> > 2. Determine what applications can perform those tasks.
> >
> > 3. Figure out which OS those applications will run on.
> >
> > 4. Figure out what the minimal installation looks like. (No point
> > in having FrozzleBlah 1.7 installed if it isn't used. Less software
> > = less attack surface, to a first approximation.)
> >
> > 5. Set the onboard firewall to bidirectional default deny. Then
> > start figuring out what holes need to be punched in it to make (2)
> > feasible.
> >
> > 6. Think about network services, VPNs, encryption. Revisit (5).
> >
> > 7. Build an alpha release and give it, plus some training, to a
> > dozen working journalists. It will break horribly. That's a good
> > thing.
> >
> > 8. Revisit 1-6 and built a beta release. Repeat step 7.
> >
> > 9. Get someone (or better, a group of someones) with devious and
> > ingenious minds to attack it. It will break horribly. That's still
> > a good thing.
> >
> > 10. Revisit 1-8 and repeat 9 until either (a) sufficient confidence
> > exists that a serviceable product has been created or (b) it
> > becomes apparent that something about the approach is irrevocably
> > wrong and can't be fixed without starting over. If (a), then move
> > on. If (b), start over. (I'm a huge fan of the development
> > philosophy that you should always write one to throw away.
> > Sometimes two. Development efforts that aren't willing to do that
> > often don't survive well in the field.)
> >
> > I submit that this outline, crude and incomplete as it is, has a
> > much higher probability of generating success than anything one
> > could possibly do using closed-source software.
> >
> > Yes, it's a lot of work. Sorry. There's no shortcut. As
> > tempting as it is to take something like MacOS and adapt it: you
> > can't. It's like trying to turn a 1975 Ford Pinto into a tank by
> > bolting armor plate onto it. Yeah, it kinda sorta vaguely looks
> > like a tank, but it's still a Pinto and always will be...and you
> > would not do well in battle with it.
> >
> > As to the one of your remarks that I elided, in re electronic
> > communications in general:
> >
> > I certainly think that they should make *every* effort to minimize
> > their footprint. E.g., if their laptop can be switched off: it
> > should be. If it can be disconnected from the wireless network via
> > the hardware switch: it should be. If their mobile phone can be
> > turned off and the battery removed: it should be. Every minute
> > that these devices are connected provides potentially actionable
> > intelligence to the adversary, so I think it's sensible to minimize
> > those minutes. (And of course, VPNs, encryption, and other
> > techniques should be used to reduce the quantity and quality of
> > information available to an adversary.)
> >
> > ---rsk
> >
> >
> > [1] Some people may not consider this insanely stupid. Okay.
> > Fair enough. If you're one of those people, please explain to me
> > EXACTLY why you think that Facebook (or one of its freelancing
> > employees) would not cheerfully sell every scrap of data they have
> > on you to country X's intelligence service/secret police or to one
> > of country Y's indigenous criminal organizations. To borrow a line
> > from Feynman, what is the source of this fantastic faith in the
> > machinery?
> >
> > Once you've finished that explanation, please also explain to me
> > EXACTLY why you think that an operation with a long, long history
> > of massive security holes has now managed to close the last one,
> > thereby rendering itself impervious to attackers -- even though
> > that same organization has almost no motivation to do so. It's not
> > *their* data, after all.
> >
> > [2] I say "probably" because it's possible the real operations
> > have acquired those typosquatted domains by now.
> >
> > [3] I see that Adobe is shifting PhotoShop to "the cloud". How
> > very nice. Now when some tinpot dictator wants to see if there are
> > any incriminating photos being prepped for publication, it's not
> > necessary to break into laptops and desktops and such; just break
> > into Adobe's cloud (or, perhaps more readily, pay off an Adobe
> > employee) and it's one-stop shopping. I suppose Adobe wasn't
> > content with merely having one massive security hole a month and
> > wanted to create something whose very existence is a massive,
> > ongoing, perpetual security hole. -- Too many emails? Unsubscribe,
> > change to digest, or change password by emailing moderator at
> > companys at stanford.edu or changing your settings at
> > https://mailman.stanford.edu/mailman/listinfo/liberationtech
> >
>
>
> - --
> Ideas are my favorite toys.
> -----BEGIN PGP SIGNATURE-----
> Version: GnuPG v2.0.17 (MingW32)
>
> iF4EAREIAAYFAlGqCHIACgkQQwkE2RkM0wpELgEAlrrdEYeXZoD05GhE9rwtvbW6
> 6sixKDPS61zJl+hRuEEBAILYrfyBf5SRrM1aEEhDIiEaXJGM+bDSw9p9zYXcgthl
> =g/o0
> -----END PGP SIGNATURE-----
> --
> Too many emails? Unsubscribe, change to digest, or change password by emailing moderator at companys at stanford.edu or changing your settings at https://mailman.stanford.edu/mailman/listinfo/liberationtech
More information about the liberationtech
mailing list