[liberationtech] Government snooping on personal data stores

Yosem Companys companys at stanford.edu
Sat Feb 23 10:20:18 PST 2013


Interesting discussion taking place at Harvard's Project VRM:

From: T-Rob <t.rob.wyatt at us.ibm.com>
To: projectvrm <projectvrm at eon.law.harvard.edu>
Re: Government snooping on personal data stores

Except for the really paranoid, it's pretty much accepted that
not even our most advanced world governments can crack the stronger
encryption algorithms available today.  If true, then it is trivially easy
to encrypt communications in such a way that government agents would be
unable to read them, rendering mandated back doors useless against a
moderately sophisticated criminal.  One then wonders what population is it
that governments hope to surveil by forcing back doors into communication
tech.   The dumbest criminals and law abiding people who just want their
privacy protected?  Why cast such a wide net with no possibility of
catching the fish you really want?  It would be unusual for any government
official to take the position against mandated back doors but if anyone
would, it would probably be Dr. Cavoukian.

The paper "Privacy by Design and the Emerging Personal Data Ecosystem"
describes controls on the Personal Data Vault in terms of policy.  If the
host of the personal cloud has access to the encryption keys, even
temporarily, this would obviate the need for mandated back doors based on
broken crypto, key escrow or other encryption-based schemes since the
government could simply pass a law requiring their status as an authorized
recipient of some or all data without user-visible audit trail.

Ironically, the justification would be that since live intercept access
has been denied this is the only way for law enforcement to get access to
the data they seek.  Of the two methods, live intercept requires
architectural concessions in the design stages or else expensive
retrofitting, whereas changing policy to allow government access as a
trusted recipient uses routine functionality in the system. Furthermore,
the amount and nature of the data leaked to government can start with
making very small concessions and gradually expand over time because these
things always ratchet up, never down.  So it's possible to oppose a
mandated intercept on one hand while still working to guarantee government
access to the data on the other.

As noted in an earlier thread, the fact of your data being held by many
different vendors poses a significant burden for either hackers or
government to assemble a comprehensive digital mosaic of you.
Centralizing it into a personal data vault eliminates the assembly step and
then
renders the image in high def.  That means Privacy By Design as applies to
personal data vaults must be considerably different than the same
objective as applied to the existing personal data ecosystem.  Because
the incentives are greater, not merely different, but better.  More
rigorous, more robust and easier to use.

http://privacybydesign.ca/content/uploads/2012/10/pbd-pde.pdf
http://www.mintpress.net/fbi-pushes-to-make-websites-wiretap-ready/
http://edition.cnn.com/2010/OPINION/01/23/schneier.google.hacking/index.html

I, live in the USA where my private data stored in the cloud by a 3rd party
is already available to the government without due process and I can be
compelled to hand over my private keys.  According to the EFF:

----8<-----
As we already described in the "What Can The Government Do?" section, the
communications stored by your communications service providers are very
weakly protected compared to those you store yourself: after 180 days (or
after you've downloaded a copy, according to the DOJ), the government can
get those communications with only a subpoena and usually with no notice
to you. But the situation is even worse when it comes to data that you
store with someone other than your communications provider — so called
"remote computing services" (RCSs). Under the Stored Communications Act,
the government can obtain data that you send to an RCS for storage or
processing with only a subpoena regardless of how old it is, and although
the government is supposed to notify you before they do, the law makes it
very easy for investigators to delay that notice until after they've
gotten your data.
https://ssd.eff.org/3rdparties/protect/storage
---->8-----

Of course, I could encrypt the data myself but then they'd just compel me
to disclose the keys:
http://www.forbes.com/sites/jonmatonis/2012/09/12/key-disclosure-laws-can-be-used-to-confiscate-bitcoin-assets/

Perhaps if I do set up a personal cloud, I'll search for a hosting provider
in your jurisdiction.

> The small proportion of the population who need absolute privacy,
even from
> (relatively) trustworthy governments, will realise that they should
rely on
> software tools that are built for that purpose, rather than on a personal
> data ecosystem.

The problem with this line of reasoning is that it is based on how the
system behaves *routinely* and the notion that it will always behave that
way.  The issue of what the government would do with their back door does
concern me, especially in light of what Google's Transparency Report
reveals about data request trends.  The more interesting discussion though
is how the system behaves *outside* of normal parameters.  The description
of "the facility for government to eaves-drop via an invisible (but
otherwise conventional) relationship with an individual's PDS" translates
to a combination of 1) executable code implementing a system intentionally
designed with the capability to disable access controls and user reporting
in order to silently exfiltrate data; and 2) a policy that silent access
functionality is intended only for government use.

http://www.techdirt.com/articles/20110216/23535513143/its-back-fbi-announcing-desire-to-wiretap-internet.shtml

Back doors implemented in code are routinely exploited by sufficiently
skilled hackers.  Back doors implemented in policy are easily exploited by
sufficiently skilled social engineers.  As an attacker I'd try to find
ways to bypass the policy enforcement point in the code or talk/coerce an
admin or legitimate user into doing it for me.  Furthermore, any back
doors mandated by government will require legislation to authorize vendors
to close holes should an exploit be discovered.  Either the sites will
remain operable and exposed during that time, or they will shut down,
neither of which is user friendly.  The third option of shutting off
government access along with hacker access to close the holes *and* allow
the service to continue is highly unlikely.  Finally, there is the issue
of insiders abusing legitimate access.  Earlier in my career when I worked
at Equifax and was looking for a new job, my manager confronted me about
my job search after my prospective employer ran a background check on me.
My next employer was a bank and I soon discovered that it was routine
practice for employees to hold personal accounts at competing banks
because snooping on co-workers was common.  Any silent access facility
intended for government use will be used by non-government actors.
Silently, of course.

You make the distinction between the proportion of the population who do
or do not require absolute privacy.  In today's system where data about an
individual is distributed across countless vendor databases and not
correlated easily, compromise of one vendor database exposes many
individuals to relatively small risk.  Compromise of a personal data store
exposes one individual to potentially catastrophic risk.  I don't require
absolute privacy, just that the security controls are appropriate for the
risk to which I'm exposed.

http://www.fbi.gov/news/stories/2008/march/housestealing_032508
http://www.infosecisland.com/blogview/8496-A-New-Twist-on-Identity-Theft-Hits-Home.html

If the ecosystem won't survive without the public sector and the public
sector is unlikely to cooperate if we design the software to permit
completely private relationships using unbreakable crypto, then viability
depends on winning an uphill battle with privacy advocates.  If you can't
budge on unbreakable encryption, I'd advise finding ways to assure privacy
advocates of unbreakable accountability.  For example, the system should
not provide silent access to anyone, just the ability to hide certain
access logs from the intended targets.  The access itself, all access in
fact, should be subject to strict audit logging.  Any discrepancy between
the system logs and the user logs would be subject to routine independent
review.   It might be possible by focusing on live monitoring, log
analysis, unbreakable auditing, independent review and other back-end
controls to mitigate some of the risk of a mandatory back door.

It would also go a long way if the implementors and government underwrote
an independent victim reimbursement fund as a show of good faith that they
were willing to stand behind a system intentionally designed with access
control bypass functions.

Incidentally, you've framed the discussion as a choice between systems
which permit completely private relationships, using unbreakable crypto,
or not.  I would frame it as a choice between a system with built-in back
doors that allow silent data access by default but which preserves my
interests through policy enforcement -versus- a system engineered for
strict privacy and access control but which preserves the government's
interest through policy enforcement.  The burden of trusting the policy
enforcement should not be on the party who bears the lion's share of the
risk but rather on the party claiming their interest justifies the risk.

Summary:
* Acknowledge that the risk to individuals is orders of magnitude greater
than the current system and design around that premise.
* Assume that breaches *will* occur and design around that premise.
* Stand behind the assertion of effective security by providing breach
insurance to users.
* Preserve citizen's interests through unbreakable crypto and government's
interest through policy enforcement instead of the other way round.
* Provide strong mitigating back-end controls including unbreakable
auditing, routine review and independent oversight. Do this in all cases
as good practice but consider it mandatory in the absence of a
cryptographically strong design built for completely private
relationships.

-- T.Rob
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.stanford.edu/pipermail/liberationtech/attachments/20130223/6155f9f2/attachment.html>


More information about the liberationtech mailing list