Preliminary Comments for Senate Commerce Committee Hearing on DNT

This Thursday, the US Senate Committee on Commerce, Science and Transportation is holding a hearing entitled “The Need for Privacy Protections: Is Self-Regulation Adequate?” Mozilla along with several others have been asked to comment at the hearing on the current state of: i) industry self-regulation; ii) Mozilla’s Do Not Track feature; and iii) the industry’s ability to provide consumers with adequate tools to protect their personal information online.

We’re planning to participate and provide comments based on our experience and perspective. We also posted the questions to governance for input.

In addition to core Mozilla messages about user choice, control, and transparency, the comments will include the following key key points:

  • Industry self-regulation can work when it’s a multi-stakeholder process that reflects the views of all of the relevant parties involved in data transactions including users, developers, service providers, publishers, and the ad networks.
  • Non-voluntary regulatory measures are a last resort. They can introduce unintended consequences that can be harmful to a fragile web ecosystem. As a result we should be cautious in this regard and give voluntary industry efforts every chance to succeed before interceding with regulation.
  • The desire to predict and deliver content that appeals to users is a core driver behind efforts to collect and analyze data about us. This will only increase particularly with the inclusion of the mobile data graph. This is not inherently bad, and delivering content that users want, when they want it, is a good thing if it’s done transparently and in harmony with user intent.
  • Commerce is a vital and beneficial Internet activity. Enabling and maintaining economic ecosystems on the web is essential to a robust and healthy Internet. Commercial imperatives and user choice/control are not mutually exclusive. They can and must coexist through a combination of technical capabilities and user-centric business and data practices.
  • DNT requires cooperative efforts of services providers, ad networks, browsers, and other parts of the web ecosystem. We’re optimistic that the multi-stakeholder process ongoing at the W3C will result in a consensus on both the meaning of DNT and how websites should respond.
  • DNT is one method to give users a voice in how third parties collect, use, and track information about them. It’s not the only method, nor the be all and end all of the data and privacy relationship that exists between users and service providers.

We’re in the process of completing the comments now and will submit them in advance of the hearing on Wednesday.

Marching Along – Privacy Forward

A bunch of folks, including Alex Fowler, Sid Stamm, and Mike Hanson to mention only a few, did some nice work developing Mozilla’s comments on the FTC’s  proposed privacy framework.  More details, including the comments, are available on the mozilla.com blog.

I’m still reading through some of the responses, and it’s really interesting seeing the diverse perspectives. Some saying the creation of a comprehensive US privacy framework will stifle innovation, leading to economic collapse and ruin, others suggesting the FTC hasn’t gone far enough.  (+1 to an open government process with a robust debate and competing ideas)

One theme that seems to pervade the narrative unfortunately is the notion that doing right by the user from a privacy perspective is somehow hostile to innovation and business.  This is a false paradigm. (We saw the same themes in the net-neutrality debate, but that’s a different story.)  Innovating in services, managing information while being user centric and respectful aren’t competing values in my view.  What’s right for the user doesn’t mean being hostile (or captive) to commercial motivations, nor should it mean rolling over to the great data slurp in the cloud.

As Eben Moglen recently reminded us, the web is young – some 7,000+ days young.  Thus, there’s so much more to come, and we can’t drive by looking in the rear view mirror.  So when I look forward, and see some of the ideas kicking around that give users both the benefits and control of their information in a “privacy forward” way, within and outside the Mozilla community,  I see lots of opportunity and innovation.

This is pretty exciting, and on a good day, I feel lucky to observe and participate.

New FTC Privacy Proposal

Today the Federal Trade Commission released a proposal describing a new framework for protecting consumer privacy in both online and offline environments. The report reflects the new challenges users, publishers, service providers, and advertisers face in today’s digital environment and incorporates feedback from public roundtables conducted over the past year.  The report acknowledges the shortcomings of the current “notice and consent” framework, but doesn’t abandon it completely, rather it seeks to implement it in a way that makes more sense for users.

While we’ll need more time to digest and evaluate the details, we’re encouraged by what we’ve seen so far.  In particular, the FTC has proposed a set of principles that align well with the Mozilla manifesto and our approach to software development including:

  • privacy by design;
  • transparency;
  • user choice; and
  • no surprises.

Of course the devil is often in the details, but the first principles seem right.  The FTC should also be commended for continuing its efforts to seek a comprehensive proposal rather than focusing only on one aspect of the issue.

The Commission has also shown that it understands the complexity and nuance of many of the issues, for example, the blending distinction between PII and non-PII, and the contextual nature of privacy issues.  To that end, the Commission has articulated a robust set of questions on which it is seeking further public feedback.  Comments on the proposal are due on January 13, 2011.

Over the next month, we’ll examine the questions and proposal in more detail and take advantage of this opportunity to share our experience, concerns, and views on the proposed framework.

If you have thoughts about the proposal let us know.

New European Commission Privacy Recommendations

The EC released its new privacy recommendations on Thursday to update the 15 year old EU privacy regime.  The report contains the Commission’s findings from their analysis over the past year and announces an intention to investigate a number areas in more depth with the goal of proposing legislation in 2011.  The impetus as described by the Commission is that today’s challenges “require the EU to develop a comprehensive and coherent approach guaranteeing that the fundamental right to data protection for individuals is fully respected within the EU and beyond.”

I suspect that for some the principles may be perceived as new administrative overhead and obstacles to an “optimum user experience.”  My quick take (personal opinion) is that the findings and areas of study represent a move in the right direction.  Ofcourse, the devil is in the details which will evolve over the coming year, so we’ll see. As the EC develops its new framework, finding reasonable and practical ways to implement the proposals will be essential to their success.

This is even more interesting given that the US Federal Trade Commission has indicated its coming out with recommendations soon. These would also likely result in legislation next year as well.  It would be great (if not just common sense) to see as much harmonization between the two frameworks as possible. We can still dream.

Welcome any thoughts or observations about the proposal. Some highlights from the report are shown below, but the report is worth the read.

  • The Commission will consider how to ensure a coherent application of data protection rules, taking into account the impact of new technologies on individuals’ rights and freedoms and the objective of ensuring the free circulation of personal data within the internal market.
  • The Commission will examine ways of clarifying and strengthening the rules on consent.
  • The Commission will consider:
    • introducing a general principle of transparent processing of personal data in the legal framework;
    • introducing specific obligations for data controllers on the type of information to be provided and on the modalities for providing it, including in relation to children;
    • drawing up one or more EU standard forms (‘privacy information notices’) to be used by data controllers.
  • The Commission will therefore examine ways of:
    • strengthening the principle of data minimisation;
    • improving the modalities for the actual exercise of the rights of access, rectification, erasure or blocking of data (e.g., by introducing deadlines for responding to individuals’ requests, by allowing the exercise of rights by electronic means or by providing that right of access should be ensured free of charge as a principle);
    • clarifying the so-called ‘right to be forgotten’, i.e. the right of individuals to have their data no longer processed and deleted when they are no longer needed for legitimate purposes. This is the case, for example, when processing is based on the person’s consent and when he or she withdraws consent or when the storage period has expired;
    • complementing the rights of data subjects by ensuring ’data portability’, i.e., providing the explicit right for an individual to withdraw his/her own data (e.g., his/her photos or a list of friends) from an application or service so that the withdrawn data can be transferred into another application or service, as far as technically feasible, without hindrance from the data controllers.
  • The Commission will examine the following elements to enhance data controllers’
    responsibility: 

    • making the appointment of an independent Data Protection Officer mandatory and harmonising the rules related to their tasks and competences31, while reflecting on the appropriate threshold to avoid undue administrative burdens, particularly on small and micro-enterprises;
    • including in the legal framework an obligation for data controllers to carry out a data protection impact assessment in specific cases, for instance, when sensitive data are being processed, or when the type of processing otherwise involves specific risks, in particular when using specific technologies, mechanisms or procedures, including profiling or video surveillance;
    • further promoting the use of PETs and the possibilities for the concrete implementation of the concept of ‘Privacy by Design’.

So Whose Data Is It?

An interim decision was issued this week in the Facebook v. Power case in the US federal court for the Northern District of California that disposed of some of the claims, but left others open because facts are still disputed.  Power.com aggregates users’ social network updates, e-mail, instant messages and contact lists together in one place after authorization by the user. The dispute is whether Power violated Facebook’s terms of service by acting as an agent for the user when logging into FB, as the user and with the user’s permission, instead of using Facebook Connect. Reports indicate that FB had initially encouraged Power to use the publicly available Connect program as a means to resolve the dispute.

In the decision, Judge Ware denied FB’s motion to find Power criminally liable under the California computer crime statute (section 502), at least at this point.  This case is far from final, however,  and the ultimate question of whether Power violated the TOS is still open.

To determine whether there was criminal liability as alleged by FB, the Court had to first define the term “access without permission” for the purposes of section 502.  Judge Ware reasoned that “interpreting the statutory phrase “without permission” in a manner that imposes liability for a violation of a term of use or receipt of a cease and desist letter would create a constitutionally untenable situation in which criminal penalties could be meted out on the basis of violating vague or ambiguous terms of use.”  The Court found that the ultimate issue of whether Power engaged in activities that would constitute evading technical barriers is still a disputed question of fact; thus, summary judgment in favor of FB was not appropriate.  The decision suggests that knowingly evading technical barriers to help users access their own data may be construed as access without permission. Unfortunately, what constitutes  “evading technical barriers” is still undefined precisely.

Underlying the host of legal claims and counter-claims in this case, lurks the quintessential issue of whose data is it, and what rights do users have to access, move, manipulate their own data directly or via 3rd parties.  Conversely, what rights do publishers have to restrain access to UGC and the incremental value added by the networks they facilitate.  For lack of better legal mechanisms, this battle is being played out under the guise of copyright, DMCA, anti-trust, and criminal computer crime statutes – a sure way to get a bad result.

Applying first principles of user centricity, choice, and control (UC3), it would in my opinion seem the outcome in this particular case  should be that retrieving your own data about you and your social graph (in its complete form) should not run afoul of civil or criminal law, regardless of whether you do it directly or via an authorized 3rd party.  Ofcourse this is only my personal opinion and is not meant to be a comprehensive policy statement because there are many exceptions, edge cases, and other interests that would have to be reflected.  Would this mean you have rights to hack into a system to get your data, absolutely not. Would it mean that data hosts would offer reasonable and complete access – which many do – for authorized extraction of your data, yes.  Power thinks so clearly, and has articulated an Internet user bill of rights based on principles of data portability that calls for in part “The right to access, disseminate, transfer or aggregate their content on any platform, or to authorize third-parties to do so for them.

No doubt this case underscores the importance of  data freedoms enabled by “access” in the same way publishing source code helps to enable the FLOSS freedoms.  Marcia Hofmann of the EFF summarized it stating “If the measure seeks to control access to or use of data, then evasion of it is almost certainly criminal. But if the restriction merely seeks to impose owner preferences or terms of service on otherwise authorized users, bypassing it should not be a crime.”

None of this is settled right now ofcourse, but releasing the data unlocks even more utility from your network and makes possible innovative services that we can’t even imagine.  This suggests that there will be even more pressure, reason, and value from access and use of your own network data in multiple environments and applications.

It will be interesting to see how this develops, but Judge Ware’s decision so far is a positive step.

Related Links:

Privacy is Brewing

People think about Mozilla mostly in the context of our major product, Firefox, but we’ve got lots of activities, both related to Firefox and beyond, that touch on issues of user control and privacy.

It’s an incredibly active area right now across the industry, and we’re finding ourselves more involved, so I wanted to start writing about these issues as they develop.  What’s below is a bit of an effort to divine some meaning from what on its face, looks like a series of unrelated events; however, in aggregate, they suggest a bigger story is unfolding which is that users’ expectations about their ability to control their online information, at least for a growing segment,  are not being satisfied.

In the last few months alone, Google Buzz and Facebook privacy practices have made the news more than once, resulting in inquires or complaints in both the EU and the US. The US Federal Trade Commission announced it is planning to create new guidelines for online privacy, and just last week, new online privacy draft legislation was introduced in Congress. (See Boucher bill is here) The US Department of Commerce has started an initiative to explore privacy and innovation, including a notice seeking public comments.  Similarly, the EU Article 29 Data Protection scheme continues to evolve as the Working Party adopted its new Work Programme for 2010-2011 with a goal to “address challenges linked to new technological development” In this same period, there have been countless news stories, all of which say they are about “privacy” but -if you read them carefully- mainly appear to be about sharing and user control.

As the New York Times reported recently:

“Consumer groups have been fighting what they see as the prevalence of online tracking, companies like Google and Yahoo have adjusted their own privacy policies in response to consumer concern, and industry groups recently put forth self-governing principles while arguing that free Internet content depended on sophisticated advertising methods.”

Among many privacy thinkers (at least in the US) there is a view that the current “notice and consent” framework doesn’t work very well.  Jonathan Zittrain has written much about this already, as well as many others. The online privacy environment is more complex than ever before in part because of:

  • new ways to share, track, and analyze information (and accompanying new questions about the definition of “user information”);
  • users who want to connect and share (Facebook didn’t get 400M users accidentally); and
  • an increasing expectation that users, when they do intend to share, also expect some reasonable control of their information and information about them.

It’s unclear whether the critique of notice and consent is driven by the framework itself, the way it has been implemented (i.e. privacy policies tucked away in the footers), or because of the inherent generative nature of the web. It’s really hard to tell whether the idea is fundamentally bad when the implementation doesn’t work that well.

One alternative framework under discussion contemplates a model with few restrictions on what is collected, but significant and enumerated limits on how the collected information may be used. Others have observed that current models are insufficient because they don’t reflect the changing context of the transaction – meaning privacy norms and expectations change depending on what you’re doing.  Helen Nissenbaum suggests a construct called contextual integrity that “ties adequate protection for privacy to norms of specific context.” The concept is developed more fully in her book, Privacy in Context: Technology, Policy and the Integrity of Social Life, which is worth the read.

Recently, we’ve also had the opportunity to share our experiences with some people in policy circles. These have included the FTC, congressional staffers,  and the Commerce Department. The discussions have helped me better understand the landscape, and provided a chance to share how our products are designed to help users manage their interactions on the web and control the information that they share.

In future posts, I’ll try to provide a summary of some of the activities here at Mozilla in this area.  In the interim, we’ll continue tracking and looking for ways to improve what we do.

CDT Comments in Federal Trade Commission Privacy Roundtable

Just had some plane time and a chance to read the Center for Democracy and Technology’s  comments submitted to the FTC on business practices for the collection and use of consumer data. If you haven’t already read it -and you’re interested in privacy- it’s very informative and raises some compelling points.

The FTC is conducting a public roundtable discussion to explore this topic further and gather views and recommendations. A host of other parties have also submitted comments and the first discussion is in Washington, D.C. on December 7, 2009.

In short, the CDT argues that the current framework of notice, consent, and security is insufficient because consumers are still left exposed to unfair practices even though they were technically informed by the privacy policy of the service provider.  CDT goes on to urge the FTC to adopt a more comprehensive privacy framework described as Fair Information Practice Principles.

CDT also makes a number of specific recommendations for FTC action, including:

  • The FTC should reaffirm that violating FIPs can result in consumer harm. The Commission should pursue enforcement actions against those engaged in unfair practices, not just in the spyware space, but in the general realm of online consumer privacy.  The FTC should use these actions to highlight violations of any or all of the FIPs, not merely notice, choice, and security. Query whether this would provide a cause of action for toolbars or add-ons that furtively change user preferences?
  • The FTC should encourage Congress to pass general consumer privacy legislation that is based on a full set of FIPS. Self-regulation cannot adequately protect consumer privacy…
  • The FTC should consider drafting its own set of consumer privacy rules if it is granted standard rule making authority to clarify basic privacy expectations for consumers and businesses alike.
  • The FTC should explore creating benchmarks and metrics for evaluating company privacy polices.

Although very subjective, this notion of “fairness” really resonates and may have (should have) broader implications.  There is plenty of room to better incorporate such principles in privacy policies, terms of use, and new web services that are presented to users, but at the same time fairness should also include some reasonable balance between the interests of both users and service providers.  No doubt there’s going to be a lot more discussion on this topic and more to learn here.