Building Trust Privacy Public Policy

The Larger Facebook/Cambridge Analytica Question: Is this really what we signed up for?

Mark Zuckerberg’s testimony before the US Congress today and the flood of news about the privacy breach at Facebook and revelations that the company mishandled the data of millions of people has me asking:

Is this really what we signed up for?

It is clear that we are not in control of our online information nor do we really have any idea how it is bought, sold, or used.

For some of us, signing up for a social network like Facebook was about staying in touch with our kids and friends. For others, it was an easy way to reach new customers, or gather a community behind a social project. Yes, many of us figured out that our information was being used to serve up ‘relevant’ ads: as a matter of fact, that seems pretty standard in today’s online world. But that’s only a small part of a much bigger picture.

In the past few weeks we have found out – yet again – that information about ourselves, and our friends and contacts was used far beyond what we intended. We have been profiled, pigeon-holed, politically manipulated, and played like pawns in someone else’s chess game. I’d challenge you to find anyone who says “yes – that’s what I was signing up for, and I knew it, and I am entirely comfortable with where we’ve ended up”.

No matter how or when it started to widen, this gap between what we reasonably expect and what is actually being done with our personal information reflects an unforgivable breach of ethics.

No one signed up for this.

The sense of outrage so many of us feel at such an egregious breach of trust – whereby Cambridge Analytica was able to make use of people’s personal data, collected from Facebook without their knowledge or explicit permission – hasn’t let up either.

Of course, we enjoy the benefits of free access to online platforms that let us connect with friends and meet new ones; share our stories, our recommendations; find new and interesting products, services and pastimes.

Yes, these are “free”, but only in exchange for data about us, some information to grant us access. But our choices are based on a partial and misleading picture of what we’re really signing up to, and a false impression of the resulting risk.

We aren’t sharing a few personal messages with only our close friends and contacts. We are pouring our data into a vast and volatile market that has an economic momentum we can’t control, and a political influence we are only just starting to understand.

This bears no relation to most people’s understanding of what a social network platform is about. The agreements we thought we were making look more and more like Faustian bargains.

This current episode has already triggered a process of investigation and response by governments, regulators, and those entrusted with safeguarding the rights of citizens. Yet, big as it is, Facebook is just one element of the online ecosystem.

But amongst all of this, the testimony, apologies, criticisms, and questions there has been little discussion of where we should end up.

So for anyone who collects, uses or shares information about us, here’s what we want:

  1. Fairness: Be fair with us. Respect our data, our attention, and our “social graph”. This means putting our interests above yours. This should be unequivocal. Your business model IS us. Seek our consent honestly, and when you use or share our information, don’t exceed what we consented to. If you do, we expect our law makers to hold you accountable in a meaningful way.
  2. Transparency: Make your privacy terms easier to understand so our consent actually means something. Be up-front and honest about your business model, your partners, your privacy policies and practices. Open your enterprise up to privacy audit, and then tell us what you are doing to address the findings.
  3. Choice: Give us genuine choices, starting with “opted out by default”. Let us opt in if we see fit. Let us opt out when we change our mind. Respect our right to stop using your products and services. Delete our data when we leave – and sooner if you no longer need it.
  4. Simplicity: You design your services for minimum friction and maximum convenience; apply your design efforts to privacy, too. Don’t expect us to manage our data, piece by piece, or fiddle with complex settings: let us express our preferences and intentions, and respect our choice.
  5. Respect: Show your respect for us and our interests, especially our privacy and autonomy. This means that public policies should prioritize our privacy, not corporate interests. Don’t treat us as mere raw material, or as the product you sell to your customers. From now on we will no longer shrug off privacy concerns and say ‘Well, I have nothing to hide.’ Now we know better.

To get here is going to take action from all of us.  It’s time to stand up and ask for a better deal.

The game has changed. We need to demand new rules.

Note: The blog post has been amended for accuracy.

Internet Governance Public Policy

Plenipot Update: 28 October 2014 – The Real Work Begins

Week 2 of the ITU Plenipotentiary began last week by concluding on elections of the Radio Regulations Board and the ITU Council.   This means that the parties and receptions come to an end and the long, hard and sometimes tedious work of negotiating the text begins.    The good news is that they gave us Sunday off so, in theory, people are rested and ready for a long week!

If you’ve been to an ITU meeting, you’ll know that at this stage of a 3-week conference, countries are still introducing their ideas and staking out their ground on various topics.  Small group conversations explore opportunities for compromise, but in the main sessions, more and more text goes into square brackets to reflect that there is no consensus.

On the issue of Internationalized Domain Names (Res 133), countries are looking to update the 2010 Resolution in light of progress made in this area since then.  There is a debate on whether to include references to work outside the ITU by the technical community and other multistakeholder processes where the work has taken place.  The room was also divided on what work the ITU should do on this issue going forward.

With regard to IP networks (Res 101), there are numerous proposals to update the existing Resolution, including adding economic language on international interconnection costs, security, and “unlawful international surveillance”.   There was very little progress made and most language is in square brackets. After a late-night debate over this Resolution, no agreement was reached and we’ll be back at it again either today or tomorrow.

A separate ad hoc group considered the ITRs, in particular how to review the ITRs and, if so, on what schedule.  Countries disagree on the starting point for a review – should it be 2012 (when the WCIT happened) or 2015 (when the new ITRs come into force for those who signed)?  The scope, nature and outcome of a possible ITR review has not yet been agreed and we expect another ad hoc meeting.  A proposal that the ITU should host the next World Telecom Policy Forum (WTPF) on review of the ITRs is still under discussion.

Finally, another evening ad hoc meeting took up the issue of illicit use of ICTs (Res 174).  A host of proposals from countries would include references to various UNGA Resolutions on cybersecurity, privacy and other topics related to national security matters.  Other countries expressed concern that additions of these references would, in effect, expand the scope of the resolution to include topics that are outside the ITU mandate. In addition, there is a proposal to consider a global charter on ICT Security but this has not been agreed.

So, we’re on to another day.

For Wednesday, 28 October, the meeting will turn its focus to the following topics:

  • ITU role in the High Level Review of WSIS;
  • Alternative calling procedures (Res 21)
  • Apportionment of revenues (Res 22)
  • Confidence and security (Res 130)

You can find out more by visiting our page on Plenipot14 or by reading our Issues Matrix.