Microsoft Adopts Do Not Track

Do Not Track
Recently Microsoft announced a change in how DNT (Do Not Track) will be implemented in Internet Explorer. In a new pre-release version of IE 10 Microsoft will automatically start sending a DNT header for the user so that they will not be tracked by third parties across the web.

We think it is absolutely great to see Microsoft put its full support into DNT. It is important to note that only a year ago Firefox was the only browser that supported DNT. This push on Microsoft’s part will move DNT more into the main stream and bring issues of user control and privacy into the light.

We are eagerly awaiting more information about Microsoft’s new DNT implementation. Such a big name taking this on should mean a lot towards setting standards in regards to DNT. At the core of DNT, and indeed the reason for its existence, is the ability to allow users a choice as to whether they wished to be tracked or not. Believe it or not this is a big deal as up until now the user has not had this choice presented to them. It was simply not put in their hands.

The WC3 group, made up of leading consumer privacy groups and industry representatives including Microsoft , states: “Key to that notion of expression is that it must reflect the user’s preference, not the preference of some institutional or network-imposed mechanism outside the user’s control.”

DNT is exciting because it is not an off switch for a form of technology, rather it is users choice reflected in code. That is what makes this great. DNT goes beyond specific technologies and goes to the heart of the matter: how user browsing habits are used.

Currently there are three different signals to consider when delivering the users tracking preferences. The user can accept tracking, decline tracking, or not have a choice. Firefox defaults to the third option and handles it as if the user says it declines tracking. Ultimately it will be up to the company on how they wish to handle the third option, but we commend Mozilla Firefox for protecting its users by default.

All of this is extremely interesting and a great relief to the end user. There is no reason it could have gone the other way, and we are simply ecstatic that a company like Microsoft is running with DNT in order to protect user choice.

Users Vote for Facebook Policy Changes

The founder of Europe vs. Facebook, Max Schrems, has forced Facebook to put proposed policy changes up for a vote to all of its users by motivating his privacy group to flood Facebook’s Site Governance page with messages. Facebook received many more than 7,000 comments needed to trigger a vote. Europe vs. Facebook is demanding sweeping changes to Facebook’s product rather than the small policy changes in the proposal.

The one-week voting period opened on a set of a relatively benign changes and Facebook will notify users by web and mobile. If over 30% of Facebook’s active users (roughly 230 million people), voted for the changes that will go into effect, and if they vote against them they’ll be scrapped. Otherwise Facebook will take the changes “under advisement”. Facebook’s Chief Privacy Officer for Policy Erin Egan stated that Facebook will consider changing its site governance voting system to discourage votes being triggered by low-quality comments and adapt to the growing size of Facebook’s user base.

Egan follows “I really don’t think any of our changes were controversial. [Max Schrems] is interested in us changing our product, but these revisions are about our policy. We can’t please everyone. We did reach the threshold because a viral meme was created [by Schrems asking users to blindly paste in the comment "I oppose the changes and want a vote about the demands on www.our-policy.org"], and unfortunately the result is a vote.” When the feedback period ended on May 18th, we noticed over 42,000 comments, most without any actual qualitative feedback had been filed and a vote was inevitable.”

In all of Facebook’s history this is only the second governance vote. The voting system was set up in 2009 when it had 200 million users, so the 7,000 comment threshold and the 30% required to make a vote binding seemed more appropriate for the total amount of users. Facebook is now considering upping the comment threshold, or even possibly doing away with the voting procedure.

Egan stated “Max is a user of ours and we appreciate his feedback, but we worry the voting threshold number may be incentivizing quantity over quality”. A new system would seek to get users actually reviewing the changes themselves and giving their own opinion, rather than being used as pawns by privacy activists.

The demands include “We want Facebook to implement an ‘Opt-In’ instead of an “Opt-Out” system for all data use and all features” and “We want Facebook to limit the use of our data for advertisement“. These are much grander changes that would seriously hamper Facebook’s ability to launch new features and make money, and are unlikely to be adopted. There’s simply no way all 900 million+ users would be willing to constantly approve every little change Facebook makes.

By creating the “I oppose the changes” meme, Europe Vs. Facebook showed it would rather obstruct progress, even progress it had lobbied for, than provide real constructive criticism. While its allegiance to strict privacy could be viewed as admirable, its tampering with the commenting system cannot.

It’s still important to note that despite flaws in the system, Facebook offers its users much, much more control of site governance that any other major website. When asked if it would like to see other sites adopt a policy feedback system, Schnitt said “Absolutely, we think users should demand this kind of thing, and they deserve it too.” When asked if Twitter and Google+ were giving people enough control, Schnitt replied “That’s for their users to decide.”

Facebook’s users will have until June 8th to vote on the Statement of Rights and Responsibilities and Data Use Policy changes. Users will be directed to the voting page from ads in the sidebar of Facebook’s website, and a banner at the top of its mobile interfaces. The most significant changes users will be voting on are:

A clarification regarding Facebook’s existing policy that it may use your data to serve you ads outside of Facebook.com while you’re on other websites
A detailed new chart of how Facebook uses cookies to improve Facebook but not track you across the web
A more detailed explanation of how in some cases Facebook will “retain [your] data as long as necessary to provide you services” whether that’s less or more time

More Than You Wanted To Know

Wine spillThis blog has looked at and gone down many privacy avenues. Usually they are related to your online identity or your online privacy. Avenues firmly routed in a world that this company exists in. But for many people their online world and their real lives are colliding. What has long been known as “our private lives” is quickly losing ground and becoming far more public than we would like despite our best attempts compartmentalize our world.

This February CBS did an article about a teacher, her Facebook, and what happens when public life meets private life. You can read the full article here. To summarize the article: A school teacher went on a summer vacation to travel Europe. She (like so many other people) took photos to chronicle her journey. You know the kind of photos, they are what you share with friends to show that little café in Italy that made the best gelato you have ever had, or just what the Eiffel Tower looks like looking up from the base. So that you have images when words aren’t enough. But the problem didn’t stem from those photos. It stemmed from her time in Ireland where she got a photo of herself with a glass of wine and Guinness.

She took that photo, as well as all of the others, and put in on her Facebook to share with friends and family. She even set her Facebook to private to avoid the collision of her two worlds. Despite all of those conscientious precautions a student’s family member saw the photo and reported the teacher to the administration. Shockingly the teacher was then offered the choice of being suspended or resigning. She resigned and is now fighting for her job.

So why do we find this worth writing about? Why did that article make it to this blog? Because it is a prime example of our shrinking world. Our online identities as well as the details of our private lives are making it into the public eye (and by proxy our public lives) quite often and definitely more than we want, like, or expect.

The problem is that it is getting to the point that it is impossible to keep anything personal or private. The only way would be to never share our stories, photos, moments, or lives with other people. This is in direct opposition to the fact that we are not solitary creatures. Our doctors, our nurses, our teachers, our police, our judges, psychiatrists, and everyone else you can think of are normal people. They have normal impulses, and outside of work lead normal lives. These same people are at a crossroads. Do they stay behind the times, have no online identity, and live in fear of when personal meets private? Or do they fight like this teacher is doing and push for privacy and the right to exist and be normal outside of the workplace? Where do you stand? How important is your privacy to you? Is it worth protecting?

FTC Calls for Privacy Legislation

The Federal Trade Commission (FTC), the arm of the government responsible for creating and enforcing national privacy policy, has published a report about how American businesses should protect the privacy of consumers and recommends the ways companies should give consumers greater control over the data that is collected about them. As part of the report, called “Protecting Consumer Privacy in an Era of Rapid Change: Recommendations For Businesses and Policymakers,” the FTC also calls for Congress to consider creating general privacy legislation, data security and breach notification legislation, and data broker legislation.

The report calls on American businesses to use best practices when it comes to privacy, specifically it calls for :

  • Privacy by Design - companies should build in consumers’ privacy protections at every stage in developing their products. These include reasonable security for consumer data, limited collection and retention of such data, and reasonable procedures to promote data accuracy;
  • Simplified Choice for Businesses and Consumers - companies should give consumers the option to decide what information is shared about them, and with whom. This should include a Do-Not-Track mechanism that would provide a simple, easy way for consumers to control the tracking of their online activities.
  • Greater Transparency - companies should disclose details about their collection and use of consumers’ information, and provide consumers access to the data collected about them.
In an attempt to not burden small businesses, the report concludes that these recommendations should not apply to companies that collect non-sensitive data from less than 5,000 consumers a year.

“If companies adopt our final recommendations for best practices – and many of them already have – they will be able to innovate and deliver creative new services that consumers can enjoy without sacrificing their privacy,” said Jon Leibowitz, Chairman of the FTC. “We are confident that consumers will have an easy to use and effective Do Not Track option by the end of the year because companies are moving forward expeditiously to make it happen and because lawmakers will want to enact legislation if they don’t.”

Data Brokers

The report takes a swipe at data brokers – who exist solely to buy, collate, and sell highly personal information about consumers, often without consumer consent or their knowledge about how this data is being used. The FTC reminds data brokers that existing legislation already gives consumers the right to access information held about them by data brokers. But it also recommends that data brokers make their operations more transparent and create a centralized website where consumers can get information about their practices and their options for controlling data use.

Concerns over data brokers rose last year after an investigation by The Associated Press which found that many such brokers frequently store incorrect or outdated information, including criminal records. The investigation found that some people were denied jobs because a data broker had incorrectly reported them as a convicted felon. Last year the data broker HireRight Solutions Inc. was forced to settle a class-action lawsuit for $28.4 million after widespread complaints about inaccurate records led to legal action against the company.

Do-Not-Track

The work done by the major browsers (like Firefox and Chrome) to develop do-not-track technology has been commended by the FTC. With DNT users have a choice about whether to be tracked by third parties as they move across the web. The World Wide Web Consortium, the group which defines the various technology standards for the Internet, are currently developing a universal web protocol for Do Not Track. “The Commission will work with these groups to complete implementation of an easy-to-use, persistent, and effective Do Not Track system,” the report says.

Monetising Privacy – Would You Reveal Private Information to Buy Something Cheaper?

‘Everyone has a price’ is the old saying and it is certainly true today where privacy is concerned. In a world where personal data is traded like any other commodity, the European Network and Information Security Agency (ENISA) – a centre of network and information security expertise for the EU – has published a study about consumer behavior in relation to the disclosure of personal information during a purchase or transaction.

In a set of controlled experiments, Dr Nicola Jentzsch and his team discovered that people have a natural built-in mechanism to protect their privacy but only if it doesn’t cost them anything. Under the experiments the participants simulated buying tickets for a movie from one of two sellers. One of the sellers asked for more personal data (e.g. their cell phone number) than the other. If the price was the same at both sellers, the majority of purchases were made with the privacy-friendly service (about 83% of all tickets sold). But, if there was a price difference (where the vendor asking for more information was cheaper) most of the participants (more than two-thirds) happily revealed the information to get the tickets cheaper.

Another interesting aspect of the study is that of those who opted to go with the privacy-unfriendly service, some participants tried to cheat the system by supplying false information (like giving their name as Donald Duck) in an attempt to get the discount. To offset this tendency the researchers used a lie detector to ensure only truthful information was given! After buying the tickets the participants were asked if they had concerns about whether the ticket seller would protect their information. A majority of users expressed concerns with only about 0.7% of participants saying that they are ‘not interested at all’ if organizations that collect personal data also protect this information.

What is startling about this study is the price difference. Were the tickets, which asked for the mobile phone number, half price? At a 33% discount? No, the difference in the price was just $0.65. Just over half a dollar, that is what private information is worth!

The report make several recommendations, one of which is “Personal data protection and privacy is a human right. The European Commission, EU Member States and data protection authorities should enforce a clear and consistent legal data protection framework.” This should also be true in the USA.

It seems that asking for more personal information than is necessary is becoming a “normal” part of online life. According to the ENISA report “43% of Internet users say they have been asked for more data than necessary when trying to obtain access to or use an online service.” It is essential that online users avoid (as much as is possible) services that request unnecessary amounts of data. Once you have handed over private information there is no way to get it back. Worse still, it seems as if greedy workers are willing to break confidentiality rules (and privacy policies) to make some extra money on the side. An investigation by the UK’s Sunday Times has found “corrupt Indian call center workers” sold confidential personal data of more than 500,000 customers to cyber criminals and marketing firms.”

Use your common sense. Don’t reveal what isn’t necessary. Use privacy friendly services, even if they cost $0.50 more!