Hello, Is That You?

It looks like Google has been recording your voice searches (German). There have been rumors all along and it was assumed this was going on. They have the actual voice recordings and their transcripts and also generate a “finger print” of your voice to be able to verify it.

If you extrapolate from that they can by now

*shudder*

Data is not an asset, it’s a liability

A short blog post that drives home a very important point:

Here’s a hard truth: regardless of the boilerplate in your privacy policy, none of your users have given informed consent to being tracked. Every tracker and beacon script on your web site increases the privacy cost they pay for transacting with you, chipping away at the trust in the relationship.

Because

The all too typical corporate big data strategy boils down to three steps:

  1. Write down all the data
  2. ???
  3. Profit

This never makes sense. You can’t expect the value of data to just appear out of thin air. Data isn’t fissile material. It doesn’t spontaneously reach critical mass and start producing insights.

Which leads to the realization:

Think this way for a while, and you notice a key factor: old data usually isn’t very interesting. You’ll be much more interested in what your users are doing right now than what they were doing a year ago. Sure, spotting trends in historical data might be cool, but in all likelihood it isn’t actionable. Today’s data is.

So

Actionable insight is an asset. Data is a liability. And old data is a non-performing loan.

 

Facebook Tracking People Who Have Opted Out of Tracking

Facebook specifically and individually tracks all people, even those who aren’t FB users. Using the opt-out mechanism you’re even worse off, since setting the opt-out cookie makes you uniquely identifiable (again).

During the opt-out process, Facebook sets a long-term identifying cookie and then uses this to track visits to pages that have a Facebook social widget. In other words: “for those individuals who are not being tracked by Facebook (e.g. non-users who have never visited a page on the facebook.com domain, or Facebook users who clear their cookies after logging out from Facebook), using the ‘opt out’ mechanism proposed for the EU actually enables tracking by Facebook” (emphasis in original).

When you opt-out …

[…] Facebook promises to stop collecting browsing information, or use it only specifically for the purpose of showing advertisements.”

So, of what use is it then?!?

They Don’t Care About Your “Online” Privacy

Messenger apps show your friends’ online status. Anytime you open the app, they’ll notify the service that you’re “online” at the moment. Now everybody else can see it in their contact lists.

And with everybody I mean anybody! If you have a phone number you can check that person’s online status as often as you want from wherever you want (no need to be friends or anything).

So did a group of researchers at the Friedrich-Alexander-Universität Erlangen-Nürnberg. They used this “feature” to “find out how frequently and how long users spent with their popular messenger” on a random sample of 1000 people in different countries for over eight months.

Looking through the project’s website should make it clear how little the creators of those apps care …

Moreover, we were able to run our monitoring solution against the WhatsApp services from July 2013 to April 2014 without any interruption. Although we monitored personal information of thousands of users for several months — and thus strongly deviated from normal user behaviour — our monitoring efforts were not inhibited in any way.

… and that they don’t want you to be able to care.

Unfortunately, affected messenger services (like WhatsApp, Telegram, etc.) currently provide no option for disabling access to a user’s “online” status. Even WhatsApp’s newly introduced privacy controls fail to prevent online status tracking, as users still cannot opt-out of disclosing their availability to anonymous parties.

Apple’s Spotlight Search Phones Home

OS X Yosemite seems to have gained the feature to “phone home” when you do spotlight searches. It’ll send search terms and your location data to Apple’s servers. Of course it’s perfectly in line with Apple’s recent “trust us, we won’t collect unnecessary data” rhetoric.

[…] Ashkan Soltani, an independent researcher and consultant, confirmed the behavior, labeling it “probably the worst example of ‘privacy by design’ I’ve seen yet.” Users don’t even have to search to give up their privacy. Apple immediately sends the user’s location to the company, according to Soltani.

You can turn it off, but it’s on by default.

Whispers of Betrayal

The Guardian exposed in a series of articles how the creators of the Whisper app track individual and group behavior.

Whisper violated their own claims made in their terms of service and privacy policy which was updated just days before the Guardian article was published, but after being asked for comment for the publication. :/

    • They had tools to track and build profiles of users although claiming they would be “anonymous”
    • They tracked the location of people who explicitly opted-out of geolocation
    • They cooperated with the DoD, sharing infos about messages from military personnel
    • They shared information with law enforcement bodies like the FBI and MI5 with a lower legal threshold than is common practice

They process data with a staff of over 100 in the Philippines although claiming to process and store all data in the US.

Update: The Guardian has since published a clarification, removing some of the previous claims. It seems like Whisper really planned to change their ToS for quite some time and doesn’t store data on non-US servers. The claims about geolocation tracking for those who’ve opted out is based on Whisper’s ability to geolocate IP addresses (which may be a quite rough estimation).

Less “Social Media,” More Passive Data Collection, Yay!

Foursquare had a great idea:

  • remove the social aspect of sharing, just track people silently all the time, it’s easier anyway
  • why bother with user-generated content, just feed them follow “experts” and feed them tips ads

Among the great features of the revamped app are:

  • tracking your location all the time
  • virtually no privacy controls
  • virtually no way to interact
  • suggestions almost solely based on paid advertisements expert opinions and tips
  • promise of more targeted ads outside of Foursquare

ArsTechnica has a nice quote on this:

This is the cleverest portion of the service’s revamp: make customers feel like they are sharing nothing, when in reality they are sharing everything. Passive information sharing and collection without the social friction—why didn’t anyone think of this before? The tragic, realistic answer is most likely “battery life.”
— Casey Johnston, ArsTechnica

Individual Mass Manipulation

There is great commentary on how and why Facebook’s infamous “emotion study” is unethical. The main point being that the researchers and Facebook violated the “informed consent” principle of research on humans.

There have been other “individual mass manipulation” studies. e.g. you could tip the outcome of close elections by manipulating search results. But manipulating the mood of people on a massive scale is “new.” Don’t get confused, I don’t mean it like “they try to influence what we’re thinking through TV and ads.” I mean individual manipulation. Different things are manipulated in varying amounts for everyone individually … basically anything that claims “to only show you the X most relevant to you” falls into this category (especially if they don’t offer a way out of the filter bubble).

But what should we do, now that we known we have the tools to enforce emotions? Why not actually press the “button of happiness“?

Imagine if Facebook could have a button which says “make the billion people who use Facebook each a little bit happier”. It’s quite hard to imagine a more effective, more powerful, cheaper way to make the world a little bit better than for that button to exist. I want them to be able to build the button of happiness. And then I want them to press it.

My dystopian senses tell me: it will be used, but not in the way suggested above. We can probably draw some conclusions from the fact that one of the authors’ work is funded by the DoD. Why would the DoD (or any military/government organization for that matter) fund anything useful to the general good of mankind?

I see three use cases manipulating emotions:

Or to put it more eloquently:

… large corporations (and governments and political campaigns) now have new tools and stealth methods to quietly model our personality, our vulnerabilities, identify our networks, and effectively nudge and shape our ideas, desires and dreams.
[…]
I identify this model of control as a Gramscian model of social control: one in which we are effectively micro-nudged into “desired behavior” as a means of societal control. Seduction, rather than fear and coercion are the currency, and as such, they are a lot more effective. (Yes, short of deep totalitarianism, legitimacy, consent and acquiescence are stronger models of control than fear and torture—there are things you cannot do well in a society defined by fear, and running a nicely-oiled capitalist market economy is one of them).

I think netzpolitik.org put it best in their conclusion (German):

The problem that these kinds of experiments and the systems that actually enable them pose is not that they are illegal, creatively or intentionally evil. This isn’t the case even if it might feel like it.
Instead [the problem is] that they’re only a tiny step away from legitimate everyday practice. That they look a lot like ordinary ads. That they sit on top of an already-accepted construction of reality by non-transparent providers. That because of their scale and stealth they can be so efficiently and easily hidden. That they don’t devise our loss of control, but only exploit it.

The actual study: “Experimental evidence of massive-scale emotional contagion through social networks” (PDF)