Sarah Spiekermann

Foto:

Isn't it strange that 88 percent of the people claim that they are worried about who has access to their data? 86 percent claim that they are getting increasingly more security-conscious about their data. 85 percent expect that governments should impose penalties on companies that misuse data. And in the US some 83 percent of consumers say they would stop doing business with a company entirely if they heard or read that the company misused customer information. At the same time 845 million people are active on Facebook, every day over 250 million photos are uploaded to the platform. In many countries over 70 percent of shoppers use loyalty cards. Isn't this a discrepancy? Scientists call the phenomenon "The Privacy Paradox" and - as far as the scientific world is concerned - I happened to discover it first in my lab in Humboldt University. Let me tell you this story:

"cost of privacy"

Straight out of consulting I was convinced that the most precious asset an online company can possess is high-quality personal data. High-quality personal data is - I am sorry to say - still the backbone of current Internet Economics and the more companies can suck from their customers the better for them. So I built a software agent called Luci who sold students cameras and winter jackets online and asked them tons of personal questions about their lifestyle. For example, Luci asked them whether they like to show off. Or what they would do with their photos. And I was deeply convinced that people would hesitate to answering this kind of personal question. I even built a formula to calculate a "cost of privacy" which was successfully published...

Students just LOVED the agent

But: Poppycock! Students just LOVED the agent. The would tell it everything, no matter how intimate the question was. Half-way down my experiments I realized that I wouldn't get the data I was looking for. So I took the bolt step, split my sample and gave students a really harsh privacy statement to sign. It said that all data given to agent Luci would be handed on to an anonymous sponsor of the experiment, the name of which we could not reveal and who would be free to do with their revelations whatever he wanted. Wow, I was sure that the experimental participants would now be more cautious. In particular, because most of them claimed in an entry questionnaire that they'd be really concerned about their privacy! But again: No sign of any caution. People continued to love the agent, told it everything asked for and walked off happily with a mediocre product recommendation. The privacy paradox was there.

Shall he step in and protect us for our own good?

So what do we take out of this? Let me ask you: What is more important: Our attitudes and convictions? Or our actual behaviour? Does our attitude, such as our attitude towards privacy become outdated, just because we do not live up to it all the time? Or are we really in times where privacy is becoming an outdated concept (as many data-hungry companies start to claim to their own advantage...)? If we fail to live up to our own socially grown expectations in these bloody addictive online environments, what should the regulator do? Shall he step in and protect us for our own good?