After 65 years of trial and error the first machine has won the Turing Test+
Yesterday the University of Reading published the big news: After 65 years of trial and error the first machine has won the Turing Test. This is big news and it means big responsibility! System designers and engineers must now – more than ever – think about the ethics of the machines they build.
I welcome the news with mixed feelings. I am thrilled on one side. And on the other side I feel that the computer science world is not ready for the responsibility ahead. Notably, I see very smart and influential people in the legal, political and economic field recently talking (publicly!) about the outdatedness of informed consent; its practical and technical infeasiblity. Informed consent is that “stubborn” European insistence that legally requires machines to ask for our permission before they are allowed to process our personal data.
Now, in the face of Turing level systems, seriously, I more than ever want the machine to ask for my permission. And if it is only for me to know that it is a machine I am talking to. I want to know what information that artificial being en face knows about me. And I want to withdraw from it the permission to use my information about me at any time. If the informed consent is taken away from us humans now, then “Good Night Mary”, as an old German saying would coin it. The technical architectures of the super intelligent system world coming would go in the wrong direction.
Unfortunately, powerful people have recently started to argue for the abolishment of our consent to personal data processing. I am not sure whether this is for power reasons (on the side of policy makers and companies), laziness and inflexibility (on the side of engineers) or simply extreme nativity and incapability to understand machines (mostly on the side of lawyers). Whatever it is it, abolishing consent is wrong and it is dangerous.
Technically giving consent can be „as reliable and easy as turning on a tap and revoking that consent as reliable and easy as turning it off again’’ (Edgar Whitley, London School of Economics, (Whitley, 2009)). So lets become a bit more frank technically for all those who still don’t want to believe this message and confront me with this outmoded and 1990s argument that “huge piles of consent forms are just too complicated for people …”:
Timely technical proposals foresee that Internet browsers serve as mediators between the “intelligent” infrastructure and us (see i.e. (Langheinrich, 2003, 2005; Spiekermann, 2007). In the near future our browsers can become more sophisticated personal software agents. They learn and store our privacy preferences and then permit or block requests to collect data about us automatically. Requests for our data as well as data sharing is logged (on our) the client side (Danezis, Kohlweiss, Livshits, & Rial, 2012), as well as with the requesting data collecting entities. The agreed data exchange terms and conditions enter a kind of “sticky policy” that is attached to the data collected form us (Casassa Mont, Pearson, & Bramhall, 2003). These policies travel as metadata-tags with our information into the databases of data controllers and processors who then need to comply to what extent and under what conditions we allow them to use our data (Nguyen, Haynes, Maguire, & Friedberg, 2013). Policies either deny any further use of our personal data (opt out) or allow for it (opt in). Policies may also detail a more elaborate set of specific privacy preferences with the help of protocols similar to P3P (Cranor et al., 2006)). Consent may also be withdrawn again or granted by us dynamically at later points in time (Kaye et al., 2014).
The true story is that all of these processes and technologies to manage our privacy are around for some time, but the industry has fought bitter battles to not use them (who is surprised?). Part of that battle is to tell some faithful politicians and lawyers that consent would simply overwhelm people and be so hard to technically implement… a blank lie.
Time is right therefore for the regulator to step in and mandate – ideally in the forthcoming European Data Protection Regulation - that companies have to adhere to the consent information we send them. They need to be regularly audited for this adherence and must be obliged to store our policy meta-tags with the data we send them. (Sarah Spiekermann, 10.6.2014)
Casassa Mont, M., Pearson, S., & Bramhall, P. 2003. Towards Accountable Management of Identity and Privacy: Sticky Policies and Enforceable Tracing Services: Hewlett Packard Laboratories Bristol.
Cranor, L. F., Dobbs, B., Egelman, S., Hogben, G., Humphrey, J., & Schunter, M. 2006. The Platform for Privacy Preferences 1.1 (P3P1.1) Specification - W3C Working Group Note 13 November 2006. In R. Wenning, & M. Schunter (Eds.), Vol. 2007: World Wide Web Consortium (W3C) - P3P Working Group.
Danezis, G., Kohlweiss, M., Livshits, B., & Rial, A. 2012. Private Client-side Proling with Random Forests and Hidden Markov Models. Paper presented at the 12th International Symposium on Privacy Enhancing Technologies (PETS 2012), Vigo, Spain.
Kaye, J., Whitley, E. A., Lund, D., Morrison, M., Teare, H., & Melham, K. 2014. Dynamic consent: a patient interface for twenty-first century research networks. European Journal of Human Genetics: 1-6.
Langheinrich, M. 2003. A Privacy Awareness System for Ubiquitous Computing Environments. Paper presented at the 4th International Conference on Ubiquitous Computing, UbiComp2002, Göteborg, Sweden.
Langheinrich, M. 2005. Personal Privacy in Ubiquitous Computing – Tools and System Support. ETH Zürich, Zürich, CH.
Nguyen, C., Haynes, P., Maguire, S., & Friedberg, J. (Microsoft Corporation). 2013. A User-Centred Approach to the Data Dilemma: Context, Architecture, and Policy. In M. Hilebrandt (Ed.), The Digital Englightment Yearbook 2013. Brussels: IOS Press.
Spiekermann, S. (Humboldt University). 2008. User Control in Ubiquitous Computing: Design Alternatives and User Acceptance, Aachen, Shaker Verlag.
Whitley, E. A. (London School of Economics) 2009. Informational privacy, consent and the ‘‘control’’ of personal data. Information Security Technical Report 14: 154-159.