Dear CEO

Harrie van de Vlag and Paulien van Slingerland (TNO) and Linnet Taylor (Gr1p)

“The one fundamental rule about new technologies is that they are subject to function creep: they will be used for other purposes than their originators intended or even imagined.” [LT]

For Dear CEO, The Gr1p Foundation, on request of the VPRO Medialab, connected some of the most prominent technology thinkers of the moment to representatives of companies and government to debate wearables and media. Through a unique letter exchange, they research the rough edges of this technology and unearth the most pressing controversies together. 

Inspired by these exchanges? Now it is up to you, the reader, to determine your position vis-a-vis wearable tech. 

Writers of this conversations are:

Linnet Taylor writes for the Gr1p Foundation and is a researcher. Her research focuses on the use of new types of digital data in research and policymaking around issues of development, urban planning and mobility. Her pen pals are Harrie van de Vlag and Paulien van Slingerland, both Consultants Data Science at TNO

Harrie van de Vlag & Paulien van Slingerland

Thursday 28-09-2017 17:40

Dear Linnet,

We are writing you to discuss a new trend in data science: "affective computing".

Emotions and relationships have long been important in our economy. People do not buy a ticket for a concert, but for an unforgettable evening with friends. People are not looking for a new job, but for a place in an organisation with a mission that suits their world views and principles.

A stronger emotional connection has a higher value, both for companies and consumers. This is why at TNO we are researching how affective states (emotions) can be interpreted, using wearables that record properties like heart rate, brain activity (EEG), skin conductivity (sweat), etcetera.

Starting point for our research was a question by Arnon Grunberg, who was interested to learn how his readers felt while reading his books. For this purpose we have conducted an experiment in a controlled environment with 200 voluntary participants. To bring this technology out of the lab and into the field TNO, Effenaar Smart Venue and software developer Eagle Science are working towards new prototypes of appliances based on emotion measurements.

The first one will be demonstrated during the Dutch Design Week 2017 (October 21-29). Together with Studio Nick Verstand, we will present the audiovisual artwork AURA, an installation that displays emotions as organic pulsating light compositions, varying in form, colour and intensity.

Eventually this technology can be used for instance to develop new forms of market research, enabling companies to measure the emotional experience of voluntary consumers without disturbing their experience. This reveals which parts of the customer journey are perceived as positive and which as annoying. Acting on these insights allows companies to provide a better experience, for instance during shopping, while visiting a festival, or when following a training in virtual reality.

At TNO, we are well aware that emotions are closely tied to the private sphere of individuals. The question arises whether consumers need to choose between their privacy on the one hand and the comfort of personalised services on the other. The upcoming new privacy legislation (GDPR) also highlights the importance of this dilemma. This is why TNO is also researching technologies to share data analyses, without disclosing the underlying sensitive data itself. For instance because the data remains encrypted at all times. This way, from a technical point of view, the dilemma appears to be solved and there would no longer be a need to choose between privacy and convenience.

At the same time we expect that this can only be the case if people feel they can trust such a system, and that more is needed than just a technical solution. Therefore we are interested in your point of view. What else is needed to establish trust?

Best regards,

Paulien van Slingerland and Harrie van de Vlag
TNO
Innovators in Data Science

At TNO, we are well aware that emotions are closely tied to the private sphere of individuals. The question arises whether consumers need to choose between their privacy on the one hand and the comfort of personalised services on the other.

Harrie van de Vlag & Paulien van Slingerland

Linnet Taylor

Thursday 28-09-2017 23:07

Dear Paulien and Harrie,

I read with interest your explanation of your new project on measuring emotional experiences. It is exciting to be part of the birth of a new technology, and the wonder of innovation is clear in your AURA project which will translate sensed emotions into light. I think this will provide new opportunities to investigate processes of human emotion, especially for the ‘quantified self’ community already engaged in measuring and tracking their own experience of the world.

I question, however whether tracking one’s changing emotional state as one experiences media, or anything in fact, is part of a ‘customer journey’. This is not just about sensing, but about investigating the border between software and wetware – technology that aims to connect to and enhance the human brain.

It is interesting to its corporate sponsors because it promises new forms of access not to ‘the customer’ but to people, in all our idiosyncrasy and physicality. Those forms of access are not necessarily more accurate than asking people what they think, but they will be more seamless and frictionless, blending into our lives and becoming something we are rather than something we do.

You ask whether consumers need to choose between their privacy on the one hand and the comfort of personalized services on the other. I think this question may distract attention from a more central one: can we separate our existence as consumers from our existence as citizens, partners, workers, parents? Our emotions are an essential bridge between ourselves and others, and what we show or hold back determines the kinds of relationships we can form, and who we can be in relation to our social world.

The language of choice may not be the right language here: your project uses only volunteers, but is it clear what they are volunteering? Your technology has a 70-per-cent accuracy, according to test subjects. But there is profound disagreement amongst brain specialists as to what we measure when we study emotions.

William James, one of the founders of psychology, argued that our experience of emotions actually results from their physical expression: we feel sad because we cry and we feel happy because we smile, not the other way around. If this is true, the sensors you are developing will have better access to the biological content of our emotions than we will, which has implications for – among other things – our freedom to form our own identities and to experience ourselves.

I am reminded of a project of Facebook’s that was recently discussed in the media. The company’s lab is attempting to produce a brain-computer speech-to-text interface, which could enable people to post on social media directly from the speech centre of their brains - whatever this means, since there is no scientific consensus that there is such a thing as a "speech centre".

The company’s research director claims this cannot invade people’s privacy because it merely decodes words they have already decided to share by sending them to this posited speech centre. Interestingly, the firm will not confirm that people’s thoughts, once captured, will not be used to create advertising revenue.

You ask what is needed to establish trust in such a system. This is a good question, because if trust is needed the problem is not solved. This is one of a myriad initiatives where people are being asked to trust that commercial actors, if given power over them, will not exploit it for commercial purposes. Yet this is tech and media companies’ only function. If their brief was to nurture our autonomy and personhood, they would be parents, priests or primary school teachers.

The one fundamental rule about new technologies is that they are subject to function creep: they will be used for other purposes than their originators intended or even imagined. A system such as this can measure many protected classes of information, such as children’s response to advertisements, or adults’ sexual arousal during media consumption.

These sources of information are potentially far more marketable than the forms of response the technology is currently being developed to measure. How will the boundary be set and enforced between what may and may not be measured, when a technology like this could potentially be pre-loaded in every entertainment device? Now that entertainment devices include our phones, tablets and laptops, as well as televisions and film screens, how are we to decide when we want to be watched and assessed?

Monitoring technologies produce data, and data’s main characteristic is that it becomes more valuable over time. Its tendency is to replicate, to leak, and to reveal. I am not sure we should trust commercial actors whose actions we cannot verify, because trust without verification is religious faith.

Yours,

Linnet Taylor
TILT (Tilburg Institute for Law, Technology and Society)

Harrie van de Vlag & Paulien van Slingerland

Thursday 05-10-2017 14:45

 

Dear Linnet,

Thank you for sharing your thoughts. The topics you describe underline the importance of discussing ethics and expectations concerning new technology in general, and affective computing in particular.

You end your letter saying that ‘if trust is needed, the problem is not solved’. This is true in cases where the trust would solely be based on a promise by a company or other party. However, there are two other levels of trust to take into account: trust based on law and trust based on technical design.

To start with trust based on law: the fact that a technology opens new possibilities, does not mean that these are also allowed by law. The fact that pencils can not only be used to write and draw, but also to kill someone, does not mean that that the latter is also allowed by law.

The same goes for affective computing: while the possibilities of affective computing and other forms of data analytics are expanding rapidly - your examples illustrate that - the possibilities of actually applying this technology are increasingly limited by law. As a matter of fact, new privacy legislation (GDPR) will become effective next year. Europe is significantly stricter in this than America (where companies like Facebook are based).

For example, as TNO is a Dutch party, we can not collect data for our research during the AURA demonstration without the explicit consent of the voluntary participants. They have to sign a document. Moreover, we need to ensure that the data processing is adequately protected. For special information, such as race, health and religion, extra strict rules apply.

Furthermore, we cannot use this data for any other purpose than the research described. For instance, VPRO was interested in our data for publication purposes. However, aside from the fact that we take the privacy of our participants very seriously, we are simply not allowed to do this by law. So TNO will not share this data with VPRO or any other party.

Altogether, applications of affective computing as well as systems for sharing analyses without disclosing data are both limited by law. We are actually developing the second category to facilitate practical implementation of the law, as the system is designed to guarantee technically that commercial companies (or anyone else for that matter) can not learn anything new about individuals.

This is trust by technical design, a novel concept that does not require a promise or law in order to work. At the same time, we realise that this is a new and unfamiliar way of thinking for many people. Therefore, we are interested to learn what is needed before such a system can be adopted as an acceptable solution.

To this end, let us rephrase our original question as follows: under what conditions would you recommend people to provide their data to such a system, given the technical guarantee that no company or other party would actually be able to see the data, even if they wanted to?

Best regards,

Paulien van Slingerland and Harrie van de Vlag
TNO
Innovators in Data Science

Can we separate our existence as consumers from our existence as citizens, partners, workers, parents? Our emotions are an essential bridge between ourselves and others, and what we show or hold back determines the kinds of relationships we can form, and who we can be in relation to our social world.

Linnet Taylor

Linnet Taylor

Zondag 08-10-2017 21:56

 

Dear Paulien and Harrie,

Your response is a useful one. It has made me consider what we mean when we talk about trust, and how the word becomes stretched across very different contexts and processes. You ask, under what conditions would I recommend people provide their data to a system that can sense their response to media content, given the technical guarantee that no company or other party would actually be able to see the data, even if they wanted to.

This is, of course, a difficult question. People should be free to adopt any technology that they find useful, necessary, interesting, stimulating. And it is likely that this sensing system will be judged all of these things. Let us be honest here, though – it is not a citizen collective that has asked us to write this exchange of letters.

We are exchanging thoughts about the future activities of media corporations, at the request of a media corporation. If the technology were going to be used exclusively in a highly bounded context where the data produced could not be shared, sold or reused in any way, I am not sure we would have been asked to have this conversation.

I think the reason we have been asked to exchange ideas is because there are huge implications to a technology that purports to allow the user to view people’s emotional processes. This technology has the potential to help media providers shape content into personal filter bubbles, like our timelines on social media.

These bubbles have their own advantages and problems. There has been much recent analysis, for example, of how the populist parties coming to power around the world have benefited hugely from digital filter bubbles where people access personalised content that aligns strongly with their own views.

It is indeed important that such a system should be used in accordance with the law. But data protection law, in this case, is a necessary but insufficient safeguard against the misuse of data. The real issue here is volume. Most people living today are producing vast quantities of digital data every moment they are engaged with the world.

These data are stored, kept, used, and eventually anonymised – at which point data protection law ceases to apply, because how can privacy relate to anonymised data? Yet the system you are developing demonstrates exactly how. It is another technology of many that will potentially make profiling easier. It will show providers our weak points, the characteristics that make it possible to sell to us – and it can do this even if we do not use it.

An example: someone wishes to live without a filter bubble and does not consent to any personalisation. But all the other data they emit in the course of their everyday life generate a commercial profile of them which is highly detailed and available on the open market. The features which make them sensitive to some types of content and not others are identifiable: they have children, they like strawberries, they are suffering domestic violence, they are made happy by images of cats. A jumble of many thousands of data points like these constitute our digital profiles.

But it is not only our characteristics. It is those of people around us, or like us. Knowledge about the attributes of users of a system such as yours (whose response to content can be directly measured) can be cross-referenced with the attributes of those who do not use it. Once this happens, it becomes possible to infer that my heart will beat harder when I watch one movie than when I watch another; that I will choose to go on watching that provider’s content; that my attention will be available for sale in a particular place at a particular time.

In this way, consent and privacy become meaningless if there are enough data points about us all: new technologies that pinpoint our behaviour, feelings and susceptibilities are valuable not for their immediate uses but as an addition to the long-term stockpile of data on all of us – and especially useful with regard to those who do not choose personalisation and are therefore harder to pinpoint and predict.

This is why I am sceptical about invoking ‘trust’ as something that can be generated by making sure individual applications of a particular technology comply with data protection law. Data protection is a cousin to privacy, but it is not at all the same thing. We may guard data without guarding privacy, and we may certainly trust that our data is being handled compliantly with the law, while also having reservations about the bigger picture.

Things that are perfectly permissible under data protection law, yet are also unfair include charging different people different prices for the same goods online; following users’ activities across devices to understand precisely what makes them respond to advertisements, and a company passing on our personal data to unlimited subsidiary companies. Law is no panacea, nor can it be relied upon to predict what will happen next.

I do not cite these things to argue that you should stop developing affective computing technologies for commercial use. I use them to suggest two fundamental realities: first, that we are no longer capable of understanding the long-term or collective implications of the data we emit, and second, that our consent is not meaningful in that context.

Having made my argument for these two problems, and how they relate to your work, I can pose a question in return: how can we, as developers and users of data analytics technologies, collaborate to look beyond legal compliance to the many possible futures of those technologies, and create ways to shape those futures?

Yours with best regards,

Linnet Taylor

Other conversations

Curious what the other duos had to say? Check out all four 'Dear CEO' conversations here.

The writers will finish their conversation live during our 'We Know How You Feel Tonight' debate night  on Wednesday October 25th 2017 in De Effenaar, Eindhoven.