Dear CEO

Geert-Jan Bogaerts (VPRO) and Tessel Renzenbrink (Gr1p)

“Technology is not inherently good or bad or neutral. It is what we make it." [TR]

For Dear CEO, The Gr1p Foundation, on request of the VPRO Medialab, connected some of the most prominent technology thinkers of the moment to representatives of companies and government to debate wearables and media. Through a unique letter exchange, they research the rough edges of this technology and unearth the most pressing controversies together. 

Inspired by these exchanges? Now it is up to you, the reader, to determine your position vis-a-vis wearable tech. 

Writers of this conversation are:
Geert-Jan Bogaerts heads the department of digital media of Dutch broadcaster VPRO. Responsible for digital channels and innovation and distribution strategy.

Tessel Renzenbrink is part of the network of the Gr1p Foundation. She is freelance writer and web editor focusing on the impact of technology on society, particularly information and renewable energy technologies.

Geert-Jan Bogaerts

Sunday 17-09-2017 11:59

Dear Tessel,

It feels almost Victorian, like in an epistolary novel by Mary Shelley or Anne Brontë, to start corresponding with a complete stranger on a subject that’s apparently close to both our hearts. I‘m eager to learn what themes you will provide and I look forward to discussing them with you. Of course, at the same time there’s a strange twist to it. We write to each other but we also know that our correspondence will be made public, and therefore one tends to — or rather, I tend to — put a better foot forward. I mean, this is not without obligation

But anyhow, we are meant to begin this correspondence with a short introduction. I don’t have to mention my name as you already know it, just like my position —head of digital at VPRO. Rather than just stating the facts I think it’s more interesting to tell you how I see myself, what I identify with most. I mean, of course I am a son, a father, a brother, a husband, a friend, a workmate — these are the parts we can all more or less see ourselves in. But what makes me different? What defines my identity most of all?

The first word that springs to mind is journalist. Although nowadays I am much more a manager, a strategist and a policy maker, my background as a journalist still shines through in everything I do. It determines what questions I ask, how I view the world and which solutions I come up with for problems I encounter. Fifteen years of editorial work — as a freelancer first and later writing for de Volkskrant (business desk and correspondent in Brussels) - does shape you for life.

At the time — we’re talking the late nineties — I was stationed in Brussels and reported on the EU, NATO and Belgium, but in my own time I got involved in the online world. The strategic implications of this technological progress were far from distinct then, but it was already evident that the internet would profoundly change our trade and society in general. In 2003 I made it my profession as well, first as head of online at de Volkskrant, from 2010 as a freelance writer. advisor and teacher and since 2014 in my present job.

How do I observe technological progress now? Not just from the strategic mission that comes with my job, but explicitly also from the impact this progress has on our culture, our coexistence, our economy, our politics, our government. I feel it is very much a key task for public broadcasters to sketch the consequences, to explain developments and to ask questions. It is from that perspective that I look at our project "We Know How You Feel". What exactly does it mean when our thoughts and feelings will be out in the open? How does that change us? As an individual, in our relationships and in our social interactions?

I hope and expect this project will bring us interesting new insights.

Warm regards,

GJ Bogaerts
Head of digital VPRO

 

What exactly does it mean when our thoughts and feelings will be out in the open? How does that change us?

Geert-Jan Bogaerts

Tessel Renzenbrink

Sunday 24-09-2017 23:55

Hi Geert-Jan,

I must confess that I started out as a techno-optimist. I was convinced that the liberating possibilities of information and communication technology would actually lead to the most positive outcome. These possibilities lie mainly in the fundamental shift from centralised to decentralised. From a world ruled by a small group of people in positions of power to a world in which every voice is equal. I was convinced that this levelling would erode the power of institutional strongholds.

Take the mass media for example. Newsrooms at papers and tv stations used to both determine what the news was and how it was framed. The documentary Page One relates how The New York Times saw its authority diminished when the internet surpassed the paper as an information source.

In the days of old the NYT set the agenda. What the paper wrote determined what people talked about. That fact is presented with pride and a yearning for better days. No one asks if it is at all desirable when just a handful of editors sets the public debate, day in day out.

Another example of decentralisation is the rise of cryptocurrencies like Bitcoin. They enable monetary transactions without the interference of a central authority. Banks will no longer be too big to fail when that system takes hold, they will be obsolete.

As we all know things went different. The internet did not decentralise the world but the world centralised the internet. Once the web became popular, it was taken over by commercial parties. Almost 80 percent of web traffic now goes through Google and Facebook.

Googles algorithms determine which information comes up when you do a web search. Facebook has positioned itself between our personal interactions with family and friends and forces us to communicate by Facebook rules. It will do everything to keep us on its platform as long as possible, so it can sell our time and attention to advertisers. And, of course, both companies collect enormous amounts of data on us.

By now I see that technology does not necessarily propel us towards the most positive (or negative) outcome. Technology is not inherently good or bad or neutral. It is what we make it. That is why I got involved in the Gr1p network. The Gr1p Foundation wants to give people more grip on their digital surroundings so they can make informed choices.

Our choice of technologies and the way we use them impacts our society. But presently technological development is mainly corporate driven. That’s why both with Gr1p and in my work as a writer I strive for greater involvement of citizens in the digitalisation of society, so we can decide in a democratic way what kind of future we want to build using technology.

I fully agree with you that there is a task for public broadcasters here. And — more specifically about the subject of our correspondence — I find it useful that VPRO Medialab dives into emerging technologies. As a public institution you can study them from a different perspective than profit driven companies do. My first question to you therefore concerns how you give an interpretation to that task.

If I understand correctly you research a new technology and its impact on the process of media production every year. Last year it was virtual reality and this year it’s wearables. You aim specifically at measuring emotions using wearable technology and the role this could play in creating and consuming media.

A practical application you research is the use of emotional data by broadcasters to offer people a personal, mood based viewing experience. With what purpose do you research that application? What kind of service would you like to offer your viewers by using wearables?

You wrote that the project aims to find out what it means when our thoughts and emotions are out there for everyone to see. How is that researched practically? Which questions are asked and what is being done to find the answers? What do you think could be the distinctive role of the Medialab in the questioning of wearable technologies?

Kind regards,

Tessel Renzenbrink
Gr1p network

 

Geert-Jan Bogaerts

Saturday 30-09-2017 21:04

Hi Tessel,

I did not only start out as a techno-optimist, I still am one. I just never believed that technological progress in itself was a prerequisite for human happiness, collective or individual. It is a necessary condition though. Without technological advancement we would still be subject to the whims of nature.

But indeed, ultimately it’s how we apply technology that determines its quality, positive or negative. So I agree that technology in itself is neutral. It’s the scientists, the artists, the designers and the storytellers who can ultimately give it direction and meaning. In my view they set a standard, a standard we in turn need to determine how far we can stray from it.

We can assume a critical stance towards Google and Facebook and other data drivers because there is an entirely different group of people thinking about alternative approaches. They constitute the subculture of technological progress and they never cease to ask critical questions about applications, whether these are driven by profit or by a lust for power and control (the NSA’s of this world).

Anyhow, as far as I’m concerned public broadcasters for as long as possible will be a safe environment, where this critical questioning and free thinking is possible, where alternatives can be thought out and where experimenting with new technologies is allowed. At the VPRO we even consider that to be a core task.

We apply it as often as possible in our own productions but naturally we also apply a set of rules: we must reach a minimum amount of viewers, listeners or visitors. And there is a limit to what productions can cost. We set up the Medialab in Eindhoven to be a truly free environment, where we try to liberate ourselves as much as possible from all these rules we have to work by.

The Medialab is always on the lookout for relevant developments it can pick up and research, fed as much as possible by the available knowledge inside the VPRO and a wider network of artists, scientists, designers, authors and journalists.

Innovation in public broadcasting is always focused on media, both their production and consumption. That’s another reason why it is a core task: we see our audience moving away from so-called linear viewing and embracing new platforms. So we have to get to know these platforms as well. We must be able to handle them and to judge if such a platform or new technology could be of any benefit to us. By doing so we get to know these technologies and we find out what their positive - and possible negative - applications are.

We expect the influence of wearable technology on our media consumption to grow as it becomes more popular. We’ve already seen that very convincingly with the portables we now all carry: our smartphones, our tablets and our e-readers. But wearable technology is developing rapidly: from smart watches to sweatbands and underwear that can monitor our heart rate, blood pressure and body temperature. Even our sex life is not safe. Remote satisfaction no longer requires a tour de force…

Wearables can be used to produce media and to consume media. We will be able to create wonderful things using them, but we must also look at the flip side. My biggest worry concerns the data wearable technology can collect and exchange. And that is what this program predominantly focuses on.

Which personal data are we giving away without knowing it? How can we make our public conscious of that fact? What do my glance, my posture and the way I walk tell the shop where I get my daily groceries? We know that some clothes stores already experiment with personal display-advertising after a lightning fast analysis of my personal traits.

"We Know How You Feel" aims at giving the audience insight in these developments and processes. Last year we did a similar project, called "We Are Data". The accompanying website clicklickclik.click received almost a million clicks. It is evident that the subject lives. It’s urgent and it calls out for critical questioning.

I see many similarities between the goals I mentioned above and your observations about Gr1p. My counter question to you is: what do you see as the most effective way to reach these goals? Is it enough to make the public conscious? And what is the best way to achieve this awareness?

Warm regards,

GJ Bogaerts
head of digital VPRO

Technology is always an expression of certain norms and values. That’s why it is necessary for scientists and artists to critically question it.

Tessel Renzenbrink

Tessel Renzenbrink

Sunday 4-10-2017 22:08

Hi Geert-Jan,

Technology is neither good or bad, on that we agree. But unlike you I don’t think technology is neutral. On the contrary. Every technological artefact is an expression of a set of cultural values. Algorithms for example can mimic the prejudices that live in a society.

To give an example: some courts in the United States use algorithms to determine the sentence a convict will receive. Based on data a calculation is made of the risk that someone will reoffend in the future. In case of a high score the judge can decide to pass a higher sentence.

Research shows these algorithms are biased: black people are often given higher scores than white people. Eric Holder, who served as attorney general under president Obama, spoke out against the use of such algorithms, because they could ‘exacerbate unwarranted and unjust disparities that are already far too common in our criminal justice system and in our society.'

Technology is always an expression of certain norms and values. That’s why it is necessary — like you say —  for scientists and artists to critically question technology, to make these implicit values visible and to discuss them when necessary. But that’s not sufficient, because it is reactive. If you only react after the technology is brought to the market you already operate within a certain paradigm.

As people and as a society we will have to act earlier in the process, so we can determine what we want from tech. It does matter what you build. To think about what you want to build before you even start. And that is where values come in. Ultimately it is not about which technology you want to realise but about which values you wish to embody in technology.

In that sense it’s interesting that Medialab does not just lay down critical questions about wearables with ‘We know how you feel’, but also experiments with them, in cooperation with artist Nick Verstand and TNO. By doing so Medialab & co claim a creative role and the ability to determine the values.

The research question they pose is: can we tune our media offering to your mood on the basis of emotional data? But is that an interesting question? Which underlying values do you recognise with such a goal and which ones do you leave out?

I see how this use of emotional data could serve broadcasters. A personalised media offering might make people stay on your channel longer. It benefits the ratings. But how does it serve the public interest? Because to me the hunt for clicks and eyeballs that holds so many media in thrall is not a goal in itself.

And then there is of course the life-size ghost of the filter bubble. Personalisation based on data — emotional or otherwise — will per definition lead to a media offering tuned to your interests and convictions. That way it will confirm and reproduce your view of the world, while I think it’s a key task for a public broadcaster to make people familiar with the social environment of other groups in society.

In your letter you asked if broad awareness is enough to steer technological development in a direction that serves the public interest. Well, I don’t think it is enough but it’s a start. It’s under the pressure of a collective conviction that things start to change.

Take for example this other technological revolution that is in full swing now: the energy transition.Through the decades people grew more and more conscious of the fact that the economy and the energy sector in particular had to become sustainable.

Because of this awareness action was taken in ever more domains. Governments introduced laws and treaties. Engineers started innovating. Tax money was made available to pay for this innovation. Consumers made different choices. Companies went green.

You asked for the best way to achieve this awareness and my answer is: alternatives. Without alternatives there is to no course of action and that leads to a certain resignation. Why bother about something you can’t change anyway? Only when viable sustainable energy technologies became available people were able to turn their worries into actions.

But of course these alternatives did not come out of the blue. They were pioneered by people and institutions looking for other solutions, asking different questions because they took different values as their starting point. That's why I want to ask you what values underlie the AURA art installation.

Kind regards,

Tessel

Geert-Jan Bogaerts

Sunday 7-10-2017 18:11

Hi Tessel,

Let me start by answering your question on the values that underlie our Medialab project. By far the most important value to me is insight. You could make a sequence, starting with data. Data leads to facts, when you have facts you are informed and information in turn can lead to understanding or insight. None of these steps is self evident: it takes an effort to get from data to facts, from facts to information and from information to insight.

With our project "We Know How You Feel" we aim to question the ease with which some people in the world of media seem to take the use of algorithms and data for granted. For if we don’t use them in the right way we do indeed run the risks you described earlier: filter bubbles, the eradication of surprise and serendipity, choosing the common denominator instead of finding interesting niches that can truly teach people something new.

Only when we (as a society) have a real understanding of how data and algorithms influence our lives - and will influence them even more as smart appliances take over more of our environment - we can think of alternatives. I’m fortunate enough to work for a broadcaster that has a genuine interest in these alternatives and reports on them on a regular base, most prominently in Tegenlicht/Backlight.

Imagination is the foundation of every technological innovation and every invention. The American author Neal Stephenson has teamed up with the University of Arizona for an interesting collaboration, Project Hieroglyph. It connects science fiction writers to scientists and builds on a thought by Carl Sagan.

He once said that the road to the most groundbreaking science was paved by science fiction. It’s the power of imagination leading the way for science. If we hadn’t been able to imagine that one day people would no longer die of of smallpox or pneumonia, the smallpox vaccine and antibiotics would never have been invented.

Shortly after World War II Arthur C. Clarke had the idea that global communication could be made a lot easier if we’d be able to launch satellites that would stay in a fixed place above the earth. Twenty years later, in the mid-60’s, the first geostationary satellite was launched and nowadays we can no longer live without them.

On the neutrality of technology: I agree with you that we as people create technology and that we do so from our own needs and biases. In that sense technology is not neutral indeed. I think the nuance can be found in Kevin Kelly’s observations in ‘What technology wants’. Kelly states that technology has its own evolution and creates its own progress, independent from mankind. In that view technology is not influenced by human prejudice.

Let’s make this vision concrete. Google was recently blasted because an image search for ‘black people’ also came up with photos of gorillas. Shortly afterwards Google’s Adwords turned out to show more ads for well paid jobs to men than to women.

These are examples of technology being applied in a non-neutral way. But underneath these examples lies an instrument panel of static mathematics, with lots of attention for regression analysis and standard deviations in the programming languages Google applies. Scientists had been using this panel for much longer and it was only a matter of time before software developers would discover it as a tool to analyse the enormous amount of available data. That analysis in turn enables new applications. Siri and Alexa grow ever smarter, but at the same time they remain products of human imagination and consequently of human bias.

In my view the real peril of this development is not contained in the observation itself. Human progress is only possible because we have ideals that spring from our own vision of the world. The peril is in the fact that the means to achieve this progress are in the hands of ever fewer people. Facebook, Google, Apple, Amazon and Microsoft are building our new world. It frightens me that these are companies that evade any form of democratic control, and are judged by their shareholders on just one thing: net profit per share.

To be honest, I’m not very optimistic that a ‘collective conviction’ can arise - as you wrote - to raise the pressure for change. These companies operate on a world-wide scale and there is not a trace of world-wide political consensus on how to handle them. The EU goes its own way and introduces a ‘right to be forgotten’. Within the EU Germany is the only country that holds platforms liable for allowing hate speech. US policy meanwhile is aimed at safeguarding the position of these companies and therefore introduces laws to protect these enterprises. And then we haven’t even started talking about the breaches of internet freedom by for example Russia and China.

But, entirely in line with my techno-optimistic vision, I also believe that technology will provide a solution for all of this in the end - blockchain FTW!

GJ Bogaerts
head of digital VPRO

Tessel Renzenbrink

Sunday 4-10-2017 22:08

Hey Geert-Jan,

There are three elements in your letter I find hard to reconcile. You say technology always expresses bias because it springs from the human brain, which is never value-free. I agree with you on that point: technology is not neutral. You continue by saying that the real danger lies in the fact that technology is developed by a small tech elite: the Amazons, the Facebooks. These companies are not subject to democratic control and their main steering mechanism is financial gain. That is a scary thought indeed.

But in the end you say everything will be fine, because technology itself — in the shape of blockchain — will provide a solution. Like an autonomous power that will dethrone the monopolists, irrespective of what we, people do. That conclusion is at odds with the first two statements. Technology after all is always an expression of human values. When technological development lies in the hands of a small group of people, it will spread and cultivate the values of this group. That will give these people more control over the playing field. The technological domain will become ever more homogeneous and assume a form that serves the interests of this group.

Yet you still trust blockchain technology to develop autonomously in this environment, so it can erode the power of the tech elite. I don’t share your optimism on this. Blockchain, like any other technology, is subject to the economical, political and social structures in which it is developed. Why would the dynamic that led to the monopolisation of the internet by a few companies leave blockchain undisturbed? In the last two paragraphs you say you have little faith in social pressure bringing about change. According to you blockchain will act as that change agent.

I have a totally opposite view and I’ll tell you why by means of an example. This fascinating graph shows the prosperity level of mankind over the last 2000 years, and is often cited when it comes to technological progress.

Reference: Max Roser (2017) – ‘Economic Growth’. CC BY-SA licentie.
Our World in Data

The graph shows that prosperity barely increased through the ages. And then, in the middle of the 18th century, growth suddenly exploded exponentially. In their book ‘The Second Machine Age’ Erik Brynjolfsson en Andrew McAfee identify the cause of this turning point in history. The bend in the hockey stick curve coincides with the invention of the steam engine: the start of the first industrial revolution.

Not everyone was lifted by the waves of growing prosperity. On the contrary. The transition from an agrarian to an industrial society went hand in hand with terrible injustices. There was exploitation, child labour and labourers worked fourteen hours a day, living in extreme deprivation. Things only started changing when our ancestors demanded better living circumstances in force. That is why social involvement in technological development does matter. The steam engine created an exponential increase in prosperity, but what is then done with that prosperity is not inherent to the steam engine. We, people decide on that.

Now we are on the brink of a third industrial revolution, Industry 4.0 or the second machine age. Whatever we wish to call it, we have to make sure that history does not repeat itself. That this time around we guide the technological revolution in such a way it benefits everyone.

I do not believe that blockchain will achieve that for us in some miraculous way or other. We will have to enforce it ourselves. Because that is the danger of techno-optimism: the belief that technology automatically leads to the best possible outcome and that therefore we don’t have to take responsibility.

Kind regards,

Tessel

Geert-Jan Bogaerts

Monday 16-10-2017 15:42

Hi Tessel,

Techno-optimism does not relieve us of our duty to act and to critically question!

So even though I consider technology to be both cause of and solution to many of our problems, I still think that parties like Gr1p and VPRO must promote our ongoing critical questioning of that technology.

Regards,
GJ Bogaerts

Tessel Renzenbrink

Monday 16-10-2017 17:01

Hi Geert-Jan.

Thanks for the lively correspondence. It was interesting to exchange thoughts with you. 

Good to continue on the 25th!

Kind regards.
Tessel

Other conversations

Curious what the other duos had to say? Check out all four 'Dear CEO' conversations here.

The writers will finish their conversation live during our 'We Know How You Feel Tonight' debate night  on Wednesday October 25th 2017 in De Effenaar, Eindhoven.