25th January 2018
By Tom Law

The New Press Ethics

Kjersti Løken Stavrum

In the digital age, the choice between what you can do and what you choose to do often boils down to a question of ethics. Unfortunately, the decisions taken are hidden from most of us.

A few years ago, a programmer approached me after I had just finished a presentation for reporters on ethics in journalism. Back then I was secretary general of the Norwegian Press Association and regularly visited newsrooms across the country, challenging reporters and editors with current issues; should you publish this story or not? The programmer’s message was of a different kind. He was concerned about how much the journalists, the editors — and the CEOs — really understood of where the core of the ethical maneuvering had moved in the digital age.

He encouraged me to pay much more attention on data, algorithms and the development of artificial intelligence, arguing that these areas should also be part of the discussions on ethics for the press. Not least because these choices are left with little or no scrutiny for the readers at all. And they are not part of the debate on the development of journalism as such. Yet.

How digital journalistic products are programmed, how algorithms work, how big data are harvested and utilized are increasingly matters to decide for the programmer more than the editor-in-chief and the CEO. Neither the short-term impacts of the choices that are made, nor the long-term impacts, are easy to foresee and understand, for readers, editors or CEOs.

The ethical decisions are virtually hidden under the hood.

In the book “Weapons of math destruction” Cathy O’Neil describes some of these brand new dilemmas and pitfalls of the digital age. She writes : For many, the business running these rogue algorithms, the money pouring in seems to prove that their models are working. Look at it through their eyes and it makes sense. When they’re building statistical systems to find customers or manipulate desperate borrowers, growing revenue appears to show that they’re on the right track. The software is doing its job. The trouble is that profits end up serving as a stand-in, or proxy, for truth. We’ll see this dangerous confusion crop up again and again.

Last fall, when the Swedish digital economist Anna Felländer in a side remark mentioned “ethics in algorithms” in a debate at a Google seminar in Stockholm, we in the Tinius Trust immediately asked her to elaborate on these questions in a Tinius Talk for us. Questions the Trust already had decided to give priority on our agenda for 2018.

You can find Felländer’s article for the Tinius Trust here. In the article “ AI Sustainability — A new form of CSR” she argues that this is a wake up call for the boards: “Going forward an “AI Sustainability Strategy” will become the moral licence to operate. Thus, the boards of all companies aiming for a digital future should address the question: How do we deal with the ethical aspects of our data and algorithms? “

This week, the annual Edelman Trust Barometer was released. Edelman reports a relieving few points rise of trust in journalism while trust in social networks and search are declining. Interestingly enough; the technology sector as a whole remains the most trusted in the Edelman Trust Barometer 2018. But technology powered social networks and search are being questioned for the first time at scale.

In his intro, Richard Edelman quotes the Chinese philosopher Confucius:

A state cannot survive without the confidence of its people.”

That might be, and we are many who share strong anxiety about the repercussions of fake news, digital trolls and propaganda. But my guess is that we will experience an increasing awareness of the (un)ethical choices hidden in the digital infrastructure — and that this awareness naturally will lead to new insecurity and lack of confidence — due to the difficulties of discovering and understanding the choices that are made.

I share this concern with Nick Newman who in his recent report Journalism, Media and Technology Trends and Predictions 2018 writes on An Uncertain Future that “We’ll be increasingly worried about who programmes the algorithms”.

Some of these worries can be fixed through conscientious compliance of the GDPR, but only if the focus of this compliance are the readers/users and not fear of a huge fine from the European Commission.


Main photo: Press stand Paris – Florian Plag (Wikimedia Commons CC-BY-2.0)