Emily Bell’s seminal speech on the relationship between journalism and technology: It’s time to make up or break up

Last Friday, Emily Bell, Director at the TOW Center for Digital Journalism at Columbia Journalism School, delivered what the World Editors Forum believes to be the year’s most definitive speech on the future of journalism. In it, she challenged journalists and editors to hold dominant technology companies publicly accountable, take the lead in technological innovation in news and help to shape the ‘new public sphere’. This is an edited version of her speech to the Reuters Institute in Oxford, published with permission.

by WAN-IFRA Staff | November 26, 2014

From Paul Reuter through to John Reith, at the BBC, the pioneers of journalism were also pioneers of communications technology. Today however we have reached a point of transition where news spaces are no longer owned by newsmakers. The press is no longer in charge of the free press and has lost control of the main conduits through which stories reach audiences. The public sphere is now operated by a small number of private companies, based in Silicon Valley.

Professional journalism is augmented by untold numbers of citizen journalists who now break news, add context and report through social platforms. To have our free speech standards, our reporting tools and publishing rules set by unaccountable software companies is a defining issue not just for journalism but the whole of society.

I am not going to argue that this is a reversible trend. It isn’t. But I am going to argue that journalism has an important role in building and deploying new technologies, shaping non-commercial parts of a new public sphere and holding to account these new extensive systems of power.

Silicon Valley Executive to editors: “You lost”

A few years ago, a number of us attended a Said Business School ‘Silicon Valley comes to Oxford’ event. Ahead of the main symposium a number of editors had a small side meeting with a group of successful entrepreneurs from Silicon Valley. Founders of Google, Facebook, LinkedIn (all young men, if I remember correctly) and other businesses agreed to discuss the issues facing the news industry. Inevitably talk turned to how the business models of the press had come under enormous pressure from the disaggregation of news and the shifting of advertising revenue from print display models to search.

One of the executives, who was from a large and highly successful search company, finally lost his patience:

‘Look!’ he said, ‘We listened to what users wanted – to find information more easily – and we listened to what advertisers wanted – to reach people who might want to buy their products or services, and we connected them. Media companies didn’t do this, and as a result you lost’.

To hear how Silicon Valley’s most successful company viewed our failure was like an early introduction to the ice bucket challenge.

Huddling in a rather resentful knot of journalists afterwards a senior editor piped up: ‘This is all very well,’ he said ‘but they are not really in our business are they?’

If only that were true, how unnecessary this conversation would be.

Clash of cultures

The ‘two cultures’ of engineering and journalism are very different. They do not share the same motivations, they have not shared the same skills, they do not seek the same outcomes and they certainly do not share the same growth and revenue models. Yet now they occupy the same space in terms of conveying news and discussion to a broad public.

In one way the editor was right. Silicon Valley technology companies don’t employ journalists to report, they don’t tell stories, they don’t seek cultural and political impact for the furtherance of democracy, they don’t have as their core purpose much outside delivering utility to users and returning money to shareholders. On the other hand I thought then, and strongly believe now, that even as a distinct field of practice, journalism’s future is inextricably linked to, and increasingly dependent on, communication technologies.

The fourth estate, which liked to think that it operated in splendid isolation from other systems of money and power, has slipped suddenly and conclusively into a world where it no longer owns the means of production, nor controls the routes to distribution.

How this happened is pretty well documented. None of us hold Stanford engineering PhDs. We lacked the fundamental technological literacy to understand how these new systems of distribution and expression were going to emerge, we lacked the institutional will or insight to move swiftly enough in the right directions and we were held back in transformation by large, legacy organisations and the revenues that came with them. And doing journalism well is a hard, resource hungry business.

News companies make it hard to publish; social media platforms make it easy to publish.

Journalism and free expression are now part of the commercial sphere

Consequently nearly everything these days is published or shared at some point on a social platform. As news organisations cease to print physical newspapers, as linear television struggles to survive the buffeting of on-demand services, as services become not just digital first but digital only, journalism and free expression become part of a commercial sphere where the activities of news and journalism are marginal.

It is worth noting that whilst engineers have developed Facebook, YouTube, Twitter, LinkedIn, Instagram, Reddit, Pinterest, Ello, Medium, Kickstarter and others, not one existing journalism or media company has conceived of or developed a widely adopted social platform. And only two – MySpace and Reddit were acquired by media companies, by News Corp and Conde Nast respectively. That is quite a record for twenty years.

Equally, social media platforms have been insistent that they are not interested in the difficult and expensive business of employing actual journalists or making ‘editorial decisions’.

Their culture is as alien to reporting and editing as ours is to designing social software.

Of course, every algorithm contains editorial decisions, every piece of software design carries social implications. If the whole world connects at high speed in 140 characters it changes the nature of discourse and events.

It is thrilling and empowering, it is terrifying and threatening.

The language of news is shaped now by engineering protocols, not by newsrooms norms and on the whole the world is a better place for it.

If there is a free press, journalists are no longer in charge of it

Engineers who rarely think about journalism or cultural impact or democratic responsibility are making decisions every day that shape how news is created and disseminated.

In creating these amazingly easy-to-use tools, in encouraging the world to publish, the platform technologies now have a social purpose and responsibility far beyond their original intent. Legacy media did not understand what it was losing, Silicon Valley did not understand what it was creating.

Facebook has 1.3 billion users, around 20 per cent of the world’s population. A recent survey from Pew showed that 30 per cent of adults in America now get their news from Facebook.

YouTube has a billion users and a hundred hours of video are uploaded to the platform every minute. Twitter now has over 300 million users.

Weibo, the main Chinese social network is bigger than all of them. Instagram, Snapchat, WhatsApp and WeChat are rapidly becoming default platforms for younger audiences.

To be clear, Facebook, Instagram, YouTube and Twitter and the like, do not yet directly create stories, or film or record them, rather the stories are pushed to the social platform by news organisations who want Facebook’s wide and powerful distribution, and individuals ‘sharing’ and ‘liking’ news.

The Facebook monopoly

No other single branded platform in the history of journalism has had the concentration of power and attention that Facebook enjoys.

Facebook uses a series of complicated formulae to decide which news stories rise to the top of your page or news feed.

These mechanisms are known as algorithms. They dictate not only what we see but provide the foundation of the business model for social platforms. They are commercially sensitive and therefore remain secret. They can change without notice, and they can alter what we see without us even noticing.

A news executive described to me how the Facebook news algorithm is now the most important way readers find his journalists’ stories. He added: “But we don’t know much about how Facebook sorts stories, and that scares the hell out of me.”

If one believes the numbers attached to Facebook, then the world’s most powerful news executive is Greg Marra, the product manager for the Facebook News Feed. He is 26.

In a recent piece in The New York Times, Marra was quoted as saying: “We try to explicitly view ourselves as not editors,” he said. “We don’t want to have editorial judgment over the content that’s in your feed. You’ve made your friends, you’ve connected to the pages that you want to connect to and you’re the best decider for the things that you care about.” However, even by making the decision that people see news based on their social circles, Marra and his colleagues have made a profound editorial decision which has a broad societal impact.

‘We don’t want to have editorial judgment’. This is a refrain one hears over and over again within Silicon Valley companies: We are ‘just a platform’, the ‘technology is neutral’; ‘we don’t make editorial decisions’. I truly think engineers believe this although it is demonstrably wrong.

Every time an algorithm is tweaked, an editorial decision is being made

When I interviewed Dick Costolo chief executive of Twitter a couple of years ago I congratulated him on running the free press of the 21st Century. He grimaced and countered, ‘That’s not really how I like to see it’. Silicon Valley entrepreneurs are not stupid, they know that editing and shaping culture is a hard, thankless, politically charged and financially unprofitable business.

But their unintentional editing carries with it unpredictable consequences. This summer academic and blogger Zeynep Tufekci, who is one of the most intelligent commentators on sociology, media and technology, was following the social unrest in the St Louis suburb of Ferguson after the police shot an unarmed young black man Michael Brown. She noted that whilst her Twitter feed was full of reports from Ferguson, nothing appeared on Facebook. Overnight as the Facebook algorithm worked its filtering magic stories began to appear, but long after the first reports and discussions.

Facebook’s algorithm had decided that Tufekci was not as immediately interested in the unrest of a small Missourian suburb as she would be in the ice bucket challenge. Without other social signals from less filtered platforms Tufekci wondered if she would have ever seen events in Ferguson on Facebook at all.

In a world where we navigate our daily lives through social platforms, just how this information reaches us, what is on a ‘trending’ list, how these algorithms work, becomes not just of marginal interest but a central democratic concern.

Tufekci suggested that without what she called “the neutral side of the Internet” such as livestreams, and also Twitter, where feeds are determined not by opaque corporate algorithms but by individual choices, people might not be talking about Ferguson at all.

Accountability and ethics in the new public sphere

The general public has been relatively unaware of how these intimate social platforms might be being used.

In June this year, the findings of an academic experiment were published in which Facebook had manipulated the news feed of 700,000 users for a week, to see how seeing different types of news might affect the users’ mood. The answer was that actually happy posts made people more likely to be happy.

It was a legal academic experiment, but when the findings were made public there was predictable uproar. How dare Facebook literally toy with our emotions?

Facebook is under no obligation to disclose how it manipulates the news feed, it might well have conducted thousands of such experiments without anyone knowing at all. And secondly it was striking that the public expectation of how information reaches them is still, well, relatively naive.

If Facebook can nudge your emotions towards happiness or sadness by manipulating what you see, can it use obscure algorithms to influence something more sinister, such as, for instance, the way we vote?

Well, yes it can, as it turns out.

In 2010 Facebook conducted another experiment to see whether issuing voting prompts in certain feeds increased turnout. It did in fact. As Harvard Law Professor Jonathan Zittrain asked: What would happen if the Facebook founder Mark Zuckerberg decided to tweak an algorithm so that only voters who favored a particular party or candidate were prompted to vote?

Twitter is now an editor

Twitter, perhaps the most useful tool for journalists since the invention of the telephone, has avoided filtering processes so far, but it is now under pressure from its status as a company with a stock market quote and finding new ‘editing’ challenges.

When Isis circulated the first videos of the beheading of American journalist James Foley earlier this year, they did it through Twitter. In a departure from established practice this led to Dick Costolo, the chief executive of the company, announcing that not only would the account distributing the video be closed, but so would any account retweeting the video.

An open and clearly editorial decision, Costolo’s action sat uneasily with those who had mistakenly thought Twitter was a ‘free’ platform open to all.

To those of us used to making editorial decisions it seemed a completely sensible call.

Citizen journalists, professional journalists and news companies are trying to work out how to secure more attention for their work in an environment they don’t control, and technologists are struggling to come to terms with the full implications of being news agencies to the world.

A senior executive of a social platform admitted to me recently that they knew editing their platform for problematic content was a persistent and growing problem, ‘but we have no system for it’, he said, ‘We scramble a small group into a war room and make decisions on an ad hoc basis. We know it is a problem.’

Journalism, Gmail and the Surveillance State

The most vivid example of the friction between the new platforms and the traditional role of the press sits of course in the remarkable set of stories published by Alan Rusbridger and his team. We saw through the excellent work of the Guardian and others on the NSA leaks brought to light by whistleblower Edward Snowden, that the tools we use for journalism – Gmail, Skype, social media – are already fatally compromised by being part of a surveillance state. Platforms like Google where aghast, supposedly, at how their infrastructure was tapped for information by security agencies.

This week Ethan Zuckerman, who directs the Civic Media Lab at MIT delivered a very thought provoking talk at the Tow Center as part of a series we have called ‘Journalism After Snowden’ where he argued persuasively that if the leaks of Edward Snowden taught us anything, it must be that journalism has a role now in creating non-surveilled spaces.

In order to preserve our role in any robust way, we must stop relying solely on the tools and platforms of others and build our own.

I do not think it is feasible for journalism to have a completely adversarial relationship with technology companies, but I also think it is absolutely imperative that there is a public sphere, of which journalism is a part, which is not wholly reliant on them.

Three initiatives for change

It is far too easy to throw up our hands in the face of Google and say ‘oh we will never catch up’. Twenty years ago Google was a doctoral thesis.

Ten years ago Twitter didn’t exist. This morning at least some of you had not heard of WhatsApp. Change in technology is constant. Yes, as part of this journalists – and editors – should learn to code, they should learn programmatic thinking and should be able to understand the world they operate in.

Instead of news executives enjoying monthly visits to the Googleplex to play around on the bicycles, they should convene serious forums about archiving, moderation, deletion, censorship, submission of user information to the authorities. These are all critical social issues at the heart of both fields.

The second unfashionable and unpopular call would be for regulation. In the US President Obama’s strong statement on net neutrality took many by surprise but is a welcome intervention. Journalism still has a powerful voice in influencing issues of regulation and it should use both its corporate presence and intellectual capital to surface and interrogate some of the issues we are seeing today.

And the third and most achievable is report. Report, report, report. Cover technology as a human rights and political issue as if it were Parliament. Maybe even with more verve and clarity – were that possible. It is just as interesting and about ten thousand times more important. The beats of data, privacy and algorithmic accountability currently either don’t exist or are inadequately staffed.

We have to stop coverage of technology being about queueing for an iPhone and make it about society and power. We need to explain these new systems of power to the world and hold them accountable. It is after all what we do best.

Share via
Copy link